I have noticed when using the built-in pause cutscene action, there is a delay from triggering and having the actual cutscene pause.
It is not consistent on how much the delay is. Without digging too far, I assume this is due to floating point precision.
Any tips to mitigate this issue?
Worst case I will have to pad the pauses.
If it is the float issue, maybe a long term fix could be to convert the time types into milliseconds stored as uint64 or something.
I attached an image of where the scrubber stops after hitting a pause action.
Windows 7 64bit
Latest version of Slate
Thanks for bringing this into my attention and very sorry for the late reply, but somehow I missed your post.
The reason this is happening is because the OnEnter of an action clip is called after the time of the cutscene is already processed by deltaTime, and since deltaTime is variant this is also why this different is variant as well.
The best way to work around this, would be to change the PauseCutscene.cs clip to the following code:
Aparently, this will snap back and sample the cutscene time at the exact time of the action clip start time. 🙂
Sorry for the late reply.
Hmm. The PauseCutscene should not work/pause a sub-cutscene in the first place, since the sub-cutscene is not considered to be playing, but rather it only get’s sampled by it’s parent cutscene, thus ‘root.Pause()’ does not have any effect. This is weird 🙂
Can you please provide a bit more information of the setup you have and calls you do, so that I can understand better what you mean?