[UT2K4] Unclock/StopWatch weirdness

  • Two Factor Authentication is now available on BeyondUnreal Forums. To configure it, visit your Profile and look for the "Two Step Verification" option on the left side. We can send codes via email (may be slower) or you can set up any TOTP Authenticator app on your phone (Authy, Google Authenticator, etc) to deliver codes. It is highly recommended that you configure this to keep your account safe.

pingz

New Member
Aug 2, 2003
64
0
0
I'm trying to time some code and getting some weird results. I'm using some code like the following....


Code:
exec function DoMyTest()
{
	local float elapsed;

	Clock( elapsed );

	// do some junk...

	UnClock( elapsed );

	ClientMessage( elapsed$ "ms" );
}
The problem is that i get what seems to be garbage elapsed times. For instance i'll get 324.34ms, 322.47ms, then get -500.23ms. Negative time makes no sense to me. I even tried using StopWatch and got similar results. I've also noticed that when timing something that is noticeably longer than another function it will return less elapsed time.

I'm running on a P4 2.8GHz machine and in my own C++ code i've never noticed any timer error.

Anyone seen these issues before? Is there another method to get timing in at least 100 millisecond resolution in UT2K4?
 
Last edited:

BinarySystem

Banned
Jun 10, 2004
705
0
0
I have noticed that behaviour in those functions before. I can only speculate on the internal workings of the Unreal Engine but it doesn't appear to use a simple interval timer.. timer, as it has more resolution than the interval timer does.

The Clock function puts a value in the timer variable which appears to correspond to the current time in some format. Then Unclock takes the difference between the current time in that format and the stored value.

I believe that it is possible (and even somewhat common) for the timer in this format to "wrap around," thus occasionally giving a negative value. Whenever I time things I time it many times, discard any negative values, and then average them over thousands of iterations.

As for getting a lesser time from the function when using a slower function, I have never had that problem.
 

pingz

New Member
Aug 2, 2003
64
0
0
"occasionally giving a negative value" is an understatement in my situation. I get 3 out of 5 values i get are negative and i suspect they're also wildly incorrect. So i thought i'll do the operation 100 times, time that as a whole, and divide by 100. Timing it with my watch it took 63 seconds... what does UnClock give me?

-223.67ms!!!

This thing is totally broken even on long intervals. How can i use it if it's wrong more than half the time?
 
Last edited:

aardvarkk

New Member
May 7, 2004
64
0
0
this may be useless to you, but for debugging purposes you could set up a variable in tick that increments like this:

var float MyVar;

function Tick(float DeltaTime)
{
MyVar += DeltaTime;
}

then you can output myvar and see how it matches up with clock and unclock just to see what sorts of values you SHOULD be getting and maybe see a pattern as to when it's negative...

hope this helps!
 

Shambler[sixpack]

New Member
May 3, 2001
564
0
0
Ireland
Visit site
That wouldn't be very helpfull because clock and unclock usually run each tick..so DeltaTime will be the same.

EDIT: More clearly put :p Clock and UnClock should ideally be called DURING a tick and not between 2 or more ticks.
 
Last edited by a moderator:

pingz

New Member
Aug 2, 2003
64
0
0
Yea i'd have to break up the work over several frames to time something like that and i don't see that as being a resonable solution.
 

BinarySystem

Banned
Jun 10, 2004
705
0
0
I usually do sort of what you did, and take an average, but I discard negative values. It may be possible that the longer stretch of time you are measuring, the more likely a negative result will be. I use Clock and Unclock only for critical function optimization and so I'm usually only clocking a few lines of code at a time.
 

pingz

New Member
Aug 2, 2003
64
0
0
Well once some I can time it with my watch as taking more time that what Unreal is reporting i have to loose all faith in Clock/Unclock being right at all. It may be ok to time things relative to each other, but it isn't giving me real world time values.
 

BinarySystem

Banned
Jun 10, 2004
705
0
0
Timed with your stopwatch? What sort of time periods are you measuring? I believe that Clock/Unclock are only intended for timing very short intervals on the order of a few milliseconds. Otherwise you are likely to be having an issue with the counter "wrapping around."

Also I'm not sure if timing with a stopwatch is even valid, as the clock/unclock functions probrably take a measurable amount of time themselves to execute.
 

pingz

New Member
Aug 2, 2003
64
0
0
I've timed all kinds of things. Very short, sub one second, things, long 60 second things, and stuff in-between... all return a mix of positive and negative times. The longer operations not only return negative times, but dead wrong positive times... 942ms != 54 seconds.

Ok so i decided to dig deeper. I examined UT2004.exe with Dependency Walker that comes with VC++ 2003 .NET. CORE.DLL uses timeGetTime() and ENGINE.DLL uses QueryPerformanceCounter(). I use both of these in my C++ code without any problems timing anything. Unless Epic invented a new magical timer, Clock/UnClock and StopWatch (the UnrealScript function) is FUBAR'd.
 

BinarySystem

Banned
Jun 10, 2004
705
0
0
Umm, Clock and Unclock probrably won't work for things that are a second or more. I believe it's meant for measuring very, very small amounts of time, IE, fractions of a millisecond. Like I said, it seems to be meant for measuring elapsed time of critical code sections in loops and such. I wouldn't trust it for anything over a few milliseconds.

Clock stores the current time in some unknown format as an int. Unclock takes the current time in some unknown format as an int, subtracts it's argument from the "current value," and multiplies it by some unknown factor to get the time in milliseconds. The counter is updated extremely frequently, possibly in hardware. This timer is updated ludicrously often and will roll over very fast if you are measuring any considerable length of time. If it rolls over you may get a negative value, or an erroneous positive value.

If you need to measure a long period of time, you may be able to use clock and unclock on iterations of a loop and add the individual values yourself. You may still have the occasional negative value, which could be discarded or you could use clock twice and try to figure out the factor that the difference need be multiplied by.

If you are timing something longer than a few milliseconds, you might as well just track the delta argument of Tick, then take the delta argument of Tick the tick AFTER you've called your test, and measure the difference. The resolution of this method is not good enough for measuring the performance of individual or small groups of instructions like Clock and Unclock are meant for, but will give you a FAIRLY accurate measure of the time elapsed during the function. It's not perfect of course because not all Ticks are the same, but if your function is taking more than a couple of milliseconds the difference in Delta will be noticeable.
 

pingz

New Member
Aug 2, 2003
64
0
0
Umm, Clock and Unclock probrably won't work for things that are a second or more.
That may be true only if Clock/UnClock are unnecessarily crippled.

http://support.microsoft.com/support/kb/articles/Q172/3/38.asp

This article uses VB code, but QueryPerformanceCounter() uses 64bit integers. When properly converted to 32bit floats returned by Clock/UnClock() you still are able to time increments of over an hour with a few millisecond resolution.

I suspect it's a bug. Most likely they didn't fix Clock/UnClock to work on hyperthreading systems, but did fix the timer used to calculate delta times.

http://support.microsoft.com/default.aspx?scid=kb;en-us;327809

In particular the part...

Some programs work with the result of QueryPerformanceFrequency as a signed integer value, which has only 31 bits of precision and a sign flag. These programs behave incorrectly on these faster CPUs.
That sounds like the behavior i'm getting from Clock/UnClock(). Gonna see if i can make this work with the program compatibility settings.
 

BinarySystem

Banned
Jun 10, 2004
705
0
0
I'm not sure how they implemented Clock and Unclock, but it doesn't have "a few millisecond resolution." It has a few MICROSECOND resolution. This may be the reason it rolls over so quickly.
 

pingz

New Member
Aug 2, 2003
64
0
0
Heh... you quote like a Republican. Ok that's a little harsh. =) I said...


you still are able to time increments of over an hour with a few millisecond resolution
What i was illustrating there is that even with longer durations of measurement you can still get millisecond resolution without wrap around. The reason it drops to milliseconds is that they convert a 64bit integer into a float which has only 6 digits of precision. Once you get past one second you start loosing microsecond precision... but it does not wrap around. The 64bit integer returned from QPC/QPF is accurate to around a microsecond ( depending on your specific processor ) and has a wrap around at 39 days.

Back the real point. The only way a wrap around could occur is because of a bug or they deliberately went out of their way to force Clock/UnClock from measuring more than some arbitrarily small amount of time. There isn't a reasonable point to that... so i say it's a bug.

My only option is to run my tests on a slower PC.
 

BinarySystem

Banned
Jun 10, 2004
705
0
0
Well, the 64 bit point is somewhat moot, because the number that Clock sets into the argument variable is an unrealscript integer, so you have only 16 bits. (on either side of zero, not sure if the clock value can be negative or not) This may be the source of the quick wraparound.
 

pingz

New Member
Aug 2, 2003
64
0
0
First Clock doesn't take an integer, it's a float and both ints and floats are 32bit types. I'm not sure where you learned binary math, but you should ask for your money back... 16bit on either side of zero has nothing to do with how negative values work or your insistence on wrap around.

This thread is long past pointless and i regret continuing to responding to it. I won't do so again.
 

BinarySystem

Banned
Jun 10, 2004
705
0
0
Oh, you're right. It is 32 bits. Though one is a sign bit. I'm just saying; the values returned by Clock DO roll over, and they do so quickly. My attempt at expaining why is just speculation - but it's not fair to call it a bug. Clock and Unclock both do exactly what they are supposed to do - though perhaps the documentation on how to use them could be improved.