When calculating a 32bit ID from a timestamp (TDateTime), I get a strange error. In certain situations, the value is different on different processors.
The fTimeStamp field is read from a Double field in a SQLite database.
The code below calculates a 32bit ID (lIntStamp) from fTimeStamp, but in some (rare) situations the value is different on different computers even though the source database file is exactly the same (i.e. the Double stored on file is the same).
lIntStamp := Round(fTimeStamp * 864000); //86400=24*60*60*10=steps of 1/10th second
lIntStamp := lIntStamp and $FFFFFFFF;
The precision ofTDateTime (Double) is 15 digits, but the rounded value in the code uses only 11 digits, so there should be enough information to round correctly.
To mention an example of values: in a specific test run the value of lIntStamp was $74AE699B on a Windows computer and $74AE699A on an iPad (= only last bit is different).
Is the Round function implemented different on each platform?
PS. Our target platforms are currently Windows, MacOS and iOS.
I made a small test program based on the comments:
var d: Double;
id: int64 absolute d;
id := $40E4863E234B78FC;
lDouble := d*864000;
Label1.text := inttostr(Round(d*864000))+’ ‘+floattostr(lDouble)+’ ‘+inttostr(Round(lDouble));
The output on Windows is:
36317325723 36317325722.5 36317325722
On the iPad the output is:
36317325722 36317325722.5 36317325722
The difference is in the first number, which shows the rounding of the intermediate calculation, so the problem happens because x86 has a higher internal precision (80 bit) than the ARM (64 bit).