SERAPHIM, my custom webserver, logs how long requests take. Usually when I'm developing I can see the milliseconds that a request takes but this is on UniVerse. When I run it in ScarletDME however, I see requests taking just 0ms. I wish I could take that at face value but unfortunately my slow site is slapping my in the face.
Really what's happening is that the TIME function in Universe and ScarletDME is different. In UniVerse I get the milliseconds as a fraction while in ScarletDME, I get just the seconds. As the requests are all under a second, ScarletDME is giving me something accurate but what I want.
Time in Pick and by extension ScarletDME is measured in seconds since midnight. OpenQM already supports millisecond times but it is a later version and so I'll need to patch it in. The way OpenQM does it is that there a mode you can set in a program or you can set it in the VOC so that it takes effect for all programs. This is the TIME.MS mode and this enables getting millisecond resolution from the TIME function.
This seems like the correct way to go about things and the rest of post will outline hopefully how I get there. The first thing I did was grep for an existing mode and found that the implementation of the modes is quite straightforward. You add a mode definition and then you can use the BITTEST function to test for it.
I think the best place is going to be in the locate statement that checks and triggers an opcode. I also see examples of the opcodes getting flipped based on the mode and so it looks like variations aren't done in the same function but rather as seperate ops.
I was debating if I wanted to add an if statement to the op_time function or if I should create an op_timems function and reading the code made it clear that the op_timems function might make more sense. The logic is going to be that I check for the TIME.MS mo$de and then set the intrinsic to be used based on that mode.
I was originally going to put my logic in the set.kernel.modes gosub but I then realized that you could toggle the mode in the same program multiple times and so its likely that the mode needs to get checked multiple times. This means that the best place is going to be right when the time opcode is about to get triggered. This assumes that the mode can be toggled in the same program which I haven't confirmed.
Before I start on my op_timems function, I'm going to move op_time to zig and make sure that is working. After that I'll create an op_timems function, add that to the system and the finally figure out how actually get seconds.ms working.