I have an application with a lot of screens (followed by MVC pattern), and I want to be able to receive in a fashion way the information that last key was pressed x seconds ago (120 sec let's say). Is there standard way to do this or I have to start a timer and every time when I pressed a key I have to override a variable and in the timer I have to check the difference time between that time and current time?
Yes, just record the system timer when a key is pressed.
long epoch = System.currentTimeMillis();
When a key is pressed again, you need to check the time difference to see how long it's been idle for.
If you need to trigger things without keypresses, then you need to start a thread which wakes now and again to check the elapsed time, and trigger an event of some kind when the time period has elapsed.
Related
I'm not quite sure how timekeeping works in linux short of configuring an NTP server and such.
I am wondering if there is a way for me to make time tick faster in linux. I would like for example for 1 second to tick 10000 times faster than normal.
For clarification I don't want to make time jump like resetting a clock, I would like to increase the tick rate whatever it may be.
This is often needed functionality for simulations and replaying incoming data or events as fast as possible.
The way people solve this issue is that they have an event loop, e.g. libevent or boost::asio. The current time is obtained from the event loop (e.g. the time when epoll has returned) and stored in the event loop variable current time. Instead of using gettimeofday or clock_gettime the time is read from that current time variable. All timers are driven by the event loop current time.
When simulating/replaying, the event loop current time gets assigned the timestamp of the next event, hence eliminating time durations between the events and replaying the events as fast as possible. And your timers still work and fire in between the events as they would in the real-time but without the delays. For this to work your saved event stream that your replay must contain a timestamp of each event, of course.
I'm working on a balanced ternary watch face.
How can I set the time to the next update?
I'd like to update in the middle of every second, minute, hour and night.
Or during conservation mode, only the middle over every minute, hour and night.
The best way to do this would be to find tell the system not to redraw until the desired time has elapsed.
There are different answers to different parts of your question.
While the watch is awake, you are free to update the face whenever you want, at whatever interval you want. Google's analog watch face sample updates once per second, and that's a fine starting point. You'll just want to modify mUpdateTimeHandler with an offset of 500ms, and then do additional checks to determine if you're in the middle of a minute, hour or night.
When in ambient mode, WatchFaceService has an onTimeTick() method - but that fires when the minute changes, not "in the middle" of each minute. My guess is you'll ignore onTimeTick() and roll your own code, using AlarmManager with an offset to fire in the middle of each minute to do your updates.
We have a (very) Legacy application written in VB6 (15 years old?).
The application contains a timer with 300ms interval. The Sub called when the timer ticks executes a batch of code that talks to some SQL servers, prints some labels and so on.
When everything is working OK, this Sub executes in 5ms to 10ms - i.e. before the next timer interval occurs - but it also wastes 290ms before the next tick.
We have a need to make this application a little faster, and one option is to change the interval to 1ms - before we do so, I would just like to confirm whether the timer will abort the interval (aka - completely ignore the tick) if the previous interval is still executing - or will it start building a stack of calls to the sub resulting in a hang after a while? (i am of course assuming all ticks get executed in the same thread as the gui – thus we’ll need to use DoEvents after every tick to ensure the UI doesn’t hang.)
I’ve tried looking into this, but finding reliable information on the old VB6 timers is proving tricky.
We do have this scheduled in to be re-written in .net using threading & background worker threads - this is just a short term fix that we're looking into.
That's not how VB6 timers work, the Tick event can only fire when your program goes idle and stops executing code. The technical term is "pumps the message loop again". DoEvents pumps the message loop. It is a very dangerous function since it doesn't only dispatch timers` Tick events, it dispatches all events. Including the ones that lets the user close your window or start a feature again while it is still busy executing. Don't use DoEvents unless you like to live dangerously or thoroughly understand its consequences.
Your quest to make it 300 times faster is also doomed. For starters, you cannot get a 1 millisecond timer. The clock resolution on Windows isn't nearly high enough. By default it increments 64 times per second. The smallest interval you can get is therefore 16 milliseconds. Secondly, you just can't expect to make slow code arbitrarily faster, not in the least because Tick events don't stack up.
You can ask Windows to increase the clock resolution, it takes a call to timeBeginPeriod(). This is not something you ought to contemplate. If that would actually work, you are bound to get a visit from a pretty crossed dbase admin carrying a blunt instrument when you hit that server every millisecond.
If the timer is a GUI component, (ie. not a thread pool timer), and fired by WM_TIMER 'messages', then the 'OnTimer' events cannot 'stack up'. WM_TIMER is not actually queued to the Windows message queue, it is synthesized when the main thread returns to the message queue AND the timer interval has expired.
When everything is working OK, this Sub executes in 5ms to 10ms - i.e.
before the next timer interval occurs - but it also wastes 290ms
before the next tick.
This is exactly what you have set it up to do if the time interval is 300ms. It is not wasting 290ms, it is waiting until 300ms has elapsed before firing the Tick event again.
If you want it to execute more often, then set the Time interval to 1ms, Stop the timer at the start of the Tick event and start it again when you have finished processing. That way there will only ever be 1ms idle time between operations.
If you put your timer interval faster than your execution time, this lock will probably allow you to execute your code as quickly as you can in VB6.
Private isRunning As Boolean
Private Sub Timer1_Tick()
If Not isRunning Then
isRunning = True
'do stuff
isRunning = False ' make sure this is set even in the event of an exception
End If
End Sub
However, if you are inside this event handler as much as you want to be, or as fast as possible, close to 100% of the time, your application will become slow to respond to or unresponsive to UI events. If you put the DoEvents inside the do stuff you will give the UI a chance to process events, but UI events will halt execution inside do stuff. Imagine moving the window and halting execution... In that case, you probably want to spawn another thread to do the work outside of the UI thread, but good luck doing this in VB6 (I hear it's not impossible).
To maximize speed, with a looping set of instructions, remove the timer all together and have it a function called one at the end of the program entry point (Sub Main or Form_Load).
Within the function, Do a loop and use QueryPerformanceCounter to manage the repeat interval. This way you remove the overhead of the timer message system and can get around the minimal timer interval that exists with the timer.
Add Doevents once at the the top of the Loop so the loop so other events can fire; and consumes idle time while waiting.
Our game is a MMO game, and our logic server has a game loop of course. To make these logical module easy to write, we provide a timer module, which support register a real timer and trigger it when it possible. In the game loop, we pass the system time (in millsecond) to timer module, and the timer manager will check if there're some timers can be triggered. For example, to update a player/monster position, when the player start to move, we update the player position every 200ms.
But when the game loop run too much logic, it will use too much time in a single frame, and the next frame, some timer will slower than the real time. That will cause some bugs actually. For example, if in one frame, it spent 1 second, and in the next frame, the real time is 1000ms, and the move timer has scheduled in 800ms, so the timer has been triggered at 1000ms, slower than the expected time.
So is there any better solutions to solve this problem ? For example, we can implement a timer, only dependent on our game, not dependent on the real computer time ?
I started a new windows form in visual studio 2010 using C++ language.
There is only one timer configured to generate an event each 1ms (1milisecond)
Inside the timer event handler, I just increment a variable named Counter (who is used only in this event) and I write the current value of this variable in a textbox, so that I can see its current value.
Considering that the timer event occurs each 1ms, for each 1 second, the variable Counter should increment 1000 times, but the Counter variable takes around 15 seconds to increment 1000 times. After 15 seconds the value shown in textbox is 1000.
I set the timer event to 1ms, but seems that the event is occuring only each 15ms, because the variable Counter took 15 times (15 seconds) more than in theory to reach the value of 1000 (1 second = 1000*1ms).
Someone have an ideia on how to solve this problem?
I need to generate an event each 1ms, where I will call another function.
How cold I generate an event each 1ms interval? Or less than this if possible.
A person of anther forum told me to create a Thread to do this job but I don't know how to do that.
Im using windows 7 profesional 64bits, I don't know if 64bits OS have any relationship with this issue. I think the PC hardware is enough to generate the event. Core 2 duo 2GHz and 3GB RAM.
http://img716.imageshack.us/img716/3627/teste1ms.png
System.Windows.Forms.Timer states that
The Windows Forms Timer component is single-threaded, and is limited to an accuracy of 55 milliseconds
So that should explain the discrepancy. Your approach seems to be a little wrong IMHO. Having a thread wake up every 1ms and that too precisely is very hard to do in a preemptive multitasking OS.
What you can do instead is
Initialize a counter to zero, a high precision time variable to current time.
Have a timer wake you up periodically
When you timer fires , user a high precision timer to find current time.
Compute delta between new old high precision time and increment counter as much as you expect it to actually be or call some callback function that many times.
This approach will be way more precise than any timer event.