Calculate the Stop time for stepper motor after the stop command was initiated - motordriver

Currently we have a stepper motor with load configured in Open loop so there is no feedback as such to determine the angle which motor makes due to interia after giving the stop command.
Is there a way to find the angle that motor rotates until it gets stopped after stop command was initiated using the open loop system.

Related

Godot 3.4.4 How to stop audio in html export?

There was a problem with audio playing in the exported instance of the game. The sound is played using the AudioStreamPlayer node. I export the game to html5 and run it either on a local server or on any service that executes html code. If during the game you minimize the browser, change the tab, make the game window NOT active, then all game processes are suspended, that is: the movement of all objects does not occur, all timers are suspended. BUT the music continues to play. I tried to handle a focus Notification , an input handle with a cursor to turn off the music, but this is not an option, because if you minimize the browser via alt + tab while on the tab with the game, then the focus does not go away and the input is still waiting (while as I said earlier, all other processes related to movement and the timer are suspended).
How stop this audio stream in this case?
thanks

How to make a function run at custom fps in Godot using Gdscript

I want to make a timer function in Godot that would use the computers frame rate in order to run code at whatever fps I choose (ex. 60 fps). So far I have code for a basic timer:
var t = Timer.new()
t.set_wait_time(time)
t.set_one_shot(true)
self.add_child(t)
t.start()
yield(t, "timeout")
t.queue_free()
However rather than having the time variable be a set number, I would like it to change based on how fast the computer can run, or time between frames.
I want to make a timer function in Godot that would use the computers frame rate
That would be code in _process. If you have VSync enabled in project settings (Under Display -> Window -> Vsync, it is enabled by default), _process will run once per frame.
run code at whatever fps I choose (ex. 60 fps)
Then that is not the computer frame rate, but the one you say. For that you can use _physics_process, and configure the rate in project settings (Under Physics -> Common -> Physics Fps, it is 60 by default).
If what you want is to have something run X times each frame, then I'm going to suggest just calling it X times either in _process or _physics_process depending on your needs.
I would like it to change based on (...) time between frames.
You could also use delta to know the time between frame and decide based on that how many times to run your code.
If you want to use a Timer, you can set the process_mode to Physics to tie it to the physics frame rate. Idle will use the graphics frame rate.
From your code it appear you don't want to connect to the "timeout" signal, but yield instead. In that case, you might be interested in:
yield(get_tree(), "idle_frame")
Which will resume execution the next graphics frame.
Or…
yield(get_tree(), "physics_frame")
Which will resume execution the next physics frame.
Heck, by doing that, you can put all your logic on _ready and don't use _process or _physics_process at all. I don't recommend it, of course.
If you want a faster rate than the graphics and physics frames, then I suggest you check OS.get_ticks_msec or OS.get_ticks_usec to decide when to use yield.
Assuming you want to avoid both physics and graphics frames entirely, you can instead create a Thread and have it sleep with OS.delay_msec or OS.delay_usec. For abstract, you would do an infinite loop, call a delay each iteration, and then do whatever you want to do.
I made an stopwatch class for an answer on Gamedev that you may use or base your code on. It has a "tick" signal, and measure running time, with support for pausing, resuming, stopping and starting again. It is NOT optimized for short intervals. Its design goal was to archive a lower rate with lower CPU usage than just checking the time each frame.
I would like it to change based on how fast the computer can run
If you se a Thread with an infinite loop, and no delays… Well, it will run as fast as it can run.

How can I choose which frame I pick from camera texture using the delay frame patch (or any other)?

I am building an effect which needs to "freeze" the camera texture for a few seconds on certain occasions, triggered by a pulse (Just like a "snapshot" effect where you trigger it and then the image doesn't move anymore for a few seconds - If you remember Pokemon snap on N64)
I am currently trying to build it using the Delay Frame patch. I render the scene using a scene render pass, then it goes through the delay frame in the first frame input, but the problem is I don't want to freeze the first frame I want to freeze a random phrase a T= X seconds, triggered by a user defined pulse.
Do you have any insight on how I should achieve that?
Thanks a lot, have a great day !
so what you can do is using the "Runtime patch " you can get how much time has passed since the effect has started so with that and a "greater then patch " you can make a trigger with the "pulse patch " to make the freeze.

Designing the structure of a NinJump-like game using J2Me

I'm trying to create a NinJump-like game using J2ME, and I've run into some problems with the animation.
My game built this way:
A thread is started as soon as the game is started. A while loop runs infinitely with a 20ms delay using thread.sleep().
The walls constantly go down - each time the main while loop runs, the walls are animated.
The ninja is animated using a TimerTask with a 30ms interval.
Each time the player jumps, the player sprite is hidden, and another sprite appears, which performs the jump using a TimerTask: 20ms interval, each time the task is executed the sprite advances the its next frame and it also moves (2px each time).
The problem is that when the player jumps, the wall animation suddenly gets slow. Also, the jumping animation is not smooth, and I just can't seem to be able to fix it using different animation time intervals.
I guess there's something wrong in the way I implemented it. How can the problems I mentioned above?
Don't use TimerTask to animate the sprites, do it on the main game loop.

how to implement a relative timer in a game

Our game is a MMO game, and our logic server has a game loop of course. To make these logical module easy to write, we provide a timer module, which support register a real timer and trigger it when it possible. In the game loop, we pass the system time (in millsecond) to timer module, and the timer manager will check if there're some timers can be triggered. For example, to update a player/monster position, when the player start to move, we update the player position every 200ms.
But when the game loop run too much logic, it will use too much time in a single frame, and the next frame, some timer will slower than the real time. That will cause some bugs actually. For example, if in one frame, it spent 1 second, and in the next frame, the real time is 1000ms, and the move timer has scheduled in 800ms, so the timer has been triggered at 1000ms, slower than the expected time.
So is there any better solutions to solve this problem ? For example, we can implement a timer, only dependent on our game, not dependent on the real computer time ?

Resources