It has been a while I'm working with Enterprise Architect 13.5 and simulating state machines.
Until now I manage transitions with simple triggers which are available in the simulation events window.
I am looking for a way to make a time based transition between two states but I do not figure how to do it.
When simulation is running I can't find a way to manage a 30s timeout between two states.
from
https://sparxsystems.com/resources/user-guides/15.1/simulation/executable-state-machines.pdf
at page 8
"Trigger and Events -- An Executable StateMachine supports event handling for Signals only. To use Call, Timing or Change Event types you must define an outside mechanism
to generate signals based on these events."
Well, you already have answered your question. When you open the properties of the transition you see the properties where you can enter something like
The specification is as simple as that. Once you save that you see the trigger name along the transition:
Pretty much straight forward, isn't it?
Related
I need to do some basic networking for a Pygame project.
Basically, it's a 2D single player or cooperative game. The networking only needs to support two players, with one as a host.
The only information that needs to be sent is the positions of players, creeps and bullets.
I've been reading around and Twisted keeps coming up, but I haven't done networking before, and I'm not sure if that might be an overkill.
So, is it possible for a relative newbie to implement networking in Pygame?
This was asked recently on Reddit, so I'll more or less just copy my answer over from there. I apologize for not being able to provide more links, I have <10 rep so I can only post two at a time.
Twisted might work, but I don't have a whole lot of experience with it. I'd recommend going with sockets, as that's what Twisted uses in the background anyway. Beej's guide (google it) is pretty much the Holy Bible of sockets if you want to learn how they work (in C++, but the concepts extend everywhere). Python does abstract some of the complexity away, but it's still a good idea to know what's going on in the background.
For Python specific sockets, you can go ahead and just use the howto (user745294 posted a link above). Here's a nice article titled "What every programmer needs to know about Game Networking". It goes into the different types of major networking styles (client-server, p2p, udp v. tcp, etc.) and the history behind what some major games used for their networking.
Below is a link to a demo I did on making a networked "game" in Python 2.6/Pygame. It's not actually a game, but each client you create connects to the server and controls a character. You can move your character with the arrow keys and the character will move on all connected clients. I tried commenting the source code with some indication of what I'm sending back and forth, but you may need a little knowledge about sockets to understand it.
The source code is provided in the codepad links in the comment below this post. You will need to provide two images in the same directory as the scripts:
bg.png is the background sprite. It should be an image 400px wide and 300px tall (this can be changed in the GameClient class if needed)
sprite.png is the player character. It should be smaller than the background so that you can see it moving around.
You can use Twisted for networking with Pygame. The "game" project on Launchpad has some examples of how one might integrate the main loops together; basically, use twisted.internet.task.LoopingCall to draw Pygame frames and handle input, while letting the Twisted reactor of your choice run normally.
Since you are already using Pygame, I think this light networking library made for Pygame will do what you need and teach you, but not overwhelm you.
"Mastermind Networking Lib" via pygame.org
There is Pyro (Python remote objects) as another solution for networking in Python.
http://irmen.home.xs4all.nl/pyro/
Using raw sockets is low-level and full of danger. As said before, Twisted is complex and takes to time get up and running. To save yourself some headaches I'd try something like zerorpc.
You need the following solutions:
discovering other player(s) on the (local) network, you don't want player to enter some IP address
handle network errors
serialize messages containing your data (positions, player name, etc.)
handle threading as networking is asynchronous I/O
Above should still be called 'basic', you should really use some fancy networking library with idiomatic API.
Essentially you need to expose the network service (in its own thread) that will push messages to Python's Queue, and then access this same queue from your Pygame code, and if there is a message then you update whatever structures you use to store player's position and draw it on screen.
You shouldn't send stuff like bullet positions over the network as they can be easily (and faster) calculated locally. You just send an event like bullet_shot over the network with a source position and velocity vector.
I am working on a VST3 compatible plugin scanner, which basically looks for .vst3 files on the filesystem at predefined locations. Now if anybody answering this question has ever looked at a plugin scanner in a DAW, what it does basically is it has a table which has columns for plugin name, vendor, version, pathname etc. When it finds a plugin, it get all these details, populates the table one at a time. The Table will be in the GUI thread and the scanning will be done on a separate thread. Note that scanning is an expensive operation as it takes about 5-10 seconds per plugin at max. and a typical producer has about 100 plugins. Now how do I do that with Qt? I read the Threading Basics part (I am new to threading), it will be good if I can get it done with QtConcurrent class but I am not sure. Sorry if this is the wrong place to ask this question. My questions usually get answered only here.
QtConcurrent seems the correct tool to get the job done without getting the hands too dirty with the low level details of threading.
My advice is to first design your data structures and GUI. For instance, your description would match a custom data class AudioPluginsTableModel (subclass of QAbstractTableModel) with a QTableView for representation and user interaction.
You may define a scanning function returning an instance of AudioPluginsTableModel, and execute this function with QtConcurrent::run which would return a QFuture<AudioPluginsTableModel>, that you may watch for completion with QFutureWatcher.
I am using REDHAWK 1.9 on CentOS 6.3 32 bit...
I have a REDHAWK component that takes in one data stream. The waveform may want to have more than one instance of the class depending upon the data. Is it possible to do the following:
Create an instance of a component on the fly when the waveform is running?
Create dynamic connections between components when the waveform is running?
Jonathan, I'm not sure I totally understand your question but let me try an answer and you can clarify if I'm misunderstanding. It sounds like you want to have a waveform running, and depending on what the waveform does to the data flowing into it, launch more waveforms to perform other tasks on the data. Is that correct?
Dynamic launching of waveforms based on the meeting of certain conditions is not included natively with REDHAWK. However, it would be possible to create a component to do this and include it in one of your waveforms.
When stringing together multiple waveforms, make sure the connecting ports are configured as external ports.
I'm trying to interface a Nexys3 board with a VmodTFT via a VHDCI connector. I am pretty new to FPGA design, and although I have experience with micro-controllers. I am trying to approach the whole problem as a FSM. However, I've been stuck on this for quite some time now. What signals constitute my power up sequence? When do I start sampling data? I've looked at the relevant datasheets and they don't make things very clearer. Any help would be greatly appreciated (P.S : I use Verilog for the design).
EDIT:
Sorry for the vagueness of my question. Here's specifically what I am looking at.
For starters, I am going to overlook the touch module. I want to look at the whole setup as a FSM. I am assuming the following states:
1. Setup connection or handshake signals
2. Switch on the LCD
3. Receive pixel data
4. Display video
5. Power off the LCD
Would this be a reasonable FSM? My main concerns are with interpreting the signals. Table 5 in the VmodTFT_rm manual shows a list of signals; however, I am having trouble understanding what signals are for what (This is my first time with display modules). I am going to assume everything prefixed with TFT_ is for the display and everything with TP_ is for the touch panel (Please correct me if I'm wrong). So what signals would I be changing in each state and what would act as inputs?
Now what changes should I make to accommodate the touch panel too?
I understand I am probably asking for too much, but I would greatly appreciate a push in the right direction as I am pretty stuck with this for a long time.
Your question could be filled out a little better, it's not clear exactly what's giving you trouble.
I see two relevant docs online (you may have seen these):
Schematic: https://digilentinc.com/Data/Products/VMOD-TFT/VmodTFT_sch.pdf
User Guide: https://digilentinc.com/Data/Products/VMOD-TFT/VmodTFT_rm.pdf
The user guide explains what signals are part of the Power up sequence
you must wait between 0.5ms and 100ms after driving TFT-EN before you can drive DE and the pixel bus
You must wait 0 to 200ms after setting up valid pixel data to enable the display (with DISP)
You must wait 160ms after enabling DISP before you start pulsing LED-EN (PWM controls the backlight)
Admittedly the documentation doesn't look great and some of the signals names are not consistent, but I think you can figure it out from there.
After looking at the user guide to understand what the signals do, look at the schematic to find the mapping between the signal names and the VHDCI pinout. Then when you connect the VHDCI pinout to your FPGA, look at your FPGA's manual to find mapping between pins on the VHDCI connector and balls of the FPGA, and then you can use the fpga's configuration settings to map an FPGA ball to a logical verilog input to your top module.
Hope that clears things up a bit, but please clarify your question about what you don't understand.
On Windows I do
HGLRC glContext = wglGetCurrentContext();
HDC deviceGLContext = wglGetCurrentDC();
wglMakeCurrent(glContext, deviceGLContext);
On Linux there are analogous functions for getting current GL context and current device context, glXGetCurrentContext and glXGetCurrentDisplay, respectively. But I am stuck with
Bool glXMakeCurrent( Display *dpy,
GLXDrawable drawable,
GLXContext ctx )
I don't know how to deal with the second parameter. I use Qt for GUI, but I still need several Windows API function, among which are the three ones mentioned above.
How to make the invoke glXMakeCurrent in the same fashion described at the beginning of the post? The problem is that I don't know how to get GLXDrawable.
I need to get a GLXContext, then create another one to share Display lists, and make it current in another thread, add it to OpenCL context attributes. The point is that I need to be able to make it current.
That 'GLXDrawable' is the X11 window for which you have got the context.
If you are using qt, I would have assumed it would have provided a 'myWindow.makeCurrent()' function, or something to the effect.
You can make a window using XCreateWindow (there is also a function for making a basic window with less options). Before this, you will need to have got a connection to the display using XOpenDisplay.
I have been very short on the details here, as there is a lot steps for getting an OpenGL Context in an X11 window, and whilst not hard, does involve a lot of error checking. I suggest you make use of a library that handles this for you.
Contrary to Windows, in X11 you are dealing with a client server model. The "display" represents the connection to the X11 server. In X11 there are Drawables, which can be used interchangably. One kind of Drawable are Windows.
You might want to have a look at
https://github.com/datenwolf/codesamples/tree/master/samples/OpenGL/x11argb_opengl
for an example on how to create OpenGL window with a transparent background using plain X11/GLX, that can be used in compositing.
--
Update
I need to get a GLXContext, then create another one to share Display lists, and make it current in another thread, add it to OpenCL context attributes. The point is that I need to be able to make it current.
Familiar problem. My solution to this is to treat a QGLWidget as if it was a context. In your other thread create another QGLWidget, that will never be shown and pass the visible QGLWidget instance into the sharing parameter of the constructor. Then you can use the QGLWidget as if it were a context. It's dirty, not really to the point, but Qt's internal OpenGL system is that way.