Observer pattern in wxPython - multithreading

I am trying to implement the Observer design pattern with wxPython.
I have a modelling application that computes vast amount of data in the background. Sometimes I would like to display the output of the model in the GUI---which is just a grid of squares of different colours. Other times I need to do the computation without displaying the GUI.
The advantage of the observer pattern is that you can plug in or not a GUI just by adding or removing one line of code, something like
self.observers.append(MyWxGui())
or similar.
Now, to do that I need my computation to run on one thread, and the wx GUI to run in a different one.
I tried doing this with wxPython but I always get a Fatal I/O error:
python: Fatal IO error 11 (Resource temporarily unavailable) on X server :0.0.
I read tutorials on multithreading in wxPython, such as http://wiki.wxpython.org/LongRunningTasks, but they all have the Mainloop() running in the main thread and than the long running task in a secondary thread, while I need it to be the other way round. This is because if I have the Mainloop() in the main thread, the program hangs waiting for some event from the GUI, instead of proceeding with the computation.
I also saw that I cannot manipulate Device contexts (DCs) such as ClientDC or PaintDC in a sub-thread, but I'm running the entire wx code inside the same thread.
Can the Mainloop() and all the wx GUI be run in its own thread that is not the main application's one?
Running wxPython 2.8.11.0 on Ubuntu 10.10 maverick.

If you read that wiki page, then you should know that you can communicate back to the wx thread using wx.CallAfter, wxCallLater or wx.PostEvent in a thread-safe manner. I have a simple tutorial here:
http://www.blog.pythonlibrary.org/2010/05/22/wxpython-and-threads/
Personally, I would use something like Pubsub + one of the threadsafe methods mentioned above to communicate with the wx MainLoop. The nice thing about Pubsub is that it can listen for messages and react to them appropriately. The example above actually shows one way to do just that. Hopefully that will help you. Otherwise, I highly recommend joining the wxPython mailing list and asking there: http://groups.google.com/group/wxpython-users/topics?pli=1

Related

Articulate non-blocking to blocking events in pyglet

I'm developing a game in pyglet, that scheduled by a simple text file like :
0:00:01;event1
0:00:02;event2
0:00:03;event3
The fact is that, among these events, some might be blocking (for instance event2 might consist in displaying instructions until a key is pressed). As a consequence, event3 might not be executed at the proper time (i.e., during the event2). For now, my strategy is to schedule one event after the other :
Execute the first event
Once the first event is finished, compute the remaining duration between the first and the second event (delta_duration)
Schedule the second event with a delay of delta_duration
... and so on
For now, I did not succeed in implementing properly a blocking event with this strategy. It seems that anything blocking the event_loop (like a sleep call during event2) is preventing even the graphical elements of event2 (text instructions) to be displayed. On the other hand, if I do not put any blocking routine (sleep) in the event2, I'm able to see the vertices, but the scheduler keeps on scheduling (!), and so the event3 comes too soon.
My question is : what would be a general strategy, in pyglet, to articulate non-blocking to blocking events ? More precisely, is it possible (desirable) to use multiple clocks for that purpose ? The pyglet documentation mentions that multiple clocks can be used but it is not very well explained.
I don't want a solution that is specific to my events example but, rather, general indications about the way to go.
It's really up to your program on what blocks. If you are using input from Python for the console window, then yes that will block because it's blocking execution of Python in general. If you have a label popup in the window that is waiting for input from an on_key_press window event, then that is completely different as it's not blocking the pyglet loop, it was scheduled within it.
If your event is a 20 second long math calculation, then that should probably be ran in a thread. You will probably have to separate the types of events in order to differentiate how they should be ran. It's hard to say because without a runnable example or sample of code, I am just guessing at your intentions.
Although it sounds more like you are wanting some sort of callback system. When execution of func1 is declared done, go to func2. There is nothing built into pyglet like this, you would have to have a clever use of scheduling. There are examples of this using pure python though. I personally use Twisted Deferred's for this.

I want to use pyQt5 to design an interactive GUI

Discretion: I would like a general guidance for an approach to a project I am working on, so the question is very broad.
I am currently trying to build a GUI to make serial communication with an arduino, a usb camera (the camera has its own python library for controls), and handle real-time data in .dat format that gets updated as this GUI is running.
Right now, I am using threading on python in order to do all of these simultaneously, and I only interact with the script by using input function on python. Once threads start running, I cannot really interact with this script.
I have 3 separate threads running: 1. thread that saves images from the camera 2. thread that sends signals to arduino in every given random timing. 3. thread that waits for an input to terminate the main thread.
Everything works as I desire, but I wish to add GUI to make things more straight forward for others to use the program.
I realized that Qt actually offers all the capabilities that I wish to implement as a part of this program. Yet, I cannot fully understand the scope of Qt library functions I will need to implement everything.
My understanding is that I could use a combination of QWidgets, QTimer and QThreads to try something, but I would like to have some guidance on a more conventional approach to designing such GUI interface to do multitasking. I would like to also display real-time data on graphs including images from the camera and recording voltage data from the data files that gets updated through another program (data gets written to another folder). The program requires tracking of time from finish to end, and I know that threading can be very confusing when it comes to tracking these times. Any reference will be greatly appreciated.
Thanks you all.

Is this a decent structure for a multithreaded videocoacher program?

Hi I’m currently working on a project for a videocoacher program for recording and replaying video, as well as showing delayed real-time video, and tracking placement via color.
The software is running on linux , on a 4 core odroid, and initially I started to make it multi threaded with threads implemented as a part of each new class. Each of these threads taking care of their own gui elements.
I’ve later found out that I need to show all gui elements/video in the main/gui thread. Earlier I’ve used opencv and boost. But it seems like using the Qt might be a better idea since some of the code already depends on the QT library. I am currently a novice at programming, and not very familiar with either opencv, qt, or threading.
My question is:
Is this relatively sound as a structure for the program, or is there something inherently wrong with how I am planning to do it now?
Main/GUI Thread
will show all visual & video content
will start a thread for ButtonControl object
ButtonControl
will handle all button input, controlling what happens in the program
depending on what buttons are pressed will start and end threads
like:
StoreToFile object ( starts storing video to a file, while sending a
video stream to GUI thread to show what it is storing in real-time)
ReadFromFile object ( reads the file currently stored and sends data
to display it in GUI thread
DelayedVideoStream object (stores video to buffer, and shows a
continuous delayed view of what happened 5seconds in the past)
ColorTracking object (tracks where a color placement is in the image
)
Kind regards, and thank you for taking the time to look at my question.
TLDR - is a structure where threads are implemented as classes and the image data is sent back to the gui/main thread a decent way to do a multithreaded program ?
Performance-wise, the best approach is not to deal with threads directly at all, but use QtConcurrent::run. It is safe to paint QImages that are simply passed via signals to a GUI object to display. I wrote a complete example demonstrating that approach. It leads to some very concise and easy-to-understand code thanks to related code being adjacent.
If you do want to use explicit threads, it will be much easier not to derive from QThread, but to simply move various worker objects into their threads, and have them communicate via signals and slots. I have a complete example for that approach as well.

Outputting console data from a process to gui in wxwidgets

I'm running a long process in the background. I've managed to output the console data to gui. But the problem is that, the data is returned only after the process is finished. But I need to display the data at realtime. ie, I need to display the data, every time it produces some output on the console. I'm running the process with in my gui from a seperate thread.
I mean, it would be like building a gui for the ping command, where output is displayed on console after each packet is send, ie at realtime. I just need to redirect that to gui, in realtime. I'm implementing the gui in wxwidgets. Any help would be greatly appreciated.
Thanking You..
Jvc
Is the output you wish to display generated in a separate process from the process running the GUI? Or in a separate thread in the same process?
I ask because most people, when they ask this question, mean a a separate thread. Since you have tagged your question with "process" I will assume that is what you mean.
You need some inter-process communication. There is a bewildering variety of techniques to do this. Personally, I always use sockets.
wxWidgets has simple, easy to use socket classes wxSocketClient and wxSocketServer.
The background process is probably not running wxWidgets, so you will need something else there. I reccomend boost::asio. I know it looks intimidating, but in fact the tutorial code can be used as is.
There is a lot more to be said, but I risk straying away from the point, since there are so few details in your question.
You can have an output queue protected by a wxMutex. The thread doing the computation writes to the queue, then signals the GUI thread using wxQueueEvent with a custom event to let it know that the thread is not empty. The GUI thread then reads the queue and outputs the data.

using threads and pyGST in a wx python app

OK, so I am writing an app, which plays music with the pyGST bindings.
This requires the use of threads to handle playback. The bindings library handles most of the thread control for me, which is nice(and what I was looking for in them).
Now, I don't have a full grasp on this concept, so I would be eager for some references. But the way I understand it, is I have to basically inform the app that it can use multiple threads.
I gathered this from the examples on the gstreamer site, where they use this call:
gtk.gdk.threads_init()
gtk.main()
according to here, this tells the app it can use multiple threads(more or less), which is where my above assumption came from.
That is the background. Now get this. I have placed those lines in my code, and they work fine. My app plays music rather than crashing whenever it tries. But something doesn't feel right.
In the examples that I got those lines from, they use gtk for the whole GUI, but I want to use wxWidgets, so it feels wrong calling this gtk function to do this.
Is there a wx equivalent to this? or is it ok to use this, and will it still work cross platform?
Also, I have to figure out how to kill all these threads on exit(which it does not do right now) I see how they do it in the example using a gtk method again, so again, looking for a wx equivalent.
PS: I think this(or the solution) may be related to the wx.App.MainLoop() function, but I am lost in trying to understand how this loop works, so again, good references about this would be appreciated, but I suppose not necessary as long as I have a good solution.
Try using this instead:
import gobject
gobject.threads_init()
I wonder how come it is not written in large print at the start of every python gstreamer plugin piece of documentation: it only took me several hours to find it.
A bit more details here.
I have no experience with pyGST, but the general advice for using threads and wxPython is to only update the GUI from the main thread (i.e. the thread that starts the MainLoop). See http://wiki.wxpython.org/LongRunningTasks for more information.
I have no experience with the python bindings, but I have had success using wxWidgets and GStreamer together on Windows. The problem is that wxWidgets runs a Windows event loop while GStreamer uses a GLib event loop. If you don't care about any of the GStreamer events, you shouldn't need to do anything. However, if you care to receive any of the GStreamer events, you will have to run your own GLib event loop (GMainLoop) in a separate thread with a separate GMainContext. Use gst_bus_create_watch to create a GST event source, add a callback to the source with g_source_set_callback, and then attach it to the main context of your GLib event loop with g_source_attach. You can then handle the GST in the callback, for example, to forward the events to the wx main event loop.

Resources