I have an MCU connected to the computer through a serial interface. The MCU might send data at regular intervals or very seldom depending on the type of sensor connected to it.
So I want to have a python function that gets called whenever there is data incoming from the serial port instead of polling all the time.
After reading a lot of similar questions (Small Example for pyserial using Threading. PySerial/Arduino, PySerial/interrupt mode, Python Serial listener, and so on), I came to the conclusion that the solution to this is threading. So I came up with 3 different codes that work:
First:
import time
import serial
import threading
ser = serial.Serial("/dev/ttyUSB0", 19200)
datos = ""
class SerialReaderThread(threading.Thread):
'''
The class with the method that reads the serial port in the backgroud.
'''
def __init__(self):
super().__init__()
self._stop_event = threading.Event()
def run(self):
'''
The method that actually gets data from the port
'''
global ser, datos
while not self.stopped():
datos = ser.readline().decode('ascii').strip()
def stop(self):
self._stop_event.set()
def stopped(self):
return self._stop_event.is_set()
serial_thread = SerialReaderThread()
serial_thread.start()
i = 0
while i < 5:
if datos != "":
print(datos)
datos = ""
i += 1
serial_thread.stop()
while serial_thread.isAlive():
pass
print("Thread stopped.")
ser.close()
Second:
import serial
import threading
import time
ser = serial.Serial("/dev/ttyUSB0", 19200)
read = True
datos = ""
def serialEvent():
global ser, read, datos
while read is True:
datos = ser.read_until().decode('ascii').strip()
return
t = threading.Thread(target=serialEvent)
t.start()
i = 0
while i < 5:
if datos != "":
print(datos)
datos = ""
i += 1
read = False
t.join()
while t.isAlive():
pass
print("Thread stopped.")
ser.close()
Third:
import serial
import concurrent
ser = serial.Serial("/dev/ttyUSB0", 19200)
datos = ""
readData = True
def serialReadEvent():
global ser, readData, datos
while readData is True:
datos = ser.read_until().decode('ascii').strip()
return
executor = concurrent.futures.ThreadPoolExecutor()
serialData = executor.submit( serialReadEvent )
i = 0
while i < 5:
if datos != "":
print(datos)
datos = ""
i += 1
readData = False
while serialData.running():
pass
print('Thread stopped.')
ser.close()
Question 1: is any of those codes better than the others?
Question 2: is using a global variable the best way to pass data between the thread and the main process?
I've also read that the PySerial API provides a way to work with threads, but I don't understand the documentation.
Question 3: can anybody give me an example of a reading thread using PySerial API?
Finally, I've read that Qt Serial Port also provides a way to process incoming data in a thread (example).
Question 4: is this the best way to solve my problem if I'm going to have a GUI written in PyQt5 as well?
You got to the correct conclusion that the solution to this is Threading. But I would recommend to make maximum use of APIs of the framework/library you are using.
So in light of what you are trying to achieve, there is an API in pySerial in_waiting which return the number of bytes in receive buffer.
Making use of this, you can fire up a thread which will continuously monitor the underlying receive buffer and only read when your desired number of bytes are present in the buffer.
To share the received data between the Serial Read Thread and main, the better way to do is to make use of queues. Based on FIFO principle, your Serial Read Thread will be the one responsible for the enqueue operation only and main will be responsible for dequeue operation only. There won't be a case of data duplication etc. In main, you can decide how many bytes must be in this queue and only then you can dequeue.
This should work well with your GUI written in PyQt5.
Related
Here is my problem : I have 2 programs communicating thanks to zmq on an arbitrary tcp port.
When the #1 receives message from #2 he has to call some function.
If #1 receives a message before the current function ends, I'd like #1 to interrupt the current function and call the new one.
I tried to use threading.Event to interrupt function.
I don't know if zmq is the right option for my needs or if the socket types fine.
To simplify I show the simplest version possible,here is what I tried :
p1.py
import zmq
from threading import Event
port_p2 = "6655"
context = zmq.Context()
socket = context.socket(zmq.PAIR)
socket.connect("tcp://localhost:%s" % port_p2)
print("port 6655")
__exit1 = Event()
__exit2 = Event()
def action1():
__exit1.clear()
__exit2.set()
while not __exit1.is_set():
for i in range(1, 20):
print(i)
time.sleep(1)
__exit1.set()
def action2():
__exit2.clear()
__exit1.set()
while not __exit2.is_set():
for i in range(1, 20):
print(i * 100)
time.sleep(1)
__exit2.set()
if __name__ == "__main__":
try:
while True:
try:
string = socket.recv(flags=zmq.NOBLOCK)
# message received, process it
string = str(string, 'utf-8')
if "Action1" in string:
action1()
if "Action2" in string:
action2()
except zmq.Again as e:
# No messages waiting to be processed
pass
time.sleep(0.1)
except(KeyboardInterrupt, SystemExit):
print("exit")
and p2.py
import time
import random
port_p1 = "6655"
context = zmq.Context()
socket_p1 = context.socket(zmq.PAIR)
socket_p1.bind("tcp://*:%s" % port_p1)
print("port 6655")
if __name__ == "__main__":
while True:
i = random.choice(range(1, 10))
print(i)
try:
if random.choice([True, False]):
print("Action 1")
socket_p1.send(b'Action1')
else:
socket_p1.send(b'Action2')
print("Action 2")
except zmq.Again as e:
pass
time.sleep(i)
For my purpose I didn't want / can't use system signals
I'd appreciate any input and don't hesitate to ask for precision, I have to confess that I had trouble writing this down.
Thank you
Q : …like #1 to interrupt the current function…
Given you have forbidden to use signals, #1 can but passively signal (be it over the present ZeroMQ infrastructure or not) the function, not to continue further and return in a pre-mature fashion ( so the fun() has to get suitably modified for doing that active re-checking, best in some reasonably granular progressive fashion, regularly checking actively the #1 if did passively signal ( "tell" the fun() ) to RET early, due to whatever reason and way the #1 had and used to do that ).
The other chance is to extend the already present ZeroMQ infrastructure ( the Context()-instance(s) ) with a socket-monitor artillery and make the fun() .connect()-directly to the socket-monitor resources to actively learn about any new message arriving to #1 ( i.e. autonomously, without #1's initiative ) and deciding to return in a pre-mature fashion, in those cases, where feasible according to your application logic.
For the socket-monitor case, the API documentation has all details needed for implementation, which would otherwise go way beyond the scope of the Stack Overflow post.
I have snort logging DDOS alerts to file; I use Syslog-ng to parse the logs and output in json format into redis (wanted to set it up as a buffer, I use 'setex' command with expiry of 70 secs).
The whole thing seems not to be working well; any ideas to make it easier is welcome.
I wrote a simple python script to listen to redis KA events and count the number of snort alerts per second. I tried creating two other threads; one to retrieve the json-formatted alerts from snort and the second to count the alerts. The third is supposed to plot a graph using matplotlib.pyplot
#import time
from redis import StrictRedis as sr
import os
import json
import matplotlib.pyplot as plt
import threading as th
import time
redis = sr(host='localhost', port = 6379, decode_responses = True)
#file = open('/home/lucidvis/vis_app_py/log.json','w+')
# This function is still being worked on
def do_plot():
print('do_plot loop running')
while accumulated_data:
x_values = [int(x['time_count']) for x in accumulated_data]
y_values = [y['date'] for y in accumulated_data]
plt.title('Attacks Alerts per time period')
plt.xlabel('Time', fontsize=14)
plt.ylabel('Snort Alerts/sec')
plt.tick_params(axis='both', labelsize=14)
plt.plot(y_values,x_values, linewidth=5)
plt.show()
time.sleep(0.01)
def accumulator():
# first of, check the current json data and see if its 'sec' value is same
#that is the last in the accumulated data list
#if it is the same, increase time_count by one else pop that value
pointer_data = {}
print('accumulator loop running')
while True:
# pointer data is the current sec of json data used for comparison
#new_data is the latest json formatted alert received
# received_from_redis is a list declared in the main function
if received_from_redis:
new_data = received_from_redis.pop(0)
if not pointer_data:
pointer_data = new_data.copy()
print(">>", type(pointer_data), " >> ", pointer_data)
if pointer_data and pointer_data['sec']==new_data["sec"]
pointer_data['time_count'] +=1
elif pointer_data:
accumulated_data.append(pointer_data)
pointer_data = new_data.copy()
pointer_data.setdefault('time_count',1)
else:
time.sleep(0.01)
# main function creates the redis object and receives messages based on events
#this function calls two other functions and creates threads so they appear to run concurrently
def main():
p = redis.pubsub()
#
p.psubscribe('__keyspace#0__*')
print('Starting message loop')
while True:
try:
time.sleep(2)
message = p.get_message()
# Obtain the key from the redis emmitted event if the event is a set event
if message and message['data']=='set':
# the format emmited by redis is in a dict form
# the key is the value to the key 'channel'
# The key is in '__keyspace#0__*' form
# obtain the last field of the list returned by split function
key = message['channel'].split('__:')[-1]
data_redis = json.loads(redis.get(str(key)))
received_from_redis.append(data_redis)
except Exception e:
print(e)
continue
if __name__ == "__main__":
accumulated_data = []
received_from_redis = []
# main function creates the redis object and receives messages based on events
#this function calls two other functions and creates threads so they appear to run concurrently
thread_accumulator = th.Thread(target = accumulator, name ='accumulator')
do_plot_thread = th.Thread(target = do_plot, name ='do_plot')
while True:
thread_accumulator.start()
do_plot_thread.start()
main()
thread_accumulator.join()
do_plot_thread.join()
I currently do get errors per se ; I just cant tell if the threads are created or are working well. I need ideas to make things work better.
sample of the alert formated in json and obtained from redis below
{"victim_port":"","victim":"192.168.204.130","protocol":"ICMP","msg":"Ping_Flood_Attack_Detected","key":"1000","date":"06/01-09:26:13","attacker_port":"","attacker":"192.168.30.129","sec":"13"}
I'm not sure I understand exactly your scenario, but if you want to count events that are essentially log messages, you can probably do that within syslog-ng. Either as a Python destination (since you are already working in python), or maybe even without additional programming using the grouping-by parser.
I have a simple twisted TCP server running absolutely fine, it basically deals with database requests and displays the right things its just an echo client with a bunch of functions, the database that is being read also updates I have this refresh function to open the database and refresh it however if I add this to the message functions it'll take too long to respond as the refresh function takes around 6/7 seconds to complete, my initial idea was to have this function in a while loop and running constantly refreshing every 5/10 mins but after reading about the global interpreter lock its made me think that that isn't possible, any suggestions on how to run this function in the background of my code would be greatly appreciated
I've tried having it in a thread but it doesn't seem to run at all when I start the thread, I put it under the if name == 'main': function and no luck!
Here is my refresh function
def refreshit()
Application = win32com.client.Dispatch("Excel.Application")
Workbook = Application.Workbooks.open(database)
Workbook.RefreshAll()
Workbook.Save()
Application.Quit()
xlsx = pd.ExcelFile(database)
global datess
global refss
df = pd.read_excel(xlsx, sheet_name='Sheet1')
datess = df.groupby('documentDate')
refss = df.groupby('reference')
class Echo(Protocol):
global Picked_DFS
Picked_DFS = None
label = None
global errors
global picked
errors = []
picked = []
def dataReceived(self, data):
"""
As soon as any data is received, write it back.
"""
response = self.handle_message(data)
print('responding with this')
print(response)
self.transport.write(response)
def main():
f = Factory()
f.protocol = Echo
reactor.listenTCP(8000, f)
reactor.run()
if __name__ == '__main__':
main()
I had tried this to no avail
if __name__ == '__main__':
main()
thread = Thread(target = refreshit())
thread.start()
thread.join()
You have an important error on this line:
thread = Thread(target = refreshit())
Though you have not included the definition of refreshit (perhaps a function to consider renaming), I assume refreshit is a function that performs your refresh.
In this case, what you are doing here is calling refreshit and waiting for it to return a value. Then, the value it returns is used as the target of the Thread you create here. This is probably not what you meant. Instead:
thread = Thread(target = refreshit)
That is, refreshit itself is what you want the target of the thread to be.
You also need to be sure to sequence your operations so that everything gets to run concurrently:
if __name__ == '__main__':
# Start your worker/background thread.
thread = Thread(target = refreshit)
thread.start()
# Run Twisted
main()
# Cleanup/wait on your worker/background thread.
thread.join()
You may also just want to use Twisted's thread support instead of using the threading module directly (but this is not mandatory).
if __name__ == '__main__':
# Start your worker/background thread.
thread = Thread(target = refreshit)
thread.start()
# Run Twisted
main()
# Cleanup/wait on your worker/background thread.
thread.join()
I have a serial device (Arduino) regularly outputting log data, which shold be written in a Log file. Also the device takes spontaneous commands over serial. I send the commands to a Raspberry over Telegram, which are handled and sent to the arduino by Telepot, which runs in a separate thread.
How can I make sure that the two processes get along with each other?
I am a complete Beginner in Multithreading.
Here is a shortened version of my Code:
import time
import datetime
import telepot
import os
import serial
from time import sleep
ser = None
bot = None
def log(data):
with open('logfile', 'w') as f:
file.write("Timestamp" + data)
#The handle Function is called by the telepot thread,
#whenever a message is received from Telegram
def handle(msg):
chat_id = msg['chat']['id']
command = msg['text']
print( 'Command Received: %s' % command)
if command = '/start':
bot.sendMessage(chat_id, 'welcome')
elif command == 'close_door':
#This serial write could possibly happen while a
#ser.readline() is executed, which would crash my program.
ser.write("Close Door")
elif command == 'LOG':
#Here i should make sure that nothing
#is waiting from the Arduino
#so that the next two Serial lines are the Arduinos
#respoonce to the "LOG" command.
#and that hanlde is the only
#function talking to the Serial port now.
ser.write("LOG")
response = ser.readline()
response += "\0000000A" + ser.readline()
#The Arduinos response is now saved as one string
#and sent to the User.
bot.sendMessage(chat_id, response)
print("Command Processed.")
bot = telepot.Bot('BOT TOKEN')
bot.message_loop(handle)
ser = serial.Serial("Arduino Serial Port", 9600)
print( 'I am listening ...')
while True:
#anything to make it not run at full speed (Recommendations welcome)
#The log updates are only once an hour.
sleep(10)
#here i need to make sure it does not collide with the other thread.
while ser.in_waiting > 0:
data = ser.readline()
log(data)
This code is not my actual code, but it should represent exactly what I'm trying to do.
My last resort would be to put the serial code in the threads loop function, But this would require me to change the libary which would be ugly.
I looked up some stuff about Queues in Asincio, and locking functions. However i don't really understand how to apply that. Also I don't use the async telepot.
After reading more on locking and threads, I found an answer with help of the links provided in this Question: Locking a method in Python?
It was often recommended to use Queues, however I don't know how.
My solution (code may have errors, but the principle works)
import time
import random
import datetime
import telepot
import os
import serial
from time import sleep
#we need to import the Lock from threading
from threading import Lock
ser = None
bot = None
def log(data):
with open('logfile', 'w') as f:
file.write("Timestamp" + data)
#create a lock:
ser_lock = Lock()
#The handle Function is called by the telepot thread,
#whenever a message is received from Telegram
def handle(msg):
#let the handle function use the same lock:
global ser_lock
chat_id = msg['chat']['id']
command = msg['text']
print( 'Command Received: %s' % command)
if command == '/start':
bot.sendMessage(chat_id, 'welcome')
elif command == 'close_door':
#This serial write could possibly happen while a
#ser.readline() is executed, which would crash my program.
with ser_lock:
ser.write("Close Door")
elif command == 'LOG':
#Here i should make sure that nothing
#is waiting from the Arduino
#so that the next two Serial lines are the Arduinos
#respoonce to the "LOG" command.
#and that hanlde is the only
#function talking to the Serial port now.
#the lock will only be open when no other thread is using the port.
#This thread will wait untill it's open.
with ser_lock:
while ser.in_waiting > 0:
data = ser.readline()
log(data)
#Should there be any old data, just write it to a file
#now i can safely execute serial writes and reads.
ser.write("LOG")
response = ser.readline()
response += "\0000000A" + ser.readline()
#The Arduinos response is now saved as one string
#and sent to the User.
bot.sendMessage(chat_id, response)
print("Command Processed.")
bot = telepot.Bot('BOT TOKEN')
bot.message_loop(handle)
ser = serial.Serial("Arduino Serial Port", 9600)
print( 'I am listening ...')
while True:
#anything to make it not run at full speed (Recommendations welcome)
#The log updates are only once a
sleep(10)
#here i need to make sure it does not collide with the other thread.
with ser_lock:
while ser.in_waiting > 0:
data = ser.readline()
log(data)
I trying to do a simple server. The server has to load data while client provides it. I want to use non-blocking 'recv' command, but whatever I do it still blocks. My code is above.
import socket
import fcntl
import os
server_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
server_socket.bind(("localhost", 8888))
server_socket.listen()
client_socket, addr = server_socket.accept()
#fcntl.fcntl(client_socket, fcntl.F_SETFL, os.O_NONBLOCK) #mark1
client_socket.setblocking(False) #mark2
#client_socket.settimeout(0.0) #mark3
while True:
data = client_socket.recv(10)
if len(data) != 0:
print(data.decode("UTF8"))
else:
break
print("Exit")
I tried to use lines '#mark1', '#mark2', '#mark3'. These lines seem to be pretty good for me, but my program is still stuck on the second iteration with data = client_socket.recv(10) line.
Why do client_socket.setblocking(False) and others not affect?