Looking for some code samples to solve this problem :-
Would like to write some code (Python or Javascript) that would act as a subscriber to a RabbitMQ queue so that on receiving a message it would broadcast the message via websockets to any connected client.
I've looked at Autobahn and node.js (using "amqp" and "ws" ) but cannot get things to work as needed. Here's the server code in javascript using node.js:-
var amqp = require('amqp');
var WebSocketServer = require('ws').Server
var connection = amqp.createConnection({host: 'localhost'});
var wss = new WebSocketServer({port:8000});
wss.on('connection',function(ws){
ws.on('open', function() {
console.log('connected');
ws.send(Date.now().toString());
});
ws.on('message',function(message){
console.log('Received: %s',message);
ws.send(Date.now().toString());
});
});
connection.on('ready', function(){
connection.queue('MYQUEUE', {durable:true,autoDelete:false},function(queue){
console.log(' [*] Waiting for messages. To exit press CTRL+C')
queue.subscribe(function(msg){
console.log(" [x] Received from MYQUEUE %s",msg.data.toString('utf-8'));
payload = msg.data.toString('utf-8');
// HOW DOES THIS NOW GET SENT VIA WEBSOCKETS ??
});
});
});
Using this code, I can successfully subscribe to a queue in Rabbit and receive any messages that are sent to the queue. Similarly, I can connect a websocket client (e.g. a browser) to the server and send/receive messages. BUT ... how can I send the payload of the Rabbit queue message as a websocket message at the point indicated ("HOW DOES THIS NOW GET SENT VIA WEBSOCKETS") ? I think it's something to do with being stuck in the wrong callback or they need to be nested somehow ...?
Alternatively, if this can be done easier in Python (via Autobahn and pika) that would be great.
Thanks !
One way to implement your system is use python with tornado.
Here the server:
import tornado.ioloop
import tornado.web
import tornado.websocket
import os
import pika
from threading import Thread
clients = []
def threaded_rmq():
connection = pika.BlockingConnection(pika.ConnectionParameters("localhost"));
print 'Connected:localhost'
channel = connection.channel()
channel.queue_declare(queue="my_queue")
print 'Consumer ready, on my_queue'
channel.basic_consume(consumer_callback, queue="my_queue", no_ack=True)
channel.start_consuming()
def consumer_callback(ch, method, properties, body):
print " [x] Received %r" % (body,)
for itm in clients:
itm.write_message(body)
class SocketHandler(tornado.websocket.WebSocketHandler):
def open(self):
print "WebSocket opened"
clients.append(self)
def on_message(self, message):
self.write_message(u"You said: " + message)
def on_close(self):
print "WebSocket closed"
clients.remove(self)
class MainHandler(tornado.web.RequestHandler):
def get(self):
print "get page"
self.render("websocket.html")
application = tornado.web.Application([
(r'/ws', SocketHandler),
(r"/", MainHandler),
])
if __name__ == "__main__":
thread = Thread(target = threaded_rmq)
thread.start()
application.listen(8889)
tornado.ioloop.IOLoop.instance().start()
and here the html page:
<html>
<head>
<script src="//code.jquery.com/jquery-1.11.0.min.js"></script>
<script>
$(document).ready(function() {
var ws;
if ('WebSocket' in window) {
ws = new WebSocket('ws://localhost:8889/ws');
}
else if ('MozWebSocket' in window) {
ws = new MozWebSocket('ws://localhost:8889/ws');
}
else {
alert("<tr><td> your browser doesn't support web socket </td></tr>");
return;
}
ws.onopen = function(evt) { alert("Connection open ...")};
ws.onmessage = function(evt){
alert(evt.data);
};
function closeConnect(){
ws.close();
}
});
</script>
</head>
<html>
So when you publish a message to "my_queue" the message is redirects to all web page connected.
I hope it can be useful
EDIT**
Here https://github.com/Gsantomaggio/rabbitmqexample you can find the complete example
Related
Currently, I have a simple websocket server that can handle recv and send operations. The code is as such.
async def recv_handler(websocket):
while True:
try:
message = await websocket.recv()
print(message)
except Exception as e:
print(e)
await asyncio.sleep(0.01)
async def send_handler(websocket):
while True:
try:
data = {
"type": "send",
"time": datetime.now().strftime("%Y-%m-%d %H:%M:%S")
}
await websocket.send(json.dumps(data))
except Exception as e:
print(e)
await asyncio.sleep(0.01)
async def main(websocket):
while True:
recv_task = asyncio.create_task(recv_handler(websocket))
send_task = asyncio.create_task(send_handler(websocket))
await asyncio.gather(recv_task, send_task)
async def start_server():
server = await websockets.serve(main, "", 3001)
await server.wait_closed()
if __name__ == "__main__":
asyncio.run(start_server())
This successfully runs a server, hand can handle message sent from a client node.js application using websockets as well as send updates to the client node.js application periodically.
// receive a message from the server
socket.addEventListener("message", ({ data }) => {
const packet = JSON.parse(data);
switch (packet.type) {
case "send":
console.log(packet.time)
break;
default:
break;
}
});
// send message to server
const onClickSend = () => {
if (socket.readyState !== WebSocket.OPEN) {
console.log("socket not open");
return;
} else {
socket.send(JSON.stringify({
type: "hello from client",
}));
}
}
Now, I want to include a blocking function call that sends a pycurl (or any http) request, then use the result of that pycurl request, package it into the json object, and send that to the client.
I have a sample pycurl requst that gets the weather from wttr.in
def getWeather():
# Creating a buffer as the cURL is not allocating a buffer for the network response
buffer = BytesIO()
c = pycurl.Curl()
#initializing the request URL
c.setopt(c.URL, 'wttr.in/Beijing?format="%l:+\%c+%t+%T\\n"')
#setting options for cURL transfer
c.setopt(c.WRITEDATA, buffer)
#setting the file name holding the certificates
c.setopt(c.CAINFO, certifi.where())
# perform file transfer
c.perform()
#Ending the session and freeing the resources
c.close()
#retrieve the content BytesIO
body = buffer.getvalue()
#decoding the buffer
return body.decode('utf-8')
So if we change data to include or weather,
date = {
"type" : "send",
"weather" : getWeather(),
}
and we can slightly change the node.js application case statement to print
case "send":
console.log(packet.weather)
The problem with this, I believe, is that we are making a blocking request, but I don't know enough on how to fix the problem. Currently, I can make requests, but every time the "onClickSend" is called (by pressing a button in frontend", now, we get an error saying that the "socket not open", meaning the backend is no longer handling receive messages.
So how do I handle pycurl requests in asyncio-websocket program?
Here the problem. I have my simple Flask web server with socketio module connected. On the client side there is some container which uses js script to get messages from server and show it in browser in realtime. On server side I need to listen other UDP port to recieve the messages and transmit recieved messages to client`s browser.When I've used BackgroundScheduler() to generate periodical test messages it worked fine. Now I'm adding my port listener to my web server executed in another thread and it doesn't transmit messages. It listens UDP port, it can print recieved messages to python console, it also gives the html page as before, but the messages is not returned from the thread to Flask app. I've tried to use multiprocessing, added my port listener to a separate process, but it didn't worked.It doesn't call any exceptions, it just do nothing. How can I find solution of my problem?
class TestUdp(BaseRequestHandler):
def handle(self):
data = self.request[0].strip()
socket = self.request[1]
if data:
socketio.send(data.decode('utf-8')
socket.sendto(data.upper(), self.client_address)
def run():
HOST, PORT = "127.0.0.1", 7030
with UDPServer((HOST, PORT), TestUdp) as server:
server.serve_forever()
This function inserted to my Flask route:
#app.route('/')
#app.route('/index')
#login_required
def index():
t1=Thread(target=run,daemon=True)
return render_template('main.html')
js code from html:
document.addEventListener('DOMContentLoaded', () => {
var socket = io.connect('http://' + document.domain + ':' + location.port);
socket.on('connect', () => {
socket.send('I\'m connected!');
});
socket.on('message', data => {
console.log(`${data}`)
});
socket.on('message', data => {
const p = document.createElement('li');
// const br = document.createElement('br');
p.innerHTML ="<li>"+ data + "</li>" ;
// document.querySelector('#message').append(p);
var resultList = document.getElementById('resultList');
resultList.append(p);
});
})
I'm trying to implement asynchronous client and server using pyzmq and asyncio in python3.5. I've used the asyncio libraries provided by zmq. Below is my code for client(requester.py) and server(responder.py). My requirement is to use only REQ and REP zmq sockets to achieve async client-server.
requester.py:
import asyncio
import zmq
import zmq.asyncio
async def receive():
message = await socket.recv()
print("Received reply ", "[", message, "]")
return message
async def send(i):
print("Sending request ", i,"...")
request = "Hello:" + str(i)
await socket.send(request.encode('utf-8'))
print("sent:",i)
async def main_loop_num(i):
await send(i)
# Get the reply.
message = await receive()
print("Message :", message)
async def main():
await asyncio.gather(*(main_loop_num(i) for i in range(1,10)))
port = 5556
context = zmq.asyncio.Context.instance()
socket = context.socket(zmq.REQ)
socket.connect("tcp://localhost:%d" % port)
asyncio.get_event_loop().run_until_complete(asyncio.wait([main()]))
responder.py:
import asyncio
import zmq
import zmq.asyncio
async def receive():
message = await socket.recv()
print("Received message:", message)
await asyncio.sleep(10)
print("Sleep complete")
return message
async def main_loop():
while True:
message = await receive()
print("back to main loop")
await socket.send(("World from %d" % port).encode('utf-8'))
print("sent back")
port = 5556
context = zmq.asyncio.Context.instance()
socket = context.socket(zmq.REP)
socket.bind("tcp://*:%d" % port)
asyncio.get_event_loop().run_until_complete(asyncio.wait([main_loop()]))
The output that I'm getting is:
requester.py:
Sending request 5 ...
sent: 5
Sending request 6 ...
Sending request 1 ...
Sending request 7 ...
Sending request 2 ...
Sending request 8 ...
Sending request 3 ...
Sending request 9 ...
Sending request 4 ...
responder.py:
Received message: b'Hello:5'
Sleep complete
back to main loop
sent back
From the output, I assume that the requester has sent multiple requests, but only the first one has reached the responder. Also, the response sent by responder for the first request has not even reached back to the requester. Why does this happen? I have used async methods everywhere possible, still the send() and recv() methods are not behaving asynchronously. Is it possible to make async req-rep without using any other sockets like router, dealer, etc?
ZMQs REQ-REP sockets expect a strict order of one request - one reply - one request - one reply - ...
your requester.py starts all 10 requests in parallel:
await asyncio.gather(*(main_loop_num(i) for i in range(1,10)))
when sending the second request ZMQ complains about this:
zmq.error.ZMQError: Operation cannot be accomplished in current state
Try to change your main function to send one request at a time:
async def main():
for i in range(1, 10):
await main_loop_num(i)
If you need to send several requests in parallel then you can't use a REQ-REP socket pair but for example a DEALER-REP socket pair.
How to use flask-socketio for heartbeat detection.
The current requirement is to create an interface. When the client connects to the corresponding interface, the server periodically sends a message to the client, the client response a message to server, it proves that the client is alive, heartbeat detection.
I have tried to write the code using flask-socketio and it runs successfully, but I don't know how to test the multiplayer connection.If the connection is broken, I don't know how the thread stops.
class MyCustomNamespace(Namespace):
clients = []
def on_connect(self):
sid = request.sid
print('client connected websocket: {}'.format(sid))
self.clients.append(sid)
global thread
with thread_lock:
if thread is None:
thread = socketio.start_background_task(target=self.background_thread, args=(current_app._get_current_object()))
def on_disconnect(self):
sid = request.sid
print('close websocket: {}'.format(sid))
self.clients.remove(sid)
def on_message(self, data):
print('received message: ' + data["param"])
if data["param"] is not None:
print('websocket {} is alive'.format(request.sid))
# else:
# print('websocket {} is die'.format(*rooms()))
# self.disconnect(*rooms())
def background_thread(self, app):
with app.app_context():
while True:
time.sleep(6)
data = {"status": "ok"}
socketio.emit('server_detect', data, namespace='/testnamespace', broadcast=True)
print("服务器探测客户端是否还活着")
socketio.on_namespace(MyCustomNamespace("/testnamespace"))
html
$(document).ready(function () {
namespace = '/testnamespace';
webSocketUrl = location.protocol + '//' + document.domain + ':' + location.port + namespace;
console.log(webSocketUrl);
var socket = io.connect(webSocketUrl);
socket.on('server_detect', function (res) {
console.log(res);
socket.emit('message',{'param':'value'});
});
});
I'm new to asynchronous programming in python and I'm trying to write a script that starts a websocket server, listens for messages, and also sends messages when certain events (e.g. pressing the 's' key) are triggered in a gtk window. Here's what I have so far:
import gi
gi.require_version('Gtk', '3.0')
from gi.repository import Gtk
import asyncio
import websockets
import threading
ws = None
async def consumer_handler(websocket, path):
global ws
ws = websocket
await websocket.send("Hello client")
while True:
message = await websocket.recv()
print("Message from client: " + message)
def keypress(widget,event):
global ws
if event.keyval == 115 and ws: #s key pressed, connection open
asyncio.get_event_loop().create_task(ws.send("key press"))
print("Message to client: key press")
def quit(widget):
Gtk.main_quit()
window = Gtk.Window(Gtk.WindowType.TOPLEVEL)
window.connect("destroy", quit)
window.connect("key_press_event", keypress)
window.show()
start_server = websockets.serve(consumer_handler, 'localhost', 8765)
asyncio.get_event_loop().run_until_complete(start_server)
wst = threading.Thread(target=asyncio.get_event_loop().run_forever)
wst.daemon = True
wst.start()
Gtk.main()
And here's the client webpage:
<!DOCTYPE html>
<html>
<head>
<title>Websockets test page</title>
<meta charset="UTF-8" />
<script>
var exampleSocket = new WebSocket("ws://localhost:8765");
function mylog(msg) {
document.getElementById("log").innerHTML += msg + "<br/>";
}
function send() {
mylog("Message to server: Hello server");
exampleSocket.send("Hello server");
}
exampleSocket.onopen = function (event) {
mylog("Connection opened");
};
exampleSocket.onmessage = function (event) {
mylog("Message from server: " + event.data);
}
</script>
</head>
<body>
<p id="log"></p>
<input type="button" value="Send message" onclick="send()"/>
</body>
</html>
Run the python code, then load the webpage in a browser, now any messages the browser sends show up in the python stdout, so far so good. But if you hit the 's' key in the gtk window, python doesn't send the message until another message is received from the browser (from pressing the 'Send message' button). I thought that await websocket.recv() was meant to return control to the event loop until a message was received? How do I get it to send messages while it's waiting to receive?
But if you hit the 's' key in the gtk window, python doesn't send the message until another message is received from the browser
The problem is in this line:
asyncio.get_event_loop().create_task(ws.send("key press"))
Since the asyncio event loop and the GTK main loop are running in different threads, you need to use run_coroutine_threadsafe to submit the coroutine to asyncio. Something like:
asyncio.run_coroutine_threadsafe(ws.send("key press"), loop)
create_task adds the coroutine to the queue of runnable coroutines, but fails to wake up the event loop, which is why your coroutine is only run when something else happens in asyncio. Also, create_task is not thread-safe, so calling it while the event loop itself is modifying the run queue could corrupt its data structures. run_coroutine_threadsafe has neither of these problems, it arranges for the event loop to wake up as soon as possible, and it uses a mutex to protect the event loop's data structures.