Sending updated messages to Socket Server using selectors library - python-3.x

I have followed a tutorial on Python Sockets and trying to modify the client side to send updated data but in vain. The source code of the tutorial can be found here. The problem is I am not able to understand how to send updated message with Python selectors after registering the connection. The example I am trying to modify is:
Server works fine.
Client Here I want to send data continously at runtime.
Here in the example two messages are sent from the client side but they are defined at time of registering the connection but what I want to do is to send updated data at runtime.
I have tried to modify the object by using modify method from the selectors library but that didn't worked.
One idea I am having is to trigger a write event from client side after updating the message but how to do so I am not able to find also after spending quite some time.
Any ideas on how to send updated messages at runtime using selectors will highly be appreciated.
Edit 01
From server and client side this and this line needs to be commented out to prevent connection close after first message

Related

How is it possible that node.js creates 2 instances of singleton class?

Setup: typescript 4.9.4, next.js 13.0.6
What I’m trying to do: messenger app. Messaging system based in Server-sent events (SSE), not WS
What’s the idea: incoming messages are handled by SSE endpoint: https://github.com/sincerely-manny/messenger.nextjs/blob/main/pages/api/messenger/incoming.ts
outgoing messages are being accepted as POST-requests here: https://github.com/sincerely-manny/messenger.nextjs/blob/main/pages/api/messenger/outgoing.ts
Singleton-class is collecting list of clients/connections and response-objects: https://github.com/sincerely-manny/messenger.nextjs/blob/main/lib/sse/serverSentEvents.ts
Whenever anyone needs to send a message he is to grab the instance of SSE class and trigger the "send" method.
Front-end part: https://github.com/sincerely-manny/messenger.nextjs/blob/main/app/messenger/page.tsx
Expected behaviour: upon establishing first connection instance of SSE class is created. Then every call of the send method finds corresponding to the client response object and and puts message to the stream
Actual behaviour: upon connecting to sse endpoint instance (1) of the class is created. Client is registered in list. But (!) sending a message creates another (2) instance of singleton-class with empty clients list. Hence sent message is lost. But (!!) after refreshing the page and creating new connection app takes this second (2) instance, puts client there and everything starts working as expected.
The question: how’s that possible and what should I do to avoid this unwanted behaviour.
Update: turns out the problem persists only in dev mode, while compiling pages on-the-fly. That makes it easier, but doesn’t explain why it happens.

Asynchronous Streaming with Flask

So, I have a socket.io connection with a React.js frontend that sends a blob of wav audio data every 100 ms to my Flask backend. On my backend, I am maintaining a thread safe queue
audio_buffer = queue.Queue()
#socketio.on('stream')
def stream(data):
#Data is a properly formatted wav file in bytes
audio_buffer.put(data)
Now, my goal is that while my backend is taking the streamed data from the frontend, I also want to send the data to google ASR from my backend at the same time. I am basing my code here on the documentation here: https://cloud.google.com/speech-to-text/docs/streaming-recognize
Basically, I am going to create a generator with a reference to the queue and give that to the google api.
audio_generator = stream.generator() #Here, I wrote a new method for creating a generator from audio_buffer
requests = (
speech.StreamingRecognizeRequest(audio_content=content)
for content in audio_generator
)
responses = client.streaming_recognize(streaming_config, requests)
# Now, put the transcription responses to use.
listen_print_loop(responses)
Now the issue that I have is that essentially, if I try to send things to the google api, my backend stops listening to the frontend and just waits for the queue to be populated, ignoring socket.io requests from the client. For example, I tried to have the client to send a start command for the backend to create the generator and start streaming data to google while the client also sends data to populate the queue through the stream method. What happened was that it got stuck on trying to stream data to google and never got around to actually listening to the data blobs.
My question is that should I attempt to use threading or celery to try and run the google api code simultaneously? Basically, I am thinking of instantiating a thread or worker to do this google api streaming recognize, and the main flask thread would keep listening to the client and populate the queue. Is this a sound approach? Is there a library that implements this for me?
Update 5/16:
So I am attempting to run the google api on a standard thread. My code is pretty simple. On start, I start a thread with
yourThread = threading.Thread(target=recognize, args=[stream])
and my recognize code passes in the thread safe queue and actually sends the code to the google api properly. However, my issue now is that emits that I put in the code don't actually make it to the client. My code is based on the google api code:
for response in responses:
#Some other code from the tutorial
socketio.emit("transcription", transcript)
that last emit call never actually gets to the frontend at all. Now, looking around, I found the same issue here: https://github.com/miguelgrinberg/Flask-SocketIO/issues/1239
import httplib2shim
httplib2shim.patch()
but when I tried the fix over there, and switched async_mode to eventlet, the google api calls still worked, but none of the emits still get to the client
Okay, so I solved my own issue. The problem is just that flask_socketio + python google speech api just doesn't work due to incompatibility between eventlet/gevent and grpc.
My solution was to switch to using regular websockets. By using a regular websocket on the frontend and flask_sockets on the backend, I managed to get the google cloud speech python api to work without issue. I've placed some sample code for this on my github: https://github.com/dragon18456/flask-sockets-google-speech-api

Socket Send(2) - How to send an error message from server to client

I'm doing some very simple TCP socket programming right now and I have a small problem I can't seem to find the answer to. Basically, what I'm building is a server program and a client program that allow the client to request a file from the server and the server will send it.
My problem is that if the client requests a file that the server doesn't have I need to send an error message back to the client. Is there a flag that I can set to do this? I can't just send a string containing an error message because it is possible that that message could appear in a file transfer at some point and trigger an incorrect response.
I have looked through the MAN pages and some other resources but I couldn't seem to figure it out. I'm working in a Linux environment.
Thank you!
See comments on my original post from kaylum and Remy Lebeau for the solution.

can I have different websockets input in the same server - output in different clients

Let me clarify the title of the question.
On the client side I have two different html files, client1.html and client2.html. They send data via websockets to the same server.js file, in node.js.
On the server side, if the data came from client1.html I want to perform a query and send the outcome to the client1.html. On the same server, if the data came from client2.html I want to perform another query and send to the client2.html the message "Data Saved".
I guess, I have to create two different functios in the server.js. OK.
But, my problem is, on the server side, how to tell, which data came from which client?
Also, how the server can send back the right message to the right client?
Thanks in advance
You have to register your clients. For example if a user A is on page client1.html then you send a message (via websockets) for example JSON (or any other format you like):
{ "user": "A", "page": "client1.html" }
Now on the server side you just mark that this user/connection came from client1.html. You can add for example a custom property:
conn.source = "client1.html";
Or any other way (depending for example on framework).
You might even use a handshake for this (instead of sending JSON): when connecting to the server do for example (on the client side):
var ws = new WebSocket("ws://myserver/client1.html");
Now you just have to do the same in handshake code (client1.html is a part of URL now in handshake).
As for other question: on the server side you keep lists of all users for client1.html, client2.html, etc. The rest is obvious: you loop over a target list and send a notification to those users.
Of course there are many small details here. Like you have to remove users from lists if a connection is dead (thus you need a background task to check whether a connection is alive), etc. But that's the general idea.

node.js event streaming as a client

I have a net stream that I want to use, however I can't use it directly due to the CORS limitation and that EventSource doesn't support authentication ..
I want to use node.js to get the stream from the source (domainA) and relay it to the requester.
so the flow would be: domainA --> node.js --> client
I was using https.request to get the data, but the thing is that its closing the connection after the first time.
so my question is - how would I proper implement this on node.js? how do I keep the connection alive and the data coming?
I was actually able to resolve this myself. what I was trying to do is simply 'mirror' an event source due to CORS restrictions. the connection kept closing after the first request, I found out that I was actually sending the wrong headers. After sending "Accept": "text/event-stream" the connection was kept opened and the data flowing. Thanks anyway!

Resources