Scratch API websocket with rust - rust

I am trying to create a rust program that can read and change cloud variables in scratch. I have tried reading through the various python libraries for this. These send request headers when connecting to the clouddata.scratch.mit.edu websocket. Does anyone know how to do this with rust tungstenite?
I have tried using http request builder for this but have only gotten errors. I need a way to connect and send requests to the scratch cloud variable websocket.I have tried using this method: How to set origin header to websocket client in Rust? but it has not worked.

Related

Reusing RabbitMQ connection per expressjs api request

We are trying to use rabbitmq using this library called amqplib (not node-amqplib). Now all docs and examples I read say the connections should be long term. So my original thought of as soo nas someone makes a request to our web server (express), it will open a connection thus open a channel thus a queue so on and so forth. But that means per request we will be constantly opening and closing which is said ia not how you should ot it.
So, with that said, if you have an express server or some nodejs arch which involves pusblishing data per http request, is there some standard? do i make the connection object global and then any functions i need to use it, i pass it in/call it? is there som,e better way to get that connection instance per request without re-creating it?
Thanks in advance!

Can I use STOMP in nodejs without message broker?

I'm working on creating mock sever for my Angular application. On frontend I have library for STOMP. And regularly my frontend communicates with Api written in Java.
But additionally I start mock nodejs api which returns hard coded Json files when remote server is down.
Now I'm trying to write mock nodejs websockets server which will community with Angular client when remote server is down. But I would like to keep it simple
I found StompJs library but seems like it needs STOMP message broker (like Rabbit?). It's seems to me a bit complicated for mock server. Is there any option to skip this broker step and keep it as simple as possible?

Asynchronous Streaming with Flask

So, I have a socket.io connection with a React.js frontend that sends a blob of wav audio data every 100 ms to my Flask backend. On my backend, I am maintaining a thread safe queue
audio_buffer = queue.Queue()
#socketio.on('stream')
def stream(data):
#Data is a properly formatted wav file in bytes
audio_buffer.put(data)
Now, my goal is that while my backend is taking the streamed data from the frontend, I also want to send the data to google ASR from my backend at the same time. I am basing my code here on the documentation here: https://cloud.google.com/speech-to-text/docs/streaming-recognize
Basically, I am going to create a generator with a reference to the queue and give that to the google api.
audio_generator = stream.generator() #Here, I wrote a new method for creating a generator from audio_buffer
requests = (
speech.StreamingRecognizeRequest(audio_content=content)
for content in audio_generator
)
responses = client.streaming_recognize(streaming_config, requests)
# Now, put the transcription responses to use.
listen_print_loop(responses)
Now the issue that I have is that essentially, if I try to send things to the google api, my backend stops listening to the frontend and just waits for the queue to be populated, ignoring socket.io requests from the client. For example, I tried to have the client to send a start command for the backend to create the generator and start streaming data to google while the client also sends data to populate the queue through the stream method. What happened was that it got stuck on trying to stream data to google and never got around to actually listening to the data blobs.
My question is that should I attempt to use threading or celery to try and run the google api code simultaneously? Basically, I am thinking of instantiating a thread or worker to do this google api streaming recognize, and the main flask thread would keep listening to the client and populate the queue. Is this a sound approach? Is there a library that implements this for me?
Update 5/16:
So I am attempting to run the google api on a standard thread. My code is pretty simple. On start, I start a thread with
yourThread = threading.Thread(target=recognize, args=[stream])
and my recognize code passes in the thread safe queue and actually sends the code to the google api properly. However, my issue now is that emits that I put in the code don't actually make it to the client. My code is based on the google api code:
for response in responses:
#Some other code from the tutorial
socketio.emit("transcription", transcript)
that last emit call never actually gets to the frontend at all. Now, looking around, I found the same issue here: https://github.com/miguelgrinberg/Flask-SocketIO/issues/1239
import httplib2shim
httplib2shim.patch()
but when I tried the fix over there, and switched async_mode to eventlet, the google api calls still worked, but none of the emits still get to the client
Okay, so I solved my own issue. The problem is just that flask_socketio + python google speech api just doesn't work due to incompatibility between eventlet/gevent and grpc.
My solution was to switch to using regular websockets. By using a regular websocket on the frontend and flask_sockets on the backend, I managed to get the google cloud speech python api to work without issue. I've placed some sample code for this on my github: https://github.com/dragon18456/flask-sockets-google-speech-api

Node.js pipelining HTTP client agent?

The HTTP client built into Node.js doesn't seem to support pipelining requests. However, it occurred to me that it may be possible to create an Agent that sets up pipelining in the background. There may be issues getting the response data back the way it should be, but perhaps the Agent could fake out some socket objects to make the HTTP client work like normal?
Has this been done? Alternatively, is there an alternative HTTP client that is a drop-in replacement for the main that supports pipelining? (Ultimately, I'd like to use it with the AWS SDK, so it needs to be compatible.)

node.js event streaming as a client

I have a net stream that I want to use, however I can't use it directly due to the CORS limitation and that EventSource doesn't support authentication ..
I want to use node.js to get the stream from the source (domainA) and relay it to the requester.
so the flow would be: domainA --> node.js --> client
I was using https.request to get the data, but the thing is that its closing the connection after the first time.
so my question is - how would I proper implement this on node.js? how do I keep the connection alive and the data coming?
I was actually able to resolve this myself. what I was trying to do is simply 'mirror' an event source due to CORS restrictions. the connection kept closing after the first request, I found out that I was actually sending the wrong headers. After sending "Accept": "text/event-stream" the connection was kept opened and the data flowing. Thanks anyway!

Resources