How to use socketio in azure web app for container - node.js

I am using web for container in azure
- platform linux
My application was written in Node.js with socketio.
I use docker for create container.
I've already tested in local machince websocket is work fine but when I deploy to azure web for container.
When I try to connect websocket, I always got error code 503 Service Temporarily Unavailable.

Related

NodeJs + Puppeteer on Azure App Services fails to run

I've wrote a simple NodeJs (ExpressJs) server that uses Puppeteer to generate PDF files on JSON data being passed to it. While locally everything is working like charm, I'm struggling to run this server on Azure App Services.
I've created a Resource group, within it I've created an App Servces instance (running on Linux) that is connected to my repo at Azure DevOps (via the Deployment Center).
My server has two endpoints:
/ - returns a JSON - { status: "ok" }. I'm using this to validate the running server instance.
/generate-pdf - uses the Puppeteer to generate and return a PDF file.
After successfully starting the App Service instance I'm able to access the "/" route and get a valid response but upon accessing the "/generate-pdf" route the result is "502 - Bad Gateway".
Does my instance require some additional configuration that I haven't done?
Does App Services can not run Puppeteer? Perhaps there is a different service on Azure that I need to use?
Is there a way to automate the process via the Azure DevOps pipeline or release?
Any questions/thoughts/FAQs are more than welcomed. Thanks =)
I'm answering my own question: as was mentioned here https://stackoverflow.com... the Azure App Services does not allow the use of GDI (which is required by Chrome) regardless if you're using Linux or Windows based system. The solution was to put the NodeJs application into a Docker container and manually install Chrome. Once you have a container - just upload it to the Azure App Services and viola!
By default App Servies exposes 80 and 443 ports, so if your application listens on a different port be sure to specify it via the WEBSITES_PORT environment variable.
In my case, I had to upload the Docker image to the Docker hub but you can also set up a pipeline to automate the process.
I've built the Docker image on my M1 Pro and it led to some arch issues when the container was uploaded to Azure. Be sure to add the --platform linux/amd64 in the image-building step if you're building for Linux.

Azure: change an existing Windows Web App to a Linux Web App (docker)

I have an existing Windows Web App in Azure which I want to migrate to a Linux (docker) Web App. It is easy to setup a new Linux Web App with a new URI and that all works. However, I need to preserve the existing URI (myservice.azurewebsites.net) that I have on the Windows Web App and use it in the Linux Web App.
Any downtime is not acceptable, so I cannot just "test" if I can remove the current Windows Web App and "re-use" the same URI.
Created ASP.Net Core Web App and deployed to Azure Windows App Service.
Tried to deploy the same Web App to Linux App Service(Docker), got the below warning.
Tried to continue the steps to provide the configuration access to docker container, got the below error.
For the apps which we want to deploy in Azure Linux Docker, we need to Enable Docker and provide Docker OS , which is not required for Azure Windows App Service.
remove the current Windows Web App and "re-use" the same URI.
It is suggested to create a new App Service Plan and deploy to Linux App (Docker).

Access local services from firebase cloud functions emulator (ECONNREFUSED connecting to localhost)

Obviously Firebase Cloud Functions cannot access http services on localhost, once deployed (as per this answer). However, is there a way to test local http services when running the cloud functions emulator locally?
I have a node.js app and various firebase emulators running in docker containers (with docker-compose), all using different ports. I need my cloud function to send a POST request to the node app, and I'd like to test this all on my local machine.
The cloud function (hosted at http://0.0.0.0:3318) gets an ECONNREFUSED error when I attempt to post a request to my node app (at http://localhost:2018) using axios.
Using the magic of docker-compose networking, I found I could connect to my node.js service (named api in my docker-compose.yml) from my emulated function simply using the url http://api:2018

Connecting Application to Azure SignalR Service

I'm trying to connect my application to Azure SignalR Service. The application works fine using SignalR locally or even using Azure App Services.
services.AddSignalR();
The code above is what I had and everything works.
services.AddSignalR().AddAzureSignalR(Configuration.GetValue<string>("SignalRConnectionString"));
This is the change to be able to use Azure SignalR Service.
In the logs, I can see that the request is being forwarded to the Azure End Point.
I keep getting this error.
failed: HTTP Authentication failed; no valid credentials available

How to deploy realm object server to cloud foundry

https://realm.io/docs/get-started/installation/developer-edition/
Seems like realm object server is base on nodejs, however I could not make it working.
By far, I could deploy it to cloud foundry successfully, but it just don't work.
Realm studio just hangs there.
apps to cloud foundry are deployed with the cf push command and they will run inside Linux Container. The contents of the linux container will put together by a buildpacks. Since realm is a NodeApp you should look at the NodJS buildpack documentation https://docs.cloudfoundry.org/buildpacks/node/node-tips.html
Apps running on Cloud Foundry container have an ephemeral disk therefore if your realm application needs a persistent disk you should not run it on a container in Cloud Foundry. Unless you have a Service Broker that can mount NFS volumes into the container.

Resources