How deploy PHP Application in WebApp for containers - azure

I have a containerised PHP application (Symfony) in a PHP-FPM container. Currently we expose this website with a Nginx Server as reverse proxy and connecting via fastcgi to this PHP-FPM container.
Do someone how I can bring this PHP-FPM container to Web App for Containers Azure Service?
Do I need to included an WebServer to my container to publish the website? (something like Apache+mod_php?)
I believe should be someway to connect the Azure WebServer to my container.
Thanks,

You're only allowed 1 image, so you'll have to add a webserver to your existing PHP-FPM container. You've already got all of the nginx configs I assume, so try installing nginx in your PHP-FPM container.

Related

How to access host nodejs app from docker nuxt app?

I am trying to get my nuxt app working on the production servers. For the local machine, the generated docker image runs well and it can access the nodejs app that runs on localhost. The axios 'baseurl: http://127.0.0.1:6008/' seems working fine, the docker image can access this. On the production servers, i have used docker to setup the nuxt app, the same way i tested on my local machine. Yet the docker nuxt app cannot reach the nodejs app on the host server. I can see this must be some kind of network setting issue.
In vuejs app, i usually setup a proxypass in the apache web conf, to convert the input backend query to match and replace them with localhost address.
ProxyPass /app/query http://localhost:6008/query
The nuxt.config file, axios setting looks lik this:
axios: {
baseURL:'http://127.0.0.1:6008/',
browserBaseURL: ''
},
Does docker needs additional settings or should i configure my apache for this communication between my docker container and a node app that running on host apache pm2 ?
localhost or 127.0.0.1 from within the docker will not resolve to the server's localhost. Instead you need to specify the name of the nodejs service (if you are using docker-compose) or the nodejs docker container name (if you are just using docker run).
You can also try by giving the IP of the server where the docker is running instead of 127.0.0.1

Two docker container (nginx and a web app) not working together (linux)

I built both containers using a Dockerfile (for each). I have the NGINX container pointing (proxy_pass http://localhost:8080) to the port that the web app is exposed (via -p 8080:80). I am able to get it to work when I just install NGINX in the linux machine, but when I use a dockerized NGINX, I just get the default NGINX index.html. Do I have to build both containers using Docker-Compose.yml file (as oppose to Dockerfile) when I want the containers working together? Sorry, if I didn't put any code, but at this point, I'm just wanting to know if I'm taking the correct approach (using Dockerfile or Docker-Compose).
The Nginx proxy needs access to the host (!) network for this to work, e.g.:
docker container run ... --net=host ... nginx
Without it, localhost refers to the proxy (localhost) which likely has nothing on :8080 and certainly not your web app.
Alternatively, if the proxy's container (!), can resolve|access the host then processes in the container can refer to host-accessible ports using the host's DNS name or IP.
Docker Compose (conventionally) solves this by putting the containers onto a new virtual network. The difference then would be that, rather than mapping everything onto host ports, each container (called a service) gets a unique name and a container called proxy could refer to a container called web on port 8080 as http://web:8080.
You may achieve similar results with Docker only by creating a network and then running containers on it, e.g:
docker network create ${NETWORK}
docker container run ... --net=${NETWORK} --name=proxy ...
...

How to setup Nginix Docker Reverse proxy

I am trying to use Nginix reverse proxy container for my web application(another docker container) which runs on non standard port ,
unfortunately I cannot edit my web application container as its developed by some vendor , so I have a plain request that I need to setup nginx as frontend with 80/443 and forward all requests to 10.0.0.0:10101(web app container).
I had tried jwilder/nginx proxy and default docker nginx container not able to get the right configurtaion .any lead would be great.
At the moment I haven't shared any conf files , I can share it on demand. here is the environment details
OS - Ubuntu
Azure
Use proxy_pass nginx feature
Assuming you have both containers linked and the web app container's name is webapp use this configuration on nginx container
upstream backend {
server webapp:10101;
}
server {
listen 80;
location / {
proxy_pass http://backend;
}
}
NOTE: please note that I am skipping some configurations as this is just an example
Put the configuration in nginx.conf file and then deploy the container like this
docker run -d -v $(pwd)/nginx.conf:/etc/nginx/nginx.conf -p 80:80 nginx
Then you'll be able to access your webapp on http://locahost

Docker request to own server

I have a docker instance running apache on port 80 and node.js+express running on port 3000. I need to make an AJAX request from the apache-served website to the node server running on port 3000.
I don't know what is the appropiate url to use. I tried localhost but that resolved to the localhost of the client browsing the webpage (also the end user) instead of the localhost of the docker image.
Thanks in advance for your help!
First you should split your containers - it is a good practice for Docker to have one container per one process.
Then you will need some tool for orchestration of these containers. You can start with docker-compose as IMO the simplest one.
It will launch all your containers and manage their network settings for you by default.
So, imaging you have following docker-compose.yml file for launching your apps:
docker-compose.yml
version: '3'
services:
apache:
image: apache
node:
image: node # or whatever
With such simple configuration you will have host names in your network apache and node. So from inside you node application, you will see apache as apache host.
Just launch it with docker-compose up
make an AJAX request from the [...] website to the node server
The JavaScript, HTML, and CSS that Apache serves up is all read and interpreted by the browser, which may or may not be running on the same host as the servers. Once you're at the browser level, code has no idea that Docker is involved with any of this.
If you can get away with only sending links without hostnames <img src="/assets/foo.png"> that will always work without any configuration. Otherwise you need to use the DNS name or IP address of the host, in exactly the same way you would as if you were running the two services directly on the host without Docker.

External access to Node.JS app, within Docker container

i have a Node app running within a Docker container, hosted on Elastic Beanstalk (single instance). The docker has port 3000 exposed to access the app within the docker, and I can 'curl 172.17.0.32:3000/test' from the host which returns the expected response.
The problem I have is accessing this port externally using the elastic beanstalk url. i.e
http://XXXXXX-env.elasticbeanstalk.com:3000/test
This will time out.. can anyone recommend how to gain access to this port externally?
thanks
Check this for reference
http://victorlin.me/posts/2014/11/26/running-docker-with-aws-elastic-beanstalk
see what your docker ps command returns.
The ip you have shared looks like private ip address of the docker service used for internal network. You have to enable a bridge between your host and docker container by supplying -p 3000:3000 to the run command and finally enable the app in your elastic console.

Resources