How to run PHP files on aws s3 bucket? - amazon-rds

I have developed my website on aws platform using aws s3 buckets.How do I run php files on my domain?

S3 doesn't run any sort of CGI script (PHP, Perl, Ruby, etc). Think of it as a static html and image repository.
If you want to host your PHP application on AWS, consider using AWS Beanstalk. It will launch an environment (server, IP, etc) where you can deploy and run your PHP application easily.

Related

How to hide `/var/app/current` source code in elastic beanstalk?

I am trying to deploy a basic react application in elastic beanstalk and when I inspect the browser I can see the source code:
The content of the react app is the default one from create-react-app with default package.json scripts.
The platform on elastic beanstalk is that one Node.js 16 running on 64bit Amazon Linux 2/5.6.3.
Despite I tried to add GENERATE_SOURCEMAP=false in npm build script, and try to configure nginx using proxy.conf, I'm unable to hide the var/app/current folder that is being deployed and has the source code of my react application.
Does someone know how avoid source code to be exposed on elastic beanstalk when deploying it in Node.js platform?

Running Node app in Docker container with aws-sdk

Background
I have a node app that essentially needs to use aws-sdk to access S3 bucket and perform other operations. In my local machine I have a .aws config file that sits in ~/.aws/config. I want to run this node app in a docker container (so that I can deploy it to EC2 later).
The problem
How do I configure my docker container to use the aws config file so that the node app running in it can use aws-sdk?
Clarification
I do not want to use IAM. I specifically want to use aws config file which has the secret access key etc.
You can do what AWS is doing when they explain how to use their containers on local machines. For example, for local AWS Glue they simply share the ~/.aws/ with the docker container using:
-v ~/.aws:/root/.aws:ro
Obviously you would have to adjust the paths above to match your local and docker setup.
The other way is to pass the AWS credentials using docker environment variables.

How to store environment variables on AWS EC2?

I'm using dotenv to fetch and use environment variables from a .env file in my node app while running it locally. But this .env file can't be committed to github for security reasons. I'm using codepipeline and codedeploy to deploy the app continuously to ec2 but then the environment variables are missing on the ec2 instance server.
How do I configure the environment variables for my node.js app in AWS EC2 (ubuntu AMI) ?
The best secure way is to use AWS system Manager
Rerference:
https://aws.amazon.com/blogs/mt/use-parameter-store-to-securely-access-secrets-and-config-data-in-aws-codedeploy/
Its secure and fully compatible with codedeploy

When I SSH into my EC2 instance, how do I figure out what server it is currently running?

When I ssh into the EC2 instance and look around the server configuration files, I see Apache config files, NGINX config files. Why are both servers there? Which one takes priority? Do both servers run simultaneously? Do they work together? Where would my server side code go for Node.js?
It is Node.js running on 64bit Amazon Linux/4.10.2 on an Elastic Beanstalk environment.
To quote the documentation:
The AWS Elastic Beanstalk Node.js platform is a platform version for
Node.js web applications that can run behind an nginx proxy server,
behind an Apache server, or standalone.
I would assume that to simplify deployment, Elastic Beanstalk chooses to deploy both NGINX and Apache, regardless of which mechanism you choose to serve content.
The configuration options are ProxyServer=apache/nginx/none.
I asked this question in Serverfault and got this answer. You can run the command ps ax | grep -E '(apache2|httpd|nginx)' to see which one is running.

NodeJS on AWS EC2: Cannot load public/images folder

I'm developing a NodeJS app and it looks good on my localhost so I decided to deploy it on AWS EC2.
I followed AWS instruction to deploy my NodeJS app on AWS EC2. When I started the server on EC2 by running node server.js, I found out that although javascript and css resources under public folder were loaded, all images from public/images were not loaded and the images folder was missing under Chrome developer tool > Sources. When running the app locally, all the images from public/images were loaded correctly. There is also a 500 Internal server error on the Chrome console.
Here is an example of my html for one of the images:
<img src="public/images/my_logo.png"></img>
I solved the problem. It turns out this is an AWS permission issue. Somehow all images uploaded to EC2 have
-rw-------
permission. So I changed the permission to solve the 500 error.
chmod 744 my_logo.png
It's worth noting that only images uploaded to AWS have the above mod. Other files have the following mod which need not be changed
-rw-r--r--

Resources