Background
I have a node app that essentially needs to use aws-sdk to access S3 bucket and perform other operations. In my local machine I have a .aws config file that sits in ~/.aws/config. I want to run this node app in a docker container (so that I can deploy it to EC2 later).
The problem
How do I configure my docker container to use the aws config file so that the node app running in it can use aws-sdk?
Clarification
I do not want to use IAM. I specifically want to use aws config file which has the secret access key etc.
You can do what AWS is doing when they explain how to use their containers on local machines. For example, for local AWS Glue they simply share the ~/.aws/ with the docker container using:
-v ~/.aws:/root/.aws:ro
Obviously you would have to adjust the paths above to match your local and docker setup.
The other way is to pass the AWS credentials using docker environment variables.
Related
I'm using dotenv to fetch and use environment variables from a .env file in my node app while running it locally. But this .env file can't be committed to github for security reasons. I'm using codepipeline and codedeploy to deploy the app continuously to ec2 but then the environment variables are missing on the ec2 instance server.
How do I configure the environment variables for my node.js app in AWS EC2 (ubuntu AMI) ?
The best secure way is to use AWS system Manager
Rerference:
https://aws.amazon.com/blogs/mt/use-parameter-store-to-securely-access-secrets-and-config-data-in-aws-codedeploy/
Its secure and fully compatible with codedeploy
Can I run docker command on host? I installed aws inside my docker container, now can I somehow use aws command on host (that under the hood will use docker container's aws)?
My situation is like that: I have database backups on production host. now I have Jenkins cron job that will take sql file from db container and take it into server folder. Now I also want jenkins to upload this backup file on AWS storage, but on host I have no aws installed, also I don't want to install anything except docker on my host, so I think aws should be installed inside container.
You can't directly do this. Docker containers and images have isolated filesystems, and the host and containers can't directly access each others' filesystems and binaries.
In theory you could write a shell script that wrapped docker run, name it aws, and put it in your $PATH
#!/bin/sh
exec docker run --rm -it awscli aws "$#"
but this doesn't scale well, requires you to have root-level permissions on the host, and you won't be able to access files on the host (like ~/.aws/config) or environment variables (like $AWS_ACCESS_KEY_ID) with additional setup.
You can just install software on your host instead, and it will work normally. There's no requirement to use Docker for absolutely everything.
I have a .net core app, hosted on PCF. Also I have Config Server installed.
I want to run locally with iis express this application and load same config values as it will have when deployed to pcf, and I do not want to deploy it to Pcf Dev as I want to debug it.
Is it possible? The only workaround I have is to fetch all variables into User managed secrets, but it's awful.
Steeltoe and SCS Client look at the VCAP_SERVICES environment variable to load the configuration they use to talk with Config Server. On PCF, this environment variable is automatically populated with information based on the services that you bind to your app.
I do not know of any tool to manage/bind services locally, but you can always set environment variables manually. If you were to run cf env <app> for an app that is bound to your Config Server, it will list the contents of the VCAP_SERVICES env variable. Copy that output, paste it into an environment variable on your local machine. Fire up your app and Steeltoe or SCS Client should pick that information up automatically.
Hope that helps!
If you don't want to connect to the exact same config server, you can run the config server locally with Java or Docker and point it at the same back-end. The Steeltoe docs include instructions for running the config server with Maven and the Music Store sample includes cmd and sh scripts that show running a config server via Docker, though they may be slightly out of date. The most recent way I've run the docker command is something like this:
docker run --rm -ti -p 8888:8888 -v $PWD/config-repo:/config --name steeltoe-config steeltoeoss/configserver --spring.profiles.active=native
from a location that contains a folder named config-repo with the relevant config files in that location.
I have built a node.js application in the docker, every time when I need to run it, I just run the docker and run the command node app.js.
I have set it up on Amazon EC2, but in a vanilla way by register and log into the Amazon EC2 instance, pull the docker image, then run and log into the docker and run the command node app.js.
Now, since Amazon has this EC2 container service, I was told that I can do these two things automatically:
EC2 runs the docker
Docker runs node app.js
The advantage of doing this is that whenever either docker is crashed or the app is crashed, both of them are crashed, therefore that EC2 can automatically run the command again and rescue them.
How can I set this function up?
It comes by default when you set up an ECS task. Make sure the task is marked as 'essential' in your task's container and that you have at least one task requested in your ECS service, and it will automatically restart a failed/crashed container for you.
I have a newbie regarding docker. I would like to know if it is possible to export a docker image created for AWS to Bluemix or Azure. My docker image contains a websocket server under NodeJS and a MongoDB database.
Thank you for your help
Access your aws cloud and use:
docker save -o image.tar image:1.0 #exporte docker image
After concluded that, access your new cloud and use:
docker load -i image.tar #load your image to the new cloud
Having the dockerfile you used to create your AWS container, you can simply use it to build the container on Bluemix using cf ic client or the docker native one
Following the reference doc for Bluemix docker cli
https://www.ng.bluemix.net/docs/containers/container_cli_reference_ov.html
https://www.ng.bluemix.net/docs/containers/container_cli_ov.html