I am trying to download files from s3 to ec2 instance in Azure DevOps by using AWS CLI in Build Pipeline. I am able to get the download logs but the data is not downloaded on my ec2 instance.
Same command works while executing from EC2 instance. It doesn't give me any error from DevOps but I do not see the data moved to my instance.
Thanks for your help.
You could refer to this ticket:
The AWS CLI calls the AWS API. The APIs for Amazon S3 do not have the ability to interact with the operating system on an Amazon EC2 instance.
So when you run the same command in the EC2 instance, it could work as expected. But it doesn't work in Azure Devops Pipeline.
You could try to use the aws cli send-command to send the aws s3 cp command to the EC2 instance.
Here is an example:
Related
We have a Node application which is running as a Docker container in AWS Elastic Beanstalk. The application has access to PostgreSQL RDS. We want to use AWS Secrets Manager, so that our Container can access RDS without the credentials being exposed in code.
Once we create the Secrets in AWS Secrets Manager a code is generated which may be in java/javascript etc. So do we add that code in our source code and attach policy of Secret Manager to both aws-elasticbeanstalk-ec2-role and aws-elasticbeanstalk-service-role?
Please advise how this can be done.
We have created the Secrets in Secret Manager. We have not proceeded further as the application is up and running, making any changes may affect it.
As this is our first time, we need help.
Im trying to add a pipeline to Azure Devops that pushes my code updates to AWS private subnet. My app runs on AWS Fargate.
Don't see any options to do this.
Would like to automatically deploy to my AWS website.
If you want to deploy your code to AWS, at first, you must create the service connection with AWS with a set of your valid AWS credentials. You can follow this doc AWS Tools for Microsoft VSTS to get more details about build this service connection.
And then, you still need to install some extension to Azure Devops so that you can use task which related to AWS: AWS Tools for Microsoft Visual Studio Team Services and AWS S3 Upload. The previous extension which named AWS Tools for Microsoft Visual Studio Team Services can enable you to use such as AWS command, AWS Shell Script to execute command in AWS. And, with AWS S3 Upload, you can deploy files to AWS.
In pipeline, add the tasks to build your repos as normal. And then login to your AWS account with AWS cli task, copy your build solution and upload them to AWS. For more details, I recommand you check this blog: Deploy app to AWS.
I have a custom centos 7.4 ami that I use to launch VM in AWS and I want to use the same custom ami in Azure, can I export that ami to azure to launch VM?
Generally, you cannot migrate an AWS AMI to Azure VMs. However, you can use tools such as packer.io to minimize the pain of supporting both.
Packer allows you to have a single code base that supports build Images on both Azure and AWS.
For a project in node.js i want to deploy mongodb instance on amazon ec2 using amazon api or something like that, is it possible? i found nothing about that.
Thanks for the time
You have many options:
vagrant with the aws provider
terraform as mentioned
cloudformation
using AWSCLI (and userdata): $ aws ec2 run-instances help
I have an application in centos VM running in amazon EC2 and now I need to migrate it to windows azure.
Is there a way to copy a snapshot to azure??
I wish to answer it step by step, but I found a link that is more than good & have almost details required to migrate an existing instance from Amazon EC2 to Windows Azure with video. The link is Guided Hands-on Lab: Migrate VMs to Windows Azure from Amazon AWS [ 20 Key Cloud Scenarios with Windows Azure Infrastructure Services ]
I hope it will help.
Well this is only possible if you are running Windows Server on your EC2 instance by following this link:
https://convective.wordpress.com/2014/07/04/migrating-a-vm-from-ec2-to-azure-at-300-mbps/
If you're running linux, currently there's no simple tool that does it, but you can go to your Azure account and follow these steps:
1- Mimic your architecture of servers on your Azure account by keeping eyes on number of VMs, Network, Storages, and other services if found.
2- Make the correct setup on those servers (configure your web server, db server, etc..)
3- Zip all of your data files found on EC2 (/var/www/Web_Folder) and use mysqldump to backup your database as well.
4- Create a windows server VM on Azure that you can connect to Remotely (Profit from the cloud internet speed) and use filezilla to download your zipped files from EC2 and then upload them back to newly created VMs on Azure. Upload your db backup file there as well.
5- Create a new database on your Azure VMs with the same old name, give user access, exit mysql and then restore your db backup file that you uploaded using: mysql -u root -p DB_Name
Just an update, Now you can accomplish this task using Azure Site Recovery this is a super easy task. In site recovery once you do failover all the virtual machines will automatically gets created which means with minimal or no downtime the migration can be performed https://azure.microsoft.com/en-in/documentation/articles/site-recovery-migrate-aws-to-azure/