How can I build front-end on AWS EC2 by CodeDeploy? - node.js

I am using Node.js, webpack, EC2, CodeDeploy with BitBucket.
In BeforeInstall script I put:
#!/bin/bash
cd /home/ec2-user/papirux/
sudo npm install
sudo NODE_ENV='production' webpack -p
Deploy was successful.
But folder built by webpack did not appear.
I don't resolve it about two days...

There's a couple things that come to mind.
1. Are you copying your source code to the right directory?
I would assume that your script would fail if you don't, but are you copying your code to the directory you want via your appspec? It should have a section like this:
files:
- source: /
destination: /home/ec2-user/papirux/
2. Look at the logging information for the deployment to see what happened with your script.
You can see the logging for deployments describe here. You should be able to see something useful at /opt/codedeploy-agent/deployment-root/deployment-logs/codedeploy-agent-deployments.log or /opt/codedeploy-agent/deployment-root/deployment-group-ID/deployment-ID/logs/scripts.log on your EC2 instance as to what happened with your script. If it failed or copied the files to an unexpected directory, you should see something useful there.
For convenience, you can also install the CloudWatch logs agent and have the logs uploaded to CloudWatch directly.

Related

Angular Universal - Deploying to AWS Elastic Beanstalk

I have been trying and failing for over three days now to get this working, and am growing increasingly frustrated with my own lack of understanding on the topic - so this is my search for an answer that I've not yet found.
I am using Angular 9.x and Angular Universal 9.x and am unable to work out how to deploy this to Elastic Beanstalk on a server running node. There are zero tutorials that explain how this should be done, as they are all aimed at those wanting to use Lambda on AWS. If someone could please point me in the right direction that would be great. I run npm run build:ssr --prod, and get the following in my dist folder:
[
I have tried deploying this folder by uploading it zipped, as well as tried eb deploy with my whole app - but all of these result in errors like the following (for eb deploy method)
> blush-front-end#0.0.0 start /var/app/current
> ng serve
sh: ng: command not found
Could someone please point me in the correct direction?
I struggled for months too because of the lack of tutorials online on how to deploy Angular Universal to AWS Elastic Beanstalk. And you will now be very happy to know how easy it is.
First, run the command npm run build:ssr to build for production.
Inside the dist folder, you will probably find a folder with your project name. Inside this folder you will find a "browser" folder and a "server" folder. Inside the "server" folder it is the main.js file.
Your setup might be slightly different, but you will be able to adjust this explanation to your situation after you read my entire answer.
Zip the dist folder.
Let's now configure the environment in AWS Elastic Beanstalk.
1) When you create an environment in Elastic Beanstalk, choose "Web server environment", and then on Platform branch config, choose the last option: "Node.js running on 64bit Amazon Linux". This is a very important step, since this is the only option that will enable you to configure the Container Options.
2) On the Application code, choose "Upload your code" and upload your zip file.
3) Click on Configure more options
4) Click on the Edit button on the Software box.
5) On the Node command field,type node dist/yourProjectFolderName/server/main.js
That's it!! Save and create your environment. Your app will work now. :-)

Plesk Git additional deploy actions npm not executed

I've connected Plesk to GitHub. Plesk gives the opportunity to run additional deployment actions after a branch was pulled. Pulling the branch works fine.
But it seems, that these actions are not triggered.
I want to run the install:prod task from my package.json file.
I can run this successfully via ssh.
I've also tried to skip the prepending "npm run" part but without success.
My current configuration looks like this.
npm run install:prod
The logs are showing no error message. It seems to silently fail.
I had the same problem. The solution for me was to use the correct path on the additional deploy scripts, you can find out those by doing "which npm" and "pwd" for getting your website dir path. Once you have those you could use this lines on your deploy scripts, for example, assuming "httpdocs" is where your website is:
cd /var/www/vhosts/<your vhost>/httpdocs
/usr/bin/npm run install:prod

How to deploy Angular SPA from BitBucket to Azure?

I'm creating a vanilla Angular project and uploading it to Bitbucket. It runs locally and I can build it into dist with no errors nor warnings. Now, I'd like to expose it on my Azure account. There's quite a lot of material showing how to but most of it is a bit aged (the options in Azure has changed) and/or the authors make it easy and use another options for the source (I target specifically Bitbucket).
Optimally, I'd like the following to happen.
Trigger by a push, get the files from the BitBucket repo.
Execute the command ng build --prod (or npm run build).
Copy over the artifacts from dist to the root of the app.
Checking the logs, I see two sections of relevance. First one is Generating deployment script, while the second is Running deployment command. The end of those as well as the label in the portal imply that it's all good and dandy. Well, it's not.
Using the following command to generate deployment script: 'azure site deploymentscript -y --no-dot-deployment -r "D:\home\site\repository" -o "D:\home\site\deployments\tools" --node --sitePath "D:\home\site\repository"'.
Generating deployment script for node.js Web Site
Generated deployment script files
Command: "D:\home\site\deployments\tools\deploy.cmd"
Handling node.js deployment.
KuduSync.NET from: 'D:\home\site\repository' to: 'D:\home\site\wwwroot'
Deleting file: 'hostingstart.html'
Copying file: '.angular-cli.json'
Copying file: '.editorconfig'
...
Copying file: 'src\index.html'
Copying file: 'src\assets\logo.png'
Copying file: 'src\assets_favicon.ico'
Copying file: 'src\environments\environment.prod.ts'
Copying file: 'src\environments\environment.ts'
Invalid start-up command "ng serve" in package.json. Please use the format "node ".
Looking for app.js/server.js under site root.
Missing server.js/app.js files, web.config is not generated
The package.json file does not specify node.js engine version constraints.
The node.js application will run with the default node.js version 6.9.1.
Selected npm version 3.10.8
web#0.0.0 D:\home\site\wwwroot
+-- #angular/animations#5.2.9
...
`-- zone.js#0.8.26
Finished successfully.
However, when I access the page, only the default document provided by MS shows. I've tried accessing the image files but failed (not sure if I got the link wrong or if those aren't there). All in all, I feel that I'm barking up the wrong tree. Trying to repeat the steps (possibly with slight changes), produced a website that says You do not have permission to view this directory or page, which gets me to a confused position where I see no rational next step in troubleshooting.
Suggestions on what I might be missing?

NPM errors and control in Azure Websites

I want to build my Node.JS application in a Azure Website.
There will be an usage of different NPM packages via my packages.json file.
My problem is that I often receive error messages which are related to missing NPM files.
Normally I put my files via FTP or edit them per VS Studio 15 Azure plugin directly on the server. This may be the reason why NPM isn't triggering as Microsoft intended it.
I would prefer a way in which I can just run commands with elevated privileges to have full control over NPM by myself.
Which ways are possible to avaid these problems?
If you're publishing your nodeJS application 'manually' via FTP there are little concerns about that.
First of All, 'manually' means manually.
Git
If you use continuous deployment via Git the final deployment step is to call npm install in your current application folder, this will install all the packages listed in package.json file.
The node_modules folder is excluded by default in .gitignore file, so all packages are downloaded by the server
Web deployment
If you're using web deployment from visual studio or command line, all the files contained by your solution are copied to Hosting environment including node_modules folder , because of this the deployment would take a long time to finish due the huge amount of dependencies and files that the folder contains.
Even worst: this scenario could take you to the same scenario you're facing right now.
FTP deployment
You're copying everything yourself. So the same thing occurs in Web Deployment is happen in FTP deployment method.
--
The thing is that when you copy all those node_modules folder contents you're assuming that those dependencies remains the same in the target enviroment, most of the cases that's true, but not always.
Some dependencies are platform dependent so maybe in you're dev environment a dependency works ok in x86 architectures but what if your target machine or website (or some mix between them) is x64 (real case I already suffer it).
Other related issues could happen. May be your direct dependencies doesn't have the problem but the linked dependencies to them could have it.
So always is strongly recommended to run npm install in your target environment and avoid to copy the dependencies directly from your dev environment.
In that way you need to copy on your target environment the folder structure excluding node_modules folder. And then when files are copied you need to run npm install on the server.
To achieve that you could go to
yoursitename.scm.azurewebsites.net
There you can goto "Debug Console" Tab, then goto this directory D:\home\site\wwwroot> and run
npm install
After that the packages and dependencies are downloaded for the server/website architecture.
Hope this helps.
Azure tweak the Kudu output settings, in local Kudu implementations looks the output is normalized.
A workaround -non perfect- could be this
npm install --dd
Or even more detailed
npm install --ddd
The most related answer from Microsoft itself is this
Using Node.js Modules with Azure applications
Regarding control via a console with elevated privileges there is the way of using the Kudu console. But the error output is quite weird. It's kind of putting blindly commands in the console without much feedback.
Maybe this is a way to go. But I didn't tried this yet.
Regarding deployment it looks like that Azure wants you to prefer Continuous Deployment.
The suggested way is this here.

cloud_package.cspkg file was not created after installing "socket.io-servicebus"

I'd like to use the "socket.io-servicebus" module to my node.js application.
But I encountered that a problem.
After installing the "socket.io-servicebus", cloud_package.cspkg file was not created by "Publish-AzureServiceProject" command.
I'm using "Windows Azure PowerShell" on Windows7 64bit edition.
Here is the procedure.
New-AzureServiceProject test1
Add-AzureNodeWebRole www
cd www
npm install socket.io-servicebus
Publish-AzureServiceProject -ServiceName xxx ...
[ cloud_package.cspkg will not created ]
By the way "Start-AzureEmulator -Launch" will be succeeded and we can test own application.
Please give me some advices. thank you.
It looks like the issue here is due to a known path length limitation. Azure has a limitation on paths in the package being more than 255 chars and in this case bringing in socket.io WITH all of it's dependencies is hitting that path.
There are several possible work arounds here.
A. - Zip up node modules and extract on the server.
Basically you zip up your modules and publish the module zip within the package. Then you can use an Azure startup task (in your cscfg) on the server to unzip the files.
Publish-AzureServicePackage will grab anything in the project, so in this case you just have a little script that you run before publishing which creates the node_modules archive and deletes node_modules.
I am planning to do a blog post on this, but it is actually relatively easy to do.
B. - Download node modules dynamically
You can download modules in the cloud. This can also be done with a startup task as is shown here.
If you look in that post you will see how you can author a startup task if you decide to do the archive route.
Feel free to ping me with any questions.

Resources