I am developing an app with NWJS, now I am thinking in the deploy process, what I need is install the app into different machines that will use that app, the problem that I see is if I change some file I will need install again into each machine, I was reading about docker and if I understood fine, I can make an Image and download the last version of the app into each machine that use the app.
The Question is if can I upload the app into a container and download that into each machine?, and How can I search the documentation for do that?.
Thanks for any help
I think I've cheated my way into a solution, this could work for you, depending on what your exact requirements are.
In one scenario, I have a shared network folder that allows machines to launch the NWJS app via the network share, so every time I update the file and someone relaunches their short-cut, they have a fresh copy.
The remote users, who are not directly on our same network, has their copy in a DropBox folder - which - of course - automatically update as I drop the new copy into that folder.
None of these solutions are as "clean" as an installer, but, for our use case, works rather well. It's a bonus that DropBox handles the downloading of the new copy of the file automatically.
Related
I'm using docker for a new project for learning purposes. I thought that it would make things a lot easier since I have to setup the environment only once.
I've created a setup like this; I created a base image that installs Ubuntu and NodeJS. I've also created a Development image that copies a src (web application) folder into the container, then does an npm install and runs the server.js. That's basically it in a nutshell.
But now, whenever I make a change to my source code I have to stop the running container, and build the image and run it again. It doesn't take long to rebuild and run de Development image, but it gets a bit annoying to do that eveythime I make a change to my code during development.
What I normally had was a Gulp task or Browsersync watching my local files. Everytime I made a change, those changes were automatically visible in the browser. That really speeds up the development process.
I could still work like this during development by installing everything locally. But that kinda defeats the purpose of having a "development image". That means that I still have to configure all the systems that want to work on this web application with the appropiate Node version, Database schemes, port mappings, SSL settings, Certificats etc.
So my questions is, is there a way to run a container, and whenever I change the source code (locally), that it is then automatically pushed to the running container? So I have "continuous development"?
If I understood you correctly, you don't wish to build the Development image each time you update the src. If that's the situation, what you can do is:
For the Development Phase (When the Source code is update frequently):
Create a Dockerfile in such a way that make use of a shared volume (where the source code will reside).
In the shared volume, you can update the source code using the src container.
Also, if you need to do some additional task, you can write it in a script and call that script each time you update the source.
Later on, when the development phase ends, you may use your current Dockerfile to build the development image.
The question:
Is there any possibility to "watch" specific folders on my workspace for new files and automatically download them to my local project folder?
I would prefer a solution using only PhpStorm, if that's possible, but I am also fine with a Linux one!
The situation:
I work with PhpStorm 2016.1.1 for Windows 8.1 on several different projects. Some of these projects are developed using Laravel, a very nice PHP framework.
All of my projects are cloned to an Open SUSE workspace server in my LAN by Git.
I import every project by using the "Create Project from existing Files" functionality and choosing the option "Files are accessable via network share or mounted drive".
I created the mounted drive using Samba.
As long as I keep developing in PhpStorm, everything works like a charm. Saved files are uploaded to my workspace automatically so I can debug my PHP projects in the browser very easily.
The problem:
Laravel offers a very nice command line tool to use called "artisan". This tool can, amongst other functionality, create specific classes for your projects like events, jobs, migrations, seeds, and so on.
This files created on the command line are, of course, not visible to me in PhpStorm because they are not in my local project folder until I manually start downloading from my workspace.
I do not know if it will help you but there is a Ticket from PhpStorm for a similiar function: WI-1284
It is about 6 Years old so i donĀ“t think that this is coming soon. Perhaps there is another solution for it.
This could help for synchronisation of a remote host: configuring-synchronization-with-a-web-server
maybe I am blind or I don't understand something right. I have created some HelloWorld-App and now I would like to test in on my device directly. (not via Visual Studios' Remote Tools)
So created my app package in VS but selected "No" for "Uploading to Windows Store" since I want to try it out localy.
The build an verification is successful and all but at the end I got a folder ("HelloWorld_1.0.1.0_Test") in the "AppPackages"-Folder. There are a couple of files. .appxbundle, .appxsym (for each architectiure one)
But if I want to install an app via the device manager it requires an .appx file. Where do I get this one?
I googled a lot, but I only found the descriptions for using the Windows Store.
Isn't it possible without it or am I missing something?
Kind Regards
Pavel
I don't know which device manager you install it through, but an appxbundle should be the fine. It's a ZIP file which includes several appx files (for several display scales, languages, ...).
But generally, inside the AppPackages folder there should be a folder like "AppName_1.0.0.0_Test". VS creates not only the appxbundle there, but also a Powershell script Add-AppDevPackage.ps1. Run it as admin and it installs the app if sideloading is enabled. This should be the easiest option to test apps on other machines without Store submissions.
What I want and achieved so far:
I want to create a custom vagrant box including a configuration and an application to reuse it in different client or serve environments.
Specifically I managed to create vagrant box, based on Ubuntu (precise/64), that has node.js installed, and package it on my dev machine with
vagrant package my-box --output filename.box
I am able to copy the filename.box to a remote server and vagrant up the box there. Node.js is installed within the vagrant box as expected.
The problem is, that I am not able to package the files in the synced folder vagrant. After starting the box on the remote server, the synced folder is empty
Therefore the application I developed on the local machine is not included in the box.
I tried to find a solution or any information about this behavior, but apart from this unanswered Post i couldn't find anything on the net.
My questions:
How can i preserve the files in the synced folder and package them in the filename.box for reuse in the server environment.
Is this even possible? Is the behavior I see a bug or is Vagrant not meant to package the files also?
I didn't do any configuration for the synced folders so far. Is it possible to package files from a different synced folder than the regular /vagrant?
If this is not possible at all, what are best practices for deployment or reusage of vagrant environments including applications?
1-3) No. This is not possible and not intended to work in the way you expect it to work.
Think of VirtualBox's shared folder as a mounted volume on a remote machine. It's not part of the file system of your virtual machine. The actual data is saved on the host machine, not the virtual one.
4) If you want to add data into your box, just copy your data over to the vm before you pack it. No need to use shared folders.
You cannot package a synced folder but what you are desiring is absolutely possible.
The easiest way to accomplish this is to put the data in some other directory in the box (thus ensuring it gets packaged with the box). And during the Vagrant box's provisioning, move or copy the data to the synced directory.
Once the box is up and running, the synced directory will have the files you want in it.
I am doing my first deployment on AWS (using Elastic Beanstalk), and I am completely new to this.
I built a personal website using NodeJS / Express, and on my local machine it loads just fine. Once I was ready to deploy a v1, I created an AWS account and set up a new EBS application environment for Node. I set up the static files to load from /public, set my node version, and set the launch command as node app.js, but those were the only options I changed.
I zipped up my site (using CNTL + Click -> Compress on a selection of all site files) and uploaded that zip, and after some time, it came up all green. Clicking the link to load my site though, I get a half finished version. Looking at my console, I see that I am getting 4 files as 404, and because of that, 4 failures from RequireJS.
These 4 files are backbone views, and are contained in a folder with 4 other JS files that are all loading just fine (I can open them in the chrome dev tools source tab from the deployed version). I am confused how just these 4 files would go missing.
Is there some way to FTP into where ever my files are contained, to confirm the files are in fact not present? And barring that, what steps are available to figure out what is occurring here? Like I said, it looks and loads just fine locally, and I am at a loss as to where to even start debugging something like this. The AWS docs I have read so far only tell me to do exactly what I have been doing.
Repo for the project is here: https://github.com/RyanMG/trustycode
And the deployment is here: http://trustycode.elasticbeanstalk.com/
The files it is having trouble with are under public/javascript/views/ (CodeView, AboutView, PhotoView, DesignView)
Any ideas / advice?
Is there some way to FTP into where ever my files are contained, to confirm the files are in fact not present?
You can ssh into the EC2 instance of the Elastic Beanstalk app using your pem file.
Check files in /var/app/current
I don't have the reputation to comment, but that is one of those common gotchas I found myself switching to OSX from GNU/LINUX at work. OSX is case insensitive; linux world is case sensitive.