I have a web app in my local environment that needs to upload some files to a path like /images/app/customer. That path does exist on the production server but obviously doesn't on my machine.
Is there a way to "simulate" the existence of that directory on my environment?
I'll answer my own question after one year of experiments. It turned out that the best and clean way is to create the directory in a sandboxed environment like Docker or any other virtual machine out there.
Related
So I'm trying to do something probably pretty simple but can't figure it out of course. I have a VM on my network running CentOS and I installed Git using the guide. Now I'm on my Windows PC using Pycharm trying to setup Git but the setup asks for where the git.exe file is and can't seem to navigate to my CentOS VM within pycharm to point it anywhere. I tried \IP Address but that didn't work. Is there somewhere on the Linux vm I need to do to allow the windows Pycharm to reach it? I'm new to this on both sides lol.
If I understand your question correctly, you want to access a Git-Repository in your local Linux network. In that case you need to be able to access these files with your Windows machine in order to push/fetch your changes from/to it. A simple way to do it, could be set up file sharing as explained here link and then clone and handle it as any local Git-Repository (see e.g. link - with local Repos the file path is used instead of the URL).
An even simpler way could be to create a hosted online Git-Repository (e.g. on Github) even if you are a sole contributor, maybe make it a private one, if you don't want to make the contents public.
Is there a way to set up a project in Eclipse so that if my code has a reference to the system root directory then it will point to my workspace instead? (I am not seeing anything in the Run Configurations that would help me with this.) Something like the equivalent of making a sym link / that points to my workspace directory.
I'm working on a perl project that has absolute references to the hosting Linux file system in what would be the production environment. Those directories don't exist in my development Eclipse environment. My workspace is located in an NFS space mounted on a cluster of servers that run Eclipse I access in my laptop via client software.
So root can be any server's local space within the cluster and I don't have any access to anything above the workspace, and so I can't create the directory structures I need. I would rather not hard-code alternate directory paths to accommodate differences between the sandbox and production environments and having to comment them out when deploying to the prod environment.
I'm not finding a straightforward answer online. Maybe I'm not articulating the question correctly and help with that would also be appreciated if that is the case.
No. Good practice is to have paths like that configurable at runtime, usually via an environment variable or command line argument, specifically to accommodate changes between development, sandbox, and production environments.
I am developing an app with NWJS, now I am thinking in the deploy process, what I need is install the app into different machines that will use that app, the problem that I see is if I change some file I will need install again into each machine, I was reading about docker and if I understood fine, I can make an Image and download the last version of the app into each machine that use the app.
The Question is if can I upload the app into a container and download that into each machine?, and How can I search the documentation for do that?.
Thanks for any help
I think I've cheated my way into a solution, this could work for you, depending on what your exact requirements are.
In one scenario, I have a shared network folder that allows machines to launch the NWJS app via the network share, so every time I update the file and someone relaunches their short-cut, they have a fresh copy.
The remote users, who are not directly on our same network, has their copy in a DropBox folder - which - of course - automatically update as I drop the new copy into that folder.
None of these solutions are as "clean" as an installer, but, for our use case, works rather well. It's a bonus that DropBox handles the downloading of the new copy of the file automatically.
What I want and achieved so far:
I want to create a custom vagrant box including a configuration and an application to reuse it in different client or serve environments.
Specifically I managed to create vagrant box, based on Ubuntu (precise/64), that has node.js installed, and package it on my dev machine with
vagrant package my-box --output filename.box
I am able to copy the filename.box to a remote server and vagrant up the box there. Node.js is installed within the vagrant box as expected.
The problem is, that I am not able to package the files in the synced folder vagrant. After starting the box on the remote server, the synced folder is empty
Therefore the application I developed on the local machine is not included in the box.
I tried to find a solution or any information about this behavior, but apart from this unanswered Post i couldn't find anything on the net.
My questions:
How can i preserve the files in the synced folder and package them in the filename.box for reuse in the server environment.
Is this even possible? Is the behavior I see a bug or is Vagrant not meant to package the files also?
I didn't do any configuration for the synced folders so far. Is it possible to package files from a different synced folder than the regular /vagrant?
If this is not possible at all, what are best practices for deployment or reusage of vagrant environments including applications?
1-3) No. This is not possible and not intended to work in the way you expect it to work.
Think of VirtualBox's shared folder as a mounted volume on a remote machine. It's not part of the file system of your virtual machine. The actual data is saved on the host machine, not the virtual one.
4) If you want to add data into your box, just copy your data over to the vm before you pack it. No need to use shared folders.
You cannot package a synced folder but what you are desiring is absolutely possible.
The easiest way to accomplish this is to put the data in some other directory in the box (thus ensuring it gets packaged with the box). And during the Vagrant box's provisioning, move or copy the data to the synced directory.
Once the box is up and running, the synced directory will have the files you want in it.
I use web deploy command line batch files to sync Staging and Production servers.
Production server has some virtual directories that are being shared between nodes (on Shared Drive).
I would like to skip virtual directory bindings from being synced ?
Can anyone help me with a switch that does it efficiently!
I appreciate your help.
EXAMPLE.
Virtual Directory Images on Development server is binded to a share drive on LOCAL NAS.
where as on Production it is binded to a shared drive (Different Location).
when i sync production environment with development the binding of virtual directory Changes to LOCAL NAS.
I would like to stop virtual directory binding synchronization.
PLEASE HELP.
I think you can use the -skip parameters (although, I have never applied them to Virtul Directories as I usually sync specific web apps, not entire web servers). E.g.
-skip:objectname='filePath',absolutepath='logs\\.*\\someNameToExclude\.txt'
See the manual on TechNet:
-skip:skipAction=<action>,objectName=<objectName>,keyAttribute=<key>,absolutePath=<absolutePath>,attributes.<attributeName>=<attributeValue>,xPath=<xpathExpression>
Special Shortcuts
-skip:ApplicationPool=<applicationPoolName>
-skip:Directory=<directoryPath>
-skip:File=<filePath>
-skip:WebApplication=<webApplicationName>
-skip:WebSite=<webSitelName>