I have nearly identical Linux (Fedora) machines at home and at work and I keep my files on both machines synchronized using the excellent Unison program. I have been trying to keep an eclipse workspace synchronized across the two machines but this has failed. I tried both:
Synchronize just the /workspace directory, badness due to plugin upgrades
Synchronize both /workspace and my .eclipse/ director.
What happens is that I work in one machine, create new projects on eclipse, etc. Then unison. Then when I go to the other machine the projects will sometimes not appear, sometimes they will appear but eclipse cannot find the files, and sometimes (rarely) it works.
I don't understand why eclipse gets so confused since I have identical workspaces, eclipse versions, and even .eclipse directories.
Have considered going through a source control repository? If privacy is a concern, there are private SVN spaces available (e.g. assembla).
I understand this technique will (at least) make it possible to synchronize the projects but probably not all the settings related to a workspace. It might be an option, no?
Take a look at Pulse. It's an Eclipse distribution that can handle synchronization of workspace preferences across users and machines. It might be what you need.
I have been using dropbox to synchronize my workspace. I have been able to work on 3 different computers so far without any issues.
I used to store my workspace(s) and sometimes the eclipse installation itself on a USB stick drive and use that for project portability from windows machine to windows machine. You can then just run Eclipse from the stick and mount the workspace on the same stick.
I have also heard that drop box (http://getdropbox.com - they have a 2gb free plan) is useful for this, though I have not tried it.
It's odd that it does not work with your sync software.
I've ad issues with unison and eclipse and have them mostly worked out though it still needs to refresh the entire workspace when I switch systems.
There are two issues I've discovered that need to be configured before it is at all happy:
1) sync your workspace, your eclipse install and ~/.eclipse
2) Specify "ignorenot" rules in your unison "prf" files to not ignore any files in these directories. This is necessary because, by default, unison excludes files it thinks are built unsing rules similar to CVS which causes issues.
for example:
path = eclipse
path = workspace
path = .eclipse
ignorenot Regex eclipse/.*
ignorenot Regex workspace/.*
ignorenot Regex .eclipse/.*
Have you considered setting up a network drive and installing Eclipse on that drive (along with your workspace)? That way, when you open Eclipse on either machine, it will be pointing at the network path for your workspace. I've successfully used this solution in the past.
I used the Mercurial DVSC on a USB stick as the transfer between home and work. I had three Mercurial repositories: one on the USB stick, one at home and one at the office sharing the same space as my Subversion checkout. If you're up-to-speed with DVSC concepts, I'd push/pull changes from office->USB->home.
It worked fine, but the first check-in was a pain as USB flash writes have crappy speeds. Pushing/pulling deltas was fairly quick afterwords.
I believe the Mozilla guys use a similar hybrid approach of SVN for the 'official' repository, but the developers use Mercurial for their development environment.
Related
I'm working in a environment where files are at Windows side (because I like to work with phpStorm from that side) and Linux side (because I've a Virtual Machine running CentOS 6.6 and there is where LAMP environment is). The phpStorm project is a remote files one. This are the steps I followed to create the project:
Clone the repo at Windows directory
Copy the files to Linux using WinSCP
Create the remote project using phpStorm and this step copy the whole files from Linux to Windows.
I'm using SmartGit to manage my repos and do GIT/SVN tasks (the easy way). But surprise, files hasn't been changed but SmartGit says it does, but waits? How is that possible if the only steps I did was the one described above? Even if you try to open a file SmartGit will said that the content is the same? So, how to avoid this behavior? How to not to commit the whole files? If I made a commit already, how do I dismiss it? Is not the first time I'm running this problem but before repos was mine and I can lose every but now is a serious project and I take care for not damage others work. Any advice? Help? What you can do in this case?
See this pic:
There you can see what I'm talking about.
I want to be able to work across multiple workstations synchronously jumping from one to the other without having to worry about committing.
I have windows personal and work desktop and a Mac OSX laptop. At the moment, I point my project to a cloud directory and have the local install of Android Studio pointing to a gradle offline cache in another cloud directory. This keeps failing as it tells me that the path to gradle is invalid. Which I understand because gradle is referenced in different locations on different machine (considering the differing file management system in MACOSX and Windows7).
Edit: When I try to open the project, it brings up the "Import Project from Gradle" screen. To which it has the option for me to select "Use local gradle distribution" and select the Gradle home directory. I pointed it to the cache directory, and it tells me:
Cannot Save Settings
Gradle location is incorrect.
Location:C:/Users/Username/.gradle
All my research (include these answers here, and here) suggest that VCS is the way to go. However, I don't see this as a solution to my problem. I'm not looking to version control, I'm looking to transition seamlessly across workstations. Of course I will still use Version Control System for the purpose of saving a working version of my code, or sharing it with other developers, but there has to be a better way when I simply just want to keep all workstations synced.
I come from web development, and I synchronise local environment on AMPPS across multiple computers without any issue. This meant I can transition from my personal desktop, laptop, and work desktop instantly. It frustrates me if I have to remember to commit every time I move around. If I have to do this 20 times a day, and it takes about a minute to do this, that's 20 minutes that could have been spent writing a couple of functions. And what if I forget to commit, then I get to work, or home, that would be a day wasted because I won't actually have the current up to date code...
So the question remains, is there a way to instantly synchronise Android Studio projects? How do I keep all my code base (ie gradle) in sync?
Ok thanks to the comments above which pointed me in the right direction.
Android Studio create some local files that are specific to the machine that you are on. Following on this principle, to sync the "source" files (files that are specific to your application only), you must ignore all these local files. This is similar to what you would store on github. I followed the answer for this question to apply the ignore rules.
Having ignored all the "local files", when I create a new project, the source files are synchronised across all my workstations. In order to establish a local version, I need to "import" the project first. Once it has been imported, "local files" will be created for that particular machine. From then on, I can "open" the project locally.
To summarise:
Set your sync to ignore files as per .gitignore or refer to this question.
Create a project on one of your workstation and save it in the cloud.
When you are ready to work on the project for the first time on another workstation, "import" the project.
Once the project has been imported, all local files should have been created.
From then on, use the "open" option to continue working on the project.
I hope this helps somebody else, saving hours on googling.
I have created a synchronized project in Eclipse so that I can develop on my Windows workstation without the overhead caused by running eclipse on our company's build server. However, the problem I'm having is that the indexer is using my Cygwin includes for things such as the stdlib which aren't the ones I wanted to include. Is there a way to include remote includes from the linux build server for things like the std lib? The only idea I have right now would be to create a mapped cifs mount to my windows machine that has access to the header files, however I don't know if that would work.
Look at "Remote Include Paths" (bottom of page). Let us know on the ptp-users mailing-list if it doesn't work.
I like subversion except for folders renaming, deleting : it's such a nightmare for me as I do this a lot that I want to drop off subversion for something else as close as Subversion but which can stand folders renaming, deleting moving under Explorer without obligating me to launch a special command.
Is there an alternative ?
It's several times I create a new repo after subversion corruption, now it says the working copy lock seems to be broken. I'm exhausted :)
Update: for Windows ;)
For the Mac there are two great options for what I think you are looking for. First there is Tower for Mac and the there is Versions for Mac which really is great if you want to step away from Github and Svn. So either try find the Windows Alternative to these two or switch over the the Mac way of life. Best wishes. Good luck
I'm looking for something that can copy (preferably only changed) files from a development machine to a staging machine and finally to a set of production machines.
A "what if" mode would be nice as would the capability to "rollback" the last deployment. Database migrations aren't a necessary feature.
UPDATE: A free/low-cost tool would be great, but cost isn't the only concern. A tool that could actually manage deployment from one environment to the next (dev->staging->production instead of from a development machine to each environment) would also be ideal.
The other big nice-to-have is the ability to only copy changed files - some of our older sites contain hundreds of .asp files.
#Sean Carpenter can you tell us a little more about your environment? Should the solution be free? simple?
I find robocopy to be pretty slick for this sort of thing. Wrap in up in a batch file and you are good to go. It's a glorified xcopy, but deploying my website isn't really hard. Just copy out the files.
As far as rollbacks... You are using source control right? Just pull the old source out of there. Or, in your batch file, ALSO copy the deployment to another folder called website yyyy.mm.dd so you have a lovely folder ready to go in an emergency.
look at the for command for details on how to get the parts of the date.
robocopy.exe
for /?
Yeah, it's a total "hack" but it moves the files nicely.
For some scenarios I used a freeware product called SyncBack (Download here).
It provides complex, multi-step file synchronization (filesystem or FTP etc., compression etc.). The program has a nice graphical user interface. You can define profiles and group/execute them together.
You can set filter on file types, names etc. and execute commands/programs after the job execution. There is also a job log provided as html report, which can be sent as email to you if you schedule the job.
There is also a professional version of the software, but for common tasks the freeware should do fine.
You don't specify if you are using Visual Studio .NET, but there are a few built-in tools in Visual Studio 2005 and 2008:
Copy Website tool -- basically a visual synchronization tool, it highlights files and lets you copy from one to the other. Manual, built into Visual Studio.
aspnet_compiler.exe -- lets you precompile websites.
Of course you can create a web deployment package and deploy as an MSI as well.
I have used a combination of Cruise Control.NET, nant and MSBuild to compile, and swap out configuration files for specific environments and copy the files to a build output directory. Then we had another nant script to do the file copying (and run database scripts if necessary).
For a rollback, we would save all prior deployments, so theoretically rolling back just involved redeploying the last working build (and restoring the database).
We used UnleashIt (unfortunate name I know) which was nicely customizable and allowed you to save profiles for deploying to different servers. It also has a "backup" feature which will backup your production files before deployment so rollback should be pretty easy.
I've given up trying to find a good free product that works.
I then found Microsoft's Sync Toy 2.0 which while lacking in options works well.
BUT I need to deploy to a remote server.
Since I connect with terminal services I realized I can select my local hard drive when I connect and then in explorer on the remote server i can open \\tsclient\S\MyWebsite on the remote server.
I then use synctoy with that path and synchronize it with my server. Seems to work pretty well and fast so far...
Maybe rsync plus some custom scripts will do the trick.
Try repliweb. It handles full rollback to previous versions of files. I've used it whilst working for a client who demanded its use and I;ve become a big fan of it, partiularily:
Rollback to previous versions of code
Authentication and rules for different user roles
Deploy to multiple environments
Full reporting to the user via email / logs statiing what has changed, what the current version is etc.