linux (CLI) download files via shared dropbox (folder`) link without a account - linux

I was thinking to use dropbox to upload my source code of a web-application. For this folder i would create a shared link. This link i like to use to download all the latest source files on my test server (instead of using s/FTP).
Now i know you can use dropbox with linux by installing their version, but it requires to create account. I don't want to use a account, and for sure don't want to use my own account.
Is there anyway to use a shared (folder) link, and download all the files in that folder command-line, without a account (maybe something like wget) ? There is no need for live-syncing, it would be fine to trigger the download with some bash script.
Thanks.

If you're ok with your links being public (which i think is not a good idea) , then you can just create a file with a list of links to your files and then create a bash script to loop over each line of the file get the link with wget
If you want to use authentication, you'll have to register for a Dropbox API key and then create a script (in python,ruby or java etc) to authenticate and get the files.
If you don't have a specific need for dropbox, i'll recommend you use git (or similar). With git you'll just have to create the repository on your server and clone it on your desktop. Then you can just edit your files and push it to the server.... it's so much easier.

Rogier, github has become the norm for hosting code. There are other options (Sourceforge, Google Code, Beanstalk) or you can set up a private git repository on your own computer.
Somewhere deep in my browser history there's an article about how to do that.
However a little googling turned up http://news.ycombinator.com/item?id=1652414. Let me know if you can't find some satisfactory instructions on your own of how to set up a git repo on your computer.

Related

How to tell Gitbook where to store files locally?

I just want to create a simple gitbook on my local drive, not attached to any remote repository. I went to File->Change Library Path... and pointed to the place where I want my files to go. When I create a new book, Gitbook puts some stuff in that library path that looks right: README.md, etc. But when I change README.md in Gitbook and save the file, it doesn't save to the README.md it created when I created the book. In fact I can't even figure out where it saved my changes, even after doing a find/grep on my entire hard drive.
Edit: I need to know where it saves, so I can run the command-line gitbook to create a pdf.
Which OS are you using?
On Windows, the default directory is C:\Users\username\GitBook
I had troubles with this when I first started too.
Here is some background that may help you
There are three flavors of gitbook: The online editor (storage is on Gitbook.com or Github), the offline editor (same storage + local at C:\Users\Documents\Gitbook), and the CLI toolchain (any storage you desire).
If you talking about issuing commands, you want to install the toolchain
If you are running Windows, I can't help, but when you switch to Ubuntu :) go to toolchain.gitbook.com to see the the instructions.
Hope this helps
Yes, its weird that the changes are not immediately reflected in the physical file. I changed something and saved again, that actually made the first save visible. :D Try it once, this may work for you too.
Gitbook Editor works with git history only. So externally edited and not committed files in that folder will not be identified in the editor. However, when you save the files in gitbook editor, it creates a commit and then only it will be reflected in the physical drive location.
You can find more about the issue here: https://github.com/GitbookIO/feedback/issues/101
Currently there's no way to explicitly tell GitBook where to store files, however we use a simple workaround which allows us to use the GitBook Editor to edit a book in our own git repository but keep control of how and when we commit changes.
We have a docs directory in our project which contains the gitbook which is symlinked to the folder GitBook uses to store its own books. This directory has its own .git folder ignored from a parent directory as we don't need GitBooks commits.

How to access code of my Azure website?

I dropped my hard drive which contained all my code and now it won't plug in to my computer. I need the code to my Azure website which I deployed earlier asap. Is there a way for me to access this code?
Yes, there are multiple ways.
Using the deployment credentials you can connect via FTP and download the files.
Using Git source control you can add your azure web app as a remote reference and clone it locally.
It's not going to be pretty but once you get your binaries via FTP you can use a Reflector library to decompile your binaries and generate source code from them.
It won't be as good as the original source code, but it'll be functionally equivalent and will make it easier for you get back to where you were.

Deploy Mercurial Changes to Website Hosting Account

I want to move only the website files changed since the published revision to a hosting account using SSH or FTP. The hosting account is Linux based but does have have any version control installed, so I can't simply do an update there, and the solution must run on the local development machines.
I'm essentially trying to do what http://www.deployhq.com/ does, but for free. I want to publish changes without having to re-upload everything or manually choose the files to move. I'm open to simply using a bash script that compares versions and copies each file (how? not that great with bash) since we'll be using Linux for development, but something with a web interface would be nice.
Thanks in advance for the help!
This seems more like a job for rsync than one for hg, given that that target doesn't have hg installed.
Something like so:
rsync -avz /path/to/local/files/ remote_host:/remote/path/
This would transfer all files, recursively (-r), from .../local/files/ and place them in /remote/path. The -az compresses and perserves file attributes.
rsync takes care of only transferring files that have changed. Be sure to watch for trailing slashed when specifying source paths, they matter (see the link above).

How to do version control via ftp?

I have a web dev. client using a shared host that doesn't allow shell access, and thus no access to SVN, Git, etc. I've tried to convince him to move to one of the many cheap options that allow it, but he won't do it. If I use version control on my staging server, are there any tools that will allow me to replicate the changes to production via ftp? Locally I have both mac & windows, the staging server is linux, so something that works on any of those platforms....
Using your Linux staging server you could keep a separate checked out copy that you use specifically for that host and then use a utility to mirror that directory with the host server.
LFTP is useful for this kind of thing. Its available for most Linux distributions and includes a 'mirror' function:
Mirror specified source directory to
local target directory. If target
directory ends with a slash, the source base name is appended to
target
directory name. Source and/or target can be URLs pointing to
directories.
Some kind of ftp mirror software is what you need. Not tested it but a quick search gave me this Java application. You could run that over your up-to-date checked out repository.
Good thing for keeping SVN repo and FTP copy in sync is svn2web. May I suggest creating separate branch for production copy and do merges to that branch for uploading to production server.
You probably need to write a batch file that is able to
Export the SVN repository
Upload the exported files to your Linux server via FTP
Short of finding / implementing some FUSE based CoW file system that supports immutable versions .. I'd just find another (more developer friendly) host. As far as I know, no FTP server supports this natively, nor can I think of any elegant means of putting it in place with script hackery.
I could be wrong.
This question (and answer) really helped me just now as I implemented version control via gitolite on a separate server and lftp.
Here’s what I did:
Set up gitolite on my ubuntu staging server
created base repo (i.e. foo.git) on staging server
cloned foo.git into working directory on staging server
cloned foo.git into working directory on local development machine
Developed locally
Pushed changes to foo.git repo on staging server
On staging server, logged into working directory, and pulled in changes from foo.git
lftp-ed into shared host (like you mention above)
Once in shared host, ran:
mirror -R --only-newer --delete --parallel=10 /source/directory/ /target/directory
Notes on the mirror command options:
-R - this pushes the source/directory to the target/directory. (mirror pulls in from target to source without this, think reverse)
—only-newer - without this option, even if you only changed one file, the mirror command will send all the files in the source directory over to the target directory. with this option only the changed (newer) files are transferred over the wire.
—delete - deletes files that are no longer in the source directory but still in the target directory. one of my pushes involved deleting expired assets. without this option, the same files would have stayed put on my shared host after executing the mirror command.
—parallel=10 - transfers 10 files at once (instead of 1 by default). this made the process much faster
While this is what worked for me, I’m sure there are ways to improve on this. I was grateful for this question and thought i’d share my experience.
Rsync will do this over an FTP connection. You probably already have it installed if you’re on a Unix-like system.

How to download/checkout a project from Google Code in Windows?

How do I download a ZIP file of an entire project from Google Code when there are no prepared downloads available?
This is what I see on the checkout page:
Command-line access
Use this command to anonymously check out the latest project source code:
svn checkout http://myproject.googlecode.com/svn/trunk/ myproject-read-only
But I'm working on Windows and I don't have the svn binaries ... do I need these?
I can access individual source code file or view the Subversion HTML pages, but that just allows me to access source code files one-by-one.
If you don't want to install anything but do want to download an SVN or GIT repository, then you can use this: http://downloadsvn.codeplex.com/
I have nothing to do with this project, but I just used it now and it saved me a few minutes. Maybe it will help someone.
If you install TortoiseSVN you can use SVN under windows. It also gives you the SVN binaries. You needn't do the checkout from the command-line though as it integrates into Windows Explorer for you.
If you don't want to install TortoiseSVN, you can simply install 'Subversion for Windows' from here:
http://sourceforge.net/projects/win32svn/
After installing, just open up a command prompt, go the folder you want to download into, then past in the checkout command as indicated on the project's 'source' page. E.g.
svn checkout http://projectname.googlecode.com/svn/trunk/ projectname-read-only
Note the space between the URL and the last string is intentional, the last string is the folder name into which the source will be downloaded.
Thanks Mr. Tom Chantler
adding that to get the exe http://downloadsvn.codeplex.com/ to pull the SVN source
just note that suppose you're downloading the below project:
you have to enter exactly the following to donwload it in the exe URL:
http://myproject.googlecode.com/svn/trunk/
developer not taking care of appending the h t t p : / / if it does not exist.
Hope it saves somebody's time.
Another simple solution without the TortoiseSVN overhead is RapidSVN. It is a lightweight open-source SVN client that is easy to install and easy to use.
The Download SVN tool did also work quite well, but it had problems with SVN repositories that don't provide a web interface. RapidSVN works fine with those.
If you have a github account and don't want to download software, you can export to github, then download a zip from github.

Resources