Yocto local repository - linux

In our office, employees use yocto for project development activity & all will be downloading from source repository.
I want to setup a repository kind of server (just like apt-cacher), where all client machine will connect local repository & download whatever is required. Is this possible?
Please correct if I am asking something wrong or understood wrong.

One idea is to create a local cloud drive and attach that to the computer as a folder.
In /conf/local.conf, change
SSTATE_DIR = "/path/to/your/sstate-repository"
DL_DIR ?= "/path/to/your/download/repository"
Please note that sstate will build up from time to time so create a cron job to delete the file in there using this command:
find ${sstate_dir} -name 'sstate*' -atime +3 -delete; fi
More information can be found on page 27 HERE

Assuming that the only thing you care about here is sources (and not built packages) you should probably take a look at the PREMIRRORS variable. One thing that I'm not clear about in your question is "source repository" (accompanied with git label), your project is typically built from lots of components, about 95% of those come from external sources while some come from internal repositories (Git/SVN/whatever). Internal repositories are usually not a big problem, you have them nearby, they tend to work and everyone involved should have some access to them anyway. Most of the problems actually occur with external fetching, that's where mirroring is handy.
The way I usually set up things wrt source file management is as follows:
set up some internal FTP server, say "ftp://oe-src.example.com/" (with anonymous read-only access)
use DL_DIR ?= "${HOME}/sources" in your local.conf (which is the way it used to be way back in OE Classic days), it's not strictly necessary, it's just that I like having an ability to clean up build directory without wiping downloaded source files at the same time
set PREMIRRORS variable in the local.conf like this:
PREMIRRORS = "\
git://.*/.* ftp://oe-src.example.com/ \n \
ftp://.*/.* ftp://oe-src.example.com/ \n \
http://.*/.* ftp://oe-src.example.com/ \n \
https://.*/.* ftp://oe-src.example.com/ \n \
"
add an action to your CI tool to synchronize (in "add to" mode, not removing old sources) ~/sources with your FTP after the build (via FTP/SCP/rsync/whatever)
This way you always have a nice set of source files (BTW, also including tarballs of VCS checkouts for your internal software, so it somewhat helps reducing the load on your internal VCS) on FTP and most of the source file requests get satisfied with this FTP, so you no longer have problems with missing sources/broken checksums/slow downloads/etc.
The only downside that I see is that this FTP is open to everyone and it has all the sources including your internal ones, which may or may not be problem depending on your security policies. This can be mitigated by using per-project SFTP mirrors, which would incur additional overhead of user key management, but I've also done such setups in practice, so when it's needed it can be done.

Related

Using haxe to edit remote file?

I've searched in haxelib for a library to use for remotely editing a file on a server using ssh connection with haxe, or listing files in directory..
Has any one done this with haxe?
I want to build a desktop app to create a yaml editor that will change settings files of several servers using a frontend like haxe-ui.
Ok, there are probably a lot of ways you could do it, but I would suggest separating your concerns:
desktop app to create a yaml editor
Ok, that's a fine use case for Haxe / a programming language. Build an editor, check.
change settings files (located on) several servers
Ok, so you have options here. Either
Make the remote files appear as local files via some network file system, or
Copy the files locally, edit them , and copy them back, or
Roll your own network-enabled service that runs on each server, receives commands, and modifies the files.
Random aside: Given that these are settings files, you probably also want to restart some service after changes are made.
I'd say option 2 is the easiest. There are even many ways to do that:
Use scp to both bring the settings files to a local location, edit them locally, and then push them back. And if you setup SSH keys, you won't have to bother with passwords.
Netcat is another tool for pushing bytes (aka files) over the network. It's simpler than scp, but with no security measures.
Or, get creative / crazy, and say, "my settings files will all be stored in a git repo. The 'sync' process will be a push / pull setup."
There are simply lots of ways to get this done.

Using git to just monitor changes on a webserver

I am tasked with monitoring the changes made to the source files of a website. I am not developing the website, just watching it. I am a firm believer in using version control, and am a fan of git, but the developer who is actually maintaining the site is not, and I have decided it is better to let him continue to work however he wants (don't ask). I do not want to have to give him any instructions whatsoever (except possibly telling him that I am adding files or directories that he can ignore).
I consider myself an intermediate-level user of git, so I want to run this by an expert or two.
I am thinking I can install git on the (Linux) server, and then ask for status, and do commits, via SSH. Will this work without jeopardizing the normal operation of the web server?
Yes, using Git on a server should not interfere with the normal operation of the server (as mentioned in the comments, doing this on a production server is dodgy but I'll leave that to one side.)
Note that using Git normally will create a .git directory at the root of whatever you're tracking. If that is your web server root directory, you might want to consider whether this is a risk as far as external access to the contents of the .git directory (depending on your server setup, this may or may not be a concern).
If you want to create the .git directory somewhere else outside your working tree, see the GIT_DIR environment variable.

Deploy Mercurial Changes to Website Hosting Account

I want to move only the website files changed since the published revision to a hosting account using SSH or FTP. The hosting account is Linux based but does have have any version control installed, so I can't simply do an update there, and the solution must run on the local development machines.
I'm essentially trying to do what http://www.deployhq.com/ does, but for free. I want to publish changes without having to re-upload everything or manually choose the files to move. I'm open to simply using a bash script that compares versions and copies each file (how? not that great with bash) since we'll be using Linux for development, but something with a web interface would be nice.
Thanks in advance for the help!
This seems more like a job for rsync than one for hg, given that that target doesn't have hg installed.
Something like so:
rsync -avz /path/to/local/files/ remote_host:/remote/path/
This would transfer all files, recursively (-r), from .../local/files/ and place them in /remote/path. The -az compresses and perserves file attributes.
rsync takes care of only transferring files that have changed. Be sure to watch for trailing slashed when specifying source paths, they matter (see the link above).

How to do version control via ftp?

I have a web dev. client using a shared host that doesn't allow shell access, and thus no access to SVN, Git, etc. I've tried to convince him to move to one of the many cheap options that allow it, but he won't do it. If I use version control on my staging server, are there any tools that will allow me to replicate the changes to production via ftp? Locally I have both mac & windows, the staging server is linux, so something that works on any of those platforms....
Using your Linux staging server you could keep a separate checked out copy that you use specifically for that host and then use a utility to mirror that directory with the host server.
LFTP is useful for this kind of thing. Its available for most Linux distributions and includes a 'mirror' function:
Mirror specified source directory to
local target directory. If target
directory ends with a slash, the source base name is appended to
target
directory name. Source and/or target can be URLs pointing to
directories.
Some kind of ftp mirror software is what you need. Not tested it but a quick search gave me this Java application. You could run that over your up-to-date checked out repository.
Good thing for keeping SVN repo and FTP copy in sync is svn2web. May I suggest creating separate branch for production copy and do merges to that branch for uploading to production server.
You probably need to write a batch file that is able to
Export the SVN repository
Upload the exported files to your Linux server via FTP
Short of finding / implementing some FUSE based CoW file system that supports immutable versions .. I'd just find another (more developer friendly) host. As far as I know, no FTP server supports this natively, nor can I think of any elegant means of putting it in place with script hackery.
I could be wrong.
This question (and answer) really helped me just now as I implemented version control via gitolite on a separate server and lftp.
Here’s what I did:
Set up gitolite on my ubuntu staging server
created base repo (i.e. foo.git) on staging server
cloned foo.git into working directory on staging server
cloned foo.git into working directory on local development machine
Developed locally
Pushed changes to foo.git repo on staging server
On staging server, logged into working directory, and pulled in changes from foo.git
lftp-ed into shared host (like you mention above)
Once in shared host, ran:
mirror -R --only-newer --delete --parallel=10 /source/directory/ /target/directory
Notes on the mirror command options:
-R - this pushes the source/directory to the target/directory. (mirror pulls in from target to source without this, think reverse)
—only-newer - without this option, even if you only changed one file, the mirror command will send all the files in the source directory over to the target directory. with this option only the changed (newer) files are transferred over the wire.
—delete - deletes files that are no longer in the source directory but still in the target directory. one of my pushes involved deleting expired assets. without this option, the same files would have stayed put on my shared host after executing the mirror command.
—parallel=10 - transfers 10 files at once (instead of 1 by default). this made the process much faster
While this is what worked for me, I’m sure there are ways to improve on this. I was grateful for this question and thought i’d share my experience.
Rsync will do this over an FTP connection. You probably already have it installed if you’re on a Unix-like system.

What is a good deployment tool for websites on Windows?

I'm looking for something that can copy (preferably only changed) files from a development machine to a staging machine and finally to a set of production machines.
A "what if" mode would be nice as would the capability to "rollback" the last deployment. Database migrations aren't a necessary feature.
UPDATE: A free/low-cost tool would be great, but cost isn't the only concern. A tool that could actually manage deployment from one environment to the next (dev->staging->production instead of from a development machine to each environment) would also be ideal.
The other big nice-to-have is the ability to only copy changed files - some of our older sites contain hundreds of .asp files.
#Sean Carpenter can you tell us a little more about your environment? Should the solution be free? simple?
I find robocopy to be pretty slick for this sort of thing. Wrap in up in a batch file and you are good to go. It's a glorified xcopy, but deploying my website isn't really hard. Just copy out the files.
As far as rollbacks... You are using source control right? Just pull the old source out of there. Or, in your batch file, ALSO copy the deployment to another folder called website yyyy.mm.dd so you have a lovely folder ready to go in an emergency.
look at the for command for details on how to get the parts of the date.
robocopy.exe
for /?
Yeah, it's a total "hack" but it moves the files nicely.
For some scenarios I used a freeware product called SyncBack (Download here).
It provides complex, multi-step file synchronization (filesystem or FTP etc., compression etc.). The program has a nice graphical user interface. You can define profiles and group/execute them together.
You can set filter on file types, names etc. and execute commands/programs after the job execution. There is also a job log provided as html report, which can be sent as email to you if you schedule the job.
There is also a professional version of the software, but for common tasks the freeware should do fine.
You don't specify if you are using Visual Studio .NET, but there are a few built-in tools in Visual Studio 2005 and 2008:
Copy Website tool -- basically a visual synchronization tool, it highlights files and lets you copy from one to the other. Manual, built into Visual Studio.
aspnet_compiler.exe -- lets you precompile websites.
Of course you can create a web deployment package and deploy as an MSI as well.
I have used a combination of Cruise Control.NET, nant and MSBuild to compile, and swap out configuration files for specific environments and copy the files to a build output directory. Then we had another nant script to do the file copying (and run database scripts if necessary).
For a rollback, we would save all prior deployments, so theoretically rolling back just involved redeploying the last working build (and restoring the database).
We used UnleashIt (unfortunate name I know) which was nicely customizable and allowed you to save profiles for deploying to different servers. It also has a "backup" feature which will backup your production files before deployment so rollback should be pretty easy.
I've given up trying to find a good free product that works.
I then found Microsoft's Sync Toy 2.0 which while lacking in options works well.
BUT I need to deploy to a remote server.
Since I connect with terminal services I realized I can select my local hard drive when I connect and then in explorer on the remote server i can open \\tsclient\S\MyWebsite on the remote server.
I then use synctoy with that path and synchronize it with my server. Seems to work pretty well and fast so far...
Maybe rsync plus some custom scripts will do the trick.
Try repliweb. It handles full rollback to previous versions of files. I've used it whilst working for a client who demanded its use and I;ve become a big fan of it, partiularily:
Rollback to previous versions of code
Authentication and rules for different user roles
Deploy to multiple environments
Full reporting to the user via email / logs statiing what has changed, what the current version is etc.

Resources