How can I change the default temp directory used by Release Management Server 2013 Release 4 - alm

Release Management Server defaults to the system drive in the current users local temp directory. I was scanning through its various configuration files but I could not seem to figure out if you can repoint its working directory to another drive. The builds are eating up space on my C drive. It there any way to repoint it to another drive?

Sure, this is actually really easy: Change the %TEMP% environment variable for the deployer user.
That said, the defaults for temp file retention are a bit crazy: 7 days!

Release Management Server is mostly a database that can be put wherever you like using standard SQL Server techniques for moving databases. Pretty sure it doesn't have a working directory - or are you doing something exotic?
The RM Deployment Agent receives builds on target nodes at C:\Users\$AccountNameUsedToRunDeployerService$\AppData\Local\Temp\RM\T\RM. I haven't seen anywhere to change this but you you can configure cleanup settings from the RM Client - Administration > Settings > Deployer Settings and maybe choosing something more aggressive may help.

Related

NT Authority/System can't see protected OS files

The Question:
Why can't the LocalSystem account (NT Authority\System) see files in the Recycle Bins or the Temporary Internet Files directory?
Background:
I created a scheduled task to run using the System account. The purpose of the task is to execute the Disk Cleanup Utility with predefined setting (for example: cleanmgr.exe sagerun:1). When it executes, it seems to run with no errors. But when I check the resources it's supposed to clean (Temporary Internet Files, Recycle Bin etc.), they're still there.
So I thought maybe cleaning up the two resources manually might work. I developed a console application in C# that clears the Recycle Bin and the Temporary Internet Files. I test it and it works just fine. But again, when I attempt to run it as a scheduled task with the System account, I run into the same issue again.
Following the log, it looks like when running the application with System account, it sees no files are in the Recycle Bin or the Temporary Internet Files directory.
Upon checking the Security tab for the Temporary Internet Files directory, it shows System as a full access account to that directory.
I'm so puzzled by this issue. I may be missing something but I assumed the LocalSystem account has the highest privilege on a machine. Is that not the case?

Eclipse workspace as project's system root referece

Is there a way to set up a project in Eclipse so that if my code has a reference to the system root directory then it will point to my workspace instead? (I am not seeing anything in the Run Configurations that would help me with this.) Something like the equivalent of making a sym link / that points to my workspace directory.
I'm working on a perl project that has absolute references to the hosting Linux file system in what would be the production environment. Those directories don't exist in my development Eclipse environment. My workspace is located in an NFS space mounted on a cluster of servers that run Eclipse I access in my laptop via client software.
So root can be any server's local space within the cluster and I don't have any access to anything above the workspace, and so I can't create the directory structures I need. I would rather not hard-code alternate directory paths to accommodate differences between the sandbox and production environments and having to comment them out when deploying to the prod environment.
I'm not finding a straightforward answer online. Maybe I'm not articulating the question correctly and help with that would also be appreciated if that is the case.
No. Good practice is to have paths like that configurable at runtime, usually via an environment variable or command line argument, specifically to accommodate changes between development, sandbox, and production environments.

Deploy Mercurial Changes to Website Hosting Account

I want to move only the website files changed since the published revision to a hosting account using SSH or FTP. The hosting account is Linux based but does have have any version control installed, so I can't simply do an update there, and the solution must run on the local development machines.
I'm essentially trying to do what http://www.deployhq.com/ does, but for free. I want to publish changes without having to re-upload everything or manually choose the files to move. I'm open to simply using a bash script that compares versions and copies each file (how? not that great with bash) since we'll be using Linux for development, but something with a web interface would be nice.
Thanks in advance for the help!
This seems more like a job for rsync than one for hg, given that that target doesn't have hg installed.
Something like so:
rsync -avz /path/to/local/files/ remote_host:/remote/path/
This would transfer all files, recursively (-r), from .../local/files/ and place them in /remote/path. The -az compresses and perserves file attributes.
rsync takes care of only transferring files that have changed. Be sure to watch for trailing slashed when specifying source paths, they matter (see the link above).

How to use IIS app_offline.htm file with Azure

I have a brilliantly designed app_offline.htm file that I'd like to display on my site periodically when I'm doing things like backing up the DB. On a server with a real file system, this wouldn't be a problem: I'd just copy app_offline.htm to the my app's root, and IIS will work its magic and redirect all requests to this file.
However, I'm using Azure, so there's no real file system and there's no easy way move files around from one location to another.
How I can I make app_offline.htm play nicely with Azure?
I figured I'd add this, I haven't seen it mentioned yet. You can actually do this via web publish from Visual Studio (or WebMatrix) as well, just put app_offline.htm in the root of your project - the same level as your main web.config. When done, just rename it and redeploy to go back online. 2 clicks - easy.
The manual option is to drop it into your /site/wwwroot via FTP.
A little personal secret, none of your site files will be accessible, style sheets etc. So put your includes into an azure blob container, and viola.
Actually there is a real file system, as each VM instance runs on Windows 2008 Server (SP2 or R2 SP1). To see this for yourself, enable Remote Desktop for your deployment and connect to a running instance.
Knowing this, you should be able to set up a mechanism to perform a file-copy of your app_offline.htm to your app root based on some type of administrative command. You'll just need to make sure each of your web role instances perform this action.
David has provided you with a good answer. However, you might be missing out on what Azure can do for you. You should be able to virtually eliminate down time with Azure by running multiple instances and using SQL Azure which is triple backed up for you. You can also backup SQL Azure using http://msdn.microsoft.com/en-us/library/ff951624.aspx

How to do version control via ftp?

I have a web dev. client using a shared host that doesn't allow shell access, and thus no access to SVN, Git, etc. I've tried to convince him to move to one of the many cheap options that allow it, but he won't do it. If I use version control on my staging server, are there any tools that will allow me to replicate the changes to production via ftp? Locally I have both mac & windows, the staging server is linux, so something that works on any of those platforms....
Using your Linux staging server you could keep a separate checked out copy that you use specifically for that host and then use a utility to mirror that directory with the host server.
LFTP is useful for this kind of thing. Its available for most Linux distributions and includes a 'mirror' function:
Mirror specified source directory to
local target directory. If target
directory ends with a slash, the source base name is appended to
target
directory name. Source and/or target can be URLs pointing to
directories.
Some kind of ftp mirror software is what you need. Not tested it but a quick search gave me this Java application. You could run that over your up-to-date checked out repository.
Good thing for keeping SVN repo and FTP copy in sync is svn2web. May I suggest creating separate branch for production copy and do merges to that branch for uploading to production server.
You probably need to write a batch file that is able to
Export the SVN repository
Upload the exported files to your Linux server via FTP
Short of finding / implementing some FUSE based CoW file system that supports immutable versions .. I'd just find another (more developer friendly) host. As far as I know, no FTP server supports this natively, nor can I think of any elegant means of putting it in place with script hackery.
I could be wrong.
This question (and answer) really helped me just now as I implemented version control via gitolite on a separate server and lftp.
Here’s what I did:
Set up gitolite on my ubuntu staging server
created base repo (i.e. foo.git) on staging server
cloned foo.git into working directory on staging server
cloned foo.git into working directory on local development machine
Developed locally
Pushed changes to foo.git repo on staging server
On staging server, logged into working directory, and pulled in changes from foo.git
lftp-ed into shared host (like you mention above)
Once in shared host, ran:
mirror -R --only-newer --delete --parallel=10 /source/directory/ /target/directory
Notes on the mirror command options:
-R - this pushes the source/directory to the target/directory. (mirror pulls in from target to source without this, think reverse)
—only-newer - without this option, even if you only changed one file, the mirror command will send all the files in the source directory over to the target directory. with this option only the changed (newer) files are transferred over the wire.
—delete - deletes files that are no longer in the source directory but still in the target directory. one of my pushes involved deleting expired assets. without this option, the same files would have stayed put on my shared host after executing the mirror command.
—parallel=10 - transfers 10 files at once (instead of 1 by default). this made the process much faster
While this is what worked for me, I’m sure there are ways to improve on this. I was grateful for this question and thought i’d share my experience.
Rsync will do this over an FTP connection. You probably already have it installed if you’re on a Unix-like system.

Resources