How to list CACLS (rights) for a folder and log to disk - security

What I want to know is how can I write the CACL rights of a folder recursively (folders and files) out to disk. I want to write two different folders out and compare them using two output files in a tool like winmerge. I have a website that works when it's setup using some manual steps but then when I publish from the build server something gets set with the CACLS automatically and the site get's access denied again when you try to browse to it. I know the TFS build server is doing something when it publishes but I am trying to figure out what is different after it publishes. I made a backup of the good folder that works, so I need to output the CACL's, etc of each folder and do a text compare. I already went through by eye and checked all the folders and files with the file properties viewer and looked for missing files or altered web.configs. It's not that. I have looked all over google and can't find a very good solution. Can someone help me?

From PowerShell you could run something like this:
gci -Recurse *.* | % {cacls $_} > c:\temp\cacls.txt
That should give you something you can compare in a diff tool.

Related

How persistent is data I put on my Azure WebApp via FTP?

I've been searching around and can't find any clear answers to this. I need a small amount of data - talking kilobytes, probably not ever reaching megabyte range - available as a file on my Azure instance, outside the web app itself, for a web job to work with. I won't get into why this is necessary, but it is (alternatives have been explored), and the question is now where to put those files. The obvious answer seems to be to connect to the FTP, create a directory, plop them there and work with them there.
I did a quick test and I'm able to create a "downloads" directory within the "data" directory, drop some files in it, and work with them there. It works great for this very small, simple need that I have.
How long will that data stay there? Is that directory purged at any point automatically by the servers? Is that directory part of any backups that are maintained? How "safe" is something I manually put outside the wwwroot folder?
It will never be purged. The only folder that can get purged is the %TEMP% folder. All other folders that you have write access to will be persisted forever.

Folder permissions in Azure web sites

Just getting my head around the new Azure web sites feature and hitting my first obstacle. I'm deploying a PHP site which writes cache data to the file system, but the app is throwing an error because the folder it wants to write to does not have write permission. Is it possible to set permissions on folders or is this a no-no?
I can probably work round this but would like to know if it's possible.
Folder permissions cannot be set/customized. This means whatever location your app writes to should be under your site root.
Your site can only write to locations under C:\DWASFiles\Sites\[siteName]\VirtualDirectory0 and to the %TEMP% folder.
Two caveats here:
Stuff can't be written directly under VirtualDirectory0, you have to create a subfolder under there and place your files in that subfolder
The %TEMP% folder really is temporary! If your site instance goes down for any reason and is brought back up somewhere else then everything in your %TEMP% folder will be gone. Use it only for files that really are temporary.
Is the folder that the app is trying to write to under the site's folder?
It's my understanding that folder permissions cannot be set/changed. But I haven't seen anything from Microsoft that definitively says "yes" or "no" to that.
It should be possible using webdeploy.
However I don't think there is a way do it without manually setting up the webdeploy package - as described in the post http://blogs.msdn.com/b/azureappgallery/archive/2013/04/03/set-file-folder-permissions-for-your-content-on-azure-website-using-web-deployable-package.aspx.

Deploy Mercurial Changes to Website Hosting Account

I want to move only the website files changed since the published revision to a hosting account using SSH or FTP. The hosting account is Linux based but does have have any version control installed, so I can't simply do an update there, and the solution must run on the local development machines.
I'm essentially trying to do what http://www.deployhq.com/ does, but for free. I want to publish changes without having to re-upload everything or manually choose the files to move. I'm open to simply using a bash script that compares versions and copies each file (how? not that great with bash) since we'll be using Linux for development, but something with a web interface would be nice.
Thanks in advance for the help!
This seems more like a job for rsync than one for hg, given that that target doesn't have hg installed.
Something like so:
rsync -avz /path/to/local/files/ remote_host:/remote/path/
This would transfer all files, recursively (-r), from .../local/files/ and place them in /remote/path. The -az compresses and perserves file attributes.
rsync takes care of only transferring files that have changed. Be sure to watch for trailing slashed when specifying source paths, they matter (see the link above).

ftp push changed files to server

Is there an easy way for me to automatically search "recursively" through a directory and put changed files up through ftp to my live server in their correct spots?
CLI is ideally what I'm looking for.
I'm tired of manually searching out the files I need to do and doing it individually or by queue, trying to make this quick and painless
If the server is under your control, you might want to try rsync instead of FTP.
rsync is the way to go for keeping directories balanced. +1 for Frederic.
The other way to go is change management, like Subversion. Once you set it up, files checked in over time can be brought to productions with a simple "svn up" command.
Subversion: http://www.wandisco.com/subversion/os/downloads

Cruisecontrol, deployment, folder permissions

We're using cruisecontrol.net, it builds the version, creates a zip file, then 15 min later, unzips the file on the Integration server. But when the folder gets to the integration server, often, the security permission on one of the folders is totally hosed. The Domain admin and folder owner can't even open the folder in explorer. We reboot and the folder permissions are good we can delete the folder and redeploy the zip file and it's okay.
Does anyone have any idea what or how the folder permissions are getting so messed up?
Any tools to use to diagnose/watch what exactly is messing it up?
Have you tried using psexec from system internals to upzip to file on the remote machine rather than the build machine?
Also, it seems to me that rather than unzipping the zip just copy the stuff directly to the remote server. I'm not seeing the reason to zip it and then just unzip it?

Resources