Linux: Traverse Server Directory and build List of Checksums for all Files - linux

I am running a web server with several CMS sites.
To be aware of hacks on my web server, I am looking for a mechanism, by the help of which I can detect changed files in my web server.
I think of a tool / script, which traverses the directory structure, builds a Checksums for each file and writes out a list of files, with file size, last modified date and checksums.
At the next execution, I would then be able to compare this list with the previous one and detect new or modified files.
Dies anyone know a script or tool, which can accomplish this?

Related

Using haxe to edit remote file?

I've searched in haxelib for a library to use for remotely editing a file on a server using ssh connection with haxe, or listing files in directory..
Has any one done this with haxe?
I want to build a desktop app to create a yaml editor that will change settings files of several servers using a frontend like haxe-ui.
Ok, there are probably a lot of ways you could do it, but I would suggest separating your concerns:
desktop app to create a yaml editor
Ok, that's a fine use case for Haxe / a programming language. Build an editor, check.
change settings files (located on) several servers
Ok, so you have options here. Either
Make the remote files appear as local files via some network file system, or
Copy the files locally, edit them , and copy them back, or
Roll your own network-enabled service that runs on each server, receives commands, and modifies the files.
Random aside: Given that these are settings files, you probably also want to restart some service after changes are made.
I'd say option 2 is the easiest. There are even many ways to do that:
Use scp to both bring the settings files to a local location, edit them locally, and then push them back. And if you setup SSH keys, you won't have to bother with passwords.
Netcat is another tool for pushing bytes (aka files) over the network. It's simpler than scp, but with no security measures.
Or, get creative / crazy, and say, "my settings files will all be stored in a git repo. The 'sync' process will be a push / pull setup."
There are simply lots of ways to get this done.

Standard log locations for a cross platform application

I'm developing a cross-platform desktop application for Mac, Linux and Windows. The application will create a plain-text log file to help with debugging, amongst other things. What are people's recommendations for a sensible place to store the log on each of the platforms?
Here is my guess so far, based on web searches:
Mac: ~/Library/Logs/MY-APP-NAME/system.log
Linux: ~/.MY-APP-NAME/logs/system.log
Windows: %APPDATA%\MY-APP-NAME\logs\system.log
For Linux, the XDG Base Directory Specification is followed by some applications. Log files are not specifically called out as such. You can put them either into a subdirectory of the data directory ($XDG_DATA_HOME or $HOME/.local/share), where they will not be deleted automatically, or you could use a subdirectory of the cache directory ($XDG_CACHE or $HOME/.cache). In the latter case, the files could be automatically expired after some time.

JSON must be no more than 1000000 bytes

We have a Jenkins-Chef setup with a QA build project to a website for a client. The build gets the code from Bitbucket, and a script uploads the cookbooks from the Chef Client to the Chef Server.
These builds ran fine for a long time. Two days ago the automated and manual builds started failing with the following error (taken from the Jenkins console output):
Updated Environment qa
Uploading example-deployment [0.1.314]
ERROR: Request Entity Too Large
Response: JSON must be no more than 1000000 bytes.
From what I understand, JSON files are supposed to be related to nodejs which is what the developers use on this webserver.
We looked all over the config files for Jenkins, the Chef-Server and the QA server. We couldn't find a way to change this 1MB limit that is causing this error.
We tried changing client_max_body_size, didn't work.
We checked the JSON files size, non of them reach this limit.
Any idea where we can find a solution? Can this limit be changed? Is there anything we can do (Infrastructure wise) or should this be fixed from the developer side?
So first of all, the 1M value is more or less hardcoded, the chef-server is not intended to store large objects.
What happens is before uploading a cookbook, a json file with it's information is created, as this file will be stored in DB and indexed it should not exceed a too large size to avoid performances problems.
The idea is to upload to the chef-server only what is absolutely necessary, strip CVS directory, any IDE build/project file, etc.
Best solution to achieve it simply is using the chefignore file. It has to be created just under the cookbook_path.
The content of this is wildcard matches to ignore while uploading the cookbook so an example one could be:
*/.svn/* # To strip subversion directories
*/.git/* # To strip git directories
*~ # to ignore vim backup files

Coldfusion security issue...how to hide directory of files?

So, I decided to try to break my website...I googled my site by typing in site:mysite.com/whatever and behold, all of the users uploaded files were available for view under a specific directory.
What kind of script/ counter measure should I use to block these files from being viewed? I already have a script that checks the path and the logged in status, however this doesn't seem to be working. I've looked all over for solutions...but I can't quite find one. I'm using ColdFusion 8.
This isn't a ColdFusion issue so much as a web server configuration issue.
You should either:
configure your web server not to show a directory of files when using a URL without a filename (e.g., http://www.example.com/files/)
drop a blank default web document (index.html, index.htm, default.htm, index.cfm, whatever) into that directory so that it displays that document rather than the list of files. If you use index.cfm, it'll fire your Application.cfm/cfc in your file path and use whatever other security you've built.
(or, better, do both)
The best way to secure your file listings and the files themselves is to store them in another folder outside of the Web site root folder. You can then serve them up using CFDIRECTORY and CFCONTENT. The pages that display the files can check your access controls and only serve the files to those allowed to see them.

Creating a testing enviornment to test fixes without duplicating entire site

I have a rather large site with hundreds of files and a footprint of many hundreds of mgs. I have in a place an assignment system that we utilize to work on enhancements and bug fixes. What I'd like to do is setup a system in which each assignment gets pushed to the web server (testing server) in it's on "sandbox".
Typically I'd just create a copy of the site under a virtual directory, replace the files affected by the assignment and proceed with testing. Problem here is we would be making many copies of massive amounts of files.
What I have in my head would be a system where a "master" copy of the site contains all the current files (presumably from source control). From there create a virtual directory for each assignment with symbolic links to all files and folders except for those actually changed for the assignment.
I essentially want to create a integrated build process that will create the virtual, pull the sym links from master and then replace the links of the files that changed with the actual changed versions from the assignment.
Is this a possibility with Windows Server 2003 and IIS?
Probably possible, but sounds like a nightmare. Must all of these files be copied to each testing site? If they are merely content files (htm, gif, jpeg, etc), leverage VDIRS to a common location. It can even be a network location.

Resources