We created an OpenGrok server and indexed our sources. The problem is that the SCM we use (here, Perforce, but I guess this does not apply to Perforce only as permission at file/folder level is widely used) restrains access per file or folder, and OpenGrok doesn't !
So today, any user performing a search with OpenGrok will retrieve all files, even the ones for which he/she should not have access to ! Which is, in my opinion, a blocker: we will never release such a security breach in production.
Do you know any setup/workaround to implement such a security ?
EDIT : this OpenGrok should be used by anyone, I could of course retrieve Perforce sources with my permissions (so I would only get the files I have permissions for) to perform searches but this would not answer this requirement of widespread audience.
https://github.com/OpenGrok/OpenGrok/issues/503
feel free to join debate there(or in similar requests), ev. send patches
Did you consider creating a different p4 user for just syncing your source code for OpenGrok indexing? That user can have limited access based on the entries in protections table of P4. That way you can sync the code at folder level while hiding the sub folders based on protections table permissions.
I have done a similar setup for my opengrok instance :)
Well, I finally found a workaround:
locate your tomcat server XML config file (mine is located in .../apache-tomcat-8.0.52/conf/server.xml)
add the following markup in Server > Service > Engine > Host:
< Valve allow="< list of IPs allowed>" className="org.apache.catalina.valves.RemoteAddrValve" deny="" />
I have a daily script that generates this list of IPs from the workstations allowed and updates this file accordingly. This list is like "1.2.3.4|5.6.7.8|6.2.5.3".
Related
I've started working on an Azure project. In terms of config, I currently have three files: ServiceConfiguration.Cloud.cscfg, ServiceConfiguration.Local.cscfg and ServiceDefinition.csdef.
ServiceDefinition.csdef is the template file for the csfg files. ServiceConfiguration.Cloud.cscfg contains all the actual Azure configuration, including DB passwords, SAS keys etc.
Should ServiceConfiguration.Cloud.cscfg be checked into source control? I wouldn't have thought so but a quick search on github for the file shows that it is.
If it should be checked in, how should the sensitive password data be managed?
I typically check in the configurations. The reason is that the behavior of your application will change dramatically depending on these configurations. For example -> number of roles for a distributed application directly affects how you process incoming messages and the vmsize directly affects how much memory you have. You may encounter issues debugging problems if each developer is using a different configuration. This standardizes your deployment.
Anything with plain-text password information shouldn't be checked into a public repo unless you want people to have access to that information.
You can add this file to the .gitignore file and prevent it from being checked in.
Provide a different ServiceConfiguration.Cloud.cscfg named something like ServiceConfiguration.Cloud.cscfg.template with all the config info of your cloud service minus the password values. If someone forks your project they need to use that and fill in the appropriate values and rename the file.
Do this and change all your passwords to something else. Even if you delete this file from the repo, it still exists in the history and anyone can view it.
What are the best steps to take to prevent bugs and/or data loss in moving servers?
EDIT: Solved, but I should specify I mean in the typical shared hosting environment e.g. DreamHost or GoDaddy.
Bootstrap config is the smartest method (Newism has a free bootstrap config module). I think it works best on fresh installs myself, but ymmv.
If you've been given an existing EE system and need to move it, there are a few simple tools that can help:
REElocate: all the EE 2.x path and config options, in one place. Swap one URL for another in setup, check what's being set and push the button.
Greenery: Again, one module to rule them all. I've not used this but it's got a good rating.
So install, set permissions, move files and and DB, and then use either free module. If you find that not all the images or CSS instantly comes back online, check your template base paths (in template prefs) and permissions.
I'm also presuming you have access to the old DB. If not, and you can't add something simple like PHPMyAdmin to back it up, try:
Backup Pro(ish): A free backup module for files and db. Easy enough that you should introduce it to the site users (most never consider backups). All done through the EE CP. The zipped output can easily be moved to the new server.
The EE User Guide offers a reasonably extensive guide to Moving ExpressionEngine to Another Server and if you follow all of these steps then you will have everything you need to try again if any bugs or data loss occur.
Verify Server Compatibility
Synchronize Templates
Back-up Database and Files
Prepare the New Database
Copy Files and Folders
Verify File Permissions
Update database.php
Verify index.php and admin.php
Log In and Update Paths
Clear Caches
As suggested by Bitmanic, a dynamic config.php file helps with moving environments tremendously. Check out Leevi Graham's Config Bootstrap for a quick and simple solution. This is helpful for dev/staging/prod environments too!
I'd say the answer is the same as any other system -- export your entire database, and download all of your files (both system and anything uploaded by users - images, etc). Then, mirror this process by importing/uploading to the new server.
Before I run my export, I like to use the Deeploy Helper module to change all of my file paths in EE to the new server's settings.
Preventing data loss primarily revolves around the database and upload directories.
Does your website allow users to interact with the database? If so at some point you'll need to turn off EE to prevent DB changes. If not that you don't have too much to worry about as you can track and changes on the database end between the old and new servers.
Both Philip and Derek offer good advice for migrating EE. I've also found that having a bootstrap config file helps tremendously - especially since you can configure your file upload directories directly via config values now (as of EE2.4, I think).
For related information, please check out the answers to this similar Stack Overflow question.
I have an installed and configured squirrelmail in my linux server which i used to send and receive emails.
Now i have to format the linux server... then before formatting how can i backup my emails and configuration so that they can be used again ?
Backing up your email messages is not a SquirrelMail issue. SquirrelMail is an IMAP client and does not store email itself. You need to determine what kind of storage is used for your particular email system. If it's a very simple/default *nix email setup, you might start by looking in /home/ for a directory with a name indicative of the purpose, such as "Mail", "Maildir" or similar. You might also look in /var/mail or /var/spool/mail
There is some starter information on some ways to migrate email between servers here: http://squirrelmail.org/docs/admin/admin-11.html#ss11.2
Also, you might want to re-think why you need to format the whole system. *nix systems don't need to be treated like Windows systems do. They can usually be rearranged, expanded, tweaked and otherwise changed without the need for reformatting.
As for SquirrelMail itself, there are a couple things you may want to back up, which would be any configuration files for SquirrelMail itself (in its "config" directory) and any plugins you'd installed (you can usually just copy the entire plugin directory for most typical plugins and transport them to the new system with minimal hassle), any custom themes you may have had in the "themes" directory, and finally, all user preferences. The location of your user preferences depends on your configuration - might be in a database or might be wherever the "$data_dir" setting points to (find this by looking in "config/config.php" or by using SquirrelMail's configuration tool ("config/conf.pl"). If you have user preferences stored in a directory, you can normally just copy the whole directory. Note that even if you have SquirrelMail configured to keep user preferences in a database, some plugins will still use the data directory setting for some purposes, so it's advisable to back up that directory no matter what.
The wiki page at SquirrelMail detailing upgrades is the same thing you need: http://squirrelmail.org/docs/admin/admin-4.html
When backing up and migrating things like this between servers, you need to be very careful about file/directory ownership/permissions on both your email data and your application configuration and preferences data. If user and system account names and UIDs are not the same between the servers, you'll want to be very careful that you adjust the ownership of the files to suit the destination server.
I have mercurial setup by following these instructions.
I'm trying to understand where or what file to setup the users in. Everything I've read seems kind of cryptic... it gives all these snippets of code saying use this but it seems to be leaving out steps of how it's all connected and what file to put the snippets of code in... can someone please de-mystify all this for the ID10T#TheKeyboard?
Keep in mind that the basic model of Mercurial cannot actually prevent anybody from checking something in. The only thing it can do is prevent those users from uploading something to the your copy of the repository.
IIS can set up authentication so that Mercurial knows which user is doing the uploading and so only certain users are even allowed to try to upload. If all you care about is limiting who has commit access to your repository you can stop right here. But if you want something finer grained, I think you are currently out of luck.
But, if it ever ends up working with web server authentication, you'll have to use the ACL extension if you want finer grained access control than simple who's allowed to send changesets to your repository.
The way the ACL extension works when changes are being sent over a network is as a pre-transaction hook on changegroups (a set of Mercurial revisions). It can look through these changegroups to make sure all the changes satisfy a given set of criteria. There are a wide variety of criteria that can be specified.
The ACL extension can be configured either in the global hgrc file, in which case it applies to all repositories, or the .hg/hgrc file of the repository you want to control access to. In my opinion the global option isn't terribly useful.
Check out the "Securing Mercurial" section here:
http://win1337ist.wordpress.com/tag/mercurial-iis7/
Also see this related question that has a lot of good info:
How to setup Mercurial and hgwebdir on IIS?
For example some parts of the client spec map files from a 'Published' depot. Is there a way for these so-called published maps to be forced read-only, i.e. they cannot be opened for edit?
For example (ignoring [not editable]):
//Development/foo/... //client/foo/...
//Published/bar/1.0/... //client/bar/... [not editable]
//Published/qux/2.0/... //client/qux/... [not editable]
In other words I want to prevent files from being opened (say, being locked by default) and am wondering if this can be enforced at the client spec level.
If not, is there a way available without making the 'Published' depot read-only to certain users?
The only way to do this properly is via the Perforce permissions table. You get to this either with p4 protect command or via the Admin menu in P4V.
Just open it up - you need admin rights - and add a line to the table to mark those files as read only. That will allow your clients to sync to the files but not be allowed to open for edit (or delete etc).
You have ultimate control in the permissions table. You could also refine this to allow just a subset of users to be able to modify the file, while everyone else sees it as read only.
The Perforce admin guide is pretty good on the permissions table. Direct link here.
you can use p4 lock, which prohibits others from submitting changes.
you can also manage user access, see p4 protect and p4 group. i believe this can be used on a file by file basis.
You could create a dummy client and lock those files on the dummy client. Then, just don't let anyone use that client, e.g. by putting it on a server.