My site is under security audit to get the security certification. After audit they gave me two security issues to look at.
Stored Cross Site Scripting: The application must implement server side validation for all user-entered inputs. Only expected values should be accepted. Script tags should be rejected. All user inputs should be sanitized.
Malicious File Upload
I have added the at filter tags in Joomla global configuration text filters. And also though I have clearly stated for all file upload elements to only use .jpg,.jpeg,.png extensions, I can still upload .php extension files.
How can we rectify these two issues?
Regards
Use the defines.php file to clean the POST data before it reaches the Joomla site, and block any request with $_FILES in it.
If your website needs to allow users to upload files, then make sure that these files only consist of specific file types, and, if you don't need the external world to have access to these files, then block access to these files (in the folder you have them uploaded to) using an htaccess rule.
Related
I have recently launched a website on GoDaddy hosting. I have keept some images and JavaScript files used in website, in separate folders. I want to prevent the users from browsing those images and files by simply appending the folder and file name in the website URL. For example
www.example.com/images/logo.png
If I understand correctly, you want to have html file with images, that shouldn't be accessible alone? If yes, then it cannot be done. You can watch for correct HTTP Referrer header, but it can be simply faked and it also makes it inaccessible for browsers that don't send referrer or having sending it forbidden for "privacy" reasons.
If you want hide files to be accessible only by server side scripts, ftp/scp, then you can try to use .htaccess (if GoDaddy runs on Apache) and correct configuration: https://httpd.apache.org/docs/2.2/howto/access.html
Another way could be hiding that files and creating one-shot token like this:
<img src=<?pseudocode GEN_TOKEN("file.jpg") ?> /> with another file serving these hidden files just for generated token, then deleting it from DB. Nevertheless, this will not protect anybody from downloading or accessing these files, if they want...
But, anyway, try to clarify your question better...
If you are keeping images/files in folder which is open to public, I guess you kept in that folder for purpose, you want public to access those images and files.
How public know images file name? Stop file content listing for your web site.
I am not aware which language you are using on web server, but in ASP.NET you may write module/ middle ware which can intercept in coming request and based on your logic (e.g. authentication and authorization) you can restrict access. All modern languages support this kind of functionality.
Say I have folderX on the Public_html folder of my web-server.
A) If I rename the folderX to something very long and random, is it technically possible for someone to access the files in that folder? (other than brute forcing the folder name, which should be slim chances).
B) since there is no link to the files in the renamed folder, or the folder itself, the web crawlers and search engines won't be able to index its content, right?
I understand that this is not a normal way to secure content, and it is recommend to move non-public data to the web-server root ( above the public_html) folder, or password protect them with .htaccess or so. But here I am asking what are chances, and if it is technically possible, and how?
Edit.
I thought about putting the name of folder in the robots.txt file to also make sure it is excluded from the web crawlering bots. But it seems counterproductive!! The robots.txt file is not obligatory for robots to follow, and by revealing the name of the folder that file a malicious bot can intentionally go there and crawl it. Am I missing something?
Yes it is.
If the connection is over plain HTTP, then any network sniffers could
determine the URL that is being accessed. The solution to this it to implement certificates and TLS so that the URL is HTTPS, protecting the path and query string portions.
Even if the connection is HTTPS, many corporate networks decrypt the connection on an outbound proxy because the certificate the proxy server uses is trusted by the client. This may reveal your URL path to network administrators if your URLs are accessed from corporate locations.
If there are any outbound links or external resources on your "hidden" pages, the referer header will leak the URL of your hidden pages to them.
Tools such as Nikto or Dirbuster can find common hidden URLs.
Don't use robots.txt for the reasons you describe. However, meta tags can be used to prevent the indexing of HTML pages.
Let's say there's a website www.example.com/user/john. Accessing that link takes you to www.example.com/user/john/index.html.
There are files like www.example.com/user/john/picture.png and www.example.com/user/john/document.html. These are also accessible to the public, but there's no link to these from index.html.
Is there a systematic way to find out these files? I'm asking because I'm going to set up my website, and I also want to put up a few files that I don't necessarily want every one to see, only people who I give the link to. So I'm wondering how easy/hard it is to find out that those files exist in my directory.
Most importantly you have to switch off the possibility to just browse the directory with the browser. Every server has its own way to switch this off. Then you can use the proposed way of "security through obscurity".
Another way can be, to have a specific folder whos access is restricted by a http basic authentication. This can be configured in the .htaccess file which you put in the root folder of your directory you want to share only with specific people.
Just google ".htacces" and "basic authentication".
HTTP does not provide a directory listing method as such. Most web servers can be set up to provide a HTML-formatted directory listing if an index file such as index.html does not exist. If you provide an index file so that autoindexing does not happen (or if you disable autoindex by web server configuration), and if your "hidden" file names are hard to guess, they should be pretty hard to find. I sometimes publish such files in a directory with a random gibberish name.
"Share links" used by Dropbox, Picasa and other services do the same, they just use longer random file/directory names or random parameters in the URL.
To provide real security you'll want to set up https (SSL/TLS) so that any eavesdroppers on the network cannot easily look at the requested URLs, and authentication such as HTTP Basic Authentication with username/password. For less sensitive stuff, http to a random hidden directory will often do fine.
I am stuck at the situation where I want the url, which contains a folder having some files (html, swf etc.), to be accessible after I validate the user.
For example.
The url to access is:
A - http://mysite.com/files/version/1/file.swf
And this above url is accessible from the link,
B - http://mysite.com/view/1
I have implemented a way to hide the URL A from a normal user but if the user somehow is a semi-techie person then he can know the swf file location from firebug or other tools. So, to make the access-to-file secure what should I do?
If a user somehow knows the first url(A) and then enters it in browser, i have to check if the user is logged-in and if validation is done it lets the url A to be loaded.
Since, in CI, the controller names cannot be named same as the folders in the root directory, in this case i cannot have a controller called “files”. So, the only option left to make this secure access to url work is to use htaccess rule/cond. If this is the only option, then how can it be achieved by htaccess and if not, then what other options do i have.
Will the codeigniter's URI Routes work because when i tried like this:
$route[‘files/version/1/(:any)’] = “view/$1”;
and it doesnt work, maybe because there is no controller/function/param as files/versions/1 ...
looking for quick help. Thanks
There isn't a sure-fire way to do it without, for example, using .htpasswd.
One thing you could implement is sort of "Security by Obscurity". In that case you could redirect all requests to a file to the URL http://mysite.com/view/file-id and then instead of loading the requested file directly, you would load a .php template with the appropriate headers - be it an image, a flash file or anything else.
But it really depends on how the files are going to be managed, since every file will need an entry in the database and you would have to output different headers for different types of files. And if someone still manages to guess the path to the file, it will be directly accessible.
I have a folder in my web server used for the users to upload photos using an ASP page.
Is it safe enough to give IUSR write permissions to the folder? Must I secure something else?
I am afraid of hackers bypassing the ASP page and uploading content directly to the folder.
I'm using ASP classic and IIS6 on Windows 2003 Server. The upload is through HTTP, not FTP.
Edit: Changing the question for clarity and changing my answers as comments.
also, I would recommend not to let the users upload into a folder that's accessible from the web. Even the best MIME type detection may fail and you absolutely don't want users to upload, say, an executable disguised as a jpeg in a case where your MIME sniffing fails, but the one in IIS works correctly.
In the PHP world it's even worse, because an attacker could upload a malicious PHP script and later access it via the webserver.
Always, always store the uploaded files in a directory somewhere outside the document root and access them via some accessing-script which does additional sanitizing (and at least explicitly sets a image/whatever MIME type.
How will the user upload the photos? If you are writing an ASP page to accept the uploaded files then only the user that IIS runs as will need write permission to the folder, since IIS will be doing the file I/O. Your ASP page should check the file size and have some form of authentication to prevent hackers from filling your hard drive.
If you are setting up an FTP server or some other file transfer method, then the answer will be specific to the method you choose.
You'll have to grant write permissions, but you can check the file's mime type to ensure an image. You can use FSO as so:
set fs=Server.CreateObject("Scripting.FileSystemObject")
set f=fs.GetFile("upload.jpg")
'image mime types or image/jpeg or image/gif, so just check to see if "image" is instr
if instr(f.type, "image") = 0 then
f.delete
end if
set f=nothing
set fs=nothing
Also, most upload COM objects have a type property that you could check against before writing the file.
Your best bang for the buck would probably be to use an upload component (I've used ASPUpload) that allows you to upload/download files from a folder that isn't accessible from the website.
You'll get some authentication hooks and won't have to worry about someone casually browsing the folder and downloading the files (or uploading in your case), since the files are only available through the component.