I have recently launched a website on GoDaddy hosting. I have keept some images and JavaScript files used in website, in separate folders. I want to prevent the users from browsing those images and files by simply appending the folder and file name in the website URL. For example
www.example.com/images/logo.png
If I understand correctly, you want to have html file with images, that shouldn't be accessible alone? If yes, then it cannot be done. You can watch for correct HTTP Referrer header, but it can be simply faked and it also makes it inaccessible for browsers that don't send referrer or having sending it forbidden for "privacy" reasons.
If you want hide files to be accessible only by server side scripts, ftp/scp, then you can try to use .htaccess (if GoDaddy runs on Apache) and correct configuration: https://httpd.apache.org/docs/2.2/howto/access.html
Another way could be hiding that files and creating one-shot token like this:
<img src=<?pseudocode GEN_TOKEN("file.jpg") ?> /> with another file serving these hidden files just for generated token, then deleting it from DB. Nevertheless, this will not protect anybody from downloading or accessing these files, if they want...
But, anyway, try to clarify your question better...
If you are keeping images/files in folder which is open to public, I guess you kept in that folder for purpose, you want public to access those images and files.
How public know images file name? Stop file content listing for your web site.
I am not aware which language you are using on web server, but in ASP.NET you may write module/ middle ware which can intercept in coming request and based on your logic (e.g. authentication and authorization) you can restrict access. All modern languages support this kind of functionality.
Related
I'm new to web development and i want to ask that why some website have the "/"?
for example https://www.roblox.com/home, notice the "/home" what does that called
I have tried to search on google and i can't find the answer
And some website have like "/login.php", "/index.html" it can also be html?
These are URLs (https://en.wikipedia.org/wiki/URL) and they identify the resource you are trying to reach. I would suggest reading more about how web pages works to get a better general overview of things(e.g.: https://developer.mozilla.org/en-US/docs/Learn/Getting_started_with_the_web/How_the_Web_works)
How these resources are actually interpreted depends on the server side implementation:
.php are usually processed by PHP web server
Other static files such as images (*.png , *.jpg, etc), html files, svgs, CSS, js, etc - Are usually located in the local server by the web server (httpd, tomcat, IIS, nodejs, and many many others) and the files as transmitted to the client 'as-is'
When using online tools to build websites, these complexities are usually abstracted away, and in the end URLs will just mean a resource identifier.
[domain]/[section]/[page(.html|.php)|resource(.js|.css)]
domain: the address of the website
section: a way to navigate inside the website itself
page: the user interface that might be rendered server side of client side hold the controls shown to user
resource: files that changes how the content in the pages looks and behaves like
My site is under security audit to get the security certification. After audit they gave me two security issues to look at.
Stored Cross Site Scripting: The application must implement server side validation for all user-entered inputs. Only expected values should be accepted. Script tags should be rejected. All user inputs should be sanitized.
Malicious File Upload
I have added the at filter tags in Joomla global configuration text filters. And also though I have clearly stated for all file upload elements to only use .jpg,.jpeg,.png extensions, I can still upload .php extension files.
How can we rectify these two issues?
Regards
Use the defines.php file to clean the POST data before it reaches the Joomla site, and block any request with $_FILES in it.
If your website needs to allow users to upload files, then make sure that these files only consist of specific file types, and, if you don't need the external world to have access to these files, then block access to these files (in the folder you have them uploaded to) using an htaccess rule.
Say I have folderX on the Public_html folder of my web-server.
A) If I rename the folderX to something very long and random, is it technically possible for someone to access the files in that folder? (other than brute forcing the folder name, which should be slim chances).
B) since there is no link to the files in the renamed folder, or the folder itself, the web crawlers and search engines won't be able to index its content, right?
I understand that this is not a normal way to secure content, and it is recommend to move non-public data to the web-server root ( above the public_html) folder, or password protect them with .htaccess or so. But here I am asking what are chances, and if it is technically possible, and how?
Edit.
I thought about putting the name of folder in the robots.txt file to also make sure it is excluded from the web crawlering bots. But it seems counterproductive!! The robots.txt file is not obligatory for robots to follow, and by revealing the name of the folder that file a malicious bot can intentionally go there and crawl it. Am I missing something?
Yes it is.
If the connection is over plain HTTP, then any network sniffers could
determine the URL that is being accessed. The solution to this it to implement certificates and TLS so that the URL is HTTPS, protecting the path and query string portions.
Even if the connection is HTTPS, many corporate networks decrypt the connection on an outbound proxy because the certificate the proxy server uses is trusted by the client. This may reveal your URL path to network administrators if your URLs are accessed from corporate locations.
If there are any outbound links or external resources on your "hidden" pages, the referer header will leak the URL of your hidden pages to them.
Tools such as Nikto or Dirbuster can find common hidden URLs.
Don't use robots.txt for the reasons you describe. However, meta tags can be used to prevent the indexing of HTML pages.
Let's say there's a website www.example.com/user/john. Accessing that link takes you to www.example.com/user/john/index.html.
There are files like www.example.com/user/john/picture.png and www.example.com/user/john/document.html. These are also accessible to the public, but there's no link to these from index.html.
Is there a systematic way to find out these files? I'm asking because I'm going to set up my website, and I also want to put up a few files that I don't necessarily want every one to see, only people who I give the link to. So I'm wondering how easy/hard it is to find out that those files exist in my directory.
Most importantly you have to switch off the possibility to just browse the directory with the browser. Every server has its own way to switch this off. Then you can use the proposed way of "security through obscurity".
Another way can be, to have a specific folder whos access is restricted by a http basic authentication. This can be configured in the .htaccess file which you put in the root folder of your directory you want to share only with specific people.
Just google ".htacces" and "basic authentication".
HTTP does not provide a directory listing method as such. Most web servers can be set up to provide a HTML-formatted directory listing if an index file such as index.html does not exist. If you provide an index file so that autoindexing does not happen (or if you disable autoindex by web server configuration), and if your "hidden" file names are hard to guess, they should be pretty hard to find. I sometimes publish such files in a directory with a random gibberish name.
"Share links" used by Dropbox, Picasa and other services do the same, they just use longer random file/directory names or random parameters in the URL.
To provide real security you'll want to set up https (SSL/TLS) so that any eavesdroppers on the network cannot easily look at the requested URLs, and authentication such as HTTP Basic Authentication with username/password. For less sensitive stuff, http to a random hidden directory will often do fine.
Hi suppose my site as www.xyz.com and i have a folder as _Userfile which have file uploaded by my users and if they download there file the link is www.xyz/_Userfile/userfile.doc now i want to learn this:
if some one has the link to other user file he can download it i want to solve this(privacy)
2: protect my site file from website downloader.
ASAP plz
Also i am using virtual directory to save my user files so i need a way to protect any type of file to be downloaded by any kind of software
You'll have to implement an authentication mechanism, and to serve those files through a server-side application (in PHP, Java or whatever), that checks if the authenticated user has the right to access a resource, then reads the resource from the disk and writes it to the HTTP response. The documents should be placed in a location that is not directly accessible through HTTP.
Just add index.html file in the folder _Userfile... This will prevent others accessing the whole directory listing in _UserFile folder! Simple isn't it?