I want to protect my files on the server from being downloaded but they need to be able to be accessed from/by the server, I was wondering if that would be possible with htaccess. I mean files like "font files", "images" etc.
I have tried the above but without luck. If you know a solution with htaccess or any other way, please reply!
One way I can see this working, is if you are using a server-side language, like PHP, you could do something like this in your HTML pages:
<!-- Your font(s) -->
<link href='http://yoursite.com/font-file?request-token=a3DdsS2n89SDF4sdf345' rel='stylesheet' type='text/css'>
where you create some unique token in the URL, that you automatically generate, with PHP, every time you serve an HTML page, and then save that token on the server. Then use another PHP script to serve the font file, that authenticates the token in the URL via $_GET['request-token'], etc. If you only allow a token to be used once, before refusing any further requests for it, you can embed the font in your HTML and any user attempts at downloading it afterwards will fail.
This isn't a completely bulletproof method, but will deter the majority of attempts to steal your font assets, etc.
Related
I'm new to web development and i want to ask that why some website have the "/"?
for example https://www.roblox.com/home, notice the "/home" what does that called
I have tried to search on google and i can't find the answer
And some website have like "/login.php", "/index.html" it can also be html?
These are URLs (https://en.wikipedia.org/wiki/URL) and they identify the resource you are trying to reach. I would suggest reading more about how web pages works to get a better general overview of things(e.g.: https://developer.mozilla.org/en-US/docs/Learn/Getting_started_with_the_web/How_the_Web_works)
How these resources are actually interpreted depends on the server side implementation:
.php are usually processed by PHP web server
Other static files such as images (*.png , *.jpg, etc), html files, svgs, CSS, js, etc - Are usually located in the local server by the web server (httpd, tomcat, IIS, nodejs, and many many others) and the files as transmitted to the client 'as-is'
When using online tools to build websites, these complexities are usually abstracted away, and in the end URLs will just mean a resource identifier.
[domain]/[section]/[page(.html|.php)|resource(.js|.css)]
domain: the address of the website
section: a way to navigate inside the website itself
page: the user interface that might be rendered server side of client side hold the controls shown to user
resource: files that changes how the content in the pages looks and behaves like
i have some problem with .htaccess file.
For prevent download or print of pdf documents , i am using PDF.js for reading contents.
Now i want to disable direct http connection to those files.
Inside the pdf.js folders, i put a directory called "doc", that contains all items and this .htaccess:
Order allow,deny
Deny from all
<Files ~ "viewer\.html$">
Allow from all
</Files>
Where viewer.html is the page that contains the documents reader.
So, when i try access from my browser to
localhost:8080/test/pdfjs/web/viewer.html?file=doc/mondia.pdf
i get:
Unexpected server response (403) while retrieving PDF "../test/pdfjs/web/mondia.pdf"
Where i am wrong?
If PDF.js is running inside the user's web browser, then the user needs to be able to download the PDF document. Apache can't (reliably) tell the difference between "PDF.js on the user's computer" and "Google Chrome on the user's computer" - both are HTTP requests from the user's computer for the resource.
If you really wanted to, you might be able to detect some header set by PDF.js when it requests the PDF, and refuse requests without that header. That would stop casual users directly accessing the file, but anyone who presses F12 in their browser could see the PDF being downloaded by PDF.js and save the contents from there.
Even if you served it in some form other than PDF, the user could copy and paste the resulting HTML, or take a screenshot of how it renders to the screen.
Stopping a user doing something with their own computer is fundamentally hard; if they can read something on their screen, you have sent it to them in some form. To really block them, you need a trusted "DRM" encryption system that renders directly to screen without ever making decrypted data accessible to the user. In the vast majority of cases, that would be completely overkill, and just annoy your users (for instance, blind users probably won't be able to access the content, as their screen reader software will not be trusted).
You can try with this plugin
https://it.wordpress.org/plugins/editionguard-for-woocommerce-ebook-sales-with-drm/#description
or similar,
DRM is the best solution for wordpress site.
Or try with this header in pdf-js
How to set range header from client with pdf.js?
Please edit the .htacess file present in Vtiger_root_location/storage
add 'pdf' option as follows:
I have recently launched a website on GoDaddy hosting. I have keept some images and JavaScript files used in website, in separate folders. I want to prevent the users from browsing those images and files by simply appending the folder and file name in the website URL. For example
www.example.com/images/logo.png
If I understand correctly, you want to have html file with images, that shouldn't be accessible alone? If yes, then it cannot be done. You can watch for correct HTTP Referrer header, but it can be simply faked and it also makes it inaccessible for browsers that don't send referrer or having sending it forbidden for "privacy" reasons.
If you want hide files to be accessible only by server side scripts, ftp/scp, then you can try to use .htaccess (if GoDaddy runs on Apache) and correct configuration: https://httpd.apache.org/docs/2.2/howto/access.html
Another way could be hiding that files and creating one-shot token like this:
<img src=<?pseudocode GEN_TOKEN("file.jpg") ?> /> with another file serving these hidden files just for generated token, then deleting it from DB. Nevertheless, this will not protect anybody from downloading or accessing these files, if they want...
But, anyway, try to clarify your question better...
If you are keeping images/files in folder which is open to public, I guess you kept in that folder for purpose, you want public to access those images and files.
How public know images file name? Stop file content listing for your web site.
I am not aware which language you are using on web server, but in ASP.NET you may write module/ middle ware which can intercept in coming request and based on your logic (e.g. authentication and authorization) you can restrict access. All modern languages support this kind of functionality.
I am stuck at the situation where I want the url, which contains a folder having some files (html, swf etc.), to be accessible after I validate the user.
For example.
The url to access is:
A - http://mysite.com/files/version/1/file.swf
And this above url is accessible from the link,
B - http://mysite.com/view/1
I have implemented a way to hide the URL A from a normal user but if the user somehow is a semi-techie person then he can know the swf file location from firebug or other tools. So, to make the access-to-file secure what should I do?
If a user somehow knows the first url(A) and then enters it in browser, i have to check if the user is logged-in and if validation is done it lets the url A to be loaded.
Since, in CI, the controller names cannot be named same as the folders in the root directory, in this case i cannot have a controller called “files”. So, the only option left to make this secure access to url work is to use htaccess rule/cond. If this is the only option, then how can it be achieved by htaccess and if not, then what other options do i have.
Will the codeigniter's URI Routes work because when i tried like this:
$route[‘files/version/1/(:any)’] = “view/$1”;
and it doesnt work, maybe because there is no controller/function/param as files/versions/1 ...
looking for quick help. Thanks
There isn't a sure-fire way to do it without, for example, using .htpasswd.
One thing you could implement is sort of "Security by Obscurity". In that case you could redirect all requests to a file to the URL http://mysite.com/view/file-id and then instead of loading the requested file directly, you would load a .php template with the appropriate headers - be it an image, a flash file or anything else.
But it really depends on how the files are going to be managed, since every file will need an entry in the database and you would have to output different headers for different types of files. And if someone still manages to guess the path to the file, it will be directly accessible.
In Firefox or Chrome I'd like to prevent a private web page from making outgoing connections, i.e. if the URL starts with http://myprivatewebpage/ or https://myprivatewebpage/ in a browser tab, then that browser tab must be restricted so that it is allowed to load images, CSS, fonts, JavaScript, XmlHttpRequest, Java applets, flash animations and all other resources only from http://myprivatewebpage/ or https://myprivatewebpage/, i.e. an <img src="http://www.google.com/images/logos/ps_logo.png"> (or the corresponding <script>new Image(...) must not be able to load that image, because it's not on myprivatewebpage. I need a 100% and foolproof solution: not even a single resource outside myprivatewebpage can be accessible, not even at low probability. There must be no resource loading restrictions on Web pages other than myprivatewebpage, e.g. http://otherwebpage/ must be able to load images from google.com.
Please note that I assume that the users of myprivatewebpage are willing to cooperate to keep the web page private unless it's too much work for them. For example, they would be happy to install a Chrome or Firefox extension once, and they wouldn't be offended if they see an error message stating that access is denied to myprivatewebpage until they install the extension in a supported browser.
The reason why I need this restriction is to keep myprivatewebpage really private, without exposing any information about its use to webmasters of other web pages. If http://www.google.com/images/logos/ps_logo.png was allowed, then the use of myprivatewebpage would be logged in the access.log of Google's ps_logo.png, so Google's webmasters would have some information how myprivatewebpage is used, and I don't want that. (In this question I'm not interested in whether the restriction is reasonable, but I'm only interested in the technical solutions and its strengths and weaknesses.)
My ideas how to implement the restriction:
Don't impose any restrictions, just rely on the same origin policy. (This doesn't provide the necessary protection, the same origin policy lets all images pass through.)
Change the web application on the server so it generates HTML, JavaScript, Java applets, flash animations etc. which never attempt to load anything outside myprivatewebpage. (This is almost impossibly hard to foolproof everywhere on a complicated web application, especially with user-generated content.)
Over-sanitize the web page using a HTML output filter on the server, i.e. remove all <script>, <embed> and <object> tags, restrict the target of <img src=, <link rel=, <form action= etc. and also restrict the links in the CSS files. (This can prevent all unwanted resources if I can remember all HTML tags properly, e.g. I mustn't forget about <video>. But this is too restrictive: it removes all dyntamic web page functionality like JavaScript, Java applets and flash animations; without these most web applications are useless.)
Sanitize the web page, i.e. add an HTML output filter into the webserver which removes all offending URLs from the generated HTML. (This is not foolproof, because there can be a tricky JavaScript which generates a disallowed URL. It also doesn't protect against URLs loaded by Java applets and flash animations.)
Install a HTTP proxy which blocks requests based on the URL and the HTTP Referer, and force all browser traffic (including myprivatewebpage, otherwebpage, google.com) through that HTTP proxy. (This would slow down traffic to other than myprivatewebpage, and maybe it doesn't protect properly if XmlHttpRequest()s, Java applets or flash animations can forge the HTTP Referer.)
Find or write a Firefox or Chrome extension which intercepts all outgoing connections, and blocks them based on the URL of the tab and the target URL of the connection. I've found https://developer.mozilla.org/en/Setting_HTTP_request_headers and thinkahead.js in https://addons.mozilla.org/en-US/firefox/addon/thinkahead/ and http://thinkahead.mozdev.org/ . Am I correct that it's possible to write a Firefox extension using that? Is there such a Firefox extension already?
Some links I've found for the Chrome extension:
http://www.chromium.org/developers/design-documents/extensions/notifications-of-web-request-and-navigation
https://groups.google.com/a/chromium.org/group/chromium-extensions/browse_thread/thread/90645ce11e1b3d86?pli=1
http://code.google.com/chrome/extensions/trunk/experimental.webRequest.html
As far as I can see, only the Firefox or Chrome extension is feasible from the list above. Do you have any other suggestions? Do you have some pointers how to write or where to find such an extension?
I've found https://developer.mozilla.org/en/Setting_HTTP_request_headers and thinkahead.js in https://addons.mozilla.org/en-US/firefox/addon/thinkahead/ and http://thinkahead.mozdev.org/ . Am I correct that it's possible to write a Firefox extension using that? Is there such a Firefox extension already?
I am the author of the latter extension, though I have yet to update it to support newer versions of Firefox. My initial guess is that, yes, it will do what you want:
User visits your web page without plugin. Web page contains ThinkAhead block that would send a simple version header to the server, but this is ignored as plugin is not installed.
Since the server does not see that header, it redirects the client to a page to install the plugin.
User installs plugin.
User visits web page with plugin. Page sends version header to server, so server allows access.
The ThinkAhead block matches all pages that are not myprivatewebpage, and does something like set the HTTP status to 403 Forbidden. Thus:
When the user visits any webpage that is in myprivatewebpage, there is normal behaviour.
When the user visits any webpage outside of myprivatewebpage, access is denied.
If you want to catch bad requests earlier, instead of modifying incoming headers, you could modify outgoing headers, perhaps screwing up "If-Match" or "Accept" so that the request is never honoured.
This solution is extremely lightweight, but might not be strong enough for your concerns. This depends on what you want to protect: given the above, the client would not be able to see blocked content, but external "blocked" hosts might still notice that a request has been sent, and might be able to gather information from the request URL.