I want to secure static files (images, .txt files) from unauthenticated users. How can I implement the user authentication to the website so that the static files in specific folder also get secured? I have used simple authentication in a login.asp file and started a session for authenticated user and I check the session value for protected .asp files. But I have no idea how to secure static content on Classic ASP website.
The website is hosted on IIS 7 with Integrated pipeline mode.
You already asked this, and I answered it, and I will give you the same answer.
You will need to use BASIC AUTHENTICATION to restrict access on static files in IIS (Classic ASP). Otherwise, you need to save the static content in another format and encrypt it and only make it viewable by people authenticated by your program.
Please don't ask this again, the answers will not be different.
If using Basic Authentification is not your cup of tea, one possibility would be to replace your static files with an ASP file that upon authorization, will output the correct file. If necessary, you can set the ContentType of the Response to the appropriate type. The link http://support2.microsoft.com/kb/173308 show you how to do that with an image stored inside a database but of course, you can take whatever you want as the source of the file. In the case of .TXT files, you can even directly take the file and simply add a small section of ASP code at the beginning for doing the check.
All of this required extra work. There is no way to simply activate some sort of protection with the session state for static files without extra work.
Old question but -- Most MS servers with Classic Asp installed have several default folders which cannot be accessed except via ASP. they are /bin /app_code /app_data and there may be others. It depends on your hosting company. Windows 10 IIS (their cut down dev & test suite) locks these by default. Using ASP code to retrieve and display text and html is very easy but I'm not sure how to do images. If you have very low traffic, one way would be to copy the image file to an unlocked folder and give it a random name, then access it normally in an IMG tag, then delete it after use. (I came here looking for a better method).
Update: The answer to loading images via ASP is here -- displaying images from sql database with classic asp ... see bottom answer by "HeavenCore" and, instead of Response.BinaryWrite rs("ImageBlob"), get the binary of the image into Your variable, eg: BinaryImageData and do Response.BinaryWrite BinaryImageData
Related
I have big dilemma and I need help.
Basically we have sitecore web app this is our main web service. Currently my app is working with the main app via .html static pages(it works as SPA, JS calls backend with needed html content).
But database I work with grows bigger, and to access certain elements with URL I need to create ~70.000+ static files. As well this static files are needed for google indexing, so we can advertise our products. In case if there is new meta data needed or new item added, I need to run my other program that creates this static files to update everything out of txt file with all items. And we have 2 reserve servers where our sitecore web is. So it like 70k+ files for 9 languages and 3 web servers. It takes a day to recreate everything...
That why I decided to make clear MVC SPA application, and it works great. But...
I can't add my MVC application or anything except .html files to the current sitecore main app.
And the question is: how it could be done without losing google indexing and without changing main domain.
For example we have now:
www.ourdomain.com/foldername/mystaticfile.html
What I want:
www.ourdomain.com/mynewmvcapplication
Sitecore has a settings called IgnoreUrlPrefixes. You can add mynewmvcapplication to this setting, in that case Sitecore will ignore that path as well as anything under it. Here is a good article which shows you how to update this setting without making an update to Sitecore's config files.
Take a look at Sitecore Redirect Manager Sitecore market place. This it has the capabilities to create your custom url and keeps your search engine rating.
https://marketplace.sitecore.net/en/Modules/Sitecore_Redirect_Manager.aspx
Otherwise you can check Custom Link Provider and Custome Item Resolver. This will need more coding than the previous one. A google search with those keywords brings back many results.
Best wishes.
I'm wondering if my website need to be hosted on a different server for load balancing purposes as picture below:
I'm thinking of installing 3 Kentico Project into each server. Then, export and import the site into each Kentico Project and link with the same database connection string.
But what if one of the webparts (.ascx) gets updated? Is that mean I will need to update all 3 Kentico Project. What if other files like js, css, or media?
Is there a proper way to host on different servers but yet can manage the content as one of the Kentico Project get update?
What you are describing is the exact purpose of Kentico's Web farm feature where you can have multiple servers (web farms) connected to a single database. The main purpose of web farms is to ensure that cache and files (not code files, but media files such as the ones uploaded by you as attachments, media library, meta files...) are synchronized across all servers.
Each server in your scenario has its own memory and if you change an object, you want all other servers to reflect the change because otherwise some visitors might end up seeing "old" data, while others wouldn't.
You are also correct in assumption that all code files (ascx, cs, aspx...) will need to be uploaded to all servers. Best way to approach this is to have a tool such as Team city which is able to deploy your changes to multiple servers simultaneously.
With js, css, html, images... it depends where you store them. If you store them in database (not usually the best thing to do) you don't need to update them on particular servers, but if you store them on file system, you might need to. There are many variables here, but some deployment tool will probably be the best bet.
One note here. Try not to install Kentico directly on each of those server and use Export and Import to setup the site. Simply make a copy of the website physically files from your DEV server and paste into each of those server. Then connect them all to the same database.
Why not use the Export and Import? 1. You will get different hast salt string in the web.config which you will get Macro security error, which you will have to replace with the same key. 2. You may miss objects during export and import. 3. The export and import are mostly for the objects stored in the data base, and for the web farm setup, they share the same database, so there is no point of doing that.
You can easily achieve this move to windows Azure from on-premises.
--Can deploy your website/ web project as Cloud service/App service.
--Kentico Azure supports both development and deployment solution
--Built-in scalability
For more details refer below links
Hosting options: https://docs.kentico.com/k10/running-kentico-on-microsoft-azure/microsoft-azure-web-hosting-options
https://devnet.kentico.com/articles/deploying-kentico-to-microsoft-azure-know-your-web-hosting-options
I have a folder that contains log files. They're not super critical, but I don't want total strangers looking through them. I'd like to put a password on that one folder. The folder and its contents are served straight up from IIS, so I'm not looking for a coding solution.
With Apache I'd use a .htaccess file.
With IIS it's possible to use multiple Web.config files at various levels to control this kind of thing.
So, what goes in the Web.config file that allows me to require a password when accessing this folder?
I'm happy for the password to pop up in a dialog like old-school websites used to do (not sure what this is called -- I think it is digest authentication) and so avoid any loginUrl redirection stuff
I'm happy to put the password in the Web.config file in plain text if it's easier
The application is internet facing and running on shared hosting, so I don't have much control over the box beyond what I can configure in Web.config.
You can achieve this using the <location path="..."/> element of web.config file.
Check this link for step-by-step instructions..
I'm relatively new to dotnetnuke and am trying to set up a simple site which will have multiple user groups with their own set of files and then another user that has access to all files.
I'm currently playing with doing this with the "documents" module and hiding the module from all but the everything user and the specific company user. This works fine but the security seems to be just security by obscurity.
If I log in as User A and get access to file A and copy its url. I then log out and log in as user B who can't see that file. If I then put the file url into the browser it seems to download fine.
Can anybody tell me if I am doign something wrong or is there no actual user based security on file downloads? I've tried goign to the actual file manager and making the directories explicitly not viewable to user B (they are secure directories too) but still it persists. Am I missing a permissions option at the file level somewhere or is the security designed to just prevent you finding the right links to the files? I'll admit the links aren't guessable (no sequential ids in the url or anything silly like that) but I'm still a little uncomfortable with the security working like this...
DNN FileManager Module
Hi Chris,
Please check out the FileManager module per above link. You are correct that the current FileManager module does not allow access per user roles. You might check Snowcovered for possible substitutes?
It seems that I was doing something wrong. I was referencing a different version of the file which didn't have any permissions attached to it. It seems also that I don't need to have multiple documents modules since if a file doesn't have read permission it will just be hidden in the list.
So to summarise the DNN Documents module will do role based security to prevent unauthorised users from downloading the file and from seeing it in the documents view.
Documents module provides security for LinkClick.aspx urls that are routed to ASP.NET.
If the actual files reside in the file system under the site's root folder, direct urls to these files are served and secured by IIS.
To prevent unauthorized access to direct urls you can disable anonymous authentication and set up Basic authentication with NTFS permissions, for example.
If don't want to touch IIS and administer Windows accounts, you can't store the files directly under any publicly available IIS folder. Security at the ASP.NET application-level is implemented using file encryption or storing the files outside the public IIS folders, like in the database. DNN File Manager offers both of these options: secure folders in the file system and secure folders in the database.
There are also 3rd party modules to manage file security and sharing, like NukeTransfer.
I'm making some changes to a legacy classic ASP application. I've made the changes locally, and now I want to copy the changed files to the server. At the same time, I need to download the Access database, add some fields to some tables, and upload it again. For this reason, I need to be able to stop visitors from modifying the database while this is happening.
My main question is, what is the best way to setup a quick "Down for Maintenance" page that will be shown immediately and no matter which page the visitor requests. The application is already established, so I'd rather an answer that didn't require me to rework the application's architecture.
My second question (maybe this should be a separate question):
Is there a better way to add fields to a db table than to copy it down, modify, and stick it up again? Please forgive if that's a dumb question - I'm new to ASP - new to Windows too.
I only have FTP access to the remote server.
Thanks.
two ways:
1
if you do a server-side include in every asp page you can do a response.redirect in that include to /upgrading.html
2
in global.asa you can do a response.redirect in the session on start event. THis is probably the best way. Will only work for .asp pages, not if the client comes to a .html page.
Do you have any control panel access to the site at all?
When I used to run a number of ASP Classic sites I often turned them off for the five minutes required to do what I needed.
Rude to do to your visitors I know.
As others have said you could redirect to a page, but that won't stop people visiting static content in html pages, but then that probably won't matter, at least it stops them making changes to the mdb whilst you download it.
It's a pity that ASP.net's app_offline.htm doesn't work for ASP classic.
Another option I used to use was to create a default.htm file that had the offline message, and the way IIS was setup default.htm overrode default.asp, so simply uploading default.htm changed the homepage. This of course doesn't stop anyone using any of the other .asp pages.
So no real answer! Sorry.
If you have just FTP access to the server (and no control over the IIS) just insert a response.redirect to the "down for maintenace" page in top of all the asp pages, and remove it when the update is completed.
The changes to the database can be performed with the ALTER TABLE statement.
With regards to the "Down Maintanance" page issue you can and taking mapache's idea a step further if there is an included file (for a header) in each of the pages you can put the Response.Redirect in that one file and upload that in place. This will avoid making changes to all pages.
Another option is to upload a temp html file which will be found first by IIS. In IIS you can set which page name.ext are looked for in a domain/folder. For example when you browse to www.example.com you don't specify the page you are looking for so it could load index.html or index.htm for example depending on setup. It will depend on your hosts configuration setup, but a bit of trial and error I'm sure you can find out which one they use. Common ones for IIS are default.htm, default.html, index.html and index.htm. You can then put it in each of the folders in the website (not ideal I know) and then carry out your maintenance.
When updating databases you can run a migration script, written in sql, to update the schema and data of the db. As you only have FTP access this will require some sort of page you can paste the sql into and run. This however opens security issues so downloading the db, making the changes and then uploading again is probably easier. In addition to doing it this way you can also save the file and you'll have a backup :-)
Hope this helps.
Better than an include file, just use the Global.asa.
In the Global.asa's Application_onStart, add
Application("Offline")= True
at the top of all of your ASP files, add
If VarType(Application("Offline")) = vbBoolean Then If Application("Offline") Then Response.Redirect "App_Offline.htm"
(The double-if gets around the lack of VBScript's short-circuit operators, and therefore any data type errors.)
You could even set the Global.asa code to
Set fso= Server.CreateObject("Scripting.FileSystemObject")
Application("Offline")= fso.FileExists(Server.MapPath("App_Offline.htm"))
Set fso= Nothing
Which would enable the offline page if it exists, like ASP.NET. However, the application start code is only reparsed when the server is reset (using iisreset), or when the Global.asa file is modified, merely adding the App_Offline.htm will not be enough.
Add below code in web.config
<?xml version="1.0"?>
<configuration>
<system.webServer>
<modules runAllManagedModulesForAllRequests="true" />
</system.webServer>
</configuration>
And place app_offline under root folder. This will work.