I'm relatively new to dotnetnuke and am trying to set up a simple site which will have multiple user groups with their own set of files and then another user that has access to all files.
I'm currently playing with doing this with the "documents" module and hiding the module from all but the everything user and the specific company user. This works fine but the security seems to be just security by obscurity.
If I log in as User A and get access to file A and copy its url. I then log out and log in as user B who can't see that file. If I then put the file url into the browser it seems to download fine.
Can anybody tell me if I am doign something wrong or is there no actual user based security on file downloads? I've tried goign to the actual file manager and making the directories explicitly not viewable to user B (they are secure directories too) but still it persists. Am I missing a permissions option at the file level somewhere or is the security designed to just prevent you finding the right links to the files? I'll admit the links aren't guessable (no sequential ids in the url or anything silly like that) but I'm still a little uncomfortable with the security working like this...
DNN FileManager Module
Hi Chris,
Please check out the FileManager module per above link. You are correct that the current FileManager module does not allow access per user roles. You might check Snowcovered for possible substitutes?
It seems that I was doing something wrong. I was referencing a different version of the file which didn't have any permissions attached to it. It seems also that I don't need to have multiple documents modules since if a file doesn't have read permission it will just be hidden in the list.
So to summarise the DNN Documents module will do role based security to prevent unauthorised users from downloading the file and from seeing it in the documents view.
Documents module provides security for LinkClick.aspx urls that are routed to ASP.NET.
If the actual files reside in the file system under the site's root folder, direct urls to these files are served and secured by IIS.
To prevent unauthorized access to direct urls you can disable anonymous authentication and set up Basic authentication with NTFS permissions, for example.
If don't want to touch IIS and administer Windows accounts, you can't store the files directly under any publicly available IIS folder. Security at the ASP.NET application-level is implemented using file encryption or storing the files outside the public IIS folders, like in the database. DNN File Manager offers both of these options: secure folders in the file system and secure folders in the database.
There are also 3rd party modules to manage file security and sharing, like NukeTransfer.
Related
I'm working on some site that all links (dynamic + hard-coded) to media library are permanent links (with getmedia...), which made it so hard to locate the exact folder of the files and update them. I've asked some developer and heard that permanent links are more secure as the system can check who have access to download the materials. Is it a fair statement and why/why not? Thanks for your input!
This is not a fair or correct statement. Access is set at the individual medial library directory, not an individual file level.
For example, if you have an Images media library which has no security behind it, you can access it directly with a URL of:
/site/media/images/logo.png or /getmedia/<guid>/logo.png
and the image will display without issue.
Now you have another media library called "Secure_Files", if you attempt to access:
/site/media/secure_files/file1.pdf
You'll get an error or a login page because the security is set on the
/site/media/secure_files directory.
Here is the documentation on securing media libraries.
By default, Kentico does not check the See library content permission for visitors on the live site. If you wish to require users to have this permission to view media library content, you need to enable the following settings in the Content -> Media category of the Settings application:
Use permanent URLs
Check file permissions
See the note at the very bottom of this documentation page.
Permanent Link is made up of:
/getmedia/
Guid ID
Image Path
.aspx
Eg: /getmedia/C73B5-6A0-4F6-878-3C29D792014/IMG_3860.jpg.aspx
Direct Path is made up of:
/
Site Name
Media Library Folder Name
Image Path
Eg: /google/media/Blog-images-from-Kentico-Cloud/IMG_360.jpg
In liferay I have
--tomcat
--webapps
--myimages
--my-portlet
So using code in my-portlet I have given links to given file in myimages folder for a specific user. Link would be
http://localhost:8080/myimages/User1.jpg
Problem Statement: I have to restrict a user (rather than defined role in liferay) so that s/he should not be able to access any of the files in myimages folder as s/he user hits on direct above link.
What I have tested:
I have checked .htaccess file will NOT be useful since liferay has
tomcat rather than apache server.
Created a filter class by which I can intercept any request made
should process through.
openLDAP can not use since we are having separate authentication
mechanism.
So if anyone has idea how to deal with this security issue, please suggest me.
URLs that are resolved through individual webapps (like myimages), thus not through Liferay, will not have any idea of the user that accesses Liferay: They'll be well shielded from the other (and in this case totally unrelated) webapplication Liferay.
What you can do is to provide these files through portlet plugins and serve the images through resource-URLs in the portlet. This properly goes through the portal context (in fact, the URLs will point to Liferay, despite the implementation in a different webapplication) and you'll be able to check the permissions of the current user. Then just read the file and pipe it into the ResourceResponse's output stream.
If the files indeed are static web resources, you might want to put them in myimages/WEB-INF/images - as tomcat will refuse to directly serve everything under WEB-INF, but your portlet will be able to access these files.
I want to secure static files (images, .txt files) from unauthenticated users. How can I implement the user authentication to the website so that the static files in specific folder also get secured? I have used simple authentication in a login.asp file and started a session for authenticated user and I check the session value for protected .asp files. But I have no idea how to secure static content on Classic ASP website.
The website is hosted on IIS 7 with Integrated pipeline mode.
You already asked this, and I answered it, and I will give you the same answer.
You will need to use BASIC AUTHENTICATION to restrict access on static files in IIS (Classic ASP). Otherwise, you need to save the static content in another format and encrypt it and only make it viewable by people authenticated by your program.
Please don't ask this again, the answers will not be different.
If using Basic Authentification is not your cup of tea, one possibility would be to replace your static files with an ASP file that upon authorization, will output the correct file. If necessary, you can set the ContentType of the Response to the appropriate type. The link http://support2.microsoft.com/kb/173308 show you how to do that with an image stored inside a database but of course, you can take whatever you want as the source of the file. In the case of .TXT files, you can even directly take the file and simply add a small section of ASP code at the beginning for doing the check.
All of this required extra work. There is no way to simply activate some sort of protection with the session state for static files without extra work.
Old question but -- Most MS servers with Classic Asp installed have several default folders which cannot be accessed except via ASP. they are /bin /app_code /app_data and there may be others. It depends on your hosting company. Windows 10 IIS (their cut down dev & test suite) locks these by default. Using ASP code to retrieve and display text and html is very easy but I'm not sure how to do images. If you have very low traffic, one way would be to copy the image file to an unlocked folder and give it a random name, then access it normally in an IMG tag, then delete it after use. (I came here looking for a better method).
Update: The answer to loading images via ASP is here -- displaying images from sql database with classic asp ... see bottom answer by "HeavenCore" and, instead of Response.BinaryWrite rs("ImageBlob"), get the binary of the image into Your variable, eg: BinaryImageData and do Response.BinaryWrite BinaryImageData
To quickly summarise my question:
Is it feasible to programmatically change the name of a directory (with both files and sub-folders) in SharePoint? I am expecting that users will have files checked out on at least some occasions what I am attempting the rename.
The background:
I am currently contracting for a company that produces web based software (ASP.NET) with a configurable document management system. The system can be configured to use different underlying systems, with the most common environment being SharePoint (WSS 3).
I have been assigned a task to extend what has to now been a fairly simple system (simply output files into a fixed directory structure, occasionally read). Having never worked with SharePoint before I am doing some research on best practices, and am attempting to work out what is viable. At this stage I do not have access to a testing environment myself, so am limited to reading up online.
One request is to have the directory structure reflect the name (as one example) of the current client - so all documentation for a client will be in one place, and can be accessed externally via SharePoint or other compatible applications. The specification cites that if the name of the client changes then the directory structure should immediately update. My concern is that this will either directly cause errors (eg. Permission denied) or indirectly cause errors (loss of work for users who have externally checked out files).
As a follow up question if there are concerns with the above, is there a better way to implement the above? I have looked at suggesting the users use views to access the structure in SharePoint, however there is a concern from our BA that users will not be able to directly upload new files into this structure.
Thanks
The issue with Folders in SharePoint is that they are not really folders in the way you would expect of a file system. All files in a SiteCollection are stored in one big-assed table on the Database (checkout the AllDocs table).
I cannot categorically say it is safe to rename the folder without doing a bit of testing, I know that the folders "name" is not the key to accessing the document, despite it appearing to be based on the Url you see in the browser.
The best bet is to do a quick test, but I am pretty sure that your plan will not be a problem.
The potential issue is if any Content Query Web Parts etc rely on specific folders to exist or if any other "code" or "pages" look for that folder and not the folderId.
Save the content of the list before you "attempt" it in production. You don't want to loose data.
Checked out documents will still work the way you expect them to.
You may however have to run a crawl again.
I was wondering if anyone knows how to or if it is possible to upload files to a sharepoint (v3/MOSS) document library over FTP. I know it is possible with webdav. If it is possible is this even supported by Microsoft?
I don't think so. I think your options are:
HTTP (via the upload page)
WebDAV
Web Services
The object model
You can map a drive to a SharePoint document library, for example \\serveraddress.domain.com\Documents. So I would try mapping a drive on your FTP server, then making sure files that come in over FTP get sent to that drive.
Big edit: Have any of you figured out how to upload to SharePoint (WSS)? I've tried drive mapping and then using Robocopy and Synctoy to copy files thinking a tool might offer greater control (i.e. a Copy Date Modified control). As I understand it the files are actually stored in SharePoint as database objects and therefore SharePoint views display the database object (SQL object's) properties in Document Libraries where a new user would expect to see the file properties. Those file properties are still alive! They just need to be uncovered by a different view. I particularly like the mapped network drive view of a SharePoint Document Library. File attributes are pretty important to my team, so we were concerned about that at the start. As an opinion note though, the default view showing attributes that appear as incorrect is just plain annoying!
The best solution we've come up with for doing large file migrations into SharePoint is a mapped network drive then using a tool called FreeFileSync available at SourceForge to move your files and folders. It's great because it produces verbose error messages and give a lot of control, especially for the instances that SharePoint tries to block a particular filename or file extension.
Direct FTP into SharePoint is not one of your options. You would need to have a timer job run that checks your FTP directory and uploads into the document library.
Yes it is possible.
The WebDav Redirector allows you to access webdav resources (including Share Point) via UNC path, ie \yourspserver\site\doclib. The IIS FTP server accepts UNC paths as backing storage to virtual directories.
On your ftp server, right click the ftp site in the IIS Manager and select "Add Virtual Directory". Give it a name and specify the sharepoint unc path for the physical path. You'll need to set the "connect as" user to a domain user that has access to the sharepoint folder you're connecting to.
Connect to the ftp folder and you should be able to "cd" into the directory and put/get files without issue (just confirmed it myself). The only caveat is an age old bug/feature of IISFTP, that doesn't show a virtual dir in an ls/dir command listing. The fix is to create a physical folder that mirrors the virtual directory's location. For example, if your ftp root is c:\inetpub\ftproot, then you'll need to create a dir that matches the name of your virtual dir in this location. It will then show up in an ls/dir listing but the cd command will still move into the virtual dir, not the physical dir.
You can directly SFTP/FTP into your SharePoint doc library using Couchdrop. It turns your SharePoint into a native SFTP/FTP server, you can create additional users, etc. Sing out if you need assistance more than happy to assist.
Full disclosure: I represent Couchdrop