I am trying to update a Razor page (running on IIS 10.0 version 1809) to allow me to share a Powershell script. I want the user to be able to right-click the link and select "save" to download the script from a folder on the web server (wwwroot/<site name>/downloadFiles). I verified that the NTFS permissions allow "Read & Execute", "List folder contents", and "Read" to IUSR, Network Service, Users, and IIS_IUSRS.
I tested:
Right-click the link and select "Save link as..." and Chrome tells me, "Failed - No file"
Browse to https://<site url>/<site name>/downloadFiles/script.ps1 and IIS tells me: 404 - File or directory not found. The resource you are looking for might have been removed, had its name changed, or is temporarily unavailable.
Browse to https://<site url>/<site name>/downloadFiles and IIS tells me: 403 - Forbidden: Access is denied. You do not have permission to view this directory or page using the credentials that you supplied.
In _Layout.cshtml, I have the link setup as:
Script
Any idea how I can make this work?
Use the FileExtensionContentTypeProvider to map the .ps1 extension to the application/octet-stream mime type to force a download as explained in the docs: https://learn.microsoft.com/en-us/aspnet/core/fundamentals/static-files#fileextensioncontenttypeprovider
var provider = new FileExtensionContentTypeProvider();
provider.Mappings[".ps1"] = "application/octet-stream";
app.UseStaticFiles(new StaticFileOptions
{
ContentTypeProvider = provider
});
Related
I open succesfully files and grab the content of released sharepoint files.
Then i try to get an historized file and it fail.
this worked:
Siteaddress: https://domain.sharepoint.com/teams/Foldername
FilePath: /Freigegebene Dokumente/General/Filename
and this not:
Siteaddress: https://domain.sharepoint.com/teams/Foldername
FilePath: /_vti_history/1024/Freigegebene Dokumente/General/Filename
Is this possible or not?
Since you select the "Site Address" and "File Path" in the "Get file content using path" action and it can't get what you want, it seems the action can't implement this requirement.
As a workaround, maybe you can try with microsoft graph api to download the file in your _vti_history. Please refer to this document of the graph api. You can add a "HTTP" action in your logic app to call the graph api and download the file.
I want to set up a direct download link using Microsoft IIS. We already have a web page using the IP address that points to a /web folder on our server, but I want to create a separate location on my server where I can put downloadable files such that the client can just type the link and get the download: http://IPADDR/download/filename.zip. Are there any resources on how to do this?
Right now, typing http://IPADDR brings up our simple web page which contains a link that launches an application, again this is bound to the /web folder on our server via IIS.
The FTP port is typically blocked on our client's networks so we have to stick with HTTP. This will be completely programmatic, so no need to have a button or link on a page. I will be using java with a GET command to pull files from the link. I just want to be able to have the web server make these files available to download.
FYI I'm newer to this server stuff so simpler is better! Thank you.
As far as I know, there are two ways to achieve your needs.
The first is using Asp.net application. You need to write code for the downloaded file. For example, when the user clicks a button, the logic method corresponding to the button is triggered, and the response to the client is the file specified in the code.
var fileNameToShow = "xxx.zip";
var fileNameAndPath = "The physical path of the file on the server"
FileInfo file = new FileInfo(fileNameAndPath);
file.Refresh();
if (file.Exists)
{
// Send the file to the browser
Response.Clear();
Response.AddHeader("Content-Disposition",
"attachment; filename= " + fileNameToShow + "; size=" + file.Length.ToString());
Response.TransmitFile(fileNameAndPath);
Response.Flush();
Response.End();
}
else
{
throw new Exception("File does not exist!");
}
The second is to use the FTP function of IIS. You need to create a site and add FTP publishing to the site. When you visit the site through ftp://domian, you can see all the files in the physical directory of the site on the server, and you can download any file by clicking on it.(It also can download through link,such as ftp://domain/filename.zip)
I am trying to copy a local file to shared folder but EPERM error is showing. This is my code:
var fs = require('fs');
var is = fs.createReadStream(SOURCE_FILE);
var os = fs.createWriteStream(TARGET_FILE);
is.pipe(os);
is.on('end', function() {
console.log('success');
});
... this is the error message:
Application has thrown an uncaught exception and is terminated:
Error: EPERM, open 'TARGET_FILE'
TARGET_FILE includes a windows shared folder with all permissions for all users
Thanks in advance
If you are running IISNODE on IIS 7/8 it could be running on a special type of user that does not inherit "All Users" permission.
If you are running iisnode from a command prompt it is probably running as the user you are logged in as. If you are running it in a service wrapper it is probably also running under a a "user" that does not inherit "All Users" permission. You can load the task manager (right click on your task bar) to see what user the process is running under.
If you are using IISNODE in IIS, node.js is using the same user as the domain you are running it under. Typically DefaultAppPool.
TO give DefaultAppPool permission:
right click on the folder
select properties
select the security tab
click the edit button
click the add button
enter IIS AppPool\DefaultAppPool in the box (including spaces)
click the check names button
it should now show DefaultAppPool underlined, click OK
select DefaultAppPool from the list of users and assign them the minimum required rights - resist the temptation to give DefaultAppPool excessive rights.
We create all our site collections programatically with a custom site def/template. Everything works as expected, except for the crawler. It's apparently denied access to the sites. The crawl logs says:
http://server.localnetwork.lan/somesites/siteName
The object was not found. (The item
was deleted because it was either not
found or the crawler was denied access
to it.)
And in the log files I'm getting this:
08/11/2009 14:20:34.01 OWSTIMER.EXE
(0x0674)
0x1560 Search Server Common
MS Search Administration
7hmh High exception in
SearchUpgradeProvisioner Keyword
Config
System.InvalidOperationException:
jobServerSearchServiceInstance is null
at
Microsoft.Office.Server.Search.Administration.SearchUpgradeProvisioner..ctor(SearchServiceInstance
searchServiceInstance) at
Microsoft.Office.Server.Search.Administration.OSSPrimaryGathererProject.ProvisionContentSources()
If I create a site collection manually the crawler is able to access it. The same users/accounts have the same access on both sites, so that shouldn't be the issue.
The code we use to actually create the site collection looks a little like this:
SPWebApplication app = SPWebApplication.Lookup(new Uri("WebApplicationUrl"));
app.FormDigestSettings.Enabled = false;
app.Sites.Add("url", "title", "description", "language code", "SiteTemplateName", "Owner.Username", "Owner.Fullname", "Owner.Email");
app.FormDigestSettings.Enabled = true;
The code has been slightly altered to protect the innocent... ;)
Any idea what we're doing wrong?
(Please note, I'm not sure if this is a programming error or a config/setup error, so I'm cross-posting with Serverfault)
If you receive this error whilst the crawler account (the default content access account) has read permission to all your sites then you most likely need to disable the loopback check.
http://support.microsoft.com/kb/896861
http://koenvosters.wordpress.com/2009/06/15/access-denied-when-using-hostname-search-and-site-on-moss-2007/
I am trying to use System.Net.WebClient in a WinForms application to upload a file to an IIS6 server which has Windows Authentication as
it only 'Authentication' method.
WebClient myWebClient = new WebClient();
myWebClient.Credentials = new System.Net.NetworkCredential(#"boxname\peter", "mypassword");
byte[] responseArray = myWebClient.UploadFile("http://localhost/upload.aspx", fileName);
I get a 'The remote server returned an error: (401) Unauthorized', actually it is a 401.2
Both client and IIS are on the same Windows Server 2003 Dev machine.
When I try to open the page in Firefox and enter the same correct credentials as in the code, the page comes up.
However when using IE8, I get the same 401.2 error.
Tried Chrome and Opera and they both work.
I have 'Enable Integrated Windows Authentication' enabled in the IE Internet options.
The Security Event Log has a Failure Audit:
Logon Failure:
Reason: An error occurred during logon
User Name: peter
Domain: boxname
Logon Type: 3
Logon Process: ÈùÄ
Authentication Package: NTLM
Workstation Name: boxname
Status code: 0xC000006D
Substatus code: 0x0
Caller User Name: -
Caller Domain: -
Caller Logon ID: -
Caller Process ID: -
Transited Services: -
Source Network Address: 127.0.0.1
Source Port: 1476
I used Process Monitor and Fiddler to investigate but to no avail.
Why would this work for 3rd party browsers but not with IE or System.Net.WebClient?
I have seen a similar issue, where the Integrated / NTLM security will only work if you are accessing the host by machine name or localhost. In fact, it is a [poorly] document feature in Windows that is designed to protect against "reflection attacks".
Basically, you need to create a registry key on the machine that is trying to access the server, and whitelist the domain you are trying to hit. Each host name / FQDN needs to be on it's own line - there are no wildcards and the name must match exactly. From the KB Article:
Click Start, click Run, type regedit, and then click OK.
In Registry Editor, locate and then click the following registry key:
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa\MSV1_0
Right-click MSV1_0, point to New, and then click Multi-String Value.
Type BackConnectionHostNames, and then press ENTER.
Right-click BackConnectionHostNames, and then click Modify.
In the Value data box, type the host name or the host names for the sites that are on the local computer, and then click OK.
Exit Registry Editor, and then restart the computer.
http://support.microsoft.com/kb/956158/en-us
Have you tried ...
new NetworkCredential( "peter", "password", "boxname" );
You might also try ...
var credCache = new CredentialCache();
credCache.Add( new Uri ("http://localhost/upload.aspx"),
"Negotiate",
new NetworkCredential("peter", "password", "boxname"));
wc.Credentials = credCache;
Also, according to this it may be that IIS is configured wrong. Try replacing "Negotiate" with "Basic" in the above and checking your IIS config for the website. There's also a bunch of possible causes here.
Try going into IE's options and explicitly add the site to the Intranet Zone. Then re-run the program. You should also not run the program from an administrator login. This may trigger the Enhanced Security Configuration for Internet Explorer.
It could explain why you can hit the site with Firefox and Opera, but not with IE or WebClient.
Without knowing your IIS deployment, and assuming that you have the correct authorization rules for upload set in IIS (e.g. the right allow* ACL's on the right dirs you are trying to upload content to, etc), first thing I would try is to set UseDefaultCredentials to true instead of explicitly set Credential. (Maybe you think you are accessing the server with the Credentials you are setting but that's not the case? That would be possible if this works.)
This is a very common scenario, so I would focus on IIS authorization rules for the directory in which you are trying to upload the file, the actual ACL's on that directory. For ex. is your site impersonating or not? if it is, then you have to have actual ACL's on that dir, otherwise whatever account app pool is running on.