Serving a static file with servicestack - servicestack

How would i go around serving a static file using servicestack?
I would like to add a route like Routes.Add(/app) and when a client issues a GET for this path i need to return the a silverlight xap file.

ServiceStack is already be able to serve static files by referencing them directly.
Otherwise if you want a service return a file for downloading, you can do so with:
return new HttpResult(new FileInfo("~/app.xap"), asAttachment:true) {
ContentType = "application/x-silverlight-app"
};
Note: asAttachment will control whether or not to send HTTP Content-Disposition headers.
More info about ServiceStack's responses is in this earlier question: ServiceStack and returning a stream

Related

Blazor server app cannot download .msg files

I have a Blazor Server 6.0 app where I have links to download .msg files.
I have setup IIS to serve that mime-type trying both application/octet-stream and application/vnd.ms-outlook (and restarting IIS)
I have also tried to put in web.config the staticcontent tag like suggested here:
.msg file gives download error
And obviously in my program.cs I have app.UseStaticFiles();
I try to put the .msg in a non-blazor app and they work ok, so I think is not IIS related
So why I cannot download (or open automatically in outlook) this type of file, while other (docx, pdf, zip, etc.) are Ok ?
ASP.NET Core -- on the server side -- also needs to know about the files it has to serve. You can enable serving all unknown file types (I'd rather not include the relevant code as it is a major security risk), or you can add you own additional mappings like so:
var provider = new FileExtensionContentTypeProvider();
provider.Mappings[".msg"] = "application/vnd.ms-outlook";
// app.UseStaticFiles();
app.UseStaticFiles(new StaticFileOptions()
{
ContentTypeProvider = provider
});
More info in the official docs: https://learn.microsoft.com/en-us/aspnet/core/fundamentals/static-files?view=aspnetcore-7.0#fileextensioncontenttypeprovider
Additionally, Blazor Server registers custom options for serving static files (like .server.js, which is different from just .js). It's not directly exposed as a public API to configure, but you can look at the source here as to what the AddServerSideBlazor extension method actually does. The solution there relies on you calling UseStaticFiles without explicitly specifying the options, so that it can retrieve the StaticFilesOptions instance from DI.
Armed with this knowledge, you can override an already configured options instance as follows:
builder.Services.PostConfigure<StaticFileOptions>(o =>
{
((FileExtensionContentTypeProvider)o.ContentTypeProvider).Mappings[".msg"] = "application/vnd.ms-outlook";
});
This configures the already initialized options instance registered in the DI (after all other configurations happened on it, thus PostConfigure).
Note that if you would for whatever reason decide to use a different IContentTypeProvider, the unsafe cast above would need to be revised as well.

Trying to use HttpClient.GetStreamAsync straight to the adls FileClient.UploadAsync

I have an Azure Function that will call an external API via HttpClient. The external API returns a JSON response. I want to save the response directly to an ADLS File.
My simplistic code is:
public async Task UploadFileBulk(Stream contentToUpload)
{
await this._theClient.FileClient.UploadAsync(contentToUpload);
}
The this._theClient is a simple wrapper class around the various Azure Data Lake classes such as DataLakeServiceClient, DataLakeFileSystemClient, DataLakeDirectoryClient, DataLakeFileClient.
I'm happy this wrapper calls works as I expect, I spin one up, set the service, filesystem, directory and then a filename to create. I've used this wrapper class to create directories etc. so it works as I expect.
I am calling the above method as follows:
await dlw.UploadFileBulk(await this._httpClient.GetStreamAsync("<endpoint>"));
I see the file getting created in the Lake directory with the name I want, however if I then download the file using Sorage Explorer and then try to open it in say VS Code it's not in a recognisable format (I can "force" code to open it but it looks like binary format to me).
If I sniff the traffic with fiddler I can see the content from the external API is JSON, content-type is application/json and the body shows in fiddler as JSON.
If I look at the calls to the ADLS endpoint I can see a PUT call followed by two PATCH calls.
The first PATCH call looks like it is the one sending the content, it has a content-header of application/octet-stream and the request body is the "binary looking content".
I am using HttpClient.GetStreamAsync as I don't want my Function to have to load the entire API payload into memory (some of the external API endpoints return very large files over 100mb). I am thinking I can "stream the response from the external API straight into ADLS".
Is there a way to change how the ADLS FileClient.UploadAsync(Stream stream) method works so I can tell it to upload the file as a JSON file with a content type of application/json?
EDIT:
So turns out the External API was sendng back zipped content and so once I added the following extra AutomaticDecompression code to my functions startup I got the files uploaded to ADLS as expected.
public override void Configure(IFunctionsHostBuilder builder)
{
builder.Services.AddHttpClient("default", client =>
{
client.DefaultRequestHeaders.Add("Accept-Encoding", "gzip, deflate");
}).ConfigurePrimaryHttpMessageHandler(() => new HttpClientHandler
{
AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate
});
}
#Gaurav Mantri has given me some pointers on if the pattern of "streaming from an output to an input" is actually correct, I will research this further.
Regarding the issue, please refer to the following code
var uploadOptions = new DataLakeFileUploadOptions();
uploadOptions.HttpHeaders = new PathHttpHeaders();
uploadOptions.HttpHeaders.ContentType ="application/json";
await fileClient.UploadAsync(stream, uploadOptions);

ServiceStack Plugin How to add MimeType for new file suffix, and allow the file suffix to be served?

I would like to add the file suffix ".wasm" to the AllowFileExtensions property of the AppHost, and I'd like to associate the MimeType "application/wasm" to that file suffix, so that a Windows service based on ServiceStack can serve static files with this suffix. In my plugin's Configure method, I've tried this code, but it is not working.
public void Configure(IAppHost appHost) {
// Add the MIMEType application/wasm and associate it with .wasm files
MimeTypes.ExtensionMimeTypes["wasm"] = "application/wasm";
// Allow static files ending in .wasm to be served
var config = new HostConfig();
var allowFileExtensions = config.AllowFileExtensions;
allowFileExtensions.Add(".wasm");
}
Requests to my ServiceStack Windows service for static files ending in .wasm return a 403 error, and the Content-Type in the response headers is "text/plain".
Any suggestions on what I'm doing wrong, and how best to allow the new suffix and associate the new MimeType?
I've added wasm file extensions to ServiceStack's allowed File Extensions list in this commit. This change is available from v5.1.1 that's now available on MyGet.
For earlier versions of ServiceStack you can register an allowed File Extension by modifying IAppHost.Config, e.g:
public void Register(IAppHost appHost)
{
appHost.Config.AllowFileExtensions.Add("wasm");
}
You don't need to register a MimeType for wasm as the default MimeType for unknown Content-Types is application/{ext} which for .wasm returns application/wasm.

Reading a file from a UNC path and setting the correct MIME type in a HTTP request

How would I go about reading a file from a UNC path, discovering the proper MIME type, and streaming that out to a browser?
It feels to me like I'm re-inventing IIS, and I'll also have to maintain my own MIME type database for each file extension. Does the above request sound reasonable, or is there a better way?
I plan on streaming this out via a browser HTTP Get request on IIS7. If it matters, I'm also running Cognos on the same server. Any framework is OK (WCF, ASPX, etc)
Using WCF its pretty basic:
This code can be hosted under IIS/Service/WAS/etc.
I never found a convenient way to handle the mime type, you will need to have your own db that will map file extension into mime types.
[ServiceContract(SessionMode = SessionMode.NotAllowed)]
public interface IMediaRetriver
{
[OperationContract]
[WebGet(UriTemplate = "/get?f={fileName}")]
Stream Get(string fileName);
}
[ServiceBehavior(InstanceContextMode = InstanceContextMode.Single)]
public class MediaRetriver : IMediaRetriver
{
public Stream Get(string fileName)
{
// pro tips
// this will cause the file dialog to show the file name instead of "get"
WebOperationContext.Current.OutgoingResponse.Headers.Add(
"Content-disposition", string.Format("inline; filename={0}", fileName));
WebOperationContext.Current.OutgoingResponse.ContentType =
"application/octet-stream";
// you want to add sharing here also
return File.Open(fileName)
}
}

Upload a file to SharePoint through the built-in web services

What is the best way to upload a file to a Document Library on a SharePoint server through the built-in web services that version WSS 3.0 exposes?
Following the two initial answers...
We definitely need to use the Web Service layer as we will be making these calls from remote client applications.
The WebDAV method would work for us, but we would prefer to be consistent with the web service integration method.
There is additionally a web service to upload files, painful but works all the time.
Are you referring to the “Copy” service?
We have been successful with this service’s CopyIntoItems method. Would this be the recommended way to upload a file to Document Libraries using only the WSS web service API?
I have posted our code as a suggested answer.
Example of using the WSS "Copy" Web service to upload a document to a library...
public static void UploadFile2007(string destinationUrl, byte[] fileData)
{
// List of desination Urls, Just one in this example.
string[] destinationUrls = { Uri.EscapeUriString(destinationUrl) };
// Empty Field Information. This can be populated but not for this example.
SharePoint2007CopyService.FieldInformation information = new
SharePoint2007CopyService.FieldInformation();
SharePoint2007CopyService.FieldInformation[] info = { information };
// To receive the result Xml.
SharePoint2007CopyService.CopyResult[] result;
// Create the Copy web service instance configured from the web.config file.
SharePoint2007CopyService.CopySoapClient
CopyService2007 = new CopySoapClient("CopySoap");
CopyService2007.ClientCredentials.Windows.ClientCredential =
CredentialCache.DefaultNetworkCredentials;
CopyService2007.ClientCredentials.Windows.AllowedImpersonationLevel =
System.Security.Principal.TokenImpersonationLevel.Delegation;
CopyService2007.CopyIntoItems(destinationUrl, destinationUrls, info, fileData, out result);
if (result[0].ErrorCode != SharePoint2007CopyService.CopyErrorCode.Success)
{
// ...
}
}
Another option is to use plain ol' HTTP PUT:
WebClient webclient = new WebClient();
webclient.Credentials = new NetworkCredential(_userName, _password, _domain);
webclient.UploadFile(remoteFileURL, "PUT", FilePath);
webclient.Dispose();
Where remoteFileURL points to your SharePoint document library...
There are a couple of things to consider:
Copy.CopyIntoItems needs the document to be already present at some server. The document is passed as a parameter of the webservice call, which will limit how large the document can be. (See http://social.msdn.microsoft.com/Forums/en-AU/sharepointdevelopment/thread/e4e00092-b312-4d4c-a0d2-1cfc2beb9a6c)
the 'http put' method (ie webdav...) will only put the document in the library, but not set field values
to update field values you can call Lists.UpdateListItem after the 'http put'
document libraries can have directories, you can make them with 'http mkcol'
you may want to check in files with Lists.CheckInFile
you can also create a custom webservice that uses the SPxxx .Net API, but that new webservice will have to be installed on the server. It could save trips to the server.
public static void UploadFile(byte[] fileData) {
var copy = new Copy {
Url = "http://servername/sitename/_vti_bin/copy.asmx",
UseDefaultCredentials = true
};
string destinationUrl = "http://servername/sitename/doclibrary/filename";
string[] destinationUrls = {destinationUrl};
var info1 = new FieldInformation
{
DisplayName = "Title",
InternalName = "Title",
Type = FieldType.Text,
Value = "New Title"
};
FieldInformation[] info = {info1};
var copyResult = new CopyResult();
CopyResult[] copyResults = {copyResult};
copy.CopyIntoItems(
destinationUrl, destinationUrls, info, fileData, out copyResults);
}
NOTE: Changing the 1st parameter of CopyIntoItems to the file name, Path.GetFileName(destinationUrl), makes the unlink message disappear.
I've had good luck using the DocLibHelper wrapper class described here: http://geek.hubkey.com/2007/10/upload-file-to-sharepoint-document.html
From a colleage at work:
Lazy way: your Windows WebDAV filesystem interface. It is bad as a programmatic solution because it relies on the WindowsClient service running on your OS, and also only works on websites running on port 80. Map a drive to the document library and get with the file copying.
There is additionally a web service to upload files, painful but works all the time.
I believe you are able to upload files via the FrontPage API but I don’t know of anyone who actually uses it.
Not sure on exactly which web service to use, but if you are in a position where you can use the SharePoint .NET API Dlls, then using the SPList and SPLibrary.Items.Add is really easy.

Resources