We are working on a website to allow animated GIF upload. To ensure the image is indeed an image and without malware/virus/backdoor/trojan or anything other than image data itself, we try to recreate the original image.
However, the process itself will take sometime when there are lots of frames inside. Is there any other way to ensure an uploaded animated GIF file is free from the issues mentioned above?
You can never 100% guarantee that a file does not contain malware - even with your approach there is the chance that the gif contains some code that could be malicious simply by opening the image in a vulnerable viewer.
That said, the chances are low and you can expect these sort of bugs to be patched fairly quickly in most modern operating systems.
There are various checks you can do on uploaded files though that take less processing time:
Check the file name extension is what you expect - ignore the content-type at upload stage though as this can be spoofed.
Virus scan all uploaded files with a virus scanner with up to date definitions.
Do not store the files in a location where they can be executed - e.g. do not store in the web root (www.example.com/uploads/image.aspx).
Serve the files via a program or script that reads them from storage as data and then streams the output to the browser.
When serving the files, ensure the correct content-type, and if possible, filename extension is set correctly. Use Content-Disposition to set the name the browser will use:
Content-Disposition: attachment; filename="fname.ext"
Related
I have successfully uploaded some audio files via Nodejs to AWS, the file url is also returned from my function. I plan to save this url in MongoDB Atlas as a reference to the original file but before doing that, I tried to play the file (from the url) in my mobile app and it won't play.
The file is in .m4a format. How do I get this to work in any audio player for mobile and web? I'm using flutter for both. I don't want to do piping, chunking and streaming manually as this is just a dummy test of the system. The original files to be used in the app will be much larger.
Here's the file url https://empty-bouquet.s3.af-south-1.amazonaws.com/Dax+-Dear+God.m4a.
Thanks
.m4a audios aren't natively streamed from S3, but after a test I can verify that .mp3 files are. Most browsers will recognize that filetype and render a built-in player for you.
You can convert from one format to another using a lot of free tools. I used Audacity.
And yes, you need to make at least the file public. Or if you're going to do this a lot I would recommend making a bucket policy that makes everything public, no matter what you throw in there.
TL:DR? Word Documents are packages, is it dangerous to upload them to the server then?
I'm developing an application in Angular 4, and Node JS, that at some point allow users to upload files, that should be only images, pdf's and Word documents.
Front end validation goes well, until node rejects a Google Docs file exported as .docx and says the file is actually a zip (application/zip).
Okay, perfectly normal, but native Word documents, have a different MIME type, so:
Should I enable the upload of zip files?
Considering that I could successfully use a docx as a zip, add a script and go back to use it as a docx file, should I have any security worries?
Is it possible to include malicious software inside a docx file and somehow use/run it on the server side?
Am I worrying about nothing?
There is no danger in a Word file on your server UNLESS you open it with Word or some other tool that processes it and essentially "runs it". Then, that provides a vector for macro malware that could be run when the file was opened in some program that might runs those macros.
Just storing it and enabling others to download it does not put your server at risk in any way. A plain file that you just store or send to others upon request is just a bucket of bits that doesn't "run" any code.
If you are providing a storage and retrieving mechanism, you may want to prevent yourself from becoming a distribution mechanism for malware by using some sort of scanner on all files that are uploaded to you such that you can filter out files that might harm someone else who downloads them and does attempt to open/run them.
Ok, I have never seen anything like this before and hoping someone else has. I just finished patching our Dev and Test servers to Nov2017CU (SharePoint 2013). Since then, any solutions that are using JS injection from Site Assets are not updating. I'll make a change to the file, the library reflects that I made the change, but when I attempt to load the page accessing the js file, the changes are not reflected. Hard refreshes and full cache cleans are not affecting it. If I close and reopen my editor (VSCode) my changes are gone. When I look at the version history, the current version doesn't have my changes, but the previous version does. If I try to revert to that version, it doesn't take (still shows the previous version of the file).
Here's where it becomes extra weird. I have deleted the entire file from the library. Reset IIS (heck, I even rebooted the server at one time). It somehow still loads the file. The file is no longer in the library, but the server is still serving it up to the browser. I have confirmed it is not getting it from another location as the Dev tools are showing the file is located in the Asset Library the file was deleted from. Even users who have never accessed the site before are still getting that file in their browser.
This isn't limited to a single site either. I have other developers in different sub sites (same site collection) that are having the same issues.
Anyone seen this before?
Looks like your web application has BLOB cache enabled which is causing files to served from the cache.
There are 2 ways to fix:
1) The heavy handed way would be to flush the BLOB cache using powershell commands mentioned:
$webApp = Get-SPWebApplication "<WebApplicationURL>"
[Microsoft.SharePoint.Publishing.PublishingCache]::FlushBlobCache($webApp)
This will flush all the files in the BLOB. Usually, the files are cached based on the max-age attribute value. So, that is the reason that your files are being served even if you had deleted it from the source.
2) The surgical knife approach would be to append a query string, like (https://sitecollurl/siteassets/app.js?v=1.1), to the file references (usually in master page, page layouts, webpart references, script links etc. wherever it is referenced). When you append a query string to the file, it will force the browser to download the newer version of the file. Would prefer this approach as it will not unnecessarily clear other files from BLOB.
Usually, while we upload it takes files to the temp directory first and then move it to the desired directory. But I'm working on Big Data e.g. uploading thousands of files at once. So I need to upload those files directly to the desired location and as each one of them uploaded to that directory, the user must see the changes on the dashboard in real time.
Also I need to show user
If any exception has occurred while uploading e.g. if a file causing a problem in the uploading process.
There should be an option to skip that file or retry upload.
Report to show the list of files uploaded successfully vs files that failed to upload.
If there is any network outage, the upload manager should keep retrying until the network is restored.
User can pause upload and can restart it on next login(if it is feasible)
This is about full manipulation of the upload process to give user the best user experience while uploading large sets of data.
You can use ng2-file-upload, it has most of the feature you require.
You can also find demo here.
For rest of the features you require, you can implement those on top of this library (It's better than writing your own code from scratch).
I have an Azure Website/Web App that is incredibly slow to serve static JS and CSS files but seems perfectly fine serving binary.
To test the problem I uploaded two 30MB files, one big.js and the other big.rar. The JS file downloads at around 100KB/s if I'm lucky. The RAR file downloads at around 4,000KB/s. The results are extremely consistent.
I've checked in Fiddler and gzip compression is occurring in both cases. As expected, the JS file is being sent with the MIME type application/x-javascript whereas the RAR file is being served as application/octet-stream.
I am struggling to understand this - why would IIS serve one type of static content so much slower than another?
We had this issue, and was able to resolve this with the help of Azure Support Team. The issue was that the slow files would use TransferEncoding: Chuncked. They suggested that we force static compression to get around this issue.
We had to add the following to <system.webServer>:
<serverRuntime enabled="true" frequentHitThreshold="1" frequentHitTimePeriod="00:00:20" />
To elaborate on John Tseng's answer: (from here)
As you saw earlier, IIS 7 caches the compressed versions of static
files. So, if a request arrives for a static file whose compressed
version is already in the cache, it doesn’t need to be compressed
again.
But what if there is no compressed version in the cache? Will IIS 7
then compress the file right away and put it in the cache? The answer
is yes, but only if the file is being requested frequently. By not
compressing files that are only requested infrequently, IIS 7 saves
CPU usage and cache space.
By default, a file is considered to be requested frequently if it is
requested two or more times per 10 seconds.
So, the reason your users are being served an uncompressed version of the javascript file is because it didn't meet the default threshold for being compressed; in other words, the javascript file was not requested 2 times within 10 seconds.
To control this, there is one attribute we must change on the <serverRuntime> element, which controls compression: frequentHitThreshold. In order for your file to be compressed when it is requested once, change your <serverRuntime> element to look like this:
<serverRuntime enabled="true" frequentHitThreshold="1" />
This will slightly impact your CPU performance if you have many javascript files that are being served and you have users quite often, but likely if you have users often enough to impact CPU from compressing these files, then they are already compressed and cached!
Looks like it was an issue on IIS 8.5 and not only Azure specific.
Now App service upgrade to Windows Server 2016 looks complete and this workaround should not be needed.