Upload of large company snapshot results in error "the file exceeds the maximal allowed size (1048576 KB)" - acumatica

Trying to upload a large Acumatica company snapshot file (1.3 GB) and I am getting an error as soon as I hit the upload button.
What setting (if any) can I change in my local Acumatica site or web.config to allow the large file import?
As a work around I am requesting a snapshot file without file attachments as the file attachments data is about 95% of the snapshot file size.
My file upload preferences are currently set to 25000 KB if that helps any. (I assume this setting is not used for snapshot imports.)
The error occurs after I select the file and click ok (before being able to click the upload button). I am using 2017R2 Update 4.
Image of error:

Modifying your web.config might work, but I think Sergey Marenich alternative is better. He did an excellent post on his blog on how to do this.
http://asiablog.acumatica.com/2017/12/restore-large-snapshot.html
The idea is :
Get a snapshot of your site in xml
Extract and put the folder in C:\Program Files (x86)\Acumatica ERP\Database\Data
Use the Configuration Wizard to deploy a site and select your snapshot data, just like you would when choosing demo data.

If your on SaaS then you may request a copy of database and be able to restore the database for offsite instance.
If your on PCS/PCP then you have couple of options you could modify the Web.config to allow bigger files to process as detailed in this blog https://acumaticaclouderp.blogspot.com/2017/12/acumatica-snapshots-uploading-and.html
If you have larger files then you can't do it coz of IIS constraint and you can certainly use Sergey's method but that would be creating for new instance only or simple approach is to take a SQL .bak file and restore to new database.
I think Acumatica shld provide a mechanism to split these large files and have them processed into multiple uploads to accomplish but again very few customers might face this issue too.

I had this same problem. I tried to modify the web.config but, that gave me an error that said that the file didn't exist or I didn't have permissions when I tried to import the snapshot file into Acumatica again.
Turns out, I had a table that has image blobs stored inside of it so, it wouldn't compress. Watch out for that one.

Related

How can I completely delete a file that was uploaded to Gitlab in an issue comment?

Someone uploaded (attached) a file in a Gitlab issue comment. They did not mean to share that file publicly. I can delete the comment, but the file is still available via the original direct url. The file is at:
https://gitlab.com/<username>/<repo>/uploads/<hash>/<filename>
Is there any way to completely remove files from this uploads directory?
Short version: There's server-side Uploads administration | GitLab, but little to nothing else.
TLDR:
For the owner of a repository, there seems to be no way to get hold of these uploads directly, there even doesn't seem to be a way to list all uploads pertaning to a specific repository (or user/owner), let alone modify them.
use-cases where this would be desirable:
deletion of data that should not be exposed but has been erroneously.
down-scaling of oversized files (images, pdfs, etc)
replacing files with updated versions
deleting space-hogs that are no longer needed.
deleting files that got uploaded accidentally by trigger-happy mice or when the result of a previous upload didn't show in time for the impatient user.
Making these files changeable would cause several issues rooted in their current/previous immutable status:
Users aware of this status will frequently re-use the url to an already uploaded file for perusal in other issues, or the associated wiki (even across projects) to avoid duplication. Afaik, there is no such thing as a link-count for upload items, so deleting an item might result in orphaned references, and changing an uploaded file might render other references out-of-context.
It would solve the serious issue of leaked information, though. The only way I have found so far to remove a file would be to send a prayer to the administrator of the gitlab server, and ask him/her to take care of the uploads directory on the server, as described in Uploads administration | GitLab

JS Files not updating in Site Assets

Ok, I have never seen anything like this before and hoping someone else has. I just finished patching our Dev and Test servers to Nov2017CU (SharePoint 2013). Since then, any solutions that are using JS injection from Site Assets are not updating. I'll make a change to the file, the library reflects that I made the change, but when I attempt to load the page accessing the js file, the changes are not reflected. Hard refreshes and full cache cleans are not affecting it. If I close and reopen my editor (VSCode) my changes are gone. When I look at the version history, the current version doesn't have my changes, but the previous version does. If I try to revert to that version, it doesn't take (still shows the previous version of the file).
Here's where it becomes extra weird. I have deleted the entire file from the library. Reset IIS (heck, I even rebooted the server at one time). It somehow still loads the file. The file is no longer in the library, but the server is still serving it up to the browser. I have confirmed it is not getting it from another location as the Dev tools are showing the file is located in the Asset Library the file was deleted from. Even users who have never accessed the site before are still getting that file in their browser.
This isn't limited to a single site either. I have other developers in different sub sites (same site collection) that are having the same issues.
Anyone seen this before?
Looks like your web application has BLOB cache enabled which is causing files to served from the cache.
There are 2 ways to fix:
1) The heavy handed way would be to flush the BLOB cache using powershell commands mentioned:
$webApp = Get-SPWebApplication "<WebApplicationURL>"
[Microsoft.SharePoint.Publishing.PublishingCache]::FlushBlob‌​Cache($webApp)
This will flush all the files in the BLOB. Usually, the files are cached based on the max-age attribute value. So, that is the reason that your files are being served even if you had deleted it from the source.
2) The surgical knife approach would be to append a query string, like (https://sitecollurl/siteassets/app.js?v=1.1), to the file references (usually in master page, page layouts, webpart references, script links etc. wherever it is referenced). When you append a query string to the file, it will force the browser to download the newer version of the file. Would prefer this approach as it will not unnecessarily clear other files from BLOB.

Kentico 9 media library files are 0KB after upload

I think this is a server setting issue, but when image are uploaded, the file length ends up being 0kb. I don't get any errors in the event log. I can see the file written to the server, it just has no data. I don't know where to look for a fix.
First step i would do is make sure the IIS Application pool has full security control over your CMS folder. If you don't have this set, it may not allow it to write files/modify. You can do this by right clicking on your CMS folder and going to Properties, Security, add user, and search for the user "IIS APPPOOL\TheAppPoolName" on the local machine.
If you're hosted, they may have tools in their file editor to do the same.

How to develop a real time file upload with Angular 2 and Node.js?

Usually, while we upload it takes files to the temp directory first and then move it to the desired directory. But I'm working on Big Data e.g. uploading thousands of files at once. So I need to upload those files directly to the desired location and as each one of them uploaded to that directory, the user must see the changes on the dashboard in real time.
Also I need to show user
If any exception has occurred while uploading e.g. if a file causing a problem in the uploading process.
There should be an option to skip that file or retry upload.
Report to show the list of files uploaded successfully vs files that failed to upload.
If there is any network outage, the upload manager should keep retrying until the network is restored.
User can pause upload and can restart it on next login(if it is feasible)
This is about full manipulation of the upload process to give user the best user experience while uploading large sets of data.
You can use ng2-file-upload, it has most of the feature you require.
You can also find demo here.
For rest of the features you require, you can implement those on top of this library (It's better than writing your own code from scratch).

Publish website to Azure, remove additional files at destination, but ignore specific folders

I currently manually delete obsolete folders from a published azure website. I know there is an option in visual studio to Remove additional files at destination. My problem is that I have an Images folder (quite large) that users upload, that will be deleted when I publish with this option checked. My question is, is there a way to use this option with exclusions? Meaning, to delete all files that are not in the local project except "\Images" folder?
You can most likely customize the web deploy usage from VS to do what you want but I don't think I would recommend it since things like that tend to get fragile.
I would suggest changing your architecture to store the images in a blob container, then possibly mapping your blobs to a custom domain (https://azure.microsoft.com/en-us/documentation/articles/storage-custom-domain-name/).
Having your images in blob storage will also prevent any accidental deletion of the Images folder by someone else that doesn't know it shouldn't be touched (or you simply forgetting about it one day).
Using blob storage will also allow you to configure CDN usage if ever find that you needed it.
Another option would be to create a virtual directory on your WebApp configuration and put the Images there - that way your VS deploy/publish wouldn't be modifying that subdirectory. This link may help with that: https://blogs.msdn.microsoft.com/tomholl/2014/09/21/deploying-multiple-virtual-directories-to-a-single-azure-website/

Resources