Uploading Excel to Azure storage is corrupting the file and providing a Security warning as well - excel

I am uploading an Excel memorystream to Azure Storage as a blob. Blob is saved successfully but corrupted while opening or downloading. Tested once with Excel
This provides Security warning everytime for the .csv files. But the file opens normally after that.
The same memorystream is working fine on local as I am able to convert the memorystream into Excel/CSV with no errors.
Any Help!!

Got the answer after some Google.
I was uploading an Excel/CSV to azure storage and while opening the file especially .csv it produces a Security warning. But the same same memorystream was working find on local.
Got some interesting answer here:
"It is possible for .csv files to contain potentially malicious code, so we purposely don't include it in our list of safe-to-open files.
In a future update, we could provide a way to customize the list of files a user would consider safe to open."
The link is:: https://github.com/microsoft/AzureStorageExplorer/issues/164

Related

Azure data factory sftp

I have copied the file from an sFTP loc which is a zip file and in zip, I have CSV format but when the file came to azure blob but the file extension came as .zip.txt can someone suggest how this is happening and how can I get CSV as it is.
Have you tried using the "compression type" option?
This will work for legacy zip. If the zip is with AES encryption or with a password, you will need a custom activity and do the unzipping using an Azure function with some code inside.

XLSX files in azure blob storage get downloaded as zip files

We have some files in our Azure blob storage - they are all xlsx files.
When we download them via Azure portal (we navigate to the storage account, then to the container, and then select a file and download it) it downloads and saves as zip file.
If after downloading we change its extension to xlsx then Excel will recognize it and open without issues. However, something is forcing that extension to change from xlsx (as we see it in the container) to the .zip whilst it is downloaded.
The same happens when we access the files programmatically (via c# code) or generate a shared access signature.
What could it be and how to fix it?
Thanks!
my work around when accessing xlsx files programmatically with C#, is to manually add the mime type specifically for the xlsx file type as, they were one's giving me issues(pdf and pictures work fine), PS, I store filenames in my DB with a corresponding filename. i.e
if (YourModel.FileName.EndsWith("xlsx"))
{
return File(YourModel.FileData, "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet");
}

how to load local file into Azure SQL DB

I have not been able to find a solution to this so will ask the experts.
A co-worker has a .txt file on his laptop that we want to load into Azure SQL DB using SSMS and Bulk Insert. We can open the local file easily enough but we don't know how to reference this file in FROM clause.
Assuming a file named myData.txt is saved to
c:\Users\Someone
how do we tell Azure SQL DB where that file is?
You don't. :) You have to upload a file to an Azure Blob Store and then, from there, you can use BULK INSERT or OPENROWSET to open the file.
https://learn.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-2017
I've written an article that describes the steps to open a JSON file here:
https://medium.com/#mauridb/work-with-json-files-with-azure-sql-8946f066ddd4
I fixed this problem by uploading the file to a local database and then use a linked server to my Azure db to insert or update the record. Much easier than creating a Blob Storage. However, if the file is very big or you have a lot of files to upload you might not want to use my method as linked servers is not the quickest connection.

Upload of large company snapshot results in error "the file exceeds the maximal allowed size (1048576 KB)"

Trying to upload a large Acumatica company snapshot file (1.3 GB) and I am getting an error as soon as I hit the upload button.
What setting (if any) can I change in my local Acumatica site or web.config to allow the large file import?
As a work around I am requesting a snapshot file without file attachments as the file attachments data is about 95% of the snapshot file size.
My file upload preferences are currently set to 25000 KB if that helps any. (I assume this setting is not used for snapshot imports.)
The error occurs after I select the file and click ok (before being able to click the upload button). I am using 2017R2 Update 4.
Image of error:
Modifying your web.config might work, but I think Sergey Marenich alternative is better. He did an excellent post on his blog on how to do this.
http://asiablog.acumatica.com/2017/12/restore-large-snapshot.html
The idea is :
Get a snapshot of your site in xml
Extract and put the folder in C:\Program Files (x86)\Acumatica ERP\Database\Data
Use the Configuration Wizard to deploy a site and select your snapshot data, just like you would when choosing demo data.
If your on SaaS then you may request a copy of database and be able to restore the database for offsite instance.
If your on PCS/PCP then you have couple of options you could modify the Web.config to allow bigger files to process as detailed in this blog https://acumaticaclouderp.blogspot.com/2017/12/acumatica-snapshots-uploading-and.html
If you have larger files then you can't do it coz of IIS constraint and you can certainly use Sergey's method but that would be creating for new instance only or simple approach is to take a SQL .bak file and restore to new database.
I think Acumatica shld provide a mechanism to split these large files and have them processed into multiple uploads to accomplish but again very few customers might face this issue too.
I had this same problem. I tried to modify the web.config but, that gave me an error that said that the file didn't exist or I didn't have permissions when I tried to import the snapshot file into Acumatica again.
Turns out, I had a table that has image blobs stored inside of it so, it wouldn't compress. Watch out for that one.

CloudBerry PowerShell Multipart

I've written a powershell script to upload from a windows system to an Amazon S3 Bucket. The script successfully uploads all files, except those over 5Gb. I have the Cloud Berry Explorer Pro license which allows for multipart upload on files up to 5TB. However there is no flag for multipart on the powershell snapin documentation. Cloudberry support directed me here as they only support the GUI not the powershell snapin. When running my script I get the error
"WARNING: Your proposed upload exceeds the maximum allowed object size (5 Gb)".
So question is, does anyone know if there is a command line option, or another way to enable multipart upload to Amazon S3 using Cloudberry Explorer Pro's Powershell Snapin?
Set-CloudOption UseChunks=true
I'm looking for the same in Powershell.
I believe in GUI, the original chunking mechanism has been depreciated. I have not tested myself, but I assume Powershell option UseChunks=true is still using the old mechanism? If so, files may be split into multiple parts and not automatically recombined when they arrive on S3. The new GUI Multipart Upload facility sorts this all out for you.
Annoyed Cloudberry still advertise Powershell as a component of Explorer (Free & Pro), but don't support it, even for fully paid up PRO support customers.
We did purchase the cloudberry explorer pro license for the native multipart upload capability, but we wanted to automate it. I believe based on their documentation that the old chunk method is deprecated in favor of their new multi-part functionality. We wound up testing the options listed in the powershell documentation. Those options are as follows:
Set-CloudOption -UseChunks -ChunkSizeKB
"Defines a size of chunk in KB; files larger than a chunk will be divided into chunks."
We verified that this was successfully uploading files beyond the 5GB restriction to our S3 bucket. I attempted to get a response from Cloudberry as to whether this was the old Chunking method or the new Multi-Part method, but I was unable to get a straight answer. They confirmed that because we were using pro, this powershell option was supported, but they failed to confirm which option the powershell command was using.
From what I can tell it appears that Cloudberry's legacy chunking mechanism would actually just break the file into individual files and thus would appear in S3 as multiple files. The Chunk Transparency mechanism in Cloudberry Explorer would allow the multiple chunks to appear as a single file in the Cloudberry Explorer GUI only. Since I can see the file as a single file on the S3 side, I'm assuming that the powershell option uses the new Multi-Part functionality and not the legacy Chunking functionality. Again I was not able to confirm this through Cloudberry so it's speculation on my part.
However, I can confirm that using the powershell options above will get you around the 5GB upload limit when using powershell.

Resources