AzCopy blob download throwing errors on local machine - azure

I am running the following command while learning how to use AzCopy.
azcopy /Source:https://storeaccountname.blob.core.windows.net/container /Dest:C:\container\ /SourceKey:Key /Pattern:"tdx" /S /V
Some files are downloaded by most files result in an error like the following. I have no idea why this happening and wondered if somebody has encountered this and knows the cause and the fix.
[2016/05/31 21:27:13][ERROR] tdx/logs/site-visit/archive/1463557944558/visit-1463557420000: Failed to open file C:\container\tdx\logs\site-visit\archive\1463557944558\visit-1463557420000: Access to the path 'C:\container\tdx\logs\site-visit\archive\1463557944558\visit-1463557420000' is denied..
My ultimate goal was to create backups of the blobs in a container of one storage account to the container of another storage account. So I am starting out with basics which seem to fail.
Here is a list of folder names from an example path pulled from Azure Portal:
storeaccountname > Blob service > container > app-logs > hdfs > logs
application_1461803569410_0008
application_1461803569410_0009
application_1461803569410_0010
application_1461803569410_0011
application_1461803569410_0025
application_1461803569410_0027
application_1461803569410_0029
application_1461803569410_0031
application_1461803569410_0033
application_1461803569410_0035
application_1461803569410_0037
application_1461803569410_0039
application_1461803569410_0041
application_1461803569410_0043
application_1461803569410_0045
There is an error in the log for each one of these folders that looks like this:
[2016/05/31 21:29:18.830-05:00][VERBOSE] Transfer FAILED: app-logs/hdfs/logs/application_1461803569410_0008 => app-logs\hdfs\logs\application_1461803569410_0008.
[2016/05/31 21:29:18.834-05:00][ERROR] app-logs/hdfs/logs/application_1461803569410_0008: Failed to open file C:\container\app-logs\hdfs\logs\application_1461803569410_0008: Access to the path 'C:\container\app-logs\hdfs\logs\application_1461803569410_0008' is denied..
The folder application_1461803569410_0008 contains two files. Those two files were successfully downloaded. From the logs:
[2016/05/31 21:29:19.041-05:00][VERBOSE] Finished transfer: app-logs/hdfs/logs/application_1461803569410_0008/10.2.0.5_30050 => app-logs\hdfs\logs\application_1461803569410_0008\10.2.0.5_30050
[2016/05/31 21:29:19.084-05:00][VERBOSE] Finished transfer: app-logs/hdfs/logs/application_1461803569410_0008/10.2.0.4_30050 => app-logs\hdfs\logs\application_1461803569410_0008\10.2.0.4_30050
So it appears that the problem is related to copying folders, which themselves are blobs but I can't be certain yet.

There are several known issues when using AzCopy, such as the below which will cause error,
If there are two blobs named “a” and “a/b” under a storage container, copying the blobs under that container with /S will fail. Windows will not allow the creation of folder name “a” and file name “a” under the same folder.
Refer to https://blogs.msdn.microsoft.com/windowsazurestorage/2012/12/03/azcopy-uploadingdownloading-files-for-windows-azure-blobs/. Scroll down to the bottom, see details of Known Issues.
In my container con2, there are a folder named abc.pdf and also a file abc.pdf, when executing Azcopy download command with /S, it will prompt a error message.
Please check your container whether there are folders with the same name as a file.

Related

Azure File Copy failing on second run in build pipeline

I am using the "Azure File Copy" task in Azure Devops, which as far as I can see, uses an Az copy command to copy a file into Azure storage.
Here's my task definition for this:
[Note - this is v3 of the task]
This works fine on first run of the task within a build pipeline, and creates the file in the container as expected (shown below):
When I run the task in the pipeline subsequent times, it fails. I can see from the error it seems to be prompting for overwrite options - Yes/No/All. See below:
My Question:
Does anyone know how I give the task arguments that will tell it to force overwrite each time? Documentation for this on the Microsoft website isn't great and I can't find an example on the Github repo.
Thanks in advance for any pointers!
Full Error:
& "AzCopy\AzCopy.exe" /Source:"D:\a\1\s\TestResults\Coverage\Reports" /Dest:"https://project1.blob.core.windows.net/examplecontainer" /#:"D:\a\_temp\36c17ff3-27da-46a2-95d7-7f3a01eab368" /SetContentType:image/png /Pattern:"Example.png"
[2020/04/18 21:29:18][ERROR] D:\a\1\s\TestResults\Coverage\Reports\Example.png: No input is received when user needed to make a choice among several given options.
Overwrite https://project1.blob.core.windows.net/examplecontainer/Example.png with D:\a\1\s\TestResults\Coverage\Reports\Example.png? (Yes/No/All) [2020/04/18 21:29:18] Transfer summary:
-----------------
Total files transferred: 1
Transfer successfully: 0
Transfer skipped: 0
Transfer failed: 1
Elapsed time: 00.00:00:01
##[error]Upload to container: 'examplecontainer' in storage account: 'project1' with blob prefix: '' failed with error: 'AzCopy.exe exited with non-zero exit code while uploading files to blob storage.' For more info please refer to https://aka.ms/azurefilecopyreadme
Not so much as solution, as a workaround but I set this to version 1 of the task and it worked for me!

Azure SSIS IR - working with files in the temp folder of the IR node

I have setup a custom SSIS IR, however I'm having problems reading files from the current working directory or temp folder on the IR node
https://learn.microsoft.com/en-us/sql/integration-services/lift-shift/ssis-azure-files-file-shares?view=sql-server-2017
The work flow of my test package is
Load compressed file to Azure file share
Unzip file
Modify file, saving it the current working group folder on the IR node (this path .\testfile.json)
Load file to Azure SQL DB
The last step is where I'm having issues, I receive the below error message. Maybe looks to be related to security, but no idea how to access the SSIS IR node to check this.
Execute SQL Task:Error: Executing the query "DECLARE #request
VARCHAR(MAX) SELECT #request =..." failed with the following error:
"Cannot bulk load because the file ".\testfile.json" could not be
opened. Operating system error code (null).". Possible failure
reasons: Problems with the query, "ResultSet" property not set
correctly, parameters not set correctly, or connection not established
correctly.
How can I fix this issue?
From just the error message, looks like you're using BULK INSERT in Execute SQL Task to load data into Azure SQL DB. BULK INSERT into Azure SQL DB can only work from Azure Storage Blob, but not from file systems/SSIS IR nodes. To load data from the current working directory of SSIS IR nodes into Azure SQL DB, you can use a Data Flow with Flat File Source and ADO.NET Destination.

Azure function upload failing with "A task was canceled"

I am getting the following error when using the following command to upload my function app:
func azure functionapp publish FuncAppName
I ran this from both the parent directory of the function app and the function app directory itself, and got the same error. It looks like some task in the upload times out after a minute or so:
Publish C:\Users\username\Documents\visual studio 2017\Projects\AzureFuncApp contents to an Azure Function App. Locally deleted files are not removed from destination.
Getting site publishing info...
Creating archive for current directory...
Uploading archive...
A task was canceled.
Any idea how to solve this/get more debugging info?
The function in question already exists on Portal and is running. I was previously able to upload it successfully.
Please refer to this GitHub issue:
https://github.com/Azure/azure-functions-cli/issues/147
A change has been made to address this issue and will be included in the next CLI release.

How do I get the correct path to a folder of an Azure container?

I'm trying to read files from an Azure storage account. In particular, I'd like to read all files contained in a certain folder, for example:
lines = sc.textFile('/path_to_azure_folder/*')
I am not quite sure what the path should be. I tried with the URL service blob endpoint, from Azure, followed by the folder path (I tried with both http and https):
lines = sc.textFile('https://container_name.blob.core.windows.net/path_to_folder/*')
and did not work:
diagnostics: Application XXXXXX failed 5 times due to AM Container for
XXXXXXXX exited with exitCode: 1 Diagnostics: Exception from
container-launch. Container id: XXXXXXXXX Exit code: 1
the URL I provided is the same I'm getting with CyberDuck App, when I click on 'Info'.
Your path should look like this
lines = sc.textFile("wasb://containerName#$storageAccountName.blob.core.windows.net/folder_path/*")
This should solve your your issue.
If you are trying to read all the blobs in an Azure Storage account, you might want to look into the tools and libraries we offer for retrieving and manipulating your data. Getting started doc here.
Hope this is helpful!

Azure copy blob to another account: invalid blob type

I want to copy a 12GB page blob from one storage account to another. At the moment, both sides are "public container". But it doesn't work: HTTP/1.1 409 The blob type is invalid for this operation.
Copying it the same way but within the same storage account works without errors.
What am I missing?
Thanks!
//EDIT: This is how I'm trying to copy blob.dat from account1 to account2 (casablanca lib):
http_client client(L"https://account2.blob.core.windows.net");
http_request request(methods::PUT);
request.headers().add(L"Authorization", L"SharedKey account2:*************************************");
request.headers().add(L"x-ms-copy-source", L"http://account1.blob.core.windows.net/dir/blob.dat");
request.headers().add(L"x-ms-date", L"Sat, 23 Nov 2013 16:50:00 GMT"); // I'm keeping this updated
request.headers().add(L"x-ms-version", L"2012-02-12");
request.set_request_uri(L"/dir/blob.dat");
auto ret = client.request(request).then([](http_response response)
{
std::wcout << response.status_code() << std::endl << response.to_string() << std::endl;
});
The storage accounts were created a few days ago, so no restrictions apply.
Also, the destination dir is empty (account2 /dir/blob.dat is not existing).
//EDIT2:
I did more testing and found out this: Uploading a new page blob (few MB) then copying it to another storage account worked!
Then I tried to rename the 12GB page blob which I wasn't able to copy (renamed from mydisk.vhd to test.dat) and suddenly the copy to another storage worked as well!
But the next problem is: After renaming the test.dat back to mydisk.vhd in the destination storage account, I cannot create a disk from it (error like "not a valid vhd file"). But the copy is already done (x-ms-copy-status: success).
What could be the problem now?
(Oh I forgot: the source mydisk.vhd lease status was "unlocked" before copying)
//EDIT3:
Well, it seems that the problem has solved itself... even with the original mydisk.vhd I wasn't able to create a disk again (invalid vhd). I don't know why as I didnt alter it, but I created it on the xbox one launch day, it was all quite slow so maybe something went wrong there. Now, as I created a new VM, I can copy the .vhd over to another storage without problems (after deleting the disk).
I would suggest using AzCopy - Cross Account Copy Blob.
Check it out here:
http://blogs.msdn.com/b/windowsazurestorage/archive/2013/04/01/azcopy-using-cross-account-copy-blob.aspx

Resources