How to download SAS(Shared Acess signature) of an Azure File share onto Linux machine - linux

I have a SAS generated for a file share(with read and list privileges(no write privileges).
my SAS looks like the following format :
“https://test.file.core.windows.net/testf1?[some_token_here]
. I used Azcopy to download the files through above SAS onto a windows virtual machine however Azcopy is not present in Linux.
How do I download the files using the above SAS onto my linux virtual machine(I run ubuntu 14.04 but i prefer a answer that runs on most linux distros)? I'd prefer using a single line of code to carry out the above task. I tried working with Azure-cli but I was unable to find any success.
ps: I am very new to Azure

If you are using the latest Azure-Cli, you might want to try:
azure storage file upload [options] [source] [share] [path]
azure storage file download [options] [share] [path] [destination]
Please try azure storage file for more help information.

Related

How can I download a folder from google drive or dropbox using command in linux?

I am trying to download a folder using command in linux shell using dropbox or google drive link. The download works but it is not saved as a folder, after it is downloaded I cannot access it using 'cd ..' command. So the folder is downloaded but when I use cd .. , I get the message that the file is not a directory.
How can I download a folder and access it? I am also executing this in virtual machine.
I do not know which method you are using to download the directory. In order to download a directory, you need to recursively download all the files in it or, create a tar or zip for the directory.
You can consider using gdown.
Please also read the detailed explanation from the following post: wget/curl large file from google drive

Can't Access /dbfs/FileStore using shell commands in databricks runtime version 7

In databricks runtime version 6.6 I am able to successfully run a shell command like the following:
%sh ls /dbfs/FileStore/tables
However, in runtime version 7, this no longer works. Is there any way to directly access /dbfs/FileStore in runtime version 7? I need to run commands to unzip a parquet zip file in /dbfs/FileStore/tables. This used to work in version 6.6 but databricks new "upgrade" breaks this simple core functionality.
Not sure if this matters but I am using the community edition of databricks.
WHen you run %sh ls /dbfs/FileStore/tables you can't Access /dbfs/FileStore using shell commands in databricks runtime version 7 because by default, the folder named '/dbfs/FileStore' does not exists in the 'dbfs'.
Try to upload some files in '/dbfs/FileStore/Tables'.
Now, try to run the same command again %sh ls /dbfs/FileStore/tables, now you see the results because we have upload the data into /dbfs/FileStore/tables folder.
The /dbfs mount doesn't work on Community Edition with DBR >= 7.x - it's a known limitation.
You can workaround this limitation by working with files on the driver node and upload or download files using the dbutils.fs.cp command (docs). So your code will look as following:
#write a file to local filesystem using Python I/O APIs
...
# upload file to DBFS
dbutils.fs.cp('file:/tmp/local-path', 'dbfs:/FileStore/tables/dbfs_file.txt')
and reading from DBFS will look as following:
# copy file from DBFS to local file_system
dbutils.fs.cp('dbfs:/tmp/dbfs_file.txt', 'file:/tmp/local-path')
# read the file locally
...
I know this question is a year old, but I wanted to share other posts that I found helpful in case someone has the same question.
I found the comments in this similar question to be helpful: How to access DBFS from shell?. The comments in the aforementioned post, also references Not able to cat dbfs file in databricks community edition cluster. FileNotFoundError: [Errno 2] No such file or directory: which I found helpful as well.
I learned in Community Edition ls /dbfs/FileStore/tables is not possible because the dbfs itself is not mounted on the nodes and the feature is disabled.

AzCopy (devops pipeline)is not recognized as the name of a cmdlet, function, script file, or operable program

I have a PowerShell script that works all the time when I use from my local machine (I have azCopy installed):
AzCopy `
/Source:C:\myfolder `
/Dest:https://mystorageaccount.blob.core.windows.net/mystoragecontainer `
/DestKey:<storage-account-access-key> `
/Pattern:"myfile.txt"
Using azure pipeline (Microsoft Hosted agent) this script fails with
"AzCopy.exe : The term 'AzCopy.exe' is not recognized as the name of a cmdlet, function, script file, or operable program."
I have tried different agents but still the same error.
Which agent I must use to use azCopy?
Am I missing the obvious?
Is there another way of doing this always using powershell?
To copy files to Azure with AzCpoy you can use build-in task Azure File Copy, you not need use PowerShell:
In addition, you can install the Microsoft Azure Build and Release Tasks extension that give you another task "Azure Copy File Extended" with more options.
Agree with Shayki Abramczyk, the Azcopy task he provided can also be used to achieve copy file. This is another way, you can consider give it a try :-)
Back to this issue. According to error message, I think it's because the missing SDK in hosted agent.
Until now, Microsoft does not install Azure.Storage.AzCopy in every hosted agent. So, the agent you used may does not support this.
We provide seven different agents for user use, but only Hosted VS2017, Hosted Windows 2019 with VS2019 and Hosted Ubuntu 1604 has been installed the SDK which support Azcopy.exe.
So, you can try with these three agents to execute your azcopy command with powershell.
Edit:
Becaues the executable file (azcopy.exe)is in local. So, where is your AzCopy.exe located? For me, it's C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy.
So, in script, you need to execute cd command to change directory to the file where AzCopy.exe located first.
cd “C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy”
Note: DO NOT lost double quote here, or you will get x86 is not recognized. If file path located not same with mine, just change file path with yours.
And then, because of using Powershell, you may need to use powershell syntax. Here is the complete format example which modify it based on your script:
$source="C:\MyFolder"
$dest="https://mystorageaccount.blob.core.windows.net/mystoragecontainer"
$pattern = "myfile.txt"
$destkey = <key>
cd “C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy”
$azcopy = .\AzCopy.exe /Source:$source /Dest:$dest /DestKey: $destkey
/Pattern: $pattern
Please try with it.
For people like me, landing to this thread because they have this error by calling AZ copy in a PS Script, it's confirmed AZ Copy in not installed in Last (VM2019) version of Windows Hosted. But according to MS, binary is present in the Image, so you don't have to install it, but just to use the right path.
For more information about packages installed (or saved) on VM, you can check this Git Repo

How to compress/zip files in Microsoft Azure App Service console?

I know about the KUDU service in Microsoft Azure and it works great. However I have got very large data, which is greater than 3GB and it takes very long time to download and then upload to my new server. Is there a way to zip the data on Azure through command line and then do wget to download this data on new server. I have been doing this manually till now but it is taking it forever to download it to my PC first and then uploading to the server through FTP.
I am logged into Microsoft Azure App Service Console. I have tried compress, Compress-Archive and even zip command but nothing works. It says that famous internal and external command not found message.
'compress' is not recognized as an internal or external command, operable program or batch file.
How could I compress these file on Azure console? Earliest help would be appreciated.
Or is there a way to install some compression tool on this server through command line?
Now Windows and Kudu both support a native tar command, why not use that.
tar -cvzf my_archive.tar.gz input_dir
tar -xf my_archive.tar.gz
While the unzip utility is available, there's no zip tool. One way around that is to upload the command line version of 7-Zip, it's a standalone .EXE file.

How to download a file from ubuntu virtual machine using Azure powershell script

I have created an Ubuntu VM on Azure and I want to download a file stored in one of the directories of this VM.
I want to do this using Powershell.
If you just want to grab a couple files, then you can use pscp. You can download pscp from here: http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html
Usage:
pscp.exe -r -pw 'password' 'local-path' adminuser#hostname.cloudapp.net:/path
If you want to do this more than once from multiple clients you can just serve the files with a web server, e.g. Apache. Then you can just use Invoke-WebRequest to download the files via HTTP.
Virtual machines in azure are completely locked down.
Do you just need to download the file once?
I am trying to understand your requirement of using PowerShell.
Here are a few options:
1. manually ftp the file from ubuntu to a FTP server and download it from there.
2. second option for you is to use Azure command line tools that run on ubuntu.
you should install Azure CLI
https://github.com/Azure/azure-xplat-cli
There are instructions for Ubuntu distributions.
One Azure CLI has been isntalled you can use the azure storage command line options to tranfer the file to an azure storage container.
After the files in an azure container you can use PowerShell to download it.
You can also use Azcopy tool to down the file from the container.
https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/
Let me know if this meets your needs.

Resources