Download file written to DBFS through Python code from Azure Databricks [duplicate] - databricks

Normally I use below URL to download file from Databricks DBFS FileStore to my local computer.
*https://<MY_DATABRICKS_INSTANCE_NAME>/fileStore/?o=<NUMBER_FROM_ORIGINAL_URL>*
However, this time the file is not downloaded and the URL lead me to Databricks homepage instead.
Does anyone have any suggestion on how I can download file from DBFS to local area? or how should fix the URL to make it work?
Any suggestions would be greatly appreciated!
PJ

Method1: Using Databricks portal GUI, you can download full results (max 1 millions rows).
Method2: Using Databricks CLI
To download full results, first save the file to dbfs and then copy the file to local machine using Databricks cli as follows.
dbfs cp "dbfs:/FileStore/tables/my_my.csv" "A:\AzureAnalytics"
You can access DBFS objects using the DBFS CLI, DBFS API, Databricks file system utilities (dbutils.fs), Spark APIs, and local file APIs.
In a Spark cluster you access DBFS objects using Databricks file system utilities, Spark APIs, or local file APIs.
On a local computer you access DBFS objects using the Databricks CLI or DBFS API.
Reference: Azure Databricks – Access DBFS
The DBFS command-line interface (CLI) uses the DBFS API to expose an easy to use command-line interface to DBFS. Using this client, you can interact with DBFS using commands similar to those you use on a Unix command line. For example:
# List files in DBFS
dbfs ls
# Put local file ./apple.txt to dbfs:/apple.txt
dbfs cp ./apple.txt dbfs:/apple.txt
# Get dbfs:/apple.txt and save to local file ./apple.txt
dbfs cp dbfs:/apple.txt ./apple.txt
# Recursively put local dir ./banana to dbfs:/banana
dbfs cp -r ./banana dbfs:/banana
Reference: Installing and configuring Azure Databricks CLI
Method3: Using third-party tool named DBFS Explorer
DBFS Explorer was created as a quick way to upload and download files to the Databricks filesystem (DBFS). This will work with both AWS and Azure instances of Databricks. You will need to create a bearer token in the web interface in order to connect.

Related

Unmounting and Mounting fileshare in azure

I have a storage account where fileshare has been mounted which includes nearly 300+ files in that fileshare. Now if I try unmounting it with below command,
sudo umount /xyx/files
Then what is the command to mount it back? Is it
P
sudo mount /xyx/files ???
I have mounted initially from windows share to Linux OS . Do I need to use the same command or the above mount command?
If I use the same command then will there be any loss of my files?
I tried to reproduce the same in my environment I have mount a file share in storage account:
First make sure you have checked your storage account accessible from public network as below:
I tried to mount a sample file in azure storage account
I have mounted my sample files successfully as below:
To unmount the azure file share, make use of below command:
sudo umount /mnt/<yourfileshare>
In the event that any files are open, or any processes are running in the working directory, the file system cannot be unmounted.
To unmount a file share drive you can make use below command
net use <drive-letter> /delete
When you try to unmount the files once the execution has been complete the mount point will be deleted from that moment, we can't access the data through the mount point on the storage. if you want to restore the file if you have already enabled the soft delete. in file share some files are deleted in that time you can disable the soft delete and in file share you can enable show deleted shares option and you can make use of undelete.
Reference: Mount Azure Blob storage as a file system on Linux with BlobFuse | Microsoft Learn

Azure Databricks - Export and Import DBFS filesystem

We have just created a new Azure Databricks resource into our resource group. In the same resource group there is an old instance of Azure Databricks. Starting from this old Databricks instance, I would copy the data stored in dbfs into the newest Databricks instance.
How could I do that? My idea is to use FS commands in order to copy or move data from a dbfs to another, probably mounting the volumes, but I am not getting how could I do that.
Do you have any indications?
Thanks,
Francesco
Unfortunately, there is no direct method to export and import files/folders from one workspace to another workspace.
Note: It's is highly recommended: Do not Store any Production Data in Default DBFS Folders
How to copy files/folders from one workspace to another workspace?
You need to manually download files/folders from one workspace and upload files/folders to another workspace.
The easiest way is to using DBFS Explorer:
Click this link to view: https://imgur.com/aUUGPXR
Download file/folder from DBFS to the local machine:
Method1: Using Databricks CLI
The DBFS command-line interface (CLI) uses the DBFS API to expose an easy to use command-line interface to DBFS. Using this client, you can interact with DBFS using commands similar to those you use on a Unix command line. For example:
# List files in DBFS
dbfs ls
# Put local file ./apple.txt to dbfs:/apple.txt
dbfs cp ./apple.txt dbfs:/apple.txt
# Get dbfs:/apple.txt and save to local file ./apple.txt
dbfs cp dbfs:/apple.txt ./apple.txt
# Recursively put local dir ./banana to dbfs:/banana
dbfs cp -r ./banana dbfs:/banana
Reference: Installing and configuring Azure Databricks CLI and Azure Databricks – Access DBFS
Method2: Using third-party tool named DBFS Explorer
DBFS Explorer was created as a quick way to upload and download files to the Databricks filesystem (DBFS). This will work with both AWS and Azure instances of Databricks. You will need to create a bearer token in the web interface in order to connect.
Upload file/folder from the local machine to DBFS:
There are multiple ways to upload files from a local machine to the Azure Databricks DBFS folder.
Method1: Using the Azure Databricks portal.
Method2: Using Databricks CLI
The DBFS command-line interface (CLI) uses the DBFS API to expose an easy to use command-line interface to DBFS. Using this client, you can interact with DBFS using commands similar to those you use on a Unix command line. For example:
# List files in DBFS
dbfs ls
# Put local file ./apple.txt to dbfs:/apple.txt
dbfs cp ./apple.txt dbfs:/apple.txt
# Get dbfs:/apple.txt and save to local file ./apple.txt
dbfs cp dbfs:/apple.txt ./apple.txt
# Recursively put local dir ./banana to dbfs:/banana
dbfs cp -r ./banana dbfs:/banana
Method3: Using third-party tool named DBFS Explorer
DBFS Explorer was created as a quick way to upload and download files to the Databricks filesystem (DBFS). This will work with both AWS and Azure instances of Databricks. You will need to create a bearer token in the web interface in order to connect.
Step1: Download and install DBFS Explorer and install it.
Step2: Open DBFS Explorer and Enter: Databricks URL and Personal Access Token
Step3: Select the folder where you want to upload the files from the local machine and just drag and drop in the folder to upload and click upload.

files copying from remote server to s3 does not show status on s3 bucket

I am uploading almost 7 TB of files and folders from my remote server to the s3 bucket but I can not see any files on the s3 bucket. only a few files I can see on s3 that was copied successfully.
I have one ec2 server on which I have mounted an s3 bucket using this link
on the remote server, I am using the following script. I have also tested this script and it was working fine for the small size of files
rsync -uvPz --recursive -e "ssh -i /tmp/key.pem" /eb_bkup/OMCS_USB/* appadmin#10.118.33.124:/tmp/tmp/s3fs-demo/source/backups/eb/ >> /tmp/log.txt &
The log file I am generating is showing me files are being copied and all the relevant information like transfer speed, filename, etc. But on the s3 bucket, I can not see any file after the 1st one is copied.
Each file size is from 500MB to 25GB.
Why I cannot see these files on S3?
Amazon S3 is an object storage service, not a filesystem. I recommend you use the AWS Command-Line Interface (CLI) to copy files rather than mounting S3 as a disk.
The AWS CLI includes a aws s3 sync command that is ideal for your purpose -- it will synchronize files between two locations. So, if something fails, you can re-run it and it will not copy files that have already been copied.
So the issue I was facing that rsync was copying files on target ec2 and first creating a temporary file and then writing it over S3 bucket. So multiple rsync job was running and the local ebs volume storage on EC2 server was full. That is why rsync wasn't able to create temp files and was kept copying/writing on the socket.

HDFS + create simbolic link between HDFS folder to local filesystem folder

I searched in google but not find it,
is it possible to create link between HDFS folder to local folder?
example
we want to create link between folder_1 in HDFS to /home/hdfs_mirror local folder
HDFS folder:
su hdfs
$ hdfs dfs -ls /hdfs_home/folder_1
Linux local folder:
ls /home/hdfs_mirror
I do not think it is possible.
This is because we are talking about two different File Systems (HDFS and Local FileSystem).
in case we want to keep syncing the Local Data Directory to HDFS directory then need to make use of some tools like Apache Flume.

how to create ini file as HDFS on linux

I am new to linux. Cloudera documentation has mentioned creating sentry-provider.ini file on cloudera CHD 5.4 as HDFS file. I am not finding good article on how to create ini file on linux.
I am trying to configure Apache Sentry on cloudera setup to have role based security on hive metadata
how to create ini file as HDFS on linux?
Simple way is: You can create this "sentry-provider.ini" file on your local (on linux terminal)
vi sentry-provider.ini
Then put the content specified at this link in the file by pressing i and then pasting the content.
After this put the file on HDFS file system using command :
hdfs dfs -copyFromLocal sentry-provider.ini etc/sentry/
Remember that the path etc/sentry/ is the path on HDFS from your user's home directory which is typically /user/username/

Resources