I am trying to use databricks cli and invoke the databricks configure - azure

I am trying to use databricks CLI and invoke the databricks configure. At the point where we need to enter the token,I am neither able to type anything or copy paste anything.

You need to just copy the token and paste and hit enter.
Note: Even after pasting the token it will show as blank.
.
Now you can run the databricks fs ls command to check whether you have successfully configured.
For more details, refer similar thread: databricks configure using cmd and R

Related

Is there a way to use cURL commands in Azure Data Factory?

So I am trying to create a Pipeline in Azure Data Factory and part of the process involves uploading a CSV to a temporary URL generated earlier in the Pipeline by an earlier REST API request. The API's documentation say's to use a cURL command or "a similar application". I have gotten the cURL command to work on my local environment but have had no luck doing it in ADF. the cURL command I am currently using is curl --upload-file "<file location>" "<api URL>" --ssl-no-revoke -v
While ADF supports web requests, it does not seem to support cURL commands at least directly. Currently I am trying to automate the cURL command through a Automation Account which runs a PowerShell Script and then use a web hook to continue from there within the pipeline but I have my doubt that this will work due to having to pass the temporary URL from the pipeline to the PowerShell script.
The questions can be summed up as follows:
Is it possible to put a cURL command in a web request? I have not found any good examples of this as most cURL commands seem to take place in PowerShell and Command Prompt
Is there some ADF functionality I am not aware of that runs cURL commands?
What are the alternatives to cURL that I could use for this process? Are they friendlier than cURL when it comes to ADF?
Any other potential advice I may need to know
I appreciate any input on this matter!
Only way to do that is by creating your own batch / *.bat files which have the curl commands in it and execute them from server side , you may save results on text files which you can also read them from your server side .
Read more here

Facing issue while installing Data bricks CLI. Not able to enter Token value at command prompt

I am trying to install Data bricks CLI. At command prompt, I entered command Data bricks configure --token and provided Data bricks Host URL. After entering both values, it's asking value of Token. I have generated token value but it's not allowing me to enter...copy paste option is also not working so I am not able to install Data bricks CLI. Can someone please help me to resolve this issue
In the recent versions of the Databricks CLI it doesn't show the pasted text because the value of token is a sensitive value. So just paste text into terminal using the standard shortcut (like described in this article), or via terminal menu (you should have Paste in the Edit menu), and press ENTER.

Execute databricks magic command from PyCharm IDE

With databricks-connect we can successfully run codes written in Databricks or Databricks notebook from many IDE. Databricks has also created many magic commands to support their feature with regards to running multi-language support in each cell by adding commands like %sql or %md. One issue I am facing currently is when I try to execute Databricks notebooks in Pycharm is as follows:
How to execute Databricks specific magic command from PyCharm.
E.g.
Importing a script or notebook in Done in Databricks using this command-
%run
'./FILE_TO_IMPORT'
Where as in IDE from FILE_TO_IMPORT import XYZ works.
Again everytime I download Databricks notebook it comments out the magic commands and that makes it impossible to be used anywhere outside Databricks environment.
It's really inefficient to convert all databricks magic command everytime I want to do any developement.
Is there any configuration I could set which automatically detects Databricks specific magic commands?
Any solution to this will be helpful. Thanks in Advance!!!
Unfortunately, as per the databricks-connect version 6.2.0-
" We cannot use magic command outside the databricks environment directly. This will either require creating custom functions but again that will only work for Jupyter not PyCharm"
Again, since importing py files requires %run magic command so this also becomes a major issue. A solution to this is by converting the set of files to be imported as a python package and add it to the cluster via Databricks UI and then import and use it in PyCharm. But this is a very tedious process.

How to list Databricks scopes using Python when working on it secret API

I can create a scope. However, I want to be sure to create the scope only when it does not already exist. Also, I want to do the checking using Python? Is that doable?
What I have found out is that I can create the scope multiple times and not get an error message -- is this the right way to handle this? The document https://docs.databricks.com/security/secrets/secret-scopes.html#secret-scopes points out using
databricks secrets list-scopes
to list the scopes. However, I created a cell and ran
%sh
databricks secrets list-scopes
I got an error message saying "/bin/bash: databricks: command not found".
Thanks!
This will list all the scopes.
dbutils.secrets.listScopes()
You can't run the CLI commands from your databricks cluster (through a notebook). CLI needs to be installed and configured on your own workstation and then you can run these commands on your workstation after you configure connecting to a databricks worksapce using the generated token.
still you can run databricks cli command in notebook by same kind databricks-clisetup in cluster level and run as bash command . install databricks cli by pip install databricks-cli

How do I work locally with files in my Cloud Shell profile

Either this isn't possible, or it's so simple, I am missing the trick or I am going about it the wrong way. Similar to this question.
I prefer working with VS Code and basically, I want to treat the home path in cloud CLI as a local folder, exposed to VS Code.
I have installed the following VS Code extensions:
Azure Account
Azure Storage
Azure CLI Tools
If I connect to cloud shell via VS Code (F1 > Azure:Open Bash in Cloud Shell) (as explained here) or through the Portal, I have a home directory /home/john, where I can put files. It is this area I want to connect to from my PC (via VS Code).
My first thought was that this area would be exposed in Azure Storage Explorer, however, the only thing in my cloud shell storage account is: File Shares: azclishare > .cloudconsole > acc_john.img. There is no sign of any of the files in /home/john. I'm guessing they're wrapped up in acc_john.img.
I also though about using SCP, but I can't find any reference to this either and I can't find any "connection strings" in the portal.
If anyone has any ideas, I'd be grateful if you could share...
P.S. I am using Windows 10.
It's always the same, post a question on SO then find the answer!
The full answer is here: https://learn.microsoft.com/en-us/azure/cloud-shell/persisting-shell-storage
The short answer is that Cloud Shell does map to the storage account (files), but to /usr/john/clouddrive.
In fact, there is a symlink to clouddrive in /home/john.

Resources