How to uninstall pip library from Azure databricks notebook - without removing it from cluster library utility? - azure

Trying to start data factory from databricks.
I am having conflict between Azure libraries installed on cluster level:
from azure.identity import ClientSecretCredential
from azure.mgmt.resource import ResourceManagementClient
from azure.mgmt.datafactory import DataFactoryManagementClient
azure_client_id = dbutils.secrets.get(scope="Azure_KeyVault", key="_Application_Id")
azure_client_secret = dbutils.secrets.get(scope="Azure_KeyVault", key="_Client_Secret")
azure_tenant_id = dbutils.secrets.get(scope="Azure_KeyVault", key="__Tenant_Id")
# example of trigger_object['topic']: /subscriptions/f8354c08-de3d-4a67-95ae-c7cbdb37fbf6/resourceGroups/WeS06DvBing15064/providers/Microsoft.Storage/storageAccounts/wes06dvraw15064
subscription_id = 'f4379743884938948398938493793749830'
credentials = ClientSecretCredential(client_id=azure_client_id, client_secret=azure_client_secret, tenant_id=azure_tenant_id)
dfmc = DataFactoryManagementClient(credentials, subscription_id, base_url="https://management.azure.com")
[f.id for f in dfmc.factories.list()]
Error message :
AttributeError: 'ClientSecretCredential' object has no attribute
'signed_session'
I think it could because we have Azure installed on this cluster using the cluster libraries utility. (Given that it works if i remove this library from cluster level).
When i'm doing this in the notebook : %pip uninstall Azure
i'm getting :
Python interpreter will be restarted. Found existing installation:
azure 4.0.0 Not uninstalling azure at
/databricks/python3/lib/python3.7/site-packages, outside environment
/local_disk0/.ephemeral_nfs/envs/pythonEnv-6eab9ca4-4cd6-4bd9-843f-8e33a185c96a
Can't uninstall 'azure'. No files were found to uninstall. Python
interpreter will be restarted.
I don't quite understand this last error message. I want to uninstall library in the notebook, but do not want to remove it from the cluster library utility level (it is used in many other notebooks)

Libraries can be installed in two levels when it comes to data bricks.
Workspace library
Cluster library
1. Library
Get into the folder containing libraries
Select the library name which you need to uninstall
Select the checkbox next to the library to which you need to uninstall, then confirm.
After confirmation the status changes to uninstall pending restart
2. Cluster
Go to the library folder
Select the library
Select the checkbox next to the name and select uninstall
After confirmation it will be in pending state
Restart the cluster
In this procedure, both the normal libraries and cluster libraries are isolated.

Related

unable to upload workspace packages and requirement.txt files on azure synapse analytics sparks pool

When trying to import python libraries at a spark pool level by applying an uploaded requirements.txt file and custom packages, I get the following error with no other details:
CreateOrUpdateSparkComputeFailed
Error occured while processing the request
It was working perfectly fine few days back. Last upload was successful on 12/3/2021.
Also SystemReservedJob-LibraryManagement application job not getting triggered.
Environment Details:
Azure Synapse Analytics
Apache Spark pool - 3.1
We tried below things:
increase the vcore size up to 200
uploaded the same packages to different subscription resource and it is working fine.
increased the spark pool size.
Please suggest
Thank you
Make sure you have below packages in your requirement.txt
Before that we need to check about the packages which are installed and which are not. You can get all the details of packages install by running below lines of code and can conclude which packages are missing and can keep them in place:
import pkg_resources
for d in pkg_resources.working_set:
print(d)
Install the missing libraries with Requirement.txt.
I faced the similar use case where I got good information and step procedure from MS Docs, have a look on it to handle workspace libs

Execute databricks magic command from PyCharm IDE

With databricks-connect we can successfully run codes written in Databricks or Databricks notebook from many IDE. Databricks has also created many magic commands to support their feature with regards to running multi-language support in each cell by adding commands like %sql or %md. One issue I am facing currently is when I try to execute Databricks notebooks in Pycharm is as follows:
How to execute Databricks specific magic command from PyCharm.
E.g.
Importing a script or notebook in Done in Databricks using this command-
%run
'./FILE_TO_IMPORT'
Where as in IDE from FILE_TO_IMPORT import XYZ works.
Again everytime I download Databricks notebook it comments out the magic commands and that makes it impossible to be used anywhere outside Databricks environment.
It's really inefficient to convert all databricks magic command everytime I want to do any developement.
Is there any configuration I could set which automatically detects Databricks specific magic commands?
Any solution to this will be helpful. Thanks in Advance!!!
Unfortunately, as per the databricks-connect version 6.2.0-
" We cannot use magic command outside the databricks environment directly. This will either require creating custom functions but again that will only work for Jupyter not PyCharm"
Again, since importing py files requires %run magic command so this also becomes a major issue. A solution to this is by converting the set of files to be imported as a python package and add it to the cluster via Databricks UI and then import and use it in PyCharm. But this is a very tedious process.

How to list Databricks scopes using Python when working on it secret API

I can create a scope. However, I want to be sure to create the scope only when it does not already exist. Also, I want to do the checking using Python? Is that doable?
What I have found out is that I can create the scope multiple times and not get an error message -- is this the right way to handle this? The document https://docs.databricks.com/security/secrets/secret-scopes.html#secret-scopes points out using
databricks secrets list-scopes
to list the scopes. However, I created a cell and ran
%sh
databricks secrets list-scopes
I got an error message saying "/bin/bash: databricks: command not found".
Thanks!
This will list all the scopes.
dbutils.secrets.listScopes()
You can't run the CLI commands from your databricks cluster (through a notebook). CLI needs to be installed and configured on your own workstation and then you can run these commands on your workstation after you configure connecting to a databricks worksapce using the generated token.
still you can run databricks cli command in notebook by same kind databricks-clisetup in cluster level and run as bash command . install databricks cli by pip install databricks-cli

Using VSTS can I install extensions other than those listed in the dropdown?

I would like to install a Web App extension as part of my VSTS build/deployment. The list of options only includes a few options (mostly Python). How/can I install other extensions?
The values in Azure App Service Manage task is static and you just can select them, but you can refer to the source code: AzureAppServiceManage to custom build/release task to include the extensions that you want.
You also could install necessary extensions through kudu API:
SiteExtensions
GET /api/extensionfeed List all extension package infos available on
the online (remote) server. The following query strings are
supported.
- filter: matching string
GET /api/siteextensions List all extension package infos currently
installed. The following query strings are supported.
- filter: matching string
GET /api/extensionfeed/{id} Get a package info with {id} from remote
store.
GET /api/siteextensions/{id} Get a package info with {id} currently
installed.
PUT /api/siteextensions/{id} Install or update the package to local
machine. The payload is the package info returned by List/Get apis
above.
DELETE /api/siteextensions/{id} Uninstall the package with {id}.
An article about Install Azure App Extension With Powershell.
There is a thread to call Kudu API that can help you: Remove files and foldes on Azure before a new deploy from VSTS
You can use powershell\arm templates\rest call to do that. Those are not native steps, so they would require research.
Several examples:
http://wp.sjkp.dk/install-azure-site-extensions-with-arm-template/
https://gist.github.com/sethreidnz/aa996f91339bafdfb5ecb1d4681ba26c/
https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/app-service-web/app-service-web-arm-with-msdeploy-provision.md

manageprofiles.sh can't create DEPLOYMENT MANAGER

My goal is to create a Deployment Manager profile in my Websphere on Linux.
Reading tons of documentation pages gives just two methods:
1) using X GUI application WAS_root/bin/ProfileManagement/pmt.sh
In this case, according to manuals, i should choose Management option, click "Next" and choose Deployment Manager server type.
Actually when i run pmt.sh in my WAS installation there is no option to choose Deployment Manager in Management section and only one profile type that i'm able to create is AdminAgent.
2) using manageprofiles.sh script under WAS_root/bin directory.
The manuals say that i can use -serverType DEPLOYMENT_MANAGER option in order to create deployment manager profile.
Actually when i run the script:
./manageprofiles.sh -create -templatePath ../profileTemplates/management/ -ServerType DEPLOYMENT_MANAGER -isDefault -profileName dmgr -adminUserName websphere -adminPassword websphere1
I get the following message:
The following validation errors were present with the command line
arguments:
serverType: The value for this parameter must be within this set of values [ADMIN_AGENT]
That means that i don't have the ability to create Deployment Manager at all.
Please advise what steps i can perform to be able to install Deploy Manager except erase my WAS installation and install it from scratch once again.
Thanks a lot.
You have standalone aka base edition of WebSphere Application Server. To create Deployment Manager you need Network Deployment edition.
Unfortunately you will have to install it from scratch from different installation files (from ND, not base or developers edition).

Resources