Can someone please let me know the process of how I can store a json key file in databricks secrets. This secret key json file stores the values of service account details which I need to interact with Google Drive. Thanks!!
Related
I have a case were I need to connect to a Bank using sFTP and retrieve bank statements and send payment files.
The basic flows will be:
Get Bank statements.
Fetch Encrypted Signed Data from Bank.
Decrypt Encrypted Signed Data with my private key.
Verify the Plain text signed file.
Unzip data file for further processing.
List item
Send Payment File
Naming Give payment file the right filename.
Compress Zip data file (PKZIP format).
Sign zipped data file with your private key and hash algorithm SHA2-256.
Encrypt the signed data file with AES256.
Send Encrypted Signed Data.
Checking the requirements received against the Key Vault functionalities is see that
Encryption algorithm for data AES256 is not supported?
Any suggestions?
As of now algorithms supported for encryption or decryption using logic apps are RSA-OAEP, RSA1_5 and RSA-OAEP-256.
ASE algorithm is not available from logic apps.
Refer this link it may help you
I need to store the token which contains the database information in it. Data is very sensitive. currently, I'm storing it in local storage which is not secure to store. The reason I'm storing the details is it is a dynamically generated database after login. To call the API every time it requires the database details so I'm storing the details in the token and storing the token in the local storage. How to store the data without storing in the local storage session storage and cookie
How do you use vault to essentially return a client secret into a json file so that it can be used by an application? I'm doing this on a remote server that many people will be ssh'ing on to. So ideally when the application executes, it would trigger a function in Python, fetch the client secret from the vault, return it as a value to the client_secrets key in the json file and allow auth without anyone else ever seeing the key.
I'll be using Google Auth client secrets with PyDrive if that makes a difference.
I have a program that pulls from azure key vault. When the data comes back it is in plain text. Is the data from the server to the client always in plain text and not encrypted?
I have generated SAS using the primary access key. I want to know that if the primary & secondary access keys of the Storage Account are changed, will the SAS generated using the earlier Primary/Secondary access keys will continue to work or do I need to generate a new SAS every time the storage access keys are changed?
Once you regenerate the account keys, the SAS tokens created with the old key will become invalid (you will start getting 403 errors if you use them). You would need to generate new SAS tokens with new account keys.