Display image in Databricks notebook error - databricks

I am working on creating a databricks notebook template with company logo. Using the below code to display image is throwing error.
Code:
%md
<img src ='/test/image/MyImage.jpg'>
Error:
HTTP ERROR 403: Invalid or missing CSRF token
Please guide me.

You either need to store image somewhere, and refer to it as a full URL, for example, you can refer your company site.
Another way is to upload file to the /FileStore directory on DBFS, and then you can refer to it using the /files/ URL, that is supported in both HTML and Markdown (see docs):
%md
![my_test_image](files/image.jpg)
You can upload image using databricks-cli, or via UI (if you have DBFS File Browser enabled). (Another option is the DBFS REST API, but it's cumbersome)

Related

Azure CDN Rewrite URL to index.html not working

I have a static website with one $web container and one more container named storybook containing a build in the $web container. Currently to reach the storybook container I have to write: www.xyz/storybook/index.hmtl.
The goal is to reach the Image when calling www.xyz/storybook.
To achieve this I tried to setup Rule Engines URL Rewrite.
My rule looks like this
Somehow the rewrite is not working and I have no idea why.
I tried to reproduce the same in my environment i am getting the same error as 404:
Thus, to resolve this issue, check whether you have provided correct Html error document path as below:
In $web container, I have added my error document path as abc.html file and my image files as well. In document path, try to add your image URL path as below:
Also, when I try to browse my primary endpoint URL, I am getting the image successfully as shown below. For this, make sure your container’s access level should be public container (anonymous read access for container and blobs) and I created azure CDN under security + networking.
Check whether you have given correct origin path default as below:
To access the image, copy the Endpoint hostname along with your image url path as below that is, Append with blob url (/$web/abc.html) by omitting origin hostname(https://imran955.blob.core.windows.net)
i.e., https://Endpointhostname/blob url
Result:

How to get AmlDatastore image url from Azure

I'm trying to use auto ML to train an object detection model. I have all of my images uploaded to azure and am trying to create my jsonl file for the tabular dataset. I'm using this document: https://learn.microsoft.com/en-us/azure/machine-learning/how-to-auto-train-image-models
This is their example of an image url: AmlDatastore://image_data/Image_01.png
How do I get that url for my images in my datastore? When I view the image in the datastore, it shows me an https url. I tried using that one but I get an error that the file is not found. I've also tried making an AmlDatastore url with a path to my image with no luck there either.

How to download via URL from DBFS in Azure Databricks

Documented here its mentioned that I am supposed to download a file from Data Bricks File System from a URL like:
https://<your-region>.azuredatabricks.net?o=######/files/my-stuff/my-file.txt
But when I try to download it from the URL with my own "o=" parameter similar to this:
https://westeurope.azuredatabricks.net/?o=1234567890123456/files/my-stuff/my-file.txt
it only gives the following error:
HTTP ERROR: 500
Problem accessing /. Reason:
java.lang.NumberFormatException: For input string:
"1234567890123456/files/my-stuff/my-file.txt"
Am I using the wrong URL or is the documentation wrong?
I already found a similar question that was answered, but that one does not seem to fit to the Azure Databricks documentation and might for AWS Databricks:
Databricks: Download a dbfs:/FileStore File to my Local Machine?
Thanks in advance for your help
The URL should be:
https://westeurope.azuredatabricks.net/files/my-stuff/my-file.txt?o=1234567890123456
Note that the file must be in the filestore folder.
As a side note I've been working on something called DBFS explorer to help with things like this if you would like to give it a try?
https://datathirst.net/projects/dbfs-explorer/

generateSharedAccessSignature not adding sv parameter?

I'm trying to generate a Shared Access Signature and am using the code here (http://blogs.msdn.com/b/brunoterkaly/archive/2014/06/13/how-to-provision-a-shared-access-signatures-that-allows-clients-to-upload-files-to-to-azure-storage-using-node-js-inside-of-azure-mobile-services.aspx) for a custom API to generate the SAS.
It seems to be missing the sv=2014-02-14 parameter when calling "generateSharedAccessSignature()".
The SAS url doesn't seem to work when I try it (getting a 400 xml not valid error) but if I try a SAS generated from Azure Management Studio the URL contains the "sv" parameter and works when I attempt to upload with it.
Any ideas?
Based on the Storage Service REST API Documentation, sv parameter in Shared Access Signature is introduced in storage service version 2014-02-14. My guess is that Azure Mobile Service is using an older version of the storage service API and this is the reason you don't see sv parameter in your SAS token.
You could be getting 400 error (invalid XML) because of this. In the earlier version of storage service API, the XML syntax for committing block list was different than what is used currently. I have had one more user come to my blog post complaining about the same error. Please try the following XML syntax when performing a commit block list operation and see if the error is gone:
<?xml version="1.0" encoding="utf-8"?>
<BlockList>
<Block>[base64-encoded-block-id]</Block>
<Block>[base64-encoded-block-id]</Block>
...
<Block>[base64-encoded-block-id]</Block>
</BlockList>
Please notice that we're not using Latest node. Instead we're using Block node.
Leaving the sv parameter out and setting it as part of the PUT request header worked using:
xhr.setRequestHeader('x-ms-version','2014-02-14');
You can check out this example for an azure file upload script: http://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript/
...which will work with the generated SAS from the question's original blog link - http://blogs.msdn.com/b/brunoterkaly/archive/2014/06/13/how-to-provision-a-shared-access-signatures-that-allows-clients-to-upload-files-to-to-azure-storage-using-node-js-inside-of-azure-mobile-services.aspx
Add the request header in the beforeSend like so:
beforeSend: function(xhr) {
xhr.setRequestHeader('x-ms-version','2014-02-14');
},

Setting Metadata in Google Cloud Storage (Export from BigQuery)

I am trying to update the metadata (programatically, from Python) of several CSV/JSON files that are exported from BigQuery. The application that exports the data is the same with the one modifying the files (thus using the same server certificate). The export goes all well, that is until I try to use the objects.patch() method to set the metadata I want. The problem is that I keep getting the following error:
apiclient.errors.HttpError: <HttpError 403 when requesting https://www.googleapis.com/storage/v1/b/<bucket>/<file>?alt=json returned "Forbidden">
Obviously, this has something to do with bucket or file permissions, but I can't manage to get around it. How come if the same certificate is being used in writing files and updating file metadata, i'm unable to update it? The bucket is created with the same certificate.
If that's the exact URL you're using, it's a URL problem: you're missing the /o/ between the bucket name and the object name.

Resources