Connect Pentaho to Azure Blob storage - azure

I was recently thrown into dealing with databases and Microsoft Azure Blob storage.
As I am completely new to this field I have some troubles:
I can't figure out how to connect to the Blob storage from Pentaho and I could not find any good information online on this topic either.
I would be glad for any information on how to set up this connection.

You can update the core-site.xml like so: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_cloud-data-access/content/authentication-wasb.html
This will give you access to the azure blob storage account.

Go to the file in Azure Blob and Generate SAS token and URL and copy the URL only. In PDI, select Hadoop file input. Double click the Hadoop file input and select local for the Evironment and insert the Azure URL in File/Folder field and that's it. You should see the file in PDI.

I figured it out eventually.
Pentaho provides a HTTP element, in which you can, amongst other things specify an URL.
In Microsoft Azure Blob storage you can generate a SAS token. If you use the URL made from the storage resource URI and the SAS token as input for the URL field in the HTTP element, Pentaho can access the respective file in the Blob storage.

Related

Connect Azure Blob Storage to Grafana

I hope you are well. I have uploaded the excel file into my azure container commonly known as Azure Blob Storage. Let me know if their is an open source connector out there.
I will try to catch up with you thanks a lot.
Kind Regards,
Osama
I tried to use azure plugins available on the grafana website But they are asking for tenant details and so on. So, I could not find the details from the azure portal. Any help will be appreciated.
I assume that you tried using "Azure Data Explorer" data source in Grafana. This is used to connect to Azure Data Explorer (Kusto) cluster. This is not the same as Azure Storage Accounts where blobs are stored.
As the data source here is an excel sheet which is stored as a blob in Azure Storage Container, you could convert it to csv and use plugins like Infinity, Csv which can visualize data from csv. In this case, the URL supplied to graphana data-source would be the URL of csv in Azure storage container.

Is it possible to open Azure Storage Explorer using direct link with SAS token?

When i want to launch Azure Storage Explorer using direct link what i do is copy it from app and pasting to my browser. It looks somethig like that:
storageexplorer:// ... and there are subId, AccountId etc
My question is, is it possible to produce uri like above using SAS token to open concrete container from my web browser using Azure Storage Explorer?
I'm from the Microsoft for Founders Hub team. With this connection method, you'll use a SAS URI for the required storage account. You'll need a SAS URI whether you want to use a file share, table, queue, or blob container. You can get a SAS URI either from the Azure portal or from Storage Explorer.
To add a SAS connection:
Open Storage Explorer.
Connect to your Azure storage account.
Select the connection type: shared access signature (SAS) URI.
Provide a meaningful name for the connection.
When you're prompted, provide the SAS URI.
Review and verify the connection details, and then select Connect.
When you've added a connection, it appears in the resource tree as a new node. You'll find the connection node in this branch: Local & attached > Storage Accounts > Attached Container > Service.

How to upload a folder to blob storage using SAS URI in storage explorer

enter image description hereI'm trying to upload a folder to blob container using storage explorer via SAS URI, but the upload is failing for folder & files. How can I achieve that ? When I connect to blob storage using account name and key it works fine, but not with SAS URI.
I've created several tests and all succeeded with SAS URI.
I think you should check a few places:
According to your screen shot. Maybe your SAS key has expired?
The URI configuration. We should concat the connect string and the SAS token.
The configuration is as follows:
First you need to check the permissions allotted to SAS and also the expiry date, there can be two cases here: one you may not have adequate access to upload and second the SAS might have expired.
If this is a one time activity and then you can use AzCopy tool as well to upload files to blob storage.

Use Azure computer vision API on images stored on BlobStore

I am trying to integrate Azure computer vision api's, and I would like to access the images stored on Azure blob storage.
The documentation mentions about running vision API's on remote URLs, however I am not able run them on- URL of images stored on Azure blob storage.
Is it possible to run Azure Computer vision APIs on images stored on Blob storage ?
I can reproduce this problem if I change my blob container access level to private(anonymous access) it will show bad request. So I suppose the problem is the image url is not accessible.
So one solution is change the access level to Blob or Container, then you image blob url will be accessible.
Another solution is Hong Ooi provide, use the sas url to access the image blob. And about how to generate sas url, the simplest way is generate it from the portal like below pic, click the Generate SAStoken and URL, it will give you the sas url. If you want a sample code, you could refer to this:Create a service SAS for a container or blob with .NET.

Azure Storage Blob with CodeIgnitor

I'd have a PHP {codeigniter} application that i want to migrate to its storage service from AWS S3 to Blob Storage,The application uploads all media files to S3 bucket and S3 generates a link that is stored to the database in which the media file can be accessed from,I want to do the same with azure Blobs storage.I'm facing technical hindrance as i can't find the right resources {libraries/code samples} achieve this goal.Tried the Azure PHP SKD but it didn't work out.
Actually, there is a detailed sample for using Azure Storage PHP SDK. You may refer to: https://github.com/Azure/azure-storage-php/blob/master/samples/BlobSamples.php
To run that sample, you just need to replace the following place with your own value:
$connectionString = 'DefaultEndpointsProtocol=https;AccountName=<yourAccount>;AccountKey=<yourKey>';
Suggestion:
I see that you want to generate an access url and store it in database. I am not familiar with AWS S3, but with Azure Storage you may need to set public access level on container or blob.
Otherwise, you can not access the blob directly. You may need to created a SAS token.

Resources