Connect Azure Blob Storage to Grafana - excel

I hope you are well. I have uploaded the excel file into my azure container commonly known as Azure Blob Storage. Let me know if their is an open source connector out there.
I will try to catch up with you thanks a lot.
Kind Regards,
Osama
I tried to use azure plugins available on the grafana website But they are asking for tenant details and so on. So, I could not find the details from the azure portal. Any help will be appreciated.

I assume that you tried using "Azure Data Explorer" data source in Grafana. This is used to connect to Azure Data Explorer (Kusto) cluster. This is not the same as Azure Storage Accounts where blobs are stored.
As the data source here is an excel sheet which is stored as a blob in Azure Storage Container, you could convert it to csv and use plugins like Infinity, Csv which can visualize data from csv. In this case, the URL supplied to graphana data-source would be the URL of csv in Azure storage container.

Related

How to load a specific directory in a Azure container into powerBI

I want to load data into powerBI from a specific directory from an Azure container in an Azure storage account , I have tried using get data -> Azure -> Azure Blob Storage but it's getting all the data in that container , But I need only data from a specific directory in that container
Any help Appreciated
Yes, this capability is not supported in Power BI as of today. Here are some similar ideas on Power BI Ideas forum that you can vote for:
Allow PowerBI to connect to subfolder in azure blob storage
Allow power bi to connect to a particular blob file from a subfolder in the Azure Blob Store

How can we call rest api using ssl certificate in azure data factory?

I have been trying to configure/call a rest full api through azure data factory where it gives response in xml format.
Using REST Linked Service: it doesn't have the certificate authentication type. So cannot go with this.
Using HTTP Linked Service: it has the certificate authentication and able to create it successfully but when try to create a dataset it doesn't have the xml format to choose.
I have even read the supported file formats in azure data factory and mentioned the same.
Is there any other posbilities where im missing in azure data factory.
Could anyone help on this please.
Else i will go with Azure Logic app or Azure Databricks.
Still i need to know how can we configure in above two referred azure resources but i will try it later on.
XML format is supported for the following connectors: Amazon S3, Azure Blob, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Azure File Storage, File System, FTP, Google Cloud Storage, HDFS, HTTP, and SFTP. It is supported as source but not sink.
Ref: XML format in Azure Data Factory
When we create dataset in Web active, we can choose the XML format:
For example, click New-->Blog Storage-->XML:
Please check if your source supports the XML format file.

Export Azure application Insight log files to Azure Data Lake storage

I am beginner of the azure portal , I configured the Azure Application insight in front-end side (Angular 2) and Back-end side (Asp.net core)
I can track my application log file through azure application insight,and export the xls sheet also http://dailydotnettips.com/2015/12/04/exporting-application-insights-data-to-excel-its-just-a-single-click/ ,But i need to store all my log file into azure data lake storage for the Backup tracking purpose
I need to debug the issue on my application while facing issues.but i got the link https://learn.microsoft.com/en-us/azure/application-insights/app-insights-code-sample-export-sql-stream-analytics and Can I download data collected by Azure Application Insights (events list)? continues export for sql,blob storage,i dont want unwanted storage for storing my data in azure resources.
So If there is any way for connect application insight to Azure Data lake through connector or plugins.IF its could you please share me the link.
Thank you..
Automatic
If you export the events to azure blob storage you can do multiple things:
Use Azure Data Factory to copy the data from blob storage to Azure Data Lake
Use AdlCopy to copy the data from blob storage to Azure Data Lake
Write an U-Sql job to copy the data to Azure Data Lake
Manual
To manually place exported Application Insights data (in .xls format) you can use the portal to upload the file to Azure Data Lake.
If you need to have more control about the exported data you can use Application Insights Analytics to create a query based on the available data and export it to an .xls file.
If course you can also create a small app to export the .xls file to Azure Data Lake if you do not want to upload it using the portal. You can use the api for that.

Connect Pentaho to Azure Blob storage

I was recently thrown into dealing with databases and Microsoft Azure Blob storage.
As I am completely new to this field I have some troubles:
I can't figure out how to connect to the Blob storage from Pentaho and I could not find any good information online on this topic either.
I would be glad for any information on how to set up this connection.
You can update the core-site.xml like so: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_cloud-data-access/content/authentication-wasb.html
This will give you access to the azure blob storage account.
Go to the file in Azure Blob and Generate SAS token and URL and copy the URL only. In PDI, select Hadoop file input. Double click the Hadoop file input and select local for the Evironment and insert the Azure URL in File/Folder field and that's it. You should see the file in PDI.
I figured it out eventually.
Pentaho provides a HTTP element, in which you can, amongst other things specify an URL.
In Microsoft Azure Blob storage you can generate a SAS token. If you use the URL made from the storage resource URI and the SAS token as input for the URL field in the HTTP element, Pentaho can access the respective file in the Blob storage.

How to transfer csv files from Google Cloud Storage to Azure Datalake Store

I'd like to have our daily csv log files transferred from GCS to Azure Datalake Store, but I can't really figure out what would be the easiest way for it.
Is there a built-in solution for that?
Can I do that with Data Factory?
I'd rather avoid running a VM scheduled to do this with the apis. The idea comes from the GCS->(DataFlow->)BigQuery solution.
Thanks for any ideas!
Yes, you can move data from Google Cloud Storage to Azure Data lake Store using Azure Data Factory by developing custom copy activity. However, in this activity, you will be using APIs for transferring that data. See details on this article.

Resources