Web API Stream file from Azure Storage with range support - azure

I have Azure Storage account with many media files.
I have Azure Website that streams media files directly from Azure Storage containers (public container). Azure Storage account supports range headers.
I want to create Web API controller that will shield direct links to Azure Storage and still support streaming (range headers)
Azure Storage Client Library has method DownloadRangeToStreamAsync() which I think has to be used. It has offset and length parameters.
Does anyone know how to retrieve these values in Web Api?
Thanks!

Related

How to upload an image to Azure Bucket without using a subdomain (like in AWS S3)

Currently, we host this file on AWS using S3:
https://s3.amazonaws.com/apretaste/mirrors
We would like to move gigs of files to Azure, where we have all our servers, but since we are an anti-censorship tool, having the name of the project "apretaste" (or any other string) as a subdomain will make it an easy target.
On Azure, I can only host the file as:
https://apretaste.blob.core.windows.net/mirrors
As you can see, the subdomain "apretaste" is fully exposed in Azure, while in AWS is hidden, encrypted as part of the https request.
Is there a way to hide the name in Azure? I could have not find one. Any help is appreciated.
Is there a way to hide the name encrypted as part of the https request in Azure?
AFAIK, there is no direct way to hide or encrypt the storage account name in Azure.
Alternatively for workaround you can use the Azure CDN (Content Delivery Network <https://name.azureedge.net>) endpoints hostname to connect the origin hostname<https://storageaccountname.blob.core.windows.net>
You can create the Azure CDN through portal:
Portal -> Storage account -> Azure CDN -> create endpoints.
I tried with postman to upload file to azure blob storage using CDN endpoints it uploaded successfully.
URL:
https://name.azureedge.net/<containername>/<filename> + sas token
Postman:
You can refer this Document for more in detail of Azure CDN.
You can also use another method in that you can configure custom domain and map it to the Azure blob storage account. Once the custom domain has been linked to your blob service endpoint it shows name of the organization since the displayed custom domain effectively encrypts the storage account name. To know more in detail kindly refer the below document.
Map a custom domain to an Azure Blob Storage endpoint - Azure Storage | Microsoft Learn

How to display log messages from azure iot device client code

I see log messages in azure iot device client source code like this:
log.debug("Connection already opened by TransportClient."); or
log.info("Device client opened successfully");
My question is where these log messages going? how to get that messages for debug purpose?
Thanks
In general, Blob Storage is added as a 'logging endpoint' which shall encompass a storage account, container in the account and blob in the container. The blobs of type 'Block blobs' shall be utilized for storing text and binary data.
All logs get stored in 'Block blobs' in a container named $logs, which is automatically created when Storage Analytics is enabled for a storage account. The $logs container is located in the blob namespace of the storage account, for example: http://.blob.core.windows.net/$logs
To view and analyze your log data, you should download the blobs that contain the log data you are interested in to a local machine. There are many tools like AzCopy, Azure Storage Data Movement library, Azure Import/Export service to import or export data to and from your storage account. To view the logs, you can also use any tool that can access Azure blob storage, such as Visual Studio or Cerebrata Azure Management Studio.
In case of azure-iot-sdk, each IoT hub exposes a set of endpoints(service endpoints) for the solution's back end to communicate with the devices. An IoT hub has a default built-in-endpoint (messages/events). By default, messages are routed to the built-in service-facing endpoint (messages/events) that is compatible with Event Hubs. You can refer to below link to various methods to read from built-in endpoint https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-messages-read-builtin
You can also create custom endpoints to route messages to by linking other services in your subscription to the IoT Hub. In case if the custom endpoint is created, a message is routed to multiple endpoints whose routing queries it matches. There are two storage services IoT Hub can route messages to Azure Blob Storage and ADLS (Azure Data Lake Storage) Gen2 accounts. You can refer to the below link to various methods to read from custom endpoint https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-messages-read-custom
In the scenario of logs from IoT SDK itself, the logs can get logged in stdout or stderr as per type of log and based on the deployment environment and also can be redirected accordingly as per the requirement. The SDK uses debug library for detailed logs. The below link can be helpful in giving additional details https://github.com/Azure/azure-iot-sdk-node/wiki/Troubleshooting-Guide-Devices

Downloading Binary content (images) from on-premises server using Azure Data Factory "Rest" Dataset

We have this Rest based web-wervice hosted on-premises.
We are using Azure Data Factory with Integration-Runtime configure on-premises on a server.
We are getting urls of images (all different but hosted on same server).
Is it possible to download these images from these rest-endpoints (hosted in on-premises server) using ADF?
Since you already set up Self-hosted IR on your machine,i think you could use Copy Activity to transfer binary contents into Azure Service(e.g Azure Blob Storage)
Base on this document,Binary format is supported for the REST connectors and Blob Storage connectors:

Using Azure Media Services with on-prem assets

Is it possible to stream on-premises media files through Azure Media Services without storing them in the cloud?
This is not possible. AMS uses blob storage as the source for assets that are being encoded and streamed. From a performance perspective this approach makes sense as well.
Note that the uploaded content is protected using storage protection and access policies.

Clarification regarding storage account in web applications

I have an on-premises mvc application with a database calls to one more server.
When I deploy this application to windows azure, I am curious to know what will be stored in the storage account for this cloud service?
Is it database records or something else?
Given you mentioned creating a Cloud Service (so, I'm assuming Web Role for your MVC app): The deployment needs a storage account, at a minimum, for storing diagnostic log information, as well as your cloud service package and configuration.
Storage account is mostly used for "Blob" storage. In Azure environment we should not prefer to store blob data( like image and doc/PDF ) in database.best practice to store blob storage link.
Azure Storage provides the flexibility to store and retrieve large amounts of unstructured data, such as documents and media files with Azure Blobs; structured nosql based data with Azure Tables; reliable messages with Azure Queues and use SMB based Azure Files for migrating on-premises applications to the cloud.
for Overview and reference : http://azure.microsoft.com/en-in/documentation/services/storage/

Resources