Status code: 416, description: 'The range specified is invalid for the current size of the resource.' - powerbi-desktop

Status code: 416, description: 'The range specified is invalid for the current size of the resource.'. I am getting the above eror while load the data from azure blob storage. can any one help me out.

Several Blob service GET operations support the use of the standard
HTTP Range header. Many HTTP clients, including the .NET client
library, limit the size of the Range header to a 32-bit integer, that
limits to a maximum of 4 GiB. Since both block blobs and page
blobs can be larger than 4 GiB in size, the Blob service accepts a
custom range header x-ms-range for any operation that takes an HTTP
Range header.
I would suggest to look into Range Header Microsoft documentation.
Are you doing it with code or Azure Storage Explorer?

Related

Azure Data Factory API call throws payload limit error

We are performing Copy Activity with API Rest Url as data source and ADLS Gen2 as sink. The pipeline works in most cases and sporadically throws below error. We have nested pipeline to loop through multiple REST API request parameters and make call within forEach activity.
Error displayed in ADF monitor-
Error Code - 2200
Failure Type - User Configuration issue
Details - The payload including configurations on activity/dataset/linked service is too large. Please check if you have settings with very large value and try to reduce its size.
Error message: The payload including configurations on
activity/dataSet/linked service is too large. Please check if you have
settings with very large value and try to reduce its size.
Cause: The payload for each activity run includes the activity configuration, the associated dataset(s), and linked service(s) configurations if any, and a small portion of system properties generated per activity type. The limit of such payload size is 896 KB as mentioned in the Azure limits documentation for Data Factory and Azure Synapse Analytics.
Recommendation: You hit this limit likely because you pass in one or more large parameter values from either upstream activity output or external, especially if you pass actual data across activities in control flow. Check if you can reduce the size of large parameter values or tune your pipeline logic to avoid passing such values across activities and handle it inside the activity instead.
Refer - https://learn.microsoft.com/en-us/azure/data-factory/data-factory-troubleshoot-guide#payload-is-too-large

Azure blob snapshots not getting deleted via logic app

While deleting old blobs using a logic app by giving the container path, we ran into an error message "Status code:409, "message": This operation is not permitted because the blob has snapshots”. This subsequently fails the running of the logic app. I tried to use delete blob by providing Id and Filename but the error persists. Is there any way to specifically delete blob and its corresponding snapshot using the logic app? Approaches to solving the issue are welcome. Blob's lifecycle management policy does not work for us.
You can use an Azure Function to delete your blob including this header in your request:
x-ms-delete-snapshots: {include, only}
Required if the blob has associated snapshots. Specify one of the following two options:
include: Delete the base blob and all of its snapshots.
only: Delete only the blob's snapshots and not the blob itself.
This header should be specified only for a request against the base blob resource. If this header is specified on a request to delete an individual snapshot, the Blob service returns status code 400 (Bad Request).
If this header is not specified on the request and the blob has associated snapshots, the Blob service returns status code 409 (Conflict).
Check documentation here.
You can try to filter and order your Blobs before remove your base blob, deleting snapshots first within your Logic App.

Copy blob connector gives 400 error - invalid characters

I have a Logic App that copies blob between two containers. When I invoke the Logic App it gives the Bad Request error saying The specified name contains invalid characters.
I do not understand where the invalid characters are. Thanks.
You should not give the full URL for the Source URL and Destination URL
Actually, the http://xxxx.blob.core.windows.net will be automatically obtained from the connection while you configuring the Copy Blob task
So you should only give the path containing from the container
Source URL: importstage/dataentity.zip
Destination blob URL: importprocessing/abc.zip
E.g,
Here testing is my container name

Azure blob copy in cloud

In aws, the "upload-part-copy" has option of byte ranges. If I wanted to copy portions of two objects to a new object within the cloud, I can copy using the "upload-part-copy" command.
I could not find any such method or mechanism to copy portions of blobs to a new blob in Azure. I tried AzCopy. But it does not have any option to select some portion of blob.
Can anyone please help me if there is any method like that.
Can anyone please help me if there is any method like that.
As of today, this feature is not there in Azure Blob Storage. A copy operation copies the entire source blob to destination blob.
A workaround would be to download the byte ranges (blocks) from the source blobs on your local machine and then create a new blob by uploading these blocks.
If you were using Blob Service REST API, here would be the operations you would need to perform:
Read Source Blob 1 by specifying the range in Range or x-ms-range request header you would like to read. Store the data fetched somewhere in your application.
Repeat the same for Source Blob 2.
Now create a new blob by uploading the data fetched for 1st source blob using Put Block.
Repeat the same for 2nd source blob.
Create the destination blob by committing block list.

Could not verify the copy source within the specified time. RequestId: (blank)

I am trying to copy some blob files from one storage account to another one. I am using AzCopy in order to fulfill this goal.
The process works for copying files between containers within the same storage account, but not between different storage accounts.
The command I am issuing is:
AzCopy /Source:https://<storage_account1>.blob.core.windows.net/<container_name1>/<path_to_desired_blobs> /Dest:https://<storage_account2>.blob.core.windows.net/<container_name2>/<path_to_store>/ /SourceKey:<source_key> /DestKey:<dest_key> /Pattern:<some_pattern> /S
The error I am getting is the following:
The remote server returned an error: (400) Bad Request.
Could not verify the copy source within the specified time.
RequestId:
Time:2016-04-01T19:33:01.0527460Z
The only difference between the two storage accounts is that one is Standard, whereas the other one is Premium.
Any help will be appreciated!
From your description, you're trying to copy Block Blob from source account to Page Blob in destination account, which is not supported in Azure Storage Service and AzCopy.
To work around it, you can firstly use AzCopy to download the Block Blobs from source account to local file system, and then upload them from local file system to destination account with option /BlobType:Page (this option is only valid when uploading from local to blob).
Premium Storage only supports page blobs. Please confirm that you are copying page blobs from standard to premium storage account. Also, specify the BlobType parameter to "page" in order to copy the data as page blobs into destination premium storage account.
From the description, I am assuming your source blob is a block blob. Azure's "Async Copy Blob" process (which is used by AzCopy as the default method) preserves the blob type. That is, you cannot convert a blob type from Block to Page through async copy blob.
Instead, can you try AzCopy again with "/SyncCopy" option along with "/BlobType:page" parameter? That might help change the destination blob type to Page.
(If that doesn't work, only other solution would be to first download the blob, and then upload it with "/BlobType:page")

Resources