Azure Blob Storage : buffer cannot be null - azure

I'm following the source code from Xamarin for Azure : FileUploader for Image Upload , and I try to run the app, the error shown the buffer cannot be null.
as shown below:

I suggest, Creating "MemoryStream object" and assign using stream.CopyTo(), return byte[] using "memorystream object".ToArray()
If you don't want to change the code as above,
1. Check the value of "stream.Length"
2. Add some padding (extras) to it e.g. byte[] buffer = new byte[stream.Length+10]
3. Also, check if you can read the stream using .CanRead()

Related

Array getting converted to ArrayBuffer and giving different behaviour for Blob creation

This is the result object returned for saving an image file in mongodb using mongoose
Case 1
The result.img.data object has a String attribute type and an Array attribute data and that is exactly what I get on the client side if I io.emit(event, result) using socket.io
Case 2
But if I io.emit(event, result.img) then on the client side the object img.data becomes an ArrayBuffer
Why is this happening?
In Case 1 if I create a Blob using new Blob(new Uint8Array(result.img.data.data)) I get a different Blob object and the image file is unviewable
But in Case 2 if I create a Blob using new Blob([img.data]) then a different Blob object is created and the file is viewable
Why is this behaviour happening?
How can I create a viewable image Blob with the Array instead of the ArrayBuffer?

ADF: How to pass binary data to stored procedure in Azure Data Factory

I reading binary data (jpeg image) using an api (Web Action) and i want to store it as varbinary or base64 in azure sql server.
As it looks there is no way to base64 encode binary data using azure data factory. Is that correct?
So i am trying to pass it as byte[] using a varbinary parameter. The parameter of the stored procedure looks like this:
#Photo varbinary(max) NULL
The parameter in the stored procedure action in ADF looks like this:
But this seems also not to work, because the pipeline is failing with this error:
The value of the property 'Value' is invalid for the stored procedure parameter 'Photo'.
Is it possible to store that image using that approach? And if not, how can this be achieved (using ADF and Stored Procuedure)?
Just to be safe, are you missing a '#' before the activity ?
Cant see it on the picture
Peace

I was trying to convert my word document to PDF using Spire.Doc and always running into an exception "A generic error occurred in GDI+"

I was trying to fetch a word from different sources via url/shared link/shared drives and save it to memory stream. My memory stream gets the data everytime but while converting it to PDF via Spire.Doc it gives an error saying "A generic error occurred in GDI+". But this happens only in my production not in localhost. The app is hosted in Azure cloud.
I put few logs and got to know that line number 3 is causing the issue.
fileContentStream and finalStream are the memory streams.
The code which I used is:
Spire.Doc.Document spireDoc = new Spire.Doc.Document();
spireDoc.LoadFromStream(fileContentStream, Spire.Doc.FileFormat.Auto);
spireDoc.SaveToStream(finalStream, Spire.Doc.FileFormat.PDF);
To convert Word to PDF on Azure cloud using Spire.Doc, you need to use the below code:
Document document = new Document();
spireDoc.LoadFromStream(stream);
ToPdfParameterList tpl = new ToPdfParameterList
{
UsePSCoversion = true
};
document.SaveToStream(stream, tpl);

Azure Application Insight. Custom attribute length restriction

I'm using Azure App Insight as a logging tool and store log data by the following code:
private void SendTrace(LoggingEvent loggingEvent)
{
loggingEvent.GetProperties();
string message = "TestMessage";
var trace = new TraceTelemetry(message)
{
SeverityLevel = SeverityLevel.Information
};
trace.Properties.Add("TetstKey", "TestValue");
var telemetryClient = new TelemetryClient();
telemetryClient.Context.InstrumentationKey = this.InstrumentationKey;
telemetryClient.Track(trace);
}
everything works well. I see logged record in App insight as well as in App insight analytics (in trace table). My custom attributes are written in special app insight row section - customDimensions. For example, the above code will add new attribute with "TestKey" key and "TestValue" value into customDimensions section.
But when I try to write some big text (for example JSON document with more then 15k letters) I still can do it without any exceptions, but the writable text will be cut off after some document length. As the result, the custom attribute value in customDimensions section will be cropped too and will have only first part of document.
As I understand there is the restriction for max text length which is allowed to be written in app insight custom attribute.
Could someone know how can I get around with this?
The message has the highest allowed limit of 32768. For items in the property collection, value has max limit of 8192.
So you can try one of the following options:
Use message field to the fullest by putting the big text there.
Split the data into multiple, and add to properties collection separately.
eg:
trace.Properties.Add("key_part1", "Bigtext1_upto8192");
trace.Properties.Add("key_part2", "Bigtext2_upto8192");
Reference: https://github.com/MicrosoftDocs/azure-docs/blob/master/includes/application-insights-limits.md

Azure Storage Copy From Blob

I am having an issue converting the Console app described here .Copying an existing blob into a media services asset to run on mobile app service
I have everything like for like refernces wise and code wise but have the following issue
// Display the size of the source blob.
Console.WriteLine(sourceBlob.Properties.Length);
Console.WriteLine(string.Format("Copy blob '{0}' to '{1}'", sourceBlob.Uri, destinationBlob.Uri));
// The line below gives the following error:
destinationBlob.StartCopyFromBlob(new Uri(sourceBlob.Uri.AbsoluteUri + signature));
Blockquote
'ICloudBlob' does not contain a definition for 'StartCopyFromBlob' and no extension method 'StartCopyFromBlob' accepting a first argument of type 'ICloudBlob' could be found (are you missing a using directive or an assembly reference?
Is this because i am using version 7 of the storage client and the method has been removed?
If so is there a new method of combination of methods that i can use to achieve a similar result?
From the release notes, you can find:
Blobs: Removed deprecated (Begin/End)StartCopyFromBlob(Async) APIs in favor of using (Begin/End)StartCopy(Async) APIs.
Therefore, please use StartCopy instead of StartCopyFromBlob.
As Zhaoxing Lu said that 'ICloudBlob' does not contain a definition for 'StartCopy'. Base on your code, you could find 'StartCopy' in CloudBlockBlob class.
According to the tutorial you mentioned, you could modify the type of destinationBlob:
CloudBlockBlob destinationBlob = destinationContainer.GetBlockBlobReference(sourceBlob.Name);
Instead of:
ICloudBlob destinationBlob = destinationContainer.GetBlockBlobReference(sourceBlob.Name);
Note: CloudBlobContainer.GetBlockBlobReference returns a CloudBlockBlob object.
Then you could run the following code:
destinationBlob.StartCopy(new Uri(sourceBlob.Uri.AbsoluteUri + signature));

Resources