Issue Image Upload to Azure Blog using Nativescript Core - azure

I am using Nativescript 4.2.0 and trying to upload an image from a local image to Azure Blob storage.
Most approaches recommended use the nativescript-background-http plugin. However by including this plugin, there are errors that start coming up requiring other npm modules. I haven't seen this reported anywhere, so I am unsure if I am doing something wrong or there are any other commands to run beside
tns install nativescript-background-http
The other plugin "nativescript-azure-storage" seems to work fine. This requires us to base64 encode our images. After Base64 encoding, the image gets uploaded to Azure Storage. However, since the image is now base64 encoded, it can't be used directly in .
Code used:
const azureNSStorage = new nsAzureStorage.NativeScriptAzureStorage(config.AZURE_STORAGE_CONNECTION_STRING);
let path = selected.android;
const imageFromLocalFile = imageSourceModule.fromFile(path);
let base64string = imageFromLocalFile.toBase64String('png');
azureNSStorage.uploadBlob(mycontainer, blobName,
base64string)
.then(() => alert(`Uploaded successfuly`))
.catch((err) => alert(`Error uploading: ${err}`));
What is the recommended way to upload images on Azure Blob so that we can reference them back in Nativescript page?
Cheers
Abhishek

It actually depends on what you need or what your backend / service provider supports. So until azure works for you as expected, there is nothing wrong with converting the image into base64 string.
Between nativescript-background-http should work too, let's know what are the errors you are facing here.

Check this sample. For Azure Blob does not accept base 64 string. You need to send the stream. https://baskarrao.wordpress.com/2018/10/12/day-3-nativescript-post-series/

Related

Does Azure Blob Storage supports partial content 206 by default?

I am using Azure blob storage to storage all my images and videos. I have implemented the upload and fetch functionality and it's working quite good. I am facing 1 issue while loading the videos, because when I use the url which is generated after uploading that video on Azure blob storage, it's downloading all the content first before rendering it to the user. So if the video size is 100 mb, it'll download all the 100 mb and till than user won't able to see the video.
I have done a lot of R&D and came to know that while rendering the video, I need to fetch the partial content (status 206) rather than fetching the whole video at a time. After adding the request header "Range:bytes-500", I tried to hit the blog url, but it's still downloading the whole content. So I have checked with some open source video URLs and tried to hit the video URL along with the "Range" request header and it was successfully giving 206 response status, which means it was properly giving me the partial content instead of the full video.
I read some forum and they are saying Azure storage supports the partial content concept and need to enable it from the properties. But I have checked all the options under the Azure storage account but didn't find anything to enable this functionality.
Can anyone please help me out to resolve this or if there's anything on Azure portal that I need to enable? It's something that I have been doing the R&D for this since a week now. Any help would be really appreciated.
Thank you! Stay safe.
Suppose the Accept-Ranges is not enabled, from this blog I got it needs to set the default version of the service.
Below is a sample code to implement it.
var credentials = new StorageCredentials("account name", "account key");
var account = new CloudStorageAccount(credentials, true);
var client = account.CreateCloudBlobClient();
var properties = client.GetServiceProperties();
properties.DefaultServiceVersion = "2019-07-07";
client.SetServiceProperties(properties);
Below is a return header comparison after setting the property.
Before:
After:
Assuming the video content is MPEG-4 the issue may be the media itself needs to have the moov atom position changed from the end of the file to the beginning. The browser won't render the video until it finds the moov atom in the file therefore you want to make sure the atom is at the start of the file which can be accomplished using ffmpeg with the "FastStart". Here's a good article with more detail : HERE
You just need to update your Azure Storage version. It will work automatically after the update.
Using Azure CLI
Just run:
az storage account blob-service-properties update --default-service-version 2021-08-06 -n yourStorageAccountName -g yourStorageResourceGroupName
List of avaliable versions:
https://learn.microsoft.com/en-us/rest/api/storageservices/previous-azure-storage-service-versions
To see your current version, open a file and inspect the x-ms-version header
following is the SDK I used to download the contents:
var container = new BlobContainerClient("UseDevelopmentStorage=true", "sample-container");
await container.CreateIfNotExistsAsync();
BlobClient blobClient = container.GetBlobClient(fileName);
Stream stream = new MemoryStream();
var result = await blobClient.DownloadToAsync(stream, cancellationToken: ct);
which DOES download the whole file right away! Unfortunately the solution provided in other answers seems to be referencing another SDK? So for the SDK that I use - the solution is to use the method OpenReadAsync:
long kBytesToReadAtOnce = 300;
long bytesToReadAtOnce = kBytesToReadAtOnce * 1024;
//int mbBytesToReadAtOnce = 1;
var result = await blobClient.OpenReadAsync(0, bufferSize: (int)bytesToReadAtOnce, cancellationToken: ct);
By default - it fetches 4mb of data, so you have to override the value to smaller amount if you want your app to have smaller memory footprint.
I think that internally the SDK sends the requests with the byte range already set. So all you have to do is enable the partial content support in Web API like this:
return new FileStreamResult(result, contentType)
{
EnableRangeProcessing = true,
};

How to Download a File (from URL) in Typescript

Update: This question used to ask about Google Cloud Storage, but I have since realized the issue actually is reproducable merely trying to save the download to local disk. Thus, I am rephrasing the question to be entirely about file downloads in Typescript and to no longer mention Google Cloud Storage.
When attempting to download and save a file in Typescript with WebRequests (though I experienced the same issue with requests and request-promises), all the code seems to execute correctly, but the resultant file is corrupted and cannot be viewed. For example, if I download an image, the file is not viewable in any applications.
// Seems to work correctly
const download = await WebRequest.get(imageUrl);
// `Buffer.from()` also takes an `encoding` parameter, but it's unclear how to determine the encoding of a download
const imageBuffer = Buffer.from(download.content);
// I *think* this line is straightforward
const imageByteArray = new Uint8Array(imageBuffer);
// Saves a corrupted file
const file = fs.writeFileSync("/path/to/file.png", imageByteArray);
I suspect the issue lies within the Buffer.from call not correctly interpreting the downloaded content, but I'm not sure how to do it right. Any help would be greatly appreciated.
Thanks so much!
From what I saw in the examples for web-request, download.content is just a string. If you want to upload a string to Cloud Storage using the node SDK, you can use File.save, passing that string directly.
Alternatively, you could use one the solutions seen here.

App crashes when storage().putFile is called with assets-library uri on iOS

I am trying to upload an image from the camera roll of the phone to Firebase Storage using react-native-firebase. However, when I pass the uri of the image to the putFile method, the app crashes without an error.
For example:
const uri = 'assets-library://asset/asset.JPG?id=3DEE5FA3-9E58-479B-9AD9-A7FDBEDF0502&ext=JPG';
firebase.storage().ref('test.jpeg').putFile(uri)
.then(...)
.catch(...);
Is this the expected behavior? If yes, how can I transform the assets uri to a full file path?
This was a bug that was fixed in the RNFirebase release v4.3.0.
If anybody is getting this in the future, it may help to note that you can't upload to the root reference (e.g. .ref()).
Instead you need something like .ref('images/xyz'), etc.

How to create a blob in node (server side) from a stream, a file or a base64 string?

I am trying to create a blob from a pdf I am creating from pdfmake so that I can send it to a remote api that only handles blobs.
This is how I get my PDF file:
var docDefinition = { content: 'This is an sample PDF printed with pdfMake' };
pdfDoc.pipe(fs.createWriteStream('./pdfs/test.pdf'));
pdfDoc.end();
The above lines of code do produce a readable pdf.
Now how can I get a blob from there? I have tried many options (creating the blob from the stream with the blob-stream module, creating from the file with fs, creating it from a base64 string with b64toBlob) but all of them require at some point to use the constructor Blob for which I always get an error even if I require the module blob:
TypeError: Blob is not a constructor
After some research I found that it seems that the Blob constructor is only supported client-side.
All the npm packages that I have found and which seem to deal with this issue seem to only work client-side: blob-stream, blob, blob-util, b64toBlob, etc.
So, how can I create a blob server-side on Node?
I don't understand why almost nobody also needs to create a blob server-side? The only thread I could find on the subject is this one.
According to that thread, apparently:
The Solution to this problem is to create a function which can convert between Array Buffers and Node Buffers. :)
Unfortunately this does not help me much as I clearly seem to lack some important knowledge here to be able to comprehend this.
use node-blob npm package
const Blob = require('node-blob');
let myBlob = new Blob(["something"], { type: 'text/plain' });

If using ImageResizer with Azure blobs do I need the AzureReader2 plugin?

I'm working on a personal project to manage users of my club, it's hosted on the free Azure package (for now at least), partly as an experiment to try out Azure. Part of creating their records is to add a photo, so I've got a Contact Card view that lets me see who they are, when they came and a photo.
I have installed ImageResizer and it's really easy to resize the 10MP photos from my camera and save them to the file system locally, but it seems that for Azure I need to use their Blobs to Upload Pictures to Windows Azure Web Sites, and that's new to me. The documentation on ImageResizer says that I need to use AzureReader2 in order to work with Azure blobs but it isn't free. It also says in their best practices #5 to
Use dynamic resizing instead of pre-resizing your images.
Which is not what I was thinking, I was going to resize to 300x300 and 75x75 (for thumbnail) when creating the users record. But if I should be storing full size images as blobs and dynamically resizing on the way out then can I just use standard means to Upload a blob into a container to save it to Azure, then when I want to display the images use the ImageResizer and pass it each image to resize as required. That way not needing to use the AzureReader2, or have I misunderstood what it does / how it works?
Is there another way to consider?
I've not yet implemented cropping, but that's next to tackle when I've worked out how to actually store the images properly
With some trepidation, I'm going to disagree with astaykov here. I believe you CAN use ImageResizer with Azure WITHOUT needing AzureReader2. Maybe I should qualify that by saying 'It works on my setup' :)
I'm using ImageResizer in an MVC 3 application. I have a standard Azure account with an images container.
Here's my test code for the view:
#using (Html.BeginForm( "UploadPhoto", "BasicProfile", FormMethod.Post, new { enctype = "multipart/form-data" }))
{
<input type="file" name="file" />
<input type="submit" value="OK" />
}
And here's the corresponding code in the Post Action method:
// This action handles the form POST and the upload
[HttpPost]
public ActionResult UploadPhoto(HttpPostedFileBase file)
{
// Verify that the user selected a file
if (file != null && file.ContentLength > 0)
{
string newGuid = Guid.NewGuid().ToString();
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("images");
// Retrieve reference to the blob we want to create
CloudBlockBlob blockBlob = container.GetBlockBlobReference(newGuid + ".jpg");
// Populate our blob with contents from the uploaded file.
using (var ms = new MemoryStream())
{
ImageResizer.ImageJob i = new ImageResizer.ImageJob(file.InputStream,
ms, new ImageResizer.ResizeSettings("width=800;height=600;format=jpg;mode=max"));
i.Build();
blockBlob.Properties.ContentType = "image/jpeg";
ms.Seek(0, SeekOrigin.Begin);
blockBlob.UploadFromStream(ms);
}
}
// redirect back to the index action to show the form once again
return RedirectToAction("UploadPhoto");
}
This is 'rough and ready' code to test the theory and could certainly stand improvement but, it does work both locally and when deployed on Azure. I can also view the images I've uploaded, which are correctly re-sized.
Hope this helps someone.
The answer to the concrete question:
If using ImageResizer with Azure blobs do I need the AzureReader2
plugin?
is YES. And as described in the Image Resizer's documentation - that plugin is used to read/process/serve images out of Blob Storage. So there is no doubt - if you are going to use Image Resizer, AzureReader2 is your needed plugin to make things right. It will take care of Blob uploads/serve.
Although I question Image Resizer's team competency on Windows Azure, since they are referencing Azure SDK v.2, while the most current version for Azure SDK is 1.8. What they mean is the Azure Storage Client Library, which has versions 1.7 and 2.x. Whereas version 2.x is recommended one to use and comes with Azure SDK 1.8. So, do not search for Azure SDK 2.0, install the latest one, which is 1.8. And by the way, use the Nuget Package Manager to install the Azure Storage Library v. 2.0.x.
You can also upload resized versions to azure. So, you first upload the original image as a blob, say with the name /original/xxx.jpg; then you create a resize of the image and upload that to azure with the name say /thumbnail/xxx.jpg. If you want to create the resized versions on the fly or on a separate thread, you may need to temporarily save the original to disk.

Resources