Capture Thumbnails from a Video on Azure Media Service - azure

I'm trying to use a Image Thumbnail from a Video on Azure Media Service.
I can't understand if a thumbnail is made automatically And if so - then what is the URI for it.
Documentation talks about 'Thumbnail Collections' in AssetFile - but I can't find anything further.
Any ideas?
Thanks

Here is a sample code to add thumbnail task to encoding job
ITask task = job.Tasks.AddNew("My thumbnail task",
processor,
"Thumbnails",
TaskOptions.None);
You can control thumbnail task parameters by using xml preset instead of system named pereset. I pasted it from sdk github repo file Jobtests.cs
string presetXml = #"<?xml version=""1.0"" encoding=""utf-8""?>
<Thumbnail Size=""80,60"" Type=""Jpeg"" Filename=""{OriginalFilename}_{ThumbnailTime}.{DefaultExtension}"">
<Time Value=""0:0:0""/>
<Time Value=""0:0:3"" Step=""0:0:0.25"" Stop=""0:0:10""/>
</Thumbnail>";
IJob job = CreateAndSubmitOneTaskJob(_mediaContext, name, mediaProcessor, presetXml, asset, TaskOptions.None);
var task = job.Tasks.First();
var asset = task.OutputAssets.First();
var files = asset.AssetFiles.ToList();
Run test ShouldFinishJobWithSuccessWhenPresetISUTF8() which is using thumbnail preset and you will find that job generates 1 output asset which will have around 30 files. To download these files you can simply call Download or DownloadAsync.
files[0].Download()
If you need to get url for selected file you can execute following code:
var accessPolicy = mediaContext.AccessPolicies.Create("12HoursRead", TimeSpan.FromHours(12), AccessPermissions.Read);
//Creating read-only access url which will be available for 12 hours
var locator = mediaContext.Locators.CreateSasLocator(asset, accessPolicy);
//Getting url for first file in collection
UriBuilder uriBuilder = new UriBuilder(locator.BaseUri);
uriBuilder.Path += String.Concat("/", files[0].Name);
Please note that all Azure media asset files are stored in Azure Storage.
If you have high volume website, it will be better to download thumbnails from storage and publish them through CDN.
MSDN docs related to thumbnail preset

Related

Does Azure Blob Storage supports partial content 206 by default?

I am using Azure blob storage to storage all my images and videos. I have implemented the upload and fetch functionality and it's working quite good. I am facing 1 issue while loading the videos, because when I use the url which is generated after uploading that video on Azure blob storage, it's downloading all the content first before rendering it to the user. So if the video size is 100 mb, it'll download all the 100 mb and till than user won't able to see the video.
I have done a lot of R&D and came to know that while rendering the video, I need to fetch the partial content (status 206) rather than fetching the whole video at a time. After adding the request header "Range:bytes-500", I tried to hit the blog url, but it's still downloading the whole content. So I have checked with some open source video URLs and tried to hit the video URL along with the "Range" request header and it was successfully giving 206 response status, which means it was properly giving me the partial content instead of the full video.
I read some forum and they are saying Azure storage supports the partial content concept and need to enable it from the properties. But I have checked all the options under the Azure storage account but didn't find anything to enable this functionality.
Can anyone please help me out to resolve this or if there's anything on Azure portal that I need to enable? It's something that I have been doing the R&D for this since a week now. Any help would be really appreciated.
Thank you! Stay safe.
Suppose the Accept-Ranges is not enabled, from this blog I got it needs to set the default version of the service.
Below is a sample code to implement it.
var credentials = new StorageCredentials("account name", "account key");
var account = new CloudStorageAccount(credentials, true);
var client = account.CreateCloudBlobClient();
var properties = client.GetServiceProperties();
properties.DefaultServiceVersion = "2019-07-07";
client.SetServiceProperties(properties);
Below is a return header comparison after setting the property.
Before:
After:
Assuming the video content is MPEG-4 the issue may be the media itself needs to have the moov atom position changed from the end of the file to the beginning. The browser won't render the video until it finds the moov atom in the file therefore you want to make sure the atom is at the start of the file which can be accomplished using ffmpeg with the "FastStart". Here's a good article with more detail : HERE
You just need to update your Azure Storage version. It will work automatically after the update.
Using Azure CLI
Just run:
az storage account blob-service-properties update --default-service-version 2021-08-06 -n yourStorageAccountName -g yourStorageResourceGroupName
List of avaliable versions:
https://learn.microsoft.com/en-us/rest/api/storageservices/previous-azure-storage-service-versions
To see your current version, open a file and inspect the x-ms-version header
following is the SDK I used to download the contents:
var container = new BlobContainerClient("UseDevelopmentStorage=true", "sample-container");
await container.CreateIfNotExistsAsync();
BlobClient blobClient = container.GetBlobClient(fileName);
Stream stream = new MemoryStream();
var result = await blobClient.DownloadToAsync(stream, cancellationToken: ct);
which DOES download the whole file right away! Unfortunately the solution provided in other answers seems to be referencing another SDK? So for the SDK that I use - the solution is to use the method OpenReadAsync:
long kBytesToReadAtOnce = 300;
long bytesToReadAtOnce = kBytesToReadAtOnce * 1024;
//int mbBytesToReadAtOnce = 1;
var result = await blobClient.OpenReadAsync(0, bufferSize: (int)bytesToReadAtOnce, cancellationToken: ct);
By default - it fetches 4mb of data, so you have to override the value to smaller amount if you want your app to have smaller memory footprint.
I think that internally the SDK sends the requests with the byte range already set. So all you have to do is enable the partial content support in Web API like this:
return new FileStreamResult(result, contentType)
{
EnableRangeProcessing = true,
};

Sharepoint REST - Can we update file metadata while uploading file itself?

Using below Endpoint we can upload file to sharepoint:
https://domain.example.com/_api/web/GetFolderByServerRelativeUrl("FolderRelativeUrl")/Files/add(url="File",overwrite=true)
Using below endpoint we can update the metadata for specific file:
https://domain.example.com/_api/web/GetFileByServerRelativeUrl(URL)/ListItemAllFields
Is it possible to update metadata when we are uploading file itself?
And same while retrieve, we need to fetch metadata along with file.
Basically I am trying to avoid 2 separate calls? Does SharePoint API support this feature?
SharePoint can't provide a REST API to achieve it.
As a workaround, we can use CSOM(C#) to achieve it.
public Boolean UploadDocument(String fileName, String filePath, List metaDataList)
{
SP.ClientContext ctx = new SP.ClientContext("http://yoursharepointURL");
Web web = ctx.Web;
FileCreationInformation newFile = new FileCreationInformation();
newFile.Content = System.IO.File.ReadAllBytes(#"C: \TestFile.doc");
newFile.Url = "/" + fileName;
List docs = web.Lists.GetByTitle(“Shared Documents”);
Microsoft.SharePoint.Client.File uploadFile = docs.RootFolder.Files.Add(newFile);
context.Load(uploadFile);
context.ExecuteQuery();
SPClient.ListItem item = uploadFile.ListItemAllFields;
//Set the metadata
string docTitle = string.Empty;
item["Title"] = docTitle;
item.Update();
context.ExecuteQuery();
}
If you want to call web service using Ajax from UI, we can create a custom web service with CSOM(C#), then consume the web service using Ajax.
It's frustrating, but you can upload the initial version and set metadata in a single call. But not upload new version and set metadata, only as separate calls. I am transfering files from a DMS which can have multiple versions, and the version history in Sharepoint will not match. To make it consistent, I also transfer the initial version and metadata as two calls. The customer is informed, and the version history is ok. The file import shows as an empty version.

What happens to files downloaded in WebJob

I am working with some sensitive files (mostly images) in my WebJob. My WebJob downloads the files from Azure Blob (container 1), does some processing and uploads to Azure Blob (container 2).
Because these files are sensitive in nature, I want to be 100% sure that WebJob deletes them once the Job is completed running.
Can someone tell me what happens to files downloaded in WebJob?
My download code looks like this ...
var stream = new MemoryStream();
using (StorageService storage = CreateStorageClient())
{
var bucketname = "container1";
var objectToDownload = storage.Objects.Get(bucketname, "files/img1.jpg").Execute();
var downloader = new MediaDownloader(storage);
downloader.Download(objectToDownload.MediaLink, stream);
}
Here CreateStorageClient() is my utility method which creates a StorageService object.
Solved using #lopezbertoni comment.
Also found relevant question which also helped - Azure Webjob - accessing local file system

Amazon glacier storage upload API method

I am trying to upload the zip file in the vault ( of amazon glacier clould storage ). the upload methods successfully executed without any exception. also return the archive id of the upload but I cant see any files in archives.
string vaultName = "test";
string archiveToUpload = #"E:\Mayur\Downloads\Zip files\search.zip";
ArchiveTransferManager manager = new ArchiveTransferManager(Amazon.RegionEndpoint.USWest2);
string archiveId = manager.Upload(vaultName, "Binari archive", archiveToUpload).ArchiveId;
I am not getting any exception methods successfully returns an archive Id
Please help me to find out what is the actual issue..
Thanks,
Mayur
Simply because The vault inventory is updated approximately once a day.
source: http://aws.amazon.com/glacier/faqs/

If using ImageResizer with Azure blobs do I need the AzureReader2 plugin?

I'm working on a personal project to manage users of my club, it's hosted on the free Azure package (for now at least), partly as an experiment to try out Azure. Part of creating their records is to add a photo, so I've got a Contact Card view that lets me see who they are, when they came and a photo.
I have installed ImageResizer and it's really easy to resize the 10MP photos from my camera and save them to the file system locally, but it seems that for Azure I need to use their Blobs to Upload Pictures to Windows Azure Web Sites, and that's new to me. The documentation on ImageResizer says that I need to use AzureReader2 in order to work with Azure blobs but it isn't free. It also says in their best practices #5 to
Use dynamic resizing instead of pre-resizing your images.
Which is not what I was thinking, I was going to resize to 300x300 and 75x75 (for thumbnail) when creating the users record. But if I should be storing full size images as blobs and dynamically resizing on the way out then can I just use standard means to Upload a blob into a container to save it to Azure, then when I want to display the images use the ImageResizer and pass it each image to resize as required. That way not needing to use the AzureReader2, or have I misunderstood what it does / how it works?
Is there another way to consider?
I've not yet implemented cropping, but that's next to tackle when I've worked out how to actually store the images properly
With some trepidation, I'm going to disagree with astaykov here. I believe you CAN use ImageResizer with Azure WITHOUT needing AzureReader2. Maybe I should qualify that by saying 'It works on my setup' :)
I'm using ImageResizer in an MVC 3 application. I have a standard Azure account with an images container.
Here's my test code for the view:
#using (Html.BeginForm( "UploadPhoto", "BasicProfile", FormMethod.Post, new { enctype = "multipart/form-data" }))
{
<input type="file" name="file" />
<input type="submit" value="OK" />
}
And here's the corresponding code in the Post Action method:
// This action handles the form POST and the upload
[HttpPost]
public ActionResult UploadPhoto(HttpPostedFileBase file)
{
// Verify that the user selected a file
if (file != null && file.ContentLength > 0)
{
string newGuid = Guid.NewGuid().ToString();
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("images");
// Retrieve reference to the blob we want to create
CloudBlockBlob blockBlob = container.GetBlockBlobReference(newGuid + ".jpg");
// Populate our blob with contents from the uploaded file.
using (var ms = new MemoryStream())
{
ImageResizer.ImageJob i = new ImageResizer.ImageJob(file.InputStream,
ms, new ImageResizer.ResizeSettings("width=800;height=600;format=jpg;mode=max"));
i.Build();
blockBlob.Properties.ContentType = "image/jpeg";
ms.Seek(0, SeekOrigin.Begin);
blockBlob.UploadFromStream(ms);
}
}
// redirect back to the index action to show the form once again
return RedirectToAction("UploadPhoto");
}
This is 'rough and ready' code to test the theory and could certainly stand improvement but, it does work both locally and when deployed on Azure. I can also view the images I've uploaded, which are correctly re-sized.
Hope this helps someone.
The answer to the concrete question:
If using ImageResizer with Azure blobs do I need the AzureReader2
plugin?
is YES. And as described in the Image Resizer's documentation - that plugin is used to read/process/serve images out of Blob Storage. So there is no doubt - if you are going to use Image Resizer, AzureReader2 is your needed plugin to make things right. It will take care of Blob uploads/serve.
Although I question Image Resizer's team competency on Windows Azure, since they are referencing Azure SDK v.2, while the most current version for Azure SDK is 1.8. What they mean is the Azure Storage Client Library, which has versions 1.7 and 2.x. Whereas version 2.x is recommended one to use and comes with Azure SDK 1.8. So, do not search for Azure SDK 2.0, install the latest one, which is 1.8. And by the way, use the Nuget Package Manager to install the Azure Storage Library v. 2.0.x.
You can also upload resized versions to azure. So, you first upload the original image as a blob, say with the name /original/xxx.jpg; then you create a resize of the image and upload that to azure with the name say /thumbnail/xxx.jpg. If you want to create the resized versions on the fly or on a separate thread, you may need to temporarily save the original to disk.

Resources