Upload and encode Audio file in Windows Azure Media Services - azure

Everywhere online I can find out some explanation related to Video files uploaded to Azure Media Services.
Based on the tutorials I wrote my own code.
After running the StoreAudio method I have:
New Blob on Storage
New Asset on Media Services
New Job successfully completed on Media Services
The created asset is Not Published
When I try to get from the convertedAsset properties like ID or URI I get an exception
Why are ID and URI null? Why is the content "not published"?
Code:
public string StoreAudio(int ID, byte[] file)
{
try
{
var blobContainerName = AudioBookContainer; //+ AudioChapterID % 1000?
var fileName = ID + ".mp3";
var mediaBlobContainer = blobClient.GetContainerReference(blobContainerName);
mediaBlobContainer.CreateIfNotExists();
using (MemoryStream ms = new MemoryStream(file))
{
var reference = mediaBlobContainer.GetBlockBlobReference(fileName);
reference.UploadFromStream(ms);
}
IAsset asset = _context.Assets.Create(fileName, AssetCreationOptions.None);
IAccessPolicy writePolicy = _context.AccessPolicies.Create("writePolicy", TimeSpan.FromMinutes(120), AccessPermissions.Write);
ILocator destinationLocator = _context.Locators.CreateLocator(LocatorType.Sas, asset, writePolicy);
Uri uploadUri = new Uri(destinationLocator.Path);
string assetContainerName = uploadUri.Segments[1];
CloudBlobContainer assetContainer = blobClient.GetContainerReference(assetContainerName);
var sourceCloudBlob = mediaBlobContainer.GetBlockBlobReference(fileName);
sourceCloudBlob.FetchAttributes();
if (sourceCloudBlob.Properties.Length > 0)
{
IAssetFile assetFile = asset.AssetFiles.Create(fileName);
var destinationBlob = assetContainer.GetBlockBlobReference(fileName);
destinationBlob.DeleteIfExists();
destinationBlob.StartCopyFromBlob(sourceCloudBlob);
destinationBlob.FetchAttributes();
if (sourceCloudBlob.Properties.Length != destinationBlob.Properties.Length)
throw new Exception("Error copying");
}
destinationLocator.Delete();
writePolicy.Delete();
asset = _context.Assets.Where(a => a.Id == asset.Id).FirstOrDefault(); //At this point, you can create a job using your asset.
var encodedAsset = EncodeToWMA(asset);
return encodedAsset.Id;
//var ismAssetFiles = encodedAsset.AssetFiles.ToList().Where(f => f.Name.EndsWith(".ism", StringComparison.OrdinalIgnoreCase)).ToArray();
//if (ismAssetFiles.Count() != 1)
// throw new ArgumentException("The asset should have only one, .ism file");
//ismAssetFiles.First().IsPrimary = true;
//ismAssetFiles.First().Update();
asset.Delete();
return encodedAsset.Uri.AbsoluteUri;
}
catch(Exception exx)
{
return exx.Message + exx.InnerException;
}
}
private static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
{
var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
if (processor == null)
throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
return processor;
}
public static IAsset EncodeToWMA(IAsset asset)
{
IJob job = _context.Jobs.Create("Convert MP3 to WMA");
IMediaProcessor processor = GetLatestMediaProcessorByName("Windows Azure Media Encoder");
ITask task = job.Tasks.AddNew("My encoding task", processor, "WMA High Quality Audio", TaskOptions.None);
task.InputAssets.Add(asset);
task.OutputAssets.AddNew(asset.Name.Replace(".mp3", ".wma"), AssetCreationOptions.None);
job.Submit();
Task progressJobTask = job.GetExecutionProgressTask(CancellationToken.None);
progressJobTask.Wait();
return task.OutputAssets.First();
}

Adding more explanation:
I suggested you to create SAS locator is because we don't support Audio streaming for WMA file for now, therefore, Getting an Origin Streaming locator won't work for you.
The reason why the asset is "not published": you never publish the asset - getting a SAS locator or Origin locator is the way to get your asset published.

For audio file, you could ask for a SAS locator to access to the file-SAS locator is used for progressive download.
_context.Locators.Create(LocatorType.Sas,outputAsset,AccessPermissions.Read,TimeSpan.FromDays(30));

Related

Unable to bulk import devices with Private Storage account

I am working on stress testing for our IoT Usecase.
For testing, I need to create 100 devices.
So, I have developed one azure function to use IoTHubs Import Device feature as per MSFT docs.
When I used, sample code using Public Storage account for Import/Output blob container SAS token.
It worked as per expectation and created devices on IoTHub.
But when I am using same code with Private Storage account, it is sometimes throwing reading error and sometimes it is throwing writing error on blob storage, even if SAS token has all required permissions (Read, Write, Delete, Create, List, Add etc..), and even private storage account is also having DNS configuration available. And for other use, I am able to add/update/delete blobs on the same private storage account.
Only problem I am facing is while calling ImportDevicesAsync method of IoTHub's RegistryManager.
My sample code is as below:
To create devices.txt file and upload it on proper container.
for (int i = deviceIndex; i < deviceCount + deviceIndex; i++)
{
var deviceToAdd = new ExportImportDevice()
{
Id = $"{devicePrefix}{i.ToString().PadLeft(6, '0')}",
ImportMode = importMode == "delete" ? ImportMode.Delete : ImportMode.Create,
Status = DeviceStatus.Enabled,
Authentication = new AuthenticationMechanism()
{
SymmetricKey = new SymmetricKey()
{
PrimaryKey = CryptoKeyGenerator.GenerateKey(32),
SecondaryKey = CryptoKeyGenerator.GenerateKey(32)
}
},
Tags = new TwinCollection(initialTags.SerializeObject())
};
serializedDevices.Add(deviceToAdd.SerializeObject());
}
// Write the list to the blob
StringBuilder sb = new();
serializedDevices.ForEach(serializedDevice => sb.AppendLine(serializedDevice));
Uri uri = new(assetsBlockBlobUrl + "?" + assetsBlobContainerSas);
CloudBlockBlob blob = new(uri);
await blob.DeleteIfExistsAsync();
using (CloudBlobStream stream = await blob.OpenWriteAsync())
{
byte[] bytes = Encoding.UTF8.GetBytes(sb.ToString());
for (var i = 0; i < bytes.Length; i += 500)
{
int length = Math.Min(bytes.Length - i, 500);
await stream.WriteAsync(bytes.AsMemory(i, length));
}
}
To import devices from the same container using registryManager.ImportDeviceAsync method:
RegistryManager registryManager = RegistryManager.CreateFromConnectionString(Environment.GetEnvironmentVariable("iotHubConnectionString"));
JobProperties importJob = await registryManager.ImportDevicesAsync(containerSasUri, containerSasUri);
////Wait until job is finished
while (true)
{
importJob = await registryManager.GetJobAsync(importJob.JobId);
_logger.LogInformation("import job " + importJob.Status);
if (importJob.Status == JobStatus.Completed)
{
return Common.Utils.GetObjectResult(importMode == "delete" ? MessageConstants.SuccessDeletedAsset : MessageConstants.SuccessCreatedAsset);
}
else if (importJob.Status == JobStatus.Failed)
{
return Common.Utils.GetObjectResult(importMode == "delete" ? MessageConstants.DeleteDeviceFail : MessageConstants.CreateDeviceFail);
}
else if (importJob.Status == JobStatus.Cancelled)
{
return Common.Utils.GetObjectResult(importMode == "delete" ? MessageConstants.DeviceDeletionCancel : MessageConstants.DeviceCreationCancel);
}
await Task.Delay(TimeSpan.FromSeconds(5));
}

Unable to get the PublishAssetURL in azure media Service

I am trying to upload the mp4 file from Controller into azure blob storage right after when i done uploading, i am creating asset from the same blob which i just uploaded every thing seems working fine but i don't know some how i am unable to get the publishAssetURL
var manifestFile = asset.AssetFiles.Where(x =>
x.Name.EndsWith(".ism")).FirstOrDefault();
This issue is on this line the manifestFile is coming null.
public string CreateAssetFromExistingBlobs(CloudBlobContainer sourceBlobContainer, CloudStorageAccount _destinationStorageAccount, CloudMediaContext _context, AzureStorageMultipartFormDataStreamProvider provider )
{
CloudBlobClient destBlobStorage = _destinationStorageAccount.CreateCloudBlobClient();
// Create a new asset.
IAsset asset = _context.Assets.Create("NewAsset_" + Guid.NewGuid(), AssetCreationOptions.None);
IAccessPolicy writePolicy = _context.AccessPolicies.Create("writePolicy",
TimeSpan.FromHours(24), AccessPermissions.Write);
ILocator destinationLocator =
_context.Locators.CreateLocator(LocatorType.Sas, asset, writePolicy);
// Get the asset container URI and Blob copy from mediaContainer to assetContainer.
CloudBlobContainer destAssetContainer =
destBlobStorage.GetContainerReference((new Uri(destinationLocator.Path)).Segments[1]);
if (destAssetContainer.CreateIfNotExists())
{
destAssetContainer.SetPermissions(new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
});
}
var blob = sourceBlobContainer.GetBlockBlobReference(provider.FileData.FirstOrDefault().LocalFileName);
blob.FetchAttributes();
var assetFile = asset.AssetFiles.Create(blob.Name);
CopyBlob(blob, destAssetContainer);
assetFile.ContentFileSize = blob.Properties.Length;
assetFile.Update();
asset.Update();
destinationLocator.Delete();
writePolicy.Delete();
// Set the primary asset file.
// If, for example, we copied a set of Smooth Streaming files,
// set the .ism file to be the primary file.
// If we, for example, copied an .mp4, then the mp4 would be the primary file.
var ismAssetFile = asset.AssetFiles.ToList().
Where(f => f.Name.EndsWith(".mp4", StringComparison.OrdinalIgnoreCase)).ToArray().FirstOrDefault();
// The following code assigns the first .ism file as the primary file in the asset.
// An asset should have one .ism file.
if (ismAssetFile != null)
{
ismAssetFile.IsPrimary = true;
ismAssetFile.Update();
}
IAsset encodedAsset = EncodeToAdaptiveBitrateMP4Set(asset, _context);
return PublishAssetGetURLs(encodedAsset, _context);
}
private void CopyBlob(ICloudBlob sourceBlob, CloudBlobContainer destinationContainer)
{
var signature = sourceBlob.GetSharedAccessSignature(new SharedAccessBlobPolicy
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24)
});
var destinationBlob = destinationContainer.GetBlockBlobReference(sourceBlob.Name);
if (destinationBlob.Exists())
{
Console.WriteLine(string.Format("Destination blob '{0}' already exists. Skipping.", destinationBlob.Uri));
}
else
{
// Display the size of the source blob.
Console.WriteLine(sourceBlob.Properties.Length);
Console.WriteLine(string.Format("Copy blob '{0}' to '{1}'", sourceBlob.Uri, destinationBlob.Uri));
destinationBlob.StartCopy(new Uri(sourceBlob.Uri.AbsoluteUri + signature));
while (true)
{
// The StartCopyFromBlob is an async operation,
// so we want to check if the copy operation is completed before proceeding.
// To do that, we call FetchAttributes on the blob and check the CopyStatus.
destinationBlob.FetchAttributes();
if (destinationBlob.CopyState.Status != CopyStatus.Pending)
{
break;
}
//It's still not completed. So wait for some time.
System.Threading.Thread.Sleep(1000);
}
// Display the size of the destination blob.
Console.WriteLine(destinationBlob.Properties.Length);
}
}
private IAsset EncodeToAdaptiveBitrateMP4Set(IAsset asset, CloudMediaContext _context)
{
// Declare a new job.
IJob job = _context.Jobs.Create("Media Encoder Standard Job");
// Get a media processor reference, and pass to it the name of the
// processor to use for the specific task.
IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard",_context);
// Create a task with the encoding details, using a string preset.
// In this case "Adaptive Streaming" preset is used.
ITask task = job.Tasks.AddNew("My encoding task",
processor,
"Adaptive Streaming",
TaskOptions.None);
// Specify the input asset to be encoded.
task.InputAssets.Add(asset);
// Add an output asset to contain the results of the job.
// This output is specified as AssetCreationOptions.None, which
// means the output asset is not encrypted.
task.OutputAssets.AddNew("Output asset",
AssetCreationOptions.None);
job.StateChanged += new EventHandler<JobStateChangedEventArgs>(JobStateChanged);
job.Submit();
job.GetExecutionProgressTask(CancellationToken.None).Wait();
return job.OutputMediaAssets[0];
}
public void JobStateChanged(object sender, JobStateChangedEventArgs e)
{
//Console.WriteLine("Job state changed event:");
//Console.WriteLine(" Previous state: " + e.PreviousState);
//Console.WriteLine(" Current state: " + e.CurrentState);
switch (e.CurrentState)
{
case JobState.Finished:
//Console.WriteLine();
//Console.WriteLine("Job is finished. Please wait while local tasks or downloads complete...");
break;
case JobState.Canceling:
case JobState.Queued:
case JobState.Scheduled:
case JobState.Processing:
//Console.WriteLine("Please wait...\n");
break;
case JobState.Canceled:
case JobState.Error:
// Cast sender as a job.
IJob job = (IJob)sender;
// Display or log error details as needed.
break;
default:
break;
}
}
private IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName, CloudMediaContext _context)
{
var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
if (processor == null)
throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
return processor;
}
private string PublishAssetGetURLs(IAsset asset, CloudMediaContext _context)
{
// Create a 30-day readonly access policy.
// You cannot create a streaming locator using an AccessPolicy that includes write or delete permissions.
IAccessPolicy policy = _context.AccessPolicies.Create("Streaming policy",
TimeSpan.FromDays(30),
AccessPermissions.Read);
// Create a locator to the streaming content on an origin.
ILocator originLocator = _context.Locators.CreateLocator(LocatorType.OnDemandOrigin, asset,
policy,
DateTime.UtcNow.AddMinutes(-5));
// Display some useful values based on the locator.
//Console.WriteLine("Streaming asset base path on origin: ");
//Console.WriteLine(originLocator.Path);
//Console.WriteLine();
// Get a reference to the streaming manifest file from the
// collection of files in the asset.
var manifestFile = asset.AssetFiles.Where(x => x.Name.EndsWith(".ism")).FirstOrDefault();
// Create a full URL to the manifest file. Use this for playback
// in streaming media clients.
string urlForClientStreaming = originLocator.Path + manifestFile.Name + "/manifest";
// Console.WriteLine("URL to manifest for client streaming using Smooth Streaming protocol: ");
// Console.WriteLine(urlForClientStreaming);
// Console.WriteLine("URL to manifest for client streaming using HLS protocol: ");
return urlForClientStreaming + "(format=m3u8-aapl)";
// Console.WriteLine("URL to manifest for client streaming using MPEG DASH protocol: ");
// Console.WriteLine(urlForClientStreaming + "(format=mpd-time-csf)");
// Console.WriteLine();
}
Checked our logs - the reason you are not getting the streaming URL is that the encode Job failed. At the end of EncodeToAdaptiveBitrateMP4Set(), you should confirm that the final Job status was Finished (i.e. successful). Looking at the encoder logs, it appears that the input file was corrupt.

Uploading ID to same Azure Blob Index that a file is being uploaded to using .Net SDK

I am uploading documents to my an Azure Blob Storage, which works perfect, but I want to be able to link and ID to this specifically uploaded document.
Below is my code for uploading the file:
[HttpPost]
public ActionResult Upload(HttpPostedFileBase file)
{
try
{
var path = Path.Combine(Server.MapPath("~/App_Data/Uploads"), file.FileName);
string searchServiceName = ConfigurationManager.AppSettings["SearchServiceName"];
string blobStorageKey = ConfigurationManager.AppSettings["BlobStorageKey"];
string blobStorageName = ConfigurationManager.AppSettings["BlobStorageName"];
string blobStorageURL = ConfigurationManager.AppSettings["BlobStorageURL"];
file.SaveAs(path);
var credentials = new StorageCredentials(searchServiceName, blobStorageKey);
var client = new CloudBlobClient(new Uri(blobStorageURL), credentials);
// Retrieve a reference to a container. (You need to create one using the mangement portal, or call container.CreateIfNotExists())
var container = client.GetContainerReference(blobStorageName);
// Retrieve reference to a blob named "myfile.gif".
var blockBlob = container.GetBlockBlobReference(file.FileName);
// Create or overwrite the "myblob" blob with contents from a local file.
using (var fileStream = System.IO.File.OpenRead(path))
{
blockBlob.UploadFromStream(fileStream);
}
System.IO.File.Delete(path);
return new JsonResult
{
JsonRequestBehavior = JsonRequestBehavior.AllowGet,
Data = "Success"
};
}
catch (Exception ex)
{
throw;
}
}
I have added the ClientID field to the Index(It is at the bottom), but have no idea how I am able to add this to this index. This is still al nerw to me and just need a little guidance if someone can help :
Thanks in advance.

Append to CloudBlockBlob stream

We have a file system abstraction that allows us to easily switch between local and cloud (Azure) storage.
For reading and writing files we have the following members:
Stream OpenRead();
Stream OpenWrite();
Part of our application "bundles" documents into one file. For our local storage provider OpenWrite returns an appendable stream:
public Stream OpenWrite()
{
return new FileStream(fileInfo.FullName, FileMode.Open, FileAccess.ReadWrite, FileShare.ReadWrite, BufferSize, useAsync: true);
}
For Azure blob storage we do the following:
public Stream OpenWrite()
{
return blob.OpenWrite();
}
Unfortunately this overrides the blob contents each time. Is it possible to return a writable stream that can be appended to?
Based on the documentation for OpenWrite here http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.storage.blob.cloudblockblob.openwrite.aspx, The OpenWrite method will overwrite an existing blob unless explicitly prevented using the accessCondition parameter.
One thing you could do is read the blob data in a stream and return that stream to your calling application and let that application append data to that stream. For example, see the code below:
static void BlobStreamTest()
{
storageAccount = CloudStorageAccount.DevelopmentStorageAccount;
CloudBlobContainer container = storageAccount.CreateCloudBlobClient().GetContainerReference("temp");
container.CreateIfNotExists();
CloudBlockBlob blob = container.GetBlockBlobReference("test.txt");
blob.UploadFromStream(new MemoryStream());//Let's just create an empty blob for the sake of demonstration.
for (int i = 0; i < 10; i++)
{
try
{
using (MemoryStream ms = new MemoryStream())
{
blob.DownloadToStream(ms);//Read blob data in a stream.
byte[] dataToWrite = Encoding.UTF8.GetBytes("This is line # " + (i + 1) + "\r\n");
ms.Write(dataToWrite, 0, dataToWrite.Length);
ms.Position = 0;
blob.UploadFromStream(ms);
}
}
catch (StorageException excep)
{
if (excep.RequestInformation.HttpStatusCode != 404)
{
throw;
}
}
}
}
There is now a CloudAppendBlob class that allows you to add content to an existing blob :
var account = CloudStorageAccount.Parse("storage account connectionstring");
var client = account.CreateCloudBlobClient();
var container = client.GetContainerReference("container name");
var blob = container.GetAppendBlobReference("blob name");
In your case you want to append from a stream:
await blob.AppendFromStreamAsync(new MemoryStream());
But you can append from text, byte array, file. Check the documentation.

GDI+ Generic Error

When my images are being loaded from my database on my web server, I see the following error:
A generic error occurred in GDI+. at
System.Drawing.Image.Save(Stream stream, ImageCodecInfo encoder,
EncoderParameters encoderParams) at
System.Drawing.Image.Save(Stream stream, ImageFormat format) at
MyWeb.Helpers.ImageHandler.ProcessRequest(HttpContext context)
All my code is attempting to do is load the image, can anybody take a look and let me know what I'm doing wrong?
Note - This works if I test it on my local machine, but not when I deploy it to my web server.
public void ProcessRequest(HttpContext context)
{
context.Response.Clear();
if (!String.IsNullOrEmpty(context.Request.QueryString["imageid"]))
{
int imageID = Convert.ToInt32(context.Request.QueryString["imageid"]);
int isThumbnail = Convert.ToInt32(context.Request.QueryString["thumbnail"]);
// Retrieve this image from the database
Image image = GetImage(imageID);
// Make it a thumbmail if requested
if (isThumbnail == 1)
{
Image.GetThumbnailImageAbort myCallback = new Image.GetThumbnailImageAbort(ThumbnailCallback);
image = image.GetThumbnailImage(200, 200, myCallback, IntPtr.Zero);
}
context.Response.ContentType = "image/png";
// Save the image to the OutputStream
image.Save(context.Response.OutputStream, ImageFormat.Png);
}
else
{
context.Response.ContentType = "text/html";
context.Response.Write("<p>Error: Image ID is not valid - image may have been deleted from the database.</p>");
}
}
The error occurs on the line:
image.Save(context.Response.OutputStream, ImageFormat.Png);
UPDATE
I've changed my code to this, bit the issue still happens:
var db = new MyWebEntities();
var screenshotData = (from screenshots in db.screenshots
where screenshots.id == imageID
select new ImageModel
{
ID = screenshots.id,
Language = screenshots.language,
ScreenshotByte = screenshots.screen_shot,
ProjectID = screenshots.projects_ID
});
foreach (ImageModel info in screenshotData)
{
using (MemoryStream ms = new MemoryStream(info.ScreenshotByte))
{
Image image = Image.FromStream(ms);
// Make it a thumbmail if requested
if (isThumbnail == 1)
{
Image.GetThumbnailImageAbort myCallback = new Image.GetThumbnailImageAbort(ThumbnailCallback);
image = image.GetThumbnailImage(200, 200, myCallback, IntPtr.Zero);
}
context.Response.ContentType = "image/png";
// Save the image to the OutputStream
image.Save(context.Response.OutputStream, ImageFormat.Png);
} }
Thanks.
Probably for the same reason that this guy was having problems - because the for a lifetime of an Image constructed from a Stream, the stream must not be destroyed.
So if your GetImage function constructs the returned image from a stream (e.g. a MemoryStream) and then closes the stream before returning the image then the above will fail. My guess is that your GetImage looks a tad like this:
Image GetImage(int id)
{
byte[] data = // Get data from database
using (MemoryStream stream = new MemoryStream(data))
{
return Image.FromStream(data);
}
}
If this is the case then try having GetImage return the MemoryStream (or possibly the byte array) directrly so that you can create the Image instance in your ProcessRequest method and dispose of the stream only when the processing of that image has completed.
This is mentioned in the documentation but its kind of in the small print.

Resources