Transferring files from previously deleted azure media service to new media service- is it possible? - azure

Is it possible to transfer files from an old media service account which has been accidentally deleted to a new media services account.
Microsoft tech support have been no help.
I'm able to copy over the files to the new media service account, but when i test to see if I can publish one of the assets in the portal, It gives me successful streaming urls, but when I try to access those I get a network error
<serverError>
<status>404</status>
<subStatus>1000</subStatus>
<hresult>MPE_STORAGE_RESOURCE_NOT_FOUND</hresult>
<activityId>80028340-0004-F800-B63F-84710C7967BB</activityId>
<serviceId>4A42CB8E-4542-0C18-2C0D-4B460D96B604</serviceId>
</serverError>
I don't think it can find the manifest file. which is named pc124m190o_AdaptiveStreaming_manifest.xml
The name of the metadata file could also be a potential problem
5f7e8f45-87e9-49ce-a2ae-7bb673bf0b0f_metadata.xml
has anyone successfully done this?
Here is the code I'm using to copy the files. Maybe the error is here?
class Program
{
// Read values from the App.config file.
private static readonly string _sourceStorageAccountName =
ConfigurationManager.AppSettings["SourceStorageAccountName"];
private static readonly string _sourceStorageAccountKey =
ConfigurationManager.AppSettings["SourceStorageAccountKey"];
private static readonly string _NameOfBlobContainerYouWantToCopy =
ConfigurationManager.AppSettings["NameOfBlobContainerYouWantToCopy"];
private static readonly string _AMSAADTenantDomain =
ConfigurationManager.AppSettings["AMSAADTenantDomain"];
private static readonly string _AMSRESTAPIEndpoint =
ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
private static readonly string _AMSClientId =
ConfigurationManager.AppSettings["AMSClientId"];
private static readonly string _AMSClientSecret =
ConfigurationManager.AppSettings["AMSClientSecret"];
private static readonly string _AMSStorageAccountName =
ConfigurationManager.AppSettings["AMSStorageAccountName"];
private static readonly string _AMSStorageAccountKey =
ConfigurationManager.AppSettings["AMSStorageAccountKey"];
// Field for service context.
private static CloudMediaContext _context = null;
private static CloudStorageAccount _sourceStorageAccount = null;
private static CloudStorageAccount _destinationStorageAccount = null;
static void Main(string[] args)
{
AzureAdTokenCredentials tokenCredentials = new AzureAdTokenCredentials(_AMSAADTenantDomain,
new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
AzureEnvironments.AzureCloudEnvironment);
var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
// Create the context for your source Media Services account.
_context = new CloudMediaContext(new Uri(_AMSRESTAPIEndpoint), tokenProvider);
_sourceStorageAccount =
new CloudStorageAccount(new StorageCredentials(_sourceStorageAccountName,
_sourceStorageAccountKey), true);
_destinationStorageAccount =
new CloudStorageAccount(new StorageCredentials(_AMSStorageAccountName,
_AMSStorageAccountKey), true);
CloudBlobClient sourceCloudBlobClient =
_sourceStorageAccount.CreateCloudBlobClient();
// CreateAssetFromExistingBlobs(sourceContainer);
List<string> containers=GetAllContainerNames(sourceCloudBlobClient);
foreach(string item in containers)
{
CloudBlobContainer sourceContainer =
sourceCloudBlobClient.GetContainerReference(item);
CreateAssetFromExistingBlobs(sourceContainer);
Console.WriteLine("finished " + item);
}
}
static private IAsset CreateAssetFromExistingBlobs(CloudBlobContainer sourceBlobContainer)
{
CloudBlobClient destBlobStorage = _destinationStorageAccount.CreateCloudBlobClient();
// Create a new asset.
IAsset asset = _context.Assets.Create("NewAsset_" + Guid.NewGuid(), AssetCreationOptions.None);
IAccessPolicy writePolicy = _context.AccessPolicies.Create("writePolicy",
TimeSpan.FromHours(24), AccessPermissions.Write);
ILocator destinationLocator =
_context.Locators.CreateLocator(LocatorType.Sas, asset, writePolicy);
// Get the asset container URI and Blob copy from mediaContainer to assetContainer.
CloudBlobContainer destAssetContainer =
destBlobStorage.GetContainerReference((new Uri(destinationLocator.Path)).Segments[1]);
if (destAssetContainer.CreateIfNotExists())
{
destAssetContainer.SetPermissions(new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
});
}
var blobList = sourceBlobContainer.ListBlobs();
foreach (CloudBlockBlob sourceBlob in blobList)
{
var assetFile = asset.AssetFiles.Create((sourceBlob as ICloudBlob).Name);
ICloudBlob destinationBlob = destAssetContainer.GetBlockBlobReference(assetFile.Name);
CopyBlob(sourceBlob, destAssetContainer);
sourceBlob.FetchAttributes();
assetFile.ContentFileSize = (sourceBlob as ICloudBlob).Properties.Length;
assetFile.Update();
Console.WriteLine("File {0} is of {1} size", assetFile.Name, assetFile.ContentFileSize);
}
asset.Update();
destinationLocator.Delete();
writePolicy.Delete();
// Set the primary asset file.
// If, for example, we copied a set of Smooth Streaming files,
// set the .ism file to be the primary file.
// If we, for example, copied an .mp4, then the mp4 would be the primary file.
var ismAssetFile = asset.AssetFiles.ToList().
Where(f => f.Name.EndsWith(".ism", StringComparison.OrdinalIgnoreCase)).ToArray().FirstOrDefault();
// The following code assigns the first .ism file as the primary file in the asset.
// An asset should have one .ism file.
if (ismAssetFile != null)
{
ismAssetFile.IsPrimary = true;
ismAssetFile.Update();
}
return asset;
}
Here is what my media storage window looks like

The manifest file you mentioned is technically not required for streaming. What is missing is that when you copied the files to your new Storage account the new Media Services account knows nothing of them. You must create new assets and copy the files into the new assets for Media Services to see them. I recommend doing an import with Azure Media Services Explorer in your new account from the Storage account.

Related

How do I access blob storage sub containers in my ASP.NET Core web application?

I've created a solution to access my container contents within my asp.net core 3.1 application and return that contents as a list to my view. At the moment the application access data in the root container which is called upload, however, this container has many sub containers and I would like to list the blobs in a specific one called 1799.
So, instead of accessing upload and showing me the full contents of that container, I want to access upload/1799 and list all the blobs within that container.
I cannot see of anyway to add this sub container to my method and allow this to happen, can anyone help?
Here is my code so far:
CarController.cs
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
namespace MyProject.Controllers
{
public class HomeController : Controller
{
private readonly IConfiguration _configuration;
private readonly string accessKey = string.Empty;
public HomeController(IConfiguration configuration)
{
_configuration = configuration;
accessKey = _configuration.GetConnectionString("AzureStorage");
}
[HttpGet]
public IActionResult Edit(int id)
{
var car = _carService.GetVessel(id);
string strContainerName = "uploads";
string subdir = "1799";
var filelist = new List<BlobListViewModel>();
BlobServiceClient blobServiceClient = new BlobServiceClient(accessKey);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(strContainerName);
var blobs = containerClient.GetBlobs();
foreach (var item in blobs)
{
filelist.Add(new BlobListViewModel
{
FileName = item.Name
});
}
return View(filelist);
}
}
I've hunted through all the documentation and I can't find anything related to this.
Thanks to David Browne for his comment, there is indeed a prefix option in GetBlob(). THey act like mini-filters in some ways allowing you to define what properties are returned and the blob state etc. Here is my code which has the options set to zero meaning Default for both Trait and State.
[HttpGet]
public IActionResult Edit(int id)
{
var car = _carService.GetVessel(id);
string strContainerName = "uploads";
string subdir = "1799";
var filelist = new List<BlobListViewModel>();
BlobServiceClient blobServiceClient = new BlobServiceClient(accessKey);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(strContainerName);
//Traits, States, Prefix
var blobs = containerClient.GetBlobs(0, 0, subdir);
foreach (var item in blobs)
{
filelist.Add(new BlobListViewModel
{
FileName = item.Name
});
}
return View(filelist);
}

Tidier way of getting Directory Names only from Blob Container

I am trying to get the directory names only of any directories in a specific location withing Blob Storage
I have the helper class below
public static class BlobHelper
{
private static CloudBlobContainer _cloudBlobContainer;
private const string _containerName = "administrator";
public static void Setup(string connectionString)
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
_cloudBlobContainer = cloudBlobClient.GetContainerReference(_containerName);
}
public static List<string> GetDirectoryNames(string relativeAddress)
{
var result = new List<string>();
var directory = _cloudBlobContainer.GetDirectoryReference(relativeAddress);
var folders = directory.ListBlobs().OfType<CloudBlobDirectory>();
foreach (var folder in folders)
{
var name = folder.Uri.AbsolutePath;
name = name.Replace(folder.Parent.Prefix, string.Empty)
.Replace(#"/", string.Empty)
.Replace(_containerName, string.Empty);
result.Add(name);
}
}
}
The process to get the directory names only (i.e. not the full hierarchy) feels a bit hacky, although it does work
Is there a better way to do this?
I tried the approach below
var directory = _cloudBlobContainer.GetDirectoryReference(relativeAddress);
var blobs = directory.ListBlobs(true).OfType<CloudBlobDirectory>();;
var blobNames = blobs.OfType<CloudBlockBlob>().Select(b => b.Name).ToList();
return blobNames;
The main difference with the above is the use of UseFlatBlobListing as true
However, this approach results in no folders at all being returned, whereas my other logic does at least give me the 2 folders I expect to find
Any ideas what I am doing wrong?
Cheers
Paul
I suppose your code is OK, I don't understand what you mean "a bit hacky". I think you want to get the directory directly.
Cause no method directory get the directory, for now known method to do it with v11 sdk mostly use the blob uri to do it.
And the below is my way to do it.
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("test");
BlobContinuationToken blobContinuationToken = null;
var blobsSeg = cloudBlobContainer.ListBlobsSegmented(null, blobContinuationToken);
var directories= blobsSeg.Results.OfType<CloudBlobDirectory>().Select(b => b.Prefix).ToList();
foreach (string directory in directories) {
Console.WriteLine(directory);
}
The result it returns it will be like below pic. Hope this could help you.

Can I set the access tier when I upload a blob? If yes, then how to do that?

I did not find any way to set the access tier of a blob when I upload it, I know I can set a blob's access tier after I uploaded it, but I just want to know if I can upload the blob and set it's access tier in only one step. And if there is any golang API to do that?
I googled it but I got nothing helpful till now.
Here is what I did now, I mean upload it and then set it's access tier.
// Here's how to upload a blob.
blobURL := containerURL.NewBlockBlobURL(fileName)
ctx := context.Background()
_, err = azblob.UploadBufferToBlockBlob(ctx, data, blobURL, azblob.UploadToBlockBlobOptions{})
handleErrors(err)
//set tier
_, err = blobURL.SetTier(ctx, azblob.AccessTierCool, azblob.LeaseAccessConditions{})
handleErrors(err)
But I want to upload a blob and set it's tier in one step, not two steps as I do now.
The short answer is No. According to the offical REST API reference, the blob operation you want is that to do via two REST APIs Put Blob and Set Blob Tier. Actually, all SDK APIs for different languages are implemented by wrapping the related REST APIs.
Except for Page Blob, you can set the header x-ms-access-tier in your operation request to do your want, as below.
For Block Blob, the operations in two steps are necessary, and can not be merged.
It is now possible using the new x-ms-access-tier header:
x-ms-access-tier
REST API with auth
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Net.Mime;
using System.Security.Cryptography;
using System.Text;
using System.Threading.Tasks;
namespace WhateverYourNameSpaceIs
{
class Program
{
private const string StorageKey = #"PutYourStorageKeyHere";
private const string StorageAccount = "PutYourStorageAccountHere";
private const string ContainerName = "PutYourContainerNameHere";
private const string Method = "PUT";
private const string ContentType = MediaTypeNames.Image.Jpeg;
private static readonly string BlobStorageTier = StorageTier.Cool;
private static readonly List<Tuple<string, string>> HttpContentHeaders = new List<Tuple<string, string>>()
{
new Tuple<string, string>("x-ms-access-tier", BlobStorageTier),
new Tuple<string, string>("x-ms-blob-type", "BlockBlob"),
new Tuple<string, string>("x-ms-date", DateTime.UtcNow.ToString("R")),
new Tuple<string, string>("x-ms-version", "2018-11-09"),
new Tuple<string, string>("Content-Type", ContentType),
};
static async Task Main()
{
await UploadBlobToAzure("DestinationFileNameWithoutPath", "LocalFileNameWithPath");
}
static async Task<int> UploadBlobToAzure(string blobName, string fileName)
{
int returnValue = (int)AzureCopyStatus.Unknown;
try
{
using var client = new HttpClient();
using var content = new ByteArrayContent(File.ReadAllBytes(fileName));
HttpContentHeaders.ForEach(x => content.Headers.Add(x.Item1, x.Item2));
var stringToSign = $"{Method}\n\n\n{content.Headers.ContentLength.Value}\n\n{ContentType}\n\n\n\n\n\n\n";
foreach (var httpContentHeader in HttpContentHeaders.Where(x => x.Item1 != "Content-Type").OrderBy(x => x.Item1))
stringToSign += $"{httpContentHeader.Item1.ToLower()}:{httpContentHeader.Item2}\n";
stringToSign += $"/{StorageAccount}/{ContainerName}/{blobName}";
HMACSHA256 hmac = new HMACSHA256(Convert.FromBase64String(StorageKey));
string signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("SharedKey", $"{StorageAccount}:{signature}");
var httpResponse = await client.PutAsync($"https://{StorageAccount}.blob.core.windows.net/{ContainerName}/{blobName}", content);
returnValue = (int)httpResponse.StatusCode;
}
catch (IOException ioException)
{
Console.WriteLine(ioException.ToString());
returnValue = (int)AzureCopyStatus.FileNotFound;
}
catch (Exception exception)
{
Console.WriteLine(exception.ToString());
returnValue = (int)AzureCopyStatus.Error;
}
return returnValue;
}
internal enum AzureCopyStatus
{
Unknown = -1,
Error = 0,
FileNotFound = 2
}
internal static class StorageTier
{
internal static string Cool = "Cool";
internal static string Hot = "Hot";
}
}
}

Why won't anything upload? - Uploading info blob to Azure with Xamarin

I have been trying for quite some time to get something to upload to my Azure blob container I have set up with an Xamarin.ios project.
I am not sure why it isn't working, and have explored many different options.
I get CS0426 whenever I try and refer to LimaBeans in my Viewcontroller.
Stumped.
This is my View Controller
using System;
using System.IO;
using UIKit;
namespace storingbuttondataaaa
{
public partial class ViewController : UIViewController
{
protected ViewController(IntPtr handle) : base(handle)
{
// Note: this .ctor should not contain any initialization logic.
}
public override void ViewDidLoad()
{
base.ViewDidLoad();
// Perform any additional setup after loading the view, typically from a nib.
}
public override void DidReceiveMemoryWarning()
{
base.DidReceiveMemoryWarning();
// Release any cached data, images, etc that aren't in use.
}
partial void FractureBtn_TouchUpInside(UIButton sender)
{
//get text box data
var name = FractureBtn;
string call = ("call 911, especially if blood is spraying everywhere!");
string line = string.Format("{0},{1}", name, call);
//Store the Information.
var documents = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments);
var filename = Path.Combine(documents, "HealthJournal.txt");
File.WriteAllText(filename, line);
;
}
partial void HandleAddClicked_TouchUpInside(UIButton sender)
{
activityIndicator.StartAnimating();
new BlobUpload.LimaBeans();
}
}
}
and this is my blob upload task:
using System;
using System.Diagnostics.Contracts;
using System.Threading.Tasks;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
namespace storingbuttondataaaa
{
public class BlobUpload
{
public BlobUpload()
{
}
public static async Task LimaBeans(string localPath)
{
Contract.Ensures(Contract.Result<Task>() != null);
// Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=itsmypersonalkey");
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("submittedreqforbd");
// Create the container if it doesn't already exist.
await container.CreateIfNotExistsAsync();
// Retrieve reference to a blob named "myblob".
CloudBlockBlob blockBlob = container.GetBlockBlobReference("myblob");
//await blockBlob.UploadFromFileAsync(Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments));
await blockBlob.UploadTextAsync("My Head Just Exploaded!");
}
}
}
What is wrong? I am on the verge of giving up!
No defining declaration found for implementing declaration of partial method ViewController.FractureBtn_TouchUpInside(UIButton)
We should define a declaration of this method in another partial class as below:
partial void FractureBtn_TouchUpInside(UIButton sender);
To upload local file to Azure, we can do as below:
/// <summary>
/// Upload file to Blob
/// </summary>
/// <param name="path">path of local file. For example: C:\Users\leel2\Desktop\333.txt</param>
/// <param name="blobName">For example: 333.txt</param>
/// <param name="containerName"></param>
public static void UploadBlob(string path, string blobName, string containerName)
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=xxxxxxxx");
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = cloudBlobClient.GetContainerReference(containerName);
container.CreateIfNotExists();
CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
blob.UploadFromFile(path);
}

C# Azure storage from one blob copy file to another blob

How can I read a file to stream from one blob and upload to another blob? My requirement is to copy a file from one blob to another blob with different file name? In C#
The easiest way to achieve it is using "Azure Storage Data Movement Library" (you can get it thru nuget package).
This is a simple-sample to make the transfer:
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
using Microsoft.WindowsAzure.Storage.DataMovement;
using System;
namespace BlobClient
{
class Program
{
static void Main(string[] args)
{
const string storageConnectionString = "DefaultEndpointsProtocol=https;AccountName=juanktest;AccountKey=loHQwke4lSEu1p2W3gg==";
const string container1 = "juankcontainer";
const string sourceBlobName = "test.txt";
const string destBlobName = "newTest.txt";
//Setup Account, blobclient and blobs
CloudStorageAccount account = CloudStorageAccount.Parse(storageConnectionString);
CloudBlobClient blobClient = account.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference(container1);
blobContainer.CreateIfNotExists();
CloudBlockBlob sourceBlob = blobContainer.GetBlockBlobReference(sourceBlobName);
CloudBlockBlob destinationBlob = blobContainer.GetBlockBlobReference(destBlobName);
//Setup data transfer
TransferContext context = new TransferContext();
Progress<TransferProgress> progress = new Progress<TransferProgress>(
(transferProgress) => {
Console.WriteLine("Bytes uploaded: {0}", transferProgress.BytesTransferred);
});
context.ProgressHandler = progress;
// Start the transfer
try
{
TransferManager.CopyAsync(sourceBlob, destinationBlob,
false /* isServiceCopy */,
null /* options */, context);
}
catch (Exception e)
{
Console.WriteLine("The transfer is cancelled: {0}", e.Message);
}
Console.WriteLine("CloudBlob {0} is copied to {1} ====successfully====",
sourceBlob.Uri.ToString(),
destinationBlob.Uri.ToString());
Console.ReadLine();
}
}
}
Note that "Azure Storage Data Movement Library" is very robust so you can track the transfer progress, cancel the operation or even suspend it to resume it later ;)
One of the easiest ways to copy files is with the AzCopy utility.
I would like to recommend another method other than the above to get this done.
That is azure functions. (serverless compute service).
As a prerequisite, you should have some knowledge in azure functions, creating them, deploying them. 1. What is azure function 2. create an Azure function app
And the following code snippet is the simplest and basic way to perform this action. (In here, when a user uploading a new file to the "demo" blob, the function will be triggered and read that uploaded file from the demo blob and copy to the "output" blob.)
namespace Company.Function{
public static class NamalFirstBlobTrigger
{
[FunctionName("NamalFirstBlobTrigger")]
public static void Run([BlobTrigger("demo/{name}", Connection = "AzureWebJobsStorage")]Stream myBlob,
[Blob("output/testing.cs",FileAccess.Write, Connection = "AzureWebJobsStorage")]Stream outputBlob,
string name,
ILogger log)
{
myBlob.CopyTo(outputBlob);
}
}}

Resources