Azure Easy Tables - Max size of a row? - azure

I want to ask if the Azure Easy Tables does have the row size limit ?
If yes what is the limit ?
Thanks for your answers !!!

We know the back end of easy table in Azure is Azure Sql . And a table can contain a maximum of 8,060 bytes per row in Azure Sql usually. There is also a LOB columns type that the limit is 2G. For better performance, I suggest you better don't reach 8060 bytes. If you have large data to store, you could put a link instead.
Can I save image to Azure blob storage and name, description to Azure Easy Tables
Yes, you could save image to Azure blob directly. You could try the following code:
Code in app.config(storage connection string):
<appSettings>
<add key="StorageConnectionString" value="DefaultEndpointsProtocol=https;AccountName=×××;AccountKey=×××" />
</appSettings>
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("container001");
CloudBlockBlob blockBlob = container.GetBlockBlobReference("apple.jpg");
using (var fileStream = System.IO.File.OpenRead(#"D:\apple.jpg"))
{
blockBlob.UploadFromStream(fileStream);
}
In Easy Table, if I convert image to base64 string to store, I also get en error about the request entity is too large. So you could copy the blob image link to store in Easy Table instead. You could copy the link by click '...'>Blob properties>Url.
var client = new MobileServiceClient("https://[your mobile service name].azurewebsites.net");
IMobileServiceTable<SiteTable> todoTable = client.GetTable<SiteTable>();
SiteTable siteTable = new SiteTable(); //SiteTable is model class name
siteTable.MyImage = " blob image link";
await todoTable.InsertAsync(siteTable);

Related

Delete unflushed file from Azure Data Lake Gen 2

To upload a file to ADL first you need to:
do a put request with the ?resource=file parameters (this creates a file on the ADL)
append data to the file with the ?action=append&position=<N> parameters
lastly, you need to flush the data with ?action=flush&position=<FILE_SIZE>
My question is:
Is there a way to tell the server how long the data should live if it is not flushed(written).
Since you need to create a file first to write data into it, there might be scenarios where the flush does not happen, and you are stuck with an empty file in the data lake.
I could not find anything on the Microsoft documentation about this.
Any info would be appreciated.
Updated 0219:
If you just call the append api, but not call the flush api, then the uncommitted data will be saved in azure within 7 days.
The uncommitted data will be deleted automatically after 7 days and cannot be deleted from the your end.
Origianl:
The SDK for Azure Datalake Storage Gen2 is ready, and you can use it to operate ADLS Gen2 more easier than using rest api.
If you're using .NET/c#, there is a SDK for Azure Datalake Storage Gen2: Azure.Storage.Files.DataLake.
Here is the official doc for how to use this SDK to operate ADLS Gen2, and the c# code below is used to delete a file / upload a file for ADLS Gen2:
static void Main(string[] args)
{
string accountName = "xxx";
string accountKey = "xxx";
StorageSharedKeyCredential sharedKeyCredential =
new StorageSharedKeyCredential(accountName, accountKey);
string dfsUri = "https://" + accountName + ".dfs.core.windows.net";
DataLakeServiceClient dataLakeServiceClient = new DataLakeServiceClient
(new Uri(dfsUri), sharedKeyCredential);
DataLakeFileSystemClient fileSystemClient = dataLakeServiceClient.GetFileSystemClient("w22");
DataLakeDirectoryClient directoryClient = fileSystemClient.GetDirectoryClient("t2");
// use this line of code to delete a file
//directoryClient.DeleteFile("22.txt");
//use the code below to upload a file
//DataLakeFileClient fileClient = directoryClient.CreateFile("22.txt");
//FileStream fileStream = File.OpenRead("d:\\foo2.txt");
//long fileSize = fileStream.Length;
//fileClient.Append(fileStream, offset: 0);
//fileClient.Flush(position: fileSize);
Console.WriteLine("**completed**");
Console.ReadLine();
}
For Java, refer to this doc.
For Python, refer to this doc.

empty image from blob storage

This is how I try to upload an image to Azure blog storage, then upload an empty file located there.
I try to upload this image here:
CloudStorageAccount storageAccount = new CloudStorageAccount(new Microsoft.WindowsAzure.Storage.Auth.StorageCredentials(Accountname, KeyValue), true);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve a reference to a container.
CloudBlobContainer container = blobClient.GetContainerReference(ContainerValue);
CloudBlockBlob blockBlob = container.GetBlockBlobReference(filename);
using (var f = System.IO.File.Open(model.FileToUpload.FileName, FileMode.Create))
{
await blockBlob.UploadFromStreamAsync(f);
}
But if i try to open this image say this.
From danish to english: We can not open this file
As Marco said, FileMode.Create specifies that the operating system should create a new file. If the file already exists, it will be overwritten.
For more details, you could refer to this article.
According to the pictures you provided, your blob size is 0 B. So, you would always couldn't find the file and open it.
FileMode specifies how the operating system should open a file.
So I suggest that you could delete it and use OpenRead to open an existing file for reading. You could refer to the code as below:
using (var f = System.IO.File.OpenRead(model.FileToUpload.FileName))
{
await blockBlob.UploadFromStreamAsync(f);
}

Picture is blank if I try to download it from blob storage [duplicate]

This is how I try to upload an image to Azure blog storage, then upload an empty file located there.
I try to upload this image here:
CloudStorageAccount storageAccount = new CloudStorageAccount(new Microsoft.WindowsAzure.Storage.Auth.StorageCredentials(Accountname, KeyValue), true);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve a reference to a container.
CloudBlobContainer container = blobClient.GetContainerReference(ContainerValue);
CloudBlockBlob blockBlob = container.GetBlockBlobReference(filename);
using (var f = System.IO.File.Open(model.FileToUpload.FileName, FileMode.Create))
{
await blockBlob.UploadFromStreamAsync(f);
}
But if i try to open this image say this.
From danish to english: We can not open this file
As Marco said, FileMode.Create specifies that the operating system should create a new file. If the file already exists, it will be overwritten.
For more details, you could refer to this article.
According to the pictures you provided, your blob size is 0 B. So, you would always couldn't find the file and open it.
FileMode specifies how the operating system should open a file.
So I suggest that you could delete it and use OpenRead to open an existing file for reading. You could refer to the code as below:
using (var f = System.IO.File.OpenRead(model.FileToUpload.FileName))
{
await blockBlob.UploadFromStreamAsync(f);
}

What is the best way to upload a large number of files to azure file storage?

I want the file storage specifically not the blob storage (I think). This is code for my azure function and I just have a bunch of stuff in my node_modules folder.
What I would like to do is upload a zip of the entire app and then just upload that and have azure unpack it at a given folder. Is this possible?
Right now I'm essentially iterating over all of my files and calling:
var fileStream = new stream.Readable();
fileStream.push(myFileBuffer);
fileStream.push(null);
fileService.createFileFromStream('taskshare', 'taskdirectory', 'taskfile', fileStream, myFileBuffer.length, function(error, result, response) {
if (!error) {
// file uploaded
}
});
And this works its just too slow. So I'm wondering if there is a faster way to upload a bunch of files for use in apps.
And this works its just too slow. So I'm wondering if there is a faster way to upload a bunch of files for use in apps.
If Microsoft Azure Storage Data Movement Library is acceptable, please have a try to use it. The Microsoft Azure Storage Data Movement Library designed for high-performance uploading, downloading and copying Azure Storage Blob and File. This library is based on the core data movement framework that powers AzCopy.
We also could get the demo code from the github document.
string storageConnectionString = "myStorageConnectionString";
CloudStorageAccount account = CloudStorageAccount.Parse(storageConnectionString);
CloudBlobClient blobClient = account.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference("mycontainer");
blobContainer.CreateIfNotExists();
string sourcePath = "path\\to\\test.txt";
CloudBlockBlob destBlob = blobContainer.GetBlockBlobReference("myblob");
// Setup the number of the concurrent operations
TransferManager.Configurations.ParallelOperations = 64;
// Setup the transfer context and track the upoload progress
SingleTransferContext context = new SingleTransferContext();
context.ProgressHandler = new Progress<TransferStatus>((progress) =>
{
Console.WriteLine("Bytes uploaded: {0}", progress.BytesTransferred);
});
// Upload a local blob
var task = TransferManager.UploadAsync(
sourcePath, destBlob, null, context, CancellationToken.None);
task.Wait();

Move a File from Azure File Storage to Azure Blob Storage

I have rather foolishly uploaded a vhd to Azure file storage thinking I can create a virtual machine from it only to find out it really needs to be in blob storage.
I know I can just upload it again - but it is very large and my upload speed is very slow.
My question is - can I move a file from file storage to blob storage without downloading/uploading again? I.e. is there anything in the Azure portal UI to do it, or even a PowerShell command?
You can try AzCopy:
AzCopy.exe /Source:{*URL to source container*} /Dest:{*URL to dest container*} /SourceKey:{*key1*} /DestKey:{*key2*} /S
When copying from File Storage to Blob Storage, the default blob type is block blob, user can specify option /BlobType:page to change the destination blob type.
AzCopy by default copies data between two storage endpoints asynchronously. Therefore, the copy operation will run in the background using spare bandwidth capacity that has no SLA in terms of how fast a blob will be copied, and AzCopy will periodically check the copy status until the copying is completed or failed. The /SyncCopy option ensures that the copy operation will get consistent speed.
In c#:
public static CloudFile GetFileReference(CloudFileDirectory parent, string path)
{
var filename = Path.GetFileName(path);
var fullPath = Path.GetDirectoryName(path);
if (fullPath == string.Empty)
{
return parent.GetFileReference(filename);
}
var dirReference = GetDirectoryReference(parent, fullPath);
return dirReference.GetFileReference(filename);
}
public static CloudFileDirectory GetDirectoryReference(CloudFileDirectory parent, string path)
{
if (path.Contains(#"\"))
{
var paths = path.Split('\\');
return GetDirectoryReference(parent.GetDirectoryReference(paths.First()), string.Join(#"\", paths.Skip(1)));
}
else
{
return parent.GetDirectoryReference(path);
}
}
The code to copy:
// Source File Storage
string azureStorageAccountName = "shareName";
string azureStorageAccountKey = "XXXXX";
string name = "midrive";
CloudStorageAccount storageAccount = new CloudStorageAccount(new StorageCredentials(azureStorageAccountName, azureStorageAccountKey), true);
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare fileShare = fileClient.GetShareReference(name);
CloudFileDirectory directorio = fileShare.GetRootDirectoryReference();
CloudFile cloudFile = GetFileReference(directorio, "SourceFolder\\fileName.pdf");
// Destination Blob
string destAzureStorageAccountName = "xx";
string destAzureStorageAccountKey = "xxxx";
CloudStorageAccount destStorageAccount = new CloudStorageAccount(new StorageCredentials(destAzureStorageAccountName, destAzureStorageAccountKey), true);
CloudBlobClient destClient = destStorageAccount.CreateCloudBlobClient();
CloudBlobContainer destContainer = destClient.GetContainerReference("containerName");
CloudBlockBlob destBlob = destContainer.GetBlockBlobReference("fileName.pdf");
// copy
await TransferManager.CopyAsync(cloudFile, destBlob, true);
Another option is to use Azure CLI...
az storage copy -s /path/to/file.txt -d https://[account].blob.core.windows.net/[container]/[path/to/blob]
More info here: az storage copy
Thanks to Gaurav Mantri for pointing me in the direction of AzCopy.
This does allow me to copy between file and blob storage using the command:
AzCopy.exe /Source:*URL to source container* /Dest:*URL to dest container* /SourceKey:*key1* /DestKey:*key2* /S
However as Gaurav also rightly points out in the comment the resulting blob will be of type Block Blob and this is no good for me. I need one of type Page Blob in order to create a VM out of it using https://github.com/Azure/azure-quickstart-templates/tree/master/201-vm-specialized-vhd
There is no way to change the blob type as far as I can see once it is up there in the cloud, so it looks like my only option is to wait for a lengthy upload again.

Resources