Uploading image to blob container is having zero bytes - azure

I am trying to upload image to blob container from asp.net core, but i see that image is zero bytes afer uploading image in blob container. i search in google to resolve the issue and set position=0 even though still issue exists, does any one find me solution where am i wrong.
Here is code i am using.
public async Task AddFace(IFormFile file)
{
dynamic FileExtension = string.Empty;
if (file.Length > 0)
{
FileExtension = Path.GetExtension(file.FileName);
if (string.IsNullOrEmpty(FileExtension))
{
FileExtension = ".jpg";
}
using (Stream stream = file.OpenReadStream())
{
stream.Seek(0, SeekOrigin.Begin);
//UPLOAD TO BLOB CONTAINER
CloudStorageAccount account = CloudStorageAccount.Parse(ConnectionString);
CloudBlobClient blobClient = account.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference(configuration[ConfigKeys.BlobStorageConfiguration.BlobContainer]);
CloudBlockBlob blockBlob = blobContainer.GetBlockBlobReference(file.FileName+FileExtension );
await blockBlob.UploadFromStreamAsync(stream);
}
}
}

Related

Download Zip file from azure Storage c#

I am looping through the file names from my database and the same file i have in azure storage. I am zipping those n number of files and download from azure storage. I saves the zipped file to my local storage. When i extract and want to see a file, it say damaged/corrupt.
public ActionResult Download(string productid, string YearActiveid)
{
HomePageModel homepagemodel = new HomePageModel();
homepagemodel.ProdHeaderDetail = GetProductHeaderDetail(productid, YearActiveid);
homepagemodel.PriorYearsActive = GetPriorYearActive(productid, YearActiveid);
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=https;AccountName=<name>;AccountKey=<key>;EndpointSuffix=core.windows.net");
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("product");
var blobFileNames = new string[] { "file1.png", "file2.png", "file3.png", "file4.png" };
var outputMemStream = new MemoryStream();
var zipOutputStream = new ZipOutputStream(outputMemStream);
foreach (var ProdHeaderDetail in homepagemodel.ProdHeaderDetail)
{
zipOutputStream.SetLevel(5);
var blob = cloudBlobContainer.GetBlockBlobReference(ProdHeaderDetail.FileName);
var entry = new ZipEntry(ProdHeaderDetail.FileName);
zipOutputStream.PutNextEntry(entry);
blob.DownloadToStreamAsync(zipOutputStream);
}
zipOutputStream.Finish();
//zipOutputStream.Close();
//zipOutputStream.CloseEntry();
zipOutputStream.IsStreamOwner = false;
outputMemStream.Position = 0;
return File(outputMemStream, "application/zip", "filename.zip");
}
I resolved the issue by adding async and wait
public async Task Download(string productid, string YearActiveid)
await blob.DownloadToStreamAsync(zipOutputStream);

zip all files in azure blob container

I am trying to use azure function to zip all files inside a blob container using System.IO.Compression.
I could list all files inside the container using the below CloudBlob code
CloudStorageAccount storageAccount = CloudStorageAccount.Parse (storageConn);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("<container>");
BlobContinuationToken blobToken = null;
var blobs = await container.ListBlobsSegmentedAsync(blobToken);
var fileList = new List<string>();
var blobpath1 = #"https://<pathtocontainer>/test.zip";
foreach (var blbitem in blobs.Results)
{
if (blbitem is CloudBlockBlob)
{
var blobFileName = blbitem.Uri.Segments.Last().Replace("%20", " ");
var blobFilePath = blbitem.Uri.AbsolutePath.Replace(blbitem.Container.Uri.AbsolutePath + "/", "").Replace("%20", " ");
var blobPath = blobFilePath.Replace("/" + blobFileName, "");
log.LogInformation("blob path : " + blbitem.Uri.ToString());
fileList.Add(blbitem.Uri.ToString());
string rootpath = #"D:\home\site\wwwroot\ZipandSendFile\temp\";
string path = rootpath + blobPath;
log.LogInformation("saving in " + path);
//Add to zip
/*
CloudBlobContainer container = cloudBlobClient.GetContainerReference("<container>");
CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
using (FileStream fs = new FileStream(rootpath, FileMode.Create))
{
blob. DownloadToStream(fs);
}
*/
}
}
</code>
After getting each file details inside the blob , I am trying to add them into zip archive
using the below System.IO.Compression package
My attempt to download file
<code>
public static void AddFilesToZip(string zipPath, string[] files,ILogger log)
{
if (files == null || files.Length == 0)
{
return;
}
log.LogInformation("Executing add files to zip");
log.LogInformation(zipPath);
using (var zipArchive = ZipFile.Open(zipPath, ZipArchiveMode.Update))
{
log.LogInformation("in Zip archive");
foreach (var file in files)
{
var fileInfo = new FileInfo(file);
log.LogInformation(fileInfo.FullName);
zipArchive.CreateEntryFromFile(fileInfo.FullName, fileInfo.Name);
}
}
}
</code>
But I am getting access denied. Any pointers on this ?
Resolved issue by logging into kudo cmdshell and cd into directory and change file attribute
with attrib +A .

Download CSV from Azure Blob Storage to Browser

I can successfully upload to Azure Blob Storage, but I'm having issues downloading files (csv and pdf files).
My goal is for the file to download to the browser (since this will be a web app and I will not know the local path to download the file to).
string connString = ConfigurationManager.ConnectionStrings["MyTestStorageAccount"].ConnectionString;
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(connString);
CloudBlobClient _blobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer _cloudBlobContainer = _blobClient.GetContainerReference("filestorage");
CloudBlockBlob _blockBlob = _cloudBlobContainer.GetBlockBlobReference("testfile.csv");
Response.AddHeader("Content-Disposition", "attachment; filename=" + "testfile.csv");
_blockBlob.DownloadToStream(Response.OutputStream);
I follow your code in mvc and download csv in my site, when I open it, the content inside is not what I added but some html template.
If this is your problem, you could refer to the following code:
public ActionResult Download()
{
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient _blobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer _cloudBlobContainer = _blobClient.GetContainerReference("data");
CloudBlockBlob _blockBlob = _cloudBlobContainer.GetBlockBlobReference("table.csv");
Response.AddHeader("Content-Disposition", "attachment; filename=" + "table.csv");
MemoryStream ms = new MemoryStream();
_blockBlob.DownloadToStream(ms);
ms.Position = 0;
return File(ms, "application/octet-stream", "table.csv");
}
Also you could return Redirect(blobUrl);
BTW, if your blob is private, you need to create a Shared Access Signature with Read permission and Content-Disposition header set and create blob URL based on that and use that URL. In this case, the blob contents will be directly streamed from storage to the client browser.
For more details, refer to the following code:
public ActionResult Download()
{
CloudStorageAccount account = new CloudStorageAccount(new StorageCredentials("accountname", "accountkey"), true);
var blobClient = account.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("container-name");
var blob = container.GetBlockBlobReference("file-name");
var sasToken = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(10),//assuming the blob can be downloaded in 10 miinutes
}, new SharedAccessBlobHeaders()
{
ContentDisposition = "attachment; filename=file-name"
});
var blobUrl = string.Format("{0}{1}", blob.Uri, sasToken);
return Redirect(blobUrl);
}
I have tested your code, it worked fine. You can tell us your problem like error message, more detailed requirements,etc.
I create a generic handler(ashx) to test it and here is my tested code for your reference:
<%# WebHandler Language="C#" Class="DownloadHandler" %>
using System.Web;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
public class DownloadHandler : IHttpHandler {
public void ProcessRequest (HttpContext context) {
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=https;AccountName=your_account;AccountKey=your_key;EndpointSuffix=core.windows.net");
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = cloudBlobClient.GetContainerReference("mycontainer");
container.CreateIfNotExists();
CloudBlockBlob blob = container.GetBlockBlobReference("4.PNG");
context.Response.AddHeader("Content-Disposition", "attachment; filename=" + "4.PNG");
blob.DownloadToStream(context.Response.OutputStream);
}
public bool IsReusable {
get {
return false;
}
}
}
Screenshot of result:
When i access this handler via IE 11

Move files between azure file share and blob

I have to move some files between a share and a blob on the same storage account.
After some googleing I ended up with this code:
CloudFileClient fileClient = account.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference("shareName");
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
CloudFileDirectory videoDirectory = rootDir.GetDirectoryReference(video.StoragePath);
CloudBlobClient blobClient = account.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(video.StoragePath);
container.CreateIfNotExists();
foreach (var Files in videoDirectory.ListFilesAndDirectories())
{
var arr = Files.Uri.ToString().Split('/');
string strFileName = arr[arr.Length - 1];
CloudFile sourceFile = videoDirectory.GetFileReference(strFileName);
string fileSas = sourceFile.GetSharedAccessSignature(new SharedAccessFilePolicy()
{
Permissions = SharedAccessFilePermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24)
});
Uri fileSasUri = new Uri(sourceFile.StorageUri.PrimaryUri.ToString() + fileSas);
CloudBlockBlob blockBlob = container.GetBlockBlobReference(strFileName);
blockBlob.StartCopyAsync(fileSasUri).Wait(); //copy the file to blob storage and wait for the operation to complete
//sourceFile.DeleteAsync(); //delete the file
}
//videoDirectory.DeleteAsync(); //delete the directory
If the delete lines are uncommented the destination contains all the files in the source folder but with 0 in size.
Any ideas what am I doing wrong?
I also want to delete the share directory after all the files are copied. Is there a way to check if the files are in the destination folder?
As far as I know, the cloud blob StartCopyAsync method means you tell the server side start to copy blob form the file storage.
But it will not wait the copy operation execute completely.
So if you want to delete the file after the file has already executed completely, you should use FetchAttributes method to get the blob’s status.
If the status is complete, you could delete the file.
More details, you could refer to follow codes:
CloudStorageAccount account = CloudStorageAccount.Parse(
"connection string");
CloudFileClient fileClient = account.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference("sharetest");
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
CloudFileDirectory videoDirectory = rootDir.GetDirectoryReference("TestDirectory");
CloudBlobClient blobClient = account.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("testdirectory");
container.CreateIfNotExists();
foreach (var Files in videoDirectory.ListFilesAndDirectories())
{
var arr = Files.Uri.ToString().Split('/');
string strFileName = arr[arr.Length - 1];
CloudFile sourceFile = videoDirectory.GetFileReference(strFileName);
string fileSas = sourceFile.GetSharedAccessSignature(new SharedAccessFilePolicy()
{
Permissions = SharedAccessFilePermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24)
});
Uri fileSasUri = new Uri(sourceFile.StorageUri.PrimaryUri.ToString() + fileSas);
CloudBlockBlob blockBlob = container.GetBlockBlobReference(strFileName);
blockBlob.StartCopyAsync(fileSasUri).Wait();
blockBlob.FetchAttributes();
while (blockBlob.CopyState.Status == CopyStatus.Pending)
{
Thread.Sleep(50);
blockBlob.FetchAttributes();
}
if (blockBlob.CopyState.Status == CopyStatus.Success)
{
sourceFile.DeleteAsync();
}

Insert Image into Dynamically create PDF in AZURE BLOB Storage

I created one dynamic PDF in AZURE BLOB Storage. All content is write perfectly. Now I want to add Image at top of the PDF content from one of blob storage. For create PDF my code is as below :
//Parse the connection string and return a reference to the storage account.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("connectionstring"));
//Create the blob client object.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
//Get a reference to a container to use for the sample code, and create it if it does not exist.
CloudBlobContainer container = blobClient.GetContainerReference("containername");
container.CreateIfNotExists();
MemoryStream ms1 = new MemoryStream();
ms1.Position = 0;
using (ms1)
{
var doc1 = new iTextSharp.text.Document();
PdfWriter writer = PdfWriter.GetInstance(doc1, ms1);
doc1.Open();
doc1.Add(new Paragraph("itextSharp DLL"));
doc1.Close();
var byteArray = ms1.ToArray();
var blobName = "testingCREATEPDF.pdf";
var blob = container.GetBlockBlobReference(blobName);
blob.Properties.ContentType = "application/pdf";
blob.UploadFromByteArray(byteArray, 0, byteArray.Length);
}
Please let me know how to add image at top of PDF from azure blob.
Try the code below. Basically the trick is to read the blob's data as a stream and create an iTextSharp.text.Image from that stream. Once you have that object, you can insert it in your PDF document.
private static void CreatePdf()
{
var account = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
var blobClient = account.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("pdftest");
container.CreateIfNotExists();
var imagesContainer = blobClient.GetContainerReference("images");
var imageName = "Storage-blob.png";
using (MemoryStream ms = new MemoryStream())
{
var doc = new iTextSharp.text.Document();
PdfWriter writer = PdfWriter.GetInstance(doc, ms);
doc.Open();
doc.Add(new Paragraph("Hello World"));
var imageBlockBlob = imagesContainer.GetBlockBlobReference(imageName);
using (var stream = new MemoryStream())
{
imageBlockBlob.DownloadToStream(stream);//Read blob contents in stream.
stream.Position = 0;//Reset the stream's position to top.
Image img = Image.GetInstance(stream);//Create am instance of iTextSharp.text.image.
doc.Add(img);//Add that image to the document.
}
doc.Close();
var byteArray = ms.ToArray();
var blobName = (DateTime.MaxValue.Ticks - DateTime.Now.Ticks).ToString() + ".pdf";
var blob = container.GetBlockBlobReference(blobName);
blob.Properties.ContentType = "application/pdf";
blob.UploadFromByteArray(byteArray, 0, byteArray.Length);
}
}

Resources