Azure Library for Lucene.net Error. File Not Found After write.lock created - azure

When executing the following code in an empty azure container, I get file not found error (segments.gen; The specified blob does not exist.).
AzureDirectory azureDirectory = new AzureDirectory(account, "audiobookindex"); // <-- audiobookindex is the name of the blog storage container on my Azure account
// Create the index writerIndexWriter indexWriter = new IndexWriter(azureDirectory, new StandardAnalyzer(), true);
It seems to be failing on the OpenInput inside the Azure Library for Lucene.net assembly. However I don't understand while it's even calling that method. Would think it would just try to create it.
Also, the assembly and code IS hitting the container because it creates a write.lock file that I can see in the container.
Any suggestions?

This should solve this problem. The examples in market are developed with older apis and older framework version etc. I found the above solution which works fine! No need of putting interfering with debugger ;)

Related

Blazor WASM Azure Static Web App, Functions not working

I created a simple Blazor WASM webapp using C# .NET5. It connects to some Functions which in turn get some data from a SQL Server database.
I followed the tutorial of BlazorTrain: https://www.youtube.com/watch?v=5QctDo9MWps
Locally using Azurite to emulate the Azure stuff it all works fine.
But after deployment using GitHub Action the webapp starts but then it needs to get some data using the Functions and that fails. Running the Function in Postman results in a 503: Function host is not running.
I'm not sure what I need to configure more. I can't find the logging from Functions. I use the injected ILog, but can find the log messages in Azure Portal.
In Azure portal I see my 3 GET functions, but no option to test or see the logging.
With the help of #Aravid I found my problem.
Because I locally needed to tell my client the URL of the API I added a configuration in Client\wwwroot\appsettings.Development.json.
Of course this file doesn't get deployed.
After changing my code in Program.cs to:
var apiAddress = builder.Configuration["ApiAddress"] ?? $"{builder.HostEnvironment.BaseAddress}/api/";
builder.Services.AddHttpClient("Api",(options) => {
options.BaseAddress = new Uri(apiAddress);
});
My client works again.
I also added my SqlServer connection string in the Application Settings of my Static Web App and the functions are working as well.
I hope somebody else will benefit from this. Took me several hours to figure it out ;)

File.Exists from UNC using Azure Storage/Fileshare via IIS results in false.

Problem:
trying to get an image out of azure fileshare for manipulation. I need to read the file as an Drawing.Image for manipulation. I cannot create a valid FileInfo object or Image using uncpath (which I need to do in order to use over IIS)
Current Setup:
Attach a virtual directory called Photos in IIS website pointing to UNCPath of the Azure file share (e.g. \myshare.file.core.windows.net\sharename\pathtoimages)
This works as http://example.com/photos/img.jpg so I know it is not a permissions or authentication issue.
For some reason though I cannot get a reference to File.
var imgpath = Path.Combine(Server.MapPath("~/Photos"),"img.jpg")
\\resolves as \\myshare.file.core.windows.net\sharename\pathtoimages\img.jpg
var fi = new FileInto(imgpath);
if(fi.exists) //this returns false 100% of the time
var img = System.Drawing.Image.FromFile(fi.FullName);
The problem is that the file is never found to exist, even though I cant take that path and put it in an explorer window and return the img.jpg 100% of the time.
Does anyone have any idea why this would not be working?
Do I need to be using CloudFileShare object to just get a read of a file I know is there?
It turns out the issue is that I needed to wrap my code in an impersonation of the azure file share userid since the virtual directory is not really in play at all at this point.
using (new impersonation("UserName","azure","azure pass"))
{
//my IO.File code
}
I used this guys impersonation script found here.
Can you explain why DirectoryInfo.GetFiles produces this IOException?

If using ImageResizer with Azure blobs do I need the AzureReader2 plugin?

I'm working on a personal project to manage users of my club, it's hosted on the free Azure package (for now at least), partly as an experiment to try out Azure. Part of creating their records is to add a photo, so I've got a Contact Card view that lets me see who they are, when they came and a photo.
I have installed ImageResizer and it's really easy to resize the 10MP photos from my camera and save them to the file system locally, but it seems that for Azure I need to use their Blobs to Upload Pictures to Windows Azure Web Sites, and that's new to me. The documentation on ImageResizer says that I need to use AzureReader2 in order to work with Azure blobs but it isn't free. It also says in their best practices #5 to
Use dynamic resizing instead of pre-resizing your images.
Which is not what I was thinking, I was going to resize to 300x300 and 75x75 (for thumbnail) when creating the users record. But if I should be storing full size images as blobs and dynamically resizing on the way out then can I just use standard means to Upload a blob into a container to save it to Azure, then when I want to display the images use the ImageResizer and pass it each image to resize as required. That way not needing to use the AzureReader2, or have I misunderstood what it does / how it works?
Is there another way to consider?
I've not yet implemented cropping, but that's next to tackle when I've worked out how to actually store the images properly
With some trepidation, I'm going to disagree with astaykov here. I believe you CAN use ImageResizer with Azure WITHOUT needing AzureReader2. Maybe I should qualify that by saying 'It works on my setup' :)
I'm using ImageResizer in an MVC 3 application. I have a standard Azure account with an images container.
Here's my test code for the view:
#using (Html.BeginForm( "UploadPhoto", "BasicProfile", FormMethod.Post, new { enctype = "multipart/form-data" }))
{
<input type="file" name="file" />
<input type="submit" value="OK" />
}
And here's the corresponding code in the Post Action method:
// This action handles the form POST and the upload
[HttpPost]
public ActionResult UploadPhoto(HttpPostedFileBase file)
{
// Verify that the user selected a file
if (file != null && file.ContentLength > 0)
{
string newGuid = Guid.NewGuid().ToString();
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("images");
// Retrieve reference to the blob we want to create
CloudBlockBlob blockBlob = container.GetBlockBlobReference(newGuid + ".jpg");
// Populate our blob with contents from the uploaded file.
using (var ms = new MemoryStream())
{
ImageResizer.ImageJob i = new ImageResizer.ImageJob(file.InputStream,
ms, new ImageResizer.ResizeSettings("width=800;height=600;format=jpg;mode=max"));
i.Build();
blockBlob.Properties.ContentType = "image/jpeg";
ms.Seek(0, SeekOrigin.Begin);
blockBlob.UploadFromStream(ms);
}
}
// redirect back to the index action to show the form once again
return RedirectToAction("UploadPhoto");
}
This is 'rough and ready' code to test the theory and could certainly stand improvement but, it does work both locally and when deployed on Azure. I can also view the images I've uploaded, which are correctly re-sized.
Hope this helps someone.
The answer to the concrete question:
If using ImageResizer with Azure blobs do I need the AzureReader2
plugin?
is YES. And as described in the Image Resizer's documentation - that plugin is used to read/process/serve images out of Blob Storage. So there is no doubt - if you are going to use Image Resizer, AzureReader2 is your needed plugin to make things right. It will take care of Blob uploads/serve.
Although I question Image Resizer's team competency on Windows Azure, since they are referencing Azure SDK v.2, while the most current version for Azure SDK is 1.8. What they mean is the Azure Storage Client Library, which has versions 1.7 and 2.x. Whereas version 2.x is recommended one to use and comes with Azure SDK 1.8. So, do not search for Azure SDK 2.0, install the latest one, which is 1.8. And by the way, use the Nuget Package Manager to install the Azure Storage Library v. 2.0.x.
You can also upload resized versions to azure. So, you first upload the original image as a blob, say with the name /original/xxx.jpg; then you create a resize of the image and upload that to azure with the name say /thumbnail/xxx.jpg. If you want to create the resized versions on the fly or on a separate thread, you may need to temporarily save the original to disk.

Azure Blob storage Download

I am learning Azure, I have successfully uploaded and list files in my containers. When I run the code below on my home pc everything works fine, no exceptions, however when I run on my work pc i catch an exception that states:
Blob data corrupted. Incorrect number of bytes received '12288' / '-1'
The file seems to download to my local drive just fine, I just cannot figure why it works different on two different PCs, exact same code.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("My connection string");
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
CloudBlockBlob blockBlob = container.GetBlockBlobReference("ARCS.TXT");
using (var fileStream = System.IO.File.OpenWrite(#"c:\a\ARCS.txt"))
{
blockBlob.DownloadToStream(fileStream);
}
May be your Organization's firewall is blocking a specific port. I have written a blog which discuss similar kind of port related issues. Will request you to verify once from that. http://nabaruns.blogspot.in/2012/11/common-port-related-troubleshoot-in.html
Regards
Nabarun
Your code looks correct.
That is a weird issue. It's more weird becuase file gets downloaded properly even after the error. I would recomend you use Azure storage explorer on both of your machines.
If Azure storage explorer works fine on both the machine then next step would be to check the SDK version on both machine. There are chances of such error with older version of SDK.
You may also want to try Commandline Downloader to trouble shoot your issue.
Note - Azure storage explorer and Commandline Downloader are open source. If download through them works fine then you can download its code and debug through it also.
I'd recommend trying CloudBlob.DownloadToFile or CloudBlob.DownloadToStream instead of CloudBlockBlob

Azure: Downloading from Blob Storage results in permissions error?

I’ve uploaded some files to Blob storage, and now I’m using the OnStart method to retrieve those files and run them. Right now I’m working locally.
Using the following code:
using (var fileStream = System.IO.File.OpenWrite(#"C:\testfolder"))
{
blob.DownloadToStream(fileStream);
}
Results in a “Access to the path 'C:\testfolder' is denied.” error.
What do you think is causing this? And - will this be an issue once the project is actually pushed up to Azure? I can change permissions locally, but I'm hoping that once it's actually in a live worker role, it won't be an issue.
Any help would be awesome :)
Scratch that - it looks like the C:\testfolder should specify the file name, not the location. I've changed it to C:\testfolder\test.txt and it works just fine :).

Resources