Azure Storage Emulator's queues not loading - azure

I'm new to Azure Storage and I've been having a problem creating and working with queues locally.
So, firstly I wasn't able to make the Azure Storage Emulator run, so I've changed the config to look like this:
<StorageEmulatorConfig>
<services>
<service name="Blob" url="http://localhost:10000/"/>
<service name="Queue" url="http://localhost:10001/"/>
<service name="Table" url="http://localhost:10002/"/>
</services>
(Originally it had a 127.0.0.1 IP instead of localhost). That change made the Storage Emulator run correctly, but I still only get an endless loading icon when trying to ellapse the Queues (and Blob containers and Tables too, by the way).
So far I've tried:
Creating a new azure storage connection with the same connection string, but replacing the default IP with localhost.
2.Changing the number of the ports in the config file
Restarting everything
Reinstalling Azure Storage Explorer and Emulator
Have you guys ever encountered this type of error, or have any idea what could I try to fix it?
Edit: I have also deleted the .IdentityService folder from my machine. No luck.

Related

Physical path in Azure WebJobs

I am trying to deploy webjob in azure. It is an existing console app that does a file copy using SFTP.
The appsetting in the app.config file is like below :
<add key="IdentityFile" value="C:\Temp\Uploads\UploadTest.ppk"/>
I know I have to set this in AppSettings section of the AppService but I am not sure what is equivalent to "C:\Temp" in Azure.
Could someone please guide me? Thank you
In Azure App Service, you can still rely on %TMP% to find the temp folder. The folder it goes to is D:\local\Temp, so you could change your config to D:\local\Temp\Uploads\UploadTest.ppk.
Note that you can change this by setting an IdentityFile App Setting in the Azure Portal, instead of changing it in the physical app.config file.

Debugging NServiceBus ServiceControl Heartbeat plugin

I have an NServiceBus endpoint running on an Azure worker role. I installed the package ServiceControl.Plugin.Nsb5.Heartbeat. When I deploy directly from VS to the cloud service, my endpoint shows up in ServicePulse and I get my heartbeats as expected.
When I go through our automated deployment process, the endpoint isn't detected by servicepulse, and I don't get any heartbeats. (Even if you don't have the heartbeat plugin installed, ServicePulse does detect the endpoint and tells you that that endpoint does not have the heartbeat plugin installed).
When I login through RD, I can see the heartbeat assembly in the approot. My config is the same for both scenario's, but I'll add it here for reference:
In my config appsettings:
<add key="Heartbeat/Interval" value="00:00:01" />
<add key="ServiceControl/Queue" value="xxx.xxx.servicecontrol" />
Rest of my config:
<section name="MessageForwardingInCaseOfFaultConfig" type="NServiceBus.Config.MessageForwardingInCaseOfFaultConfig, NServiceBus.Core" />
<MessageForwardingInCaseOfFaultConfig ErrorQueue="error" />
My ServiceControl instance is running on my local computer and monitoring the correct service bus. The error queue name is set to error, just like in the config, and the error forwarding queue name is set to error.log.
When the workerrole starts, and the NSB is started, I can find this in the logs (which btw is exactly the same as what I can find in the workerrole that is sending out heartbeats):
Name: Heartbeats
Version: 2.0.0
Enabled by Default: Yes
Status: Enabled
Dependencies: None
Startup Tasks: HeartbeatStartup
I have absolutely no clue why the same code is behaving differently. It's the same code, the same config, the same setup, just deployed differently. When comparing deployed assemblies, I can't detect a difference. The heartbeat assembly is there, and it looks like NSB is picking it up as well. I'm just not receiving any heartbeats from that particular endpoint.
Any idea on what I could be missing? Or what I could try to fix this?
Thanks in advance!
It turns out that both endpoints are sending heartbeats, but ServicePulse shows them as one endpoint.
In ServicePulse I could see one endpoint: Endpoint#MachineA.
MachineA was the actual machine name of the worker role instance of my CloudService "Test". I could login to this instance through RD and see NSB's log activating heartbeat functionality.
When I deployed through our automated deployment to CloudService "Dev", I got no additional endpoint in ServicePulse. So I decided to delete CloudService "Test" completely.
When I checked ServicePulse, endpoint Endpoint#MachineA was still up and receiving heartbeats every second. I couldn't figure out why since I had just deleted CloudService "Test" with that particular instance.
I decided to rename the endpoint, and deploy through our automated procedure to CloudService "Dev" (so CloudService "Test" does not exist). At that moment I saw the endpoint Endpoint#MachineA go down, and a new EndpointRenamed#MachineX go up, receiving heartbeat messages.
So it was a non-issue in the sense that both my endpoints were sending out heartbeats. The problem lies in the fact that ServicePulse somehow considered them to be the same endpoint. They did have the same name, but they were hosted in a different cloud service on a different machine, which should be translated into another endpoint in ServicePulse.
Hope that helps someone!

Get publication path when publishing Azure

I followed this thread to create a virtual directory in Azure Cloud Services: Windows Azure creating virtual directory to local storage.
It works fine but I'm not able to get "localResourcePath" with the path Azure located the files.
Where do I have to set "MyResource"?
Thanks in advance.
"MyResource" is a LocalStorage resource you define in your CSDEF file - http://msdn.microsoft.com/en-us/library/azure/ee758711.aspx.
You would add the following to your CSDEF (changing sizeInMB as appropriate):
<LocalResources>
<LocalStorage name="MyResource" cleanOnRoleRecycle="false" sizeInMB="1000" />
</LocalResources>

Enable Diagnostics in Windows Azure without .NET

I'm using Windows Azure to host my python project and I'm trying to enable the diagnostics without good results.
As I'm using python and not .NET, the only way I can actually configure it is through config files.
Below my config files:
ServiceDefinition.csdef
...
<Imports>
<Import moduleName="Diagnostics" />
</Imports>
...
ServiceConfiguration.Cloud.cscfg
....
<Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="DefaultEndpointsProtocol=https;AccountName=<my-account-name>;AccountKey=<my-account-key"/>
....
diagnostics.wadcfg:
<DiagnosticMonitorConfiguration xmlns="http://schemas.microsoft.com/ServiceHosting/2010/10/DiagnosticsConfiguration"
configurationChangePollInterval="PT10M"
overallQuotaInMB="1200">
<DiagnosticInfrastructureLogs bufferQuotaInMB="100"
scheduledTransferLogLevelFilter="Warning"
scheduledTransferPeriod="PT5M" />
<Logs bufferQuotaInMB="200"
scheduledTransferLogLevelFilter="Warning"
scheduledTransferPeriod="PT5M" />
<Directories bufferQuotaInMB="600"
scheduledTransferPeriod="PT5M">
<CrashDumps container="wad-crash-dumps" directoryQuotaInMB="200" />
<FailedRequestLogs container="wad-frq" directoryQuotaInMB="200" />
<IISLogs container="wad-iis" directoryQuotaInMB="200" />
</Directories>
<WindowsEventLog bufferQuotaInMB="200"
scheduledTransferLogLevelFilter="Warning"
scheduledTransferPeriod="PT5M">
<DataSource name="System!*" />
</WindowsEventLog>
</DiagnosticMonitorConfiguration>
In Diagnostics Manager, I can't actually see any data.
Thanks.
May i ask where your diagnostics.wadcfg located? For a regular worker role the diagnostics.wadcfg must be in the root folder and because you don't have worker role module in your project the location of the architecture of your role folder is very important. Be sure to have exact same folder structure in your Python application as a regular worker role and then drop the diagnostics.wadcfg in the role root folder. (add that info back to your question to verify)
Do you see a diagnostics configuration XML is created in your Windows Azure Blob storage which is configured in the *.Diagnostics.ConnectionString. This is a check which suggests that the diagnostics component in the Azure role was able to read the provided configuration and could create the configuration XML at destination blob stroage (same Azure Storage will be use to write log Azure Table storage). Please verify.
Finally your diagnostics.wadcfg need some more work. As this is a non .net worker role you have configured IIS logging (do you really have IIS running in worker role? ) and also have System event log scheduled to transfer "warning only" so if there are no warnings. Finally the log transfer time is set to 5 minutes which is long during test.
What i can suggest as below to test if diagnostics is working or not:
Remove the IIS log if you dont have IIS running the Azure VM
Replace event log DataSource from System!* to Application!* and set filter to Info level
Change the log transfer time to less then a minutes
Run the exact same code in Development Fabric with Diagnostics connection string connected to Actual Azure Storage.
Add custom event log in your machine and see if they are transferred within the time limit to Azure Table storage and specific tables are created
Above should help you to troubleshoot the problem.

Windows Azure Local Blob Storage Access?

I don't understand why when I call :
LocalResource ls = RoleEnvironment.GetLocalResource("mystorage");
ls.RootPath;
The local folder is not created, I don't understant how it works.
For me, it's a local path that is create. So I should have.
C:\Resources\directory\xxx.HelloAzureWeb.mystorage\
But there is not such file.
Edit:
What I am trying to do is to store some xml scene save files. In a Unity application, I need to post (save them) et get (use them) those files.
There is some misunderstanding you have about creating the local storage. When you call below code, it does not create Local Storage for you, instead it returns back to you an instance of Local Storage object which is created during Application setup:
LocalResource ls = RoleEnvironment.GetLocalResource("mystorage");
To define local storage in your application you either directly add the following in CSDEF (below settings will create a local storage name mystorage, size 2GB and if any case VM is rebooted or role restarts, the local storage will not be clean and content still there:
<LocalResources>
<LocalStorage cleanOnRoleRecycle="false" name="mystorage" sizeInMB="2048" />
</LocalResources>
To add local storage you can also choose ->> [ Visual studio > Role Properties > Local Storage > Add Local Storage ] option as well.
When you define Local Storage in your Windows Azure Application, a new folder will be added as below in your drive C: on Azure VM and this is done when you role is being provisioned in the VM during VM start time:
[In Azure VM]
C:\Resources\directory\[Your_deploymentID].[your_webrolename]_[Role_Instance_Count]\
[In Compute Emulator]
// Lunch your application in Compute Emulator and then open "Compute Emulator UI" to see the Local Storage Path in the command window related to your instance:
C:\Users\avkashc\AppData\Local\dftmp\Resources\3503509c-2112-45ea-8d63-377fe9f78491\directory\mystorage\
Once you will add above local storage specific settings in ServiceDefinition.CSDEF, your local storage will be created and then the following code will work:
LocalResource ls = RoleEnvironment.GetLocalResource("mystorage");
ls.Root // returns the correct local storage path
// Now you can use ls to read/write your data.
For me, the files I store to the storage emulator blobs go into C:\Users\[username]\AppData\Local\DevelopmentStorage\LDB\BlockBlobRoot. Unfortunately, they're not very useful. It's just a bunch of GUIDs, but the filesizes look correct.
Here is how you access Local Storage in your ASP.NET Web Role:
Step 1: Created a very simple ASP.NET Web Role Project
Step 2: Included the following in servicedefinition.csdef:
<LocalResources>
<LocalStorage name="mystorage" cleanOnRoleRecycle="false" sizeInMB="2096" />
</LocalResources>
Step 3: Add the following code in any aspx (in this case about.aspx)
<div>
Local Storage file content: <b><asp:Label ID="fileContent" runat="server" /></b>
</div>
Step 4: Add the following code in any aspx.cs (in this case about.aspx.cs):
protected void Page_Load(object sender, EventArgs e)
{
LocalResource myStorage = RoleEnvironment.GetLocalResource("mystorage");
string filePath = Path.Combine(myStorage.RootPath, "Lesson.txt");
File.WriteAllText(filePath, "First Lesson");
fileContent.Text = File.ReadAllText(filePath);
}
That's it.
I have tested this code in compute emulator and on Cloud and it does work.
When using the storage emulator, Local Resource allocation is just a file directory. The root path looks a bit different from what you describe, but you should be able to navigate to that directory on your local machine, which will initially be empty. You should be seeing it under your \users\you\AppData (which is a hidden directory).
Oh, and local resources have nothing to do with Blob storage. In Windows Azure, it's just a locally-attached disk resource with a specific size quota. It's non-durable storage, unlike Blob storage. Perfect for temporary file writes, caching, etc. and faster than Blob Storage since it's a local disk, but for anything that needs to persist, you'll want to consider Blobs.

Resources