How to deploy a Azure application into production without Azure - azure

I am developing an Azure application using queues, blob storage and SQL Azure. We anticipate that some clients will not be willing to have their data hosted in the cloud (for reasons of paranoia or legal limitations on the jurisdiction in which data can be stored) and will want to run the system on a server located within their own data centres, on a single server. Using SQL Server and building an alternative to blob storage should be easy, but Azure queues are a more complicated. I guess using the development fabric is undesirable because the MS documentation says it must run as administrator.
How should I go about this?

I would add a layer of abstraction over the AzureQueues.
Something like:
public interface IQueueService
{
// will create if not exists
IQueue GetQueue(string name);
IQueue GetQueueIfExists(string name);
}
public interface IQueue
{
string Name { get; set; }
void AddMessage(SimpleMessage message);
void DeleteMessage(SimpleMessage message);
SimpleMessage PeekMessage();
void Clear();
}
etc...
That should give you an idea. You can then provide two implementations, one that utilizes AzureQueues and another one that uses MS Queues (http://en.wikipedia.org/wiki/Microsoft_Message_Queuing)
You choose the implementation depending on whether you are running on Azure or not.
I have done something very similar in the past.

You don't need to run on the developer fabric to access azure resources. Blobs are very easy to access via the web, I'm fairly certain you can do it with tables and Queues as well as the "http://'accountname'.queue.core.windows.net/" URLs are publicly available.
For a neat solution you should look at Azure AppFabric service bus, it basically allows you to connect, or "project" an on premise app web service endpoints into the cloud, it's basically a relay service. (It sound like magic, but it's actually pretty simple). You can use the same Service Bus to give Azure Worker Role services public Url endpoints.
http://msdn.microsoft.com/en-us/library/ee732537.aspx
http://www.microsoft.com/windowsazure/appfabric/overview/default.aspx

Related

How to see request logs Service Fabric Application

I want to know if is there a way to see all the requests made to my service fabric application while it is already published on azure servers, or at least capture all requests and save them in some place, I need informations like source and body.
Thanks in advance!
To see all the requests and responses, you first need to log them in somewhere. Here are the available approaches:
Streaming into VS Cloud Explorer
You could use ServiceEventSource to log the valuable information and
then you will be able to see it by attaching to your SF cluster via
CloudExplorer in VS. Here is you could find more info - Debug your
Service Fabric application by using Visual
Studio.
Windows Azure Diagnostics
WAD extension that you could install on your VM-s uploads logs to Azure Storage, and also has the option to send logs to Azure Application Insights or Event Hubs. Check out Event aggregation and collection using Windows Azure Diagnostics.
EventFlow
Using EventFlow allows you to have services send their logs directly to an analysis and visualization platform, and/or to storage. Other libraries (ILogger, Serilog, etc.) might be used for the same purpose, but EventFlow has the benefit of having been designed specifically for in-process log collection and to support Service Fabric services.
Event analysis and visualization with OMS
When OMS is configured, you will have access to a specific OMS workspace, from where data can be queried or visualized in dashboards.
After data is received by Log Analytics, OMS has several Management Solutions that are prepackaged solutions to monitor incoming data, customized to several scenarios. These include a Service Fabric Analytics solution and a Containers solution, which are the two most relevant ones to diagnostics and monitoring when using Service Fabric clusters. Find more info on Event analysis and visualization with OMS and Assess Service Fabric applications and micro-services with the Azure portal.
And there is a number of ways how you could capture source and body, or whatever you need. Below you could find some:
Apply ActionFilterAttribute on your controller class if you don't
have one, and log all the information you need within OnActionExecuted method
Add the middleware in Startup class -
public static void ConfigureApp(IAppBuilder appBuilder)
{
// Configure Web API for self-host.
HttpConfiguration config = new HttpConfiguration();
config.Routes.MapHttpRoute(
name: "DefaultApi",
routeTemplate: "api/{controller}/{id}",
defaults: new { id = RouteParameter.Optional }
);
appBuilder.Use(async (IOwinContext context, Func<Task> next) =>
{
await next.Invoke();
// log anything you want here
ServiceEventSource.Current.Message($"Response status code = {context.Response.StatusCode}");
});
appBuilder.UseWebApi(config);
}

Azure Web Job - How to connect to an Azure MS SQL Database?

I have a MVC website that gets published to Azure where it uses an Azure SQL Database.
The time has come where we need to run a scheduled task to send SMS reminders. I was under the impression that Azure Web Jobs was a good fit for this but am having some issues getting it up and running.
I have added a console app to my website solution and referenced by EF data model from the console app (which I would like to publish as a web job).
The current console app looks as follows:
class Program
{
static void Main(string[] args)
{
JobHost host = new JobHost();
host.RunAndBlock();
}
public static void ProcessNotifications()
{
var uow = new KirkleesDatabase.DAL.UnitOfWork();
uow.CommunicationRepository.SendPALSAppointmentReminders();
}
}
Running the console app will then throw the exception:
Additional information: User account connection string is missing. This can be set via the 'AzureJobsData' connection string or via the constructor.
This suggests that the Web Job is specifically looking for a connection string that points at a storage account. However, I would like the web job to query an Azure database rather than work with storage.
Is this doable?
Thanks,
As Victor wrote, the sdk events are triggered by blobs, queues and tables.
A simple solution for your need: write a console application with a pollig approach. No sdk needed. Details at the beginning of http://blog.amitapple.com/post/73574681678/git-deploy-console-app/
while(true)
{
if ("isthereworktodo?")
{
var uow = new KirkleesDatabase.DAL.UnitOfWork();
uow.CommunicationRepository.SendPALSAppointmentReminders();
}
Thread.Sleep(10000); // set an appropriate sleep interval
}
Unfortunately the WebJobs SDK does not support SQL databases as triggers. For triggers, it only supports Azure Storage (blobs, queues and tables).
You can access Azure SQL Databases from the web job as demonstrated in this answer.
To create a web job you don't have to use the webjobs sdk, you can use several types of executables (.exe, .cmd, .js, .php, .py, ...).
Try using this new visual studio add-on: http://visualstudiogallery.msdn.microsoft.com/f4824551-2660-4afa-aba1-1fcc1673c3d0, follow the steps there to link between your web application and your console application which will be deployed as a webjob when published to your Azure site.

Windows Azure - portability and migration?

We are looking to use Windows Azure to host our existing SaaS platform and extend our functionality and capability. WE will be taking adavantage of both the data storage and application and web service functionality of Azure.
My question is as follows:
Some of our clients will not want Public CLoud access. Since our datastore stores sensitive client data many of them will require our whole system to be hosted internally on their own network and servers.
If we setup a full Azure setup of database and connected applications and processes how difficult is it to be able to duplicate that system for a specific client on their own servers and network using existing Microsoft technologies?
I know its a vague question and I also have a liminted understanding of Azure so whatever information you can provide here would be most appreciated.
Thank you
It sounds like you need the flexibility of a hybrid cloud/on-prem solution. Likely the best solution is the Windows Azure Service Bus. Essentially, you configure a WCF web service in the cloud (SOAP, REST, etc) that performs asynchronous brokered messaging between your on-premise application and your web application. This can be performed using queue messages, for example:
The web application (cloud) requests resources from the brokering service (cloud) by sending a queue message
The service handles the queue message and makes it available to the consuming (on-prem) service
On-prem service checks for new messages from the brokering service, gets the request for data, and returns desired data from DB
On-prem service sends message to brokering service with desired data
Web app (cloud) checks for new messages from the brokering service, then uses the data from on-prem service
Service bus is secure, asynchronous, fault-tolerant, and ensures that both components are decoupled.
Another method is to use Windows Azure Connect, which is a VPN solution that sets up network-level connnectivity. I recommend Service Bus because it promotes a more robust and scalable architecture, and fault-tolerance is high.

Is it possible to create a public queue in Windows Azure?

In Windows Azure it's possible to create public Blob Container. Such a container can be accessed by anonymous clients via the REST API.
Is it also possible to create a publicly accessible Queue?
The documentation for the Create Container operation explains how to specify the level of public access (with the x-ms-blob-public-access HTTP header) for a Blob Container. However, the documentation for the Create Queue operation doesn't list a similar option, leading me to believe that this isn't possible - but I'd really like to be corrected :)
At this time, Azure Queues cannot be made public.
As you have noted, this "privacy" is enforced by requiring all Storage API calls made in RE: to queues to be authenticated with a signed request from your key. There is no "public" concept similar to public containers in blob store.
This would follow best practice in that even in the cloud you would not want to expose the internals of your infrastructure to the outside world. If you wanted to achieve this functionality, you could expose a very thin/simple "layer" app on top of queues. A simple WCF REST app in a web role could expose the queuing operations to your consumers, but handle the signing of api requests internally so you would not need the queues to be public.
You are right, the Azure storage queues won't be publicly accessible like the blobs (Uris). However you may still be able to achieve a publicly consumable messaging infrastructure with the appfabric service bus.
I think the best option would be to setup a worker role and provide access to the queue publicly in that manner. Maybe with AppFabric Service Bus for extra connectivity/interactivity with external sources.
? Otherwise - not really clear what the scope might be. The queue itself appears to be locked away at this time. :(

Windows Azure role is state full or not

According to MSDN, an azure service can conatins any number of worker roles. According to my knowledge a worker role can be recycled at any time by Windows Azure Fabric. If it is the true, then:
Worker role should be state less OR
Worker role should persist its state to Windows Azure storage services.
But i want to make a service which conatains client data and do not want to use Azure storage service. How I can accomplish this?
The velocity (whatever it is called) component of AppFabric is a distributed cache and can be used in these situations.
Azure's web and compute roles are stateless means all its local data is volatile and if you want to maintain the state you need to use some external resource to maintain that state and logic in your app to handle that. For simplicity you can use Azure drive but again internally its a blob storage.
You can write to local storage on the worker role by using the standard file IO APIs - but this will be erased upon instance shutdown.
You could also use SQL Azure, or post your data off to another storage service by HTTP (e.g. Amazon S3, or your own server).
However, this is likely to have performance implications. Depending on how much data you'll be storing, how frequently, and how big it is, you might be better off with Azure Storage!
Why don't you want to use Azure Storage?
If the data could be stored in Azure you have a good number of choices: Azure distributed cache, SQL Azure, blob, table, queue, or Azure Drive. It sounds like you need persistence, but can't use any of these Azure storage mechanisms. If data security is the problem, could you encrypt/hashing the data? Understanding why would be useful.
One alternative might be not persist at all, by chaining/nesting synchronous web service calls together, thus achieving reliable messaging.
Another might be to use Azure Connect to domain join Azure compute resource to your local data centre (if you have one), and use you on-premise storage.

Resources