Best practices for Hangfire on Azure - azure

I am attempting to migrate an OnPrem solution to Azure. The application uses Hangfire fairly heavily. Hangfire is hosted in a windows service and is backed by SQL Server.
I prefer not to remove this dependency unless absolutely required. (alternatives being functions/webjobs.)
Are there any guidelines/best practices for running Hangfire on Azure?
Barring the differences in storage characteristics, is Hangfire expected to work exactly on azure as it does OnPrem?
On googling + scanning github issues & SO:
People do seem to be using Hangfire on Azure.
Could not find any best-practice/migration documents around this.
Thanks,
Partho

If you use Hangfire, I highly recommend you use their Azure Service Bus package. Which switches the polling to Azure SB

Related

Is Servic Fabric Appropriate for Simple Background Jobs?

We have a bunch of Windows Services. We need to be able to continue to deploy our code as Windows service on premise, but would like to deploy to Azure where appropriate. The goal is to manage less infrastructure. I'm not keen on deploying dozens of bits an pieces as Azure Functions, but not entirely opposed to it either. Azure Batch / Webjobs are also another option. However, the long term goal is to move all of our services over to an orchestration server like Service Fabric so that all the services can be deployed and orchestrated from the one place. This is mainly a deployment consideration.
We will break the existing C# code in to .NET Core class libraries and reference them from either Service Fabric hosted in Azure, on-premise Service Fabric, or on-premise Windows Service. Is Service Fabric an appropriate choice? Or, is there a strong reason to run background jobs as Azure Batch / Functions / Webjobs?
This is Microsoft's diagram from here:
The answer to the question is that we don't really need full fledged orchestration right now, but it will become more important moving in to future. I have to balance being able to deploy all our code in one hit with the ease of ad hoc deployment that Azure Functions offer.
(Stateful) Services can be an excellent way to run background jobs. They offer the RunAsync entry point, in which you can run your job, check (and store) progress. SF really shines when multiple services collaborate on tasks, offering SF Remoting as a communication channel, with built-in retry support.
You can choose to containerize your software, which would free you from platform lock-in, but prevent you from using some platform features.
By automating delivery of services (CI/CD), you can deploy to any platform you choose. This is not something that is specific to SF.

Common function-as-a-service feature for .NET-Core as Azure AppService and On-Premises

We have the challenge of implementing services that can be deployed both in an Azure Cloud (On-Demand) as well as in a local LAN On-Premises scenario. This is fine with .NET-Core, SQL-Server, Redis etc.
What we are missing is a common feature for Functions-as-a-service or WebJobs. Both of these Azure services appear to be cloud-only. Is something like Hangfire the most viable approach, or are we missing something?
Thanks
Azure Functions Runtime enables you to run Function Apps on premise. Note that it's still in preview though.

Internet of Things using MS Azure

I am starting my journey of IoT development with MS Azure. I would like some insight on the Azure cloud. I am a total newbie on cloud development. Can someone tell me some good books/links on Azure that will help me understand how I can use Azure for IoT and start development on the same.?
Thanks a lot for your inputs.
This totally depends on the architecture of your application. You can use SAAS components for rapid prototyping, parts or all of your application architecture. This will give you a better insight into selecting the appropriate stack of tools for your application.
If you want to deploy your own software stack, you would provision Azure Virtual Machines. Azure provides an SDK to interact with the cloud infrastructure.
Docker is a really good option to use for application deployment these days. Google provides better support for Docker containers using its Kubernetes framework.
Simple APIs or website can be developed on azure using Azure webapps. I am currently developing a node application using azure websites. The actual container where the site runs is a windows NT machine with IIS. If you want your SAAS server container's to be linux based then you might look at AWS/Google or Redhat Openshift.
I have used OpenShift SAAS, and found it quite easy to get onboard with.
I advise you to have a look at Build and Ignite events, this week. There might be more announcements there. You can definitely have a look at the following white paper: http://download.microsoft.com/download/E/1/F/E1FFDADF-C0FF-4E72-A834-B173A079F393/Microsoft_Internet_of_Things_White_Paper.pdf
The most important services for IoT in Azure are (until today):
Azure Event Hubs: a massive ingestion service that can take in millions of telemetry events per second.
Azure Stream Analytics: Real time complex event processing, combining multiple incoming streams of data and detection patterns in it
PowerBI: this will allow users to build and explore interactive reports and graphs
Azure Machine Learning: Leverage prediction & machine learning models
For storage, you have DocumentDB, Azure and blob storage, among other
HDInsight will help you in working with the data (big data) and make jobs with it.
Azure Web Apps and API apps will allow you to present and expose the data to you users and custom reports
Good luck

Cassette.AspNet on a Azure Cloud Service - how will it scale?

I searched and couldn't find any references to my question:
is it safe to use Cassette.AspNet when hosting a web application in Azure Cloud Services?
Currently we are using it and everything works fine, but we have only one instance.
What should we expect when we scale?
Thanks!
Don't see a reason why this shouldn't scale.
But we moved away from Cassette to the official Microsoft Optimization Framework.
http://remy.supertext.ch/2013/02/extending-the-asp-net-optimization-framework/
It's just often easier to work with MS stuff.

Can you connect to Dynamics GP from an Azure worker role?

I'm looking into the possibility of integrating an Azure hosted .NET solution with Dynamics GP and I'm new to Dynamics. In general it seems like there are two approaches to connecting to GP: 1) web services and 2) eConnect. This article has some good background.
I would think as long as the web services are accessible, that option would work. I see that there are MSMQ and other requirements for eConnect which makes me think that would be a headache if it is even possible without something like Azure Connect. Has anyone has done this one way or the other?
Thanks
Yes you can connect to your Dynamics GP from a Windows Azure Worker role (why Worker, why not web role?) and it's all depend which route your would want to take.
Web Services method is comparatively neat & easier to access your Secure Web Services configured Dynamics GP.
On the other hand, eConnect Integration requires several other configurations so If you decided to use eConnect, I think you are better of using eConnect along with BizTalk Server/Adapter combination set into Service Bus (which is talking directly to eConnect) and your Azure application is talking to BizTalk services directly. This could be much easier to implement but you can not beat web services.

Resources