When I access a SQL table via the server scripts is the sql azure retry logic implemented somewhere following the
request.execute(); ?
Yes, the underlying code will try several times to complete the operation with SQL Azure over a 30 second window.
Related
I have several Azure WebJobs (.Net Framework, Not .Net Core) running which interact with an Azure Service Bus. Now I want to have a convenient way to store and analyze their Log-Messages (incl. the related Message from the Service Bus). We are talking about a lot of Log Messages per Day.
My Idea is to send the Logs to an Azure Event Hub and store them in an Azure SQL Database. Later I can have for example a WebApp that enables Users to conveniently browse and analyze the Logs and view the Messages.
Is this a bad Idea? Should I instead use Application Insights?
Application insight charges are more than your implementation. So i would say this is good idea. Just one change i would send each logs to logic apps and do some processing like sorting error logs, info logs etc differently. Also why are you thinking about SQL when this can be stored in non SQL Azure tables and fetch them from there.
We have SSIS packages running on a server with sql server agent. However, we want to move this job to a cloud solution. One solution is to use a powershell script, but we also tried to replace SSIS with Azure Data Factory.
However, as stated above, the gateway requires my computer to be online and can't be installed on a domain controller (server). Does this mean that data factory cannot be used to fill our database at night (when the pc's are shutdown) and is therefore not a good replacement for SSIS?
Are there any other solutions for this problem?
The Data Gateway can be installed on any computer in your network that has access to the SQL Server. Obviously both the gateway and the SQL server need to be up at the time the activity runs.
I'm posting because I don't think the azure functions can take advantage of connection pooling. Say if I run 1 sql query every 5 minutes in my azure function, the initial connection will take a long time to connect because it can't take advantage of connection pooling like a C# web api that's always running.
Would it be better to call my C# Webapi to make that data call and return the results? Or is it better to directly connect to the db? Now if there are 10 or so DB calls I'm sure directly would be better, but 1 or 2 I don't know.
this is a C# azure function connecting to a azure sql server
This would only be applicable in the Consumption ("Serverless") Plan. Using the traditional App Service plan, your function would not be deprovisioned and would be able to make use of connection pooling since it is running on the same plan as an API App.
We are using the SQL Database Sync feature in Azure and whilst it generally works without issues it does occasionally fail. The failures can be due to lost database connections or failures in which the error messages in the Azure Sync Log don't help. I see no way in the Azure portal to set up alerts to email us when a sync fails and I don't see any Azure cmdlets that will return the status of a database sync. Does anyone have any ideas how we can add monitoring to the service?
Thanks!
I did crosspost this later to the msdn forum that user6133663 listed below.
The answer from Xu Ye (who I assume works for MS) was:
You are right. Currently SQL Data Sync did not support API & failure alert. We will keep you update if any update on this.
So, to answer my original question, there is currently no way to monitor the success or failure of a SQL database sync.
I have one Microsoft Azure subscription with one cloud service and one sql azure instance. Now I want create another cloud service with a different subscription (using a different microsoft account). With this second cloud service, can I use the same sql azure instance of the first subscription? (I need to share data between the two cloud service)
Or there may be performance issues?
Thanks in advance
Yes. Azure SQL DB instance can be accessed from different subscription as long as you have the connection string, username and password to the Azure SQL instance. As long as both the services are from the same region, there is no performance issue.
Yes, sure. From user perspective SQL Azure is mostly an ordinary SQL Server which you can access from anywhere in the world (given that the firewall rules allow that access) - from Azure services, from VMs in some other services hosted elsewhere, from your desktop, from servers in your company server room.
Network latency might kick in. Also more clients to the same instance mean more load. Also there's a limit on number of concurrent connections. Other than that - no problems.
You need to make sure are a member in each Azure instance to be able to use the others SQL DB