How do Jobs in Azure Sql Server Databse? - azure

I'm trying to create jobs in Azure Sql Database but I don't know how to do that. Is It possible do them inside de Sql Server Management Studio?

You need to use Azure Automation to schedule the execution of a stored procedure. For instance, You can use Azure Automation to schedule index maintenance tasks.
Below are steps :
Provision an Automation Account if you don’t have any, by going to https://portal.azure.com and select New > Management > Automation Account
After creating the Automation Account, open the details and now click on Runbooks > Browse Gallery
Type on the search box the word “indexes” and the runbook “Indexes tables in an Azure database if they have a high fragmentation” appears:
Note that the author of the runbook is the SC Automation Product Team at Microsoft. Click on Import:
After importing the runbook, now let’s add the database credentials to the assets. Click on Assets > Credentials and then on “Add a credential…” button.
Set a Credential name (that will be used later on the runbook), the database user name and password:
Now click again on Runbooks and then select the “Update-SQLIndexRunbook” from the list, and click on the “Edit…” button. You will be able to see the PowerShell script that will be executed:
If you want to test the script, just click on the “Test Pane” button, and the test window opens. Introduce the required parameters and click on Start to execute the index rebuild. If any error occurs, the error is logged on the results window. Note that depending on the database and the other parameters, this can take a long time to complete:
Now go back to the editor, and click on the “Publish” button enable the runbook. If we click on “Start”, a window appears asking for the parameters. But as we want to schedule this task, we will click on the “Schedule” button instead:
Click on the Schedule link to create a new Schedule for the runbook. I have specified once a week, but that will depend on your workload and how your indexes increase their fragmentation over time. You will need to tweak the schedule based on your needs and by executing the initial queries between executions:
Now introduce the parameters and run settings:
NOTE: you can play with having different schedules with different settings, i.e. having a specific schedule for a specific table.
With that, you have finished. Remember to change the Logging settings as desired:

You can use Microsoft Flow (https://flow.microsoft.com) in order to create a programmed flow with an SQL Server connector. Then in the connector you set the SQL Azure server, database name, username and password.
SQL Server connector
There are many options but the ones that you can use to run a T-SQL query daily are these:
SQL Connector options
Execute a SQL Query
Execute stored procedure
You can also edit your connection info in Data --> Connections menu.

Related

Azure SQL DB daily growth alert creation

Hi I'm using azure SQL database and I need to create a notification/alert once the daily growth of the database is over a pre-defined number. As an example I need to send an email to the DB admins once the database has grown over 1 GB within the last 24 hours. I was seeking for solutions but couldn't find a straight forward solution to be implemented using azure. Any help will be appreciated.
You can create alerts for SQL Db using alerts and action group in Azure. Below are steps you can follow to create alerts for SQL DB usage for a period of time,
Create a logic app as shown below,
In send email action, configure recipients mail addresses to notify alerts.
Next create an action group and configure the created logic app in actions tab.
Creating action group,
Once logic app selected, click on review + create.
Now you can create alert for sql db and select the created action group.
In Conditions tab, select signal as Data space used.
As per your requirement, configure details as shown below,
In Actions tab, select already created action group.
Once it is done, click on review + create.
This flow will execute whenever data used is more that 1 Gb for a selected time period.

How to delete a pipeline trigger in Azure Synapse Analytics

How do you delete a pipeline trigger in Azure Synapse Analytics with the UI?
What's the problem?
I currently can't publish my workspace changes. I get the following error:
TestTrigger1
Trigger 'TestTrigger1' cannot be activated and contain no pipelines
This is correct. I disconnected TestTrigger1 to try a different trigger. However now I can't publish and I can't delete the trigger in the UI either.
In Data Factory, there is a UI option in the bottom left of the pipelines screen to delete. See this blog post.
However in ASA, there is no UI option for managing triggers and I can't find one.
How do I delete this trigger so I can republish?
In Azure Synapse, you can see all the triggers under the Integration section under the Manage tab. If you have right permissions you should be able to see the delete button wHen you hover your mouse overt the trigger you want to delete.

Save log queries across different application insights instances

In Azure Application Insights Logs, I can save custom queries into the "query explorer", but they are saved per Application Insights instance. I want to use the same query in a different AI Logs.
Note I don't want to query data across AI instances, just save and reuse the queries themselves without duplicating them.
Unfortunately, currently the saved query can only be used by the current Application Insights instance. A user feedback is already raised here.
You can consider using workbook(Note that workbook is not designed for this and has some limitations, but we can use it to save query and use the query in other Application insights instances).
Steps are as below:
1.Nav to azure portal -> one of your application insights -> click Workbooks -> create an empty template:
2.Click Add -> then click "add query":
3.In the new page, you can select one of your Application Insights from the Resource dropdown -> then write your query code -> then click the "Run Query" button to check the results-> click Save button to save the workbook:
4.Next time, if you want to re-use the query written in step 3, just open the saved workbook -> click the Edit button to enter into edit mode:
5.Then click the Edit button in edit mode:
6.In the new page, click the Change button to select another Application Insights. Then the query will be based the new selected Application insights:

Azure Scheduled WebJob - save diagnostics to Table store?

I can successfully save my logs to Table store for a continuous WebJob, following these instructions:
https://azure.microsoft.com/en-us/documentation/articles/web-sites-enable-diagnostic-log/
However, if I make the WebJob scheduled (runs once every 5 mins), the logs do not show up in Table store. Is this a known limitation (and if so, why?), or does anyone know a way to make it work?
Note: I can see the logs in the Azure Portal, so I know the job runs correctly -- I just want to save these to a WADLogsTable.
Thanks!
Maybe your problem is related to this. Compress in a .zip the Release folder of your web job and then go to your azure webpage, click web jobs and add. Set your schedule in the menu that shows when you press add, you can change it later in your azure web job schedule menu.

How to dynamically specify database credentials with Azure Reporting Services (SSRS)

Summary
I am using SQL Server Reporting Services on Azure. I want to dynamically specify the connection string including the credentials at runtime. I run the reports by embedding a ReportViewer control in an ASPX page.
I can make this work on premise with SSRS 2012 as long as I specify an Execution Account on the SSRS. However, on Azure Reporting Services I can't specify an Execution Account so it doesn't work.
My question is - how do I make this work on Azure? Specifically, how do I specify the database connection string including credentials at runtime when using Azure Reporting Services rather than on-premise.
Details
When you do this on-premise, these are the key steps;
Set your report to use an embedded connection.
Define a parameter, say "ConnectionString"
In your embedded data source, set the Connection string to "[#ConnectionString]" (you can also use the expression builder to construct a connection string from different params but it's the same difference).
In your embedded data source set Credentials to "Do not use credentials".
Make sure you have specified an Execution Account on SSRS
In your ASPX page, do something like this.ReportViewer1.ServerReport.SetParameters(new ReportParameter("ConnectionString", connectionString, false));
On SSRS on Azure, you cannot specify an Execution Account so therefore the above doesn't work.
In essence, if you try to set "Do not use credentials" and you do not have an Execution Account specified, you will get this error:
The current action cannot be completed. The user data source credentials do not meet the requirements. Either the user data source credentials are not stored in the report server database, or the user data source is configured not to require credentials but the unattended execution account is not specified. Tracing ID is: XXX. Machine name is XXX. (rsInvalidDataSourceCredentialSetting)
What I have tried
I have tried quite a few different things so far, including specifying dummy credentials in the embedded data source. The only way I can make this work with Azure is if I specify valid database credentials directly in the embedded data source.
I have also seen some advice around trying to use the "SQL Server" type connection string instead of "Azure SQL" but it doesn't seem to make any different locally and, in any case, I can't deploy to Azure unless I set it to Azure SQL. I have also experiemented with this.ReportViewer1.ServerReport.SetDataSourceCredentials but that doesn't seem to help either; When stepping through the code I get the error the first time I try to communicate with the report, even if that communication is the call to try to set the credentials.
What I find most frustrating about this is that it seems like an entirely illogical dependency; The database credentials you specify in the embedded data source cannot be used to actually run the report (as they are just SQL credentials) and the Execution Account is a Windows account that cannot access the database. So, it looks like an entirely arbitrary dependency for no practical reason whatsoever.
You can use this solution if you are able to create a general purpose "ReportUser" windows account on your server and give this account viewer access to your reports.
You can get around this by adding two parameters to your report.
1. #DatabaseServerName
2. #DatabaseName
Then in your report data source set the connection expression to:
="Data Source="+Parameters!DatabaseServerName.Value+";Initial Catalog="&Parameters!DatabaseName.Value
When developing the reports you should add a TEST data source to the report that points to a valid development endpoint then prior to deployment to your production server set all the datasets to point back to the data source using the expression above.
In order for this to work you will need to create a "ReportUser" account on your production server and set the credentials of your dynamic data source to the username and password of this account.

Resources