How to dynamically specify database credentials with Azure Reporting Services (SSRS) - azure

Summary
I am using SQL Server Reporting Services on Azure. I want to dynamically specify the connection string including the credentials at runtime. I run the reports by embedding a ReportViewer control in an ASPX page.
I can make this work on premise with SSRS 2012 as long as I specify an Execution Account on the SSRS. However, on Azure Reporting Services I can't specify an Execution Account so it doesn't work.
My question is - how do I make this work on Azure? Specifically, how do I specify the database connection string including credentials at runtime when using Azure Reporting Services rather than on-premise.
Details
When you do this on-premise, these are the key steps;
Set your report to use an embedded connection.
Define a parameter, say "ConnectionString"
In your embedded data source, set the Connection string to "[#ConnectionString]" (you can also use the expression builder to construct a connection string from different params but it's the same difference).
In your embedded data source set Credentials to "Do not use credentials".
Make sure you have specified an Execution Account on SSRS
In your ASPX page, do something like this.ReportViewer1.ServerReport.SetParameters(new ReportParameter("ConnectionString", connectionString, false));
On SSRS on Azure, you cannot specify an Execution Account so therefore the above doesn't work.
In essence, if you try to set "Do not use credentials" and you do not have an Execution Account specified, you will get this error:
The current action cannot be completed. The user data source credentials do not meet the requirements. Either the user data source credentials are not stored in the report server database, or the user data source is configured not to require credentials but the unattended execution account is not specified. Tracing ID is: XXX. Machine name is XXX. (rsInvalidDataSourceCredentialSetting)
What I have tried
I have tried quite a few different things so far, including specifying dummy credentials in the embedded data source. The only way I can make this work with Azure is if I specify valid database credentials directly in the embedded data source.
I have also seen some advice around trying to use the "SQL Server" type connection string instead of "Azure SQL" but it doesn't seem to make any different locally and, in any case, I can't deploy to Azure unless I set it to Azure SQL. I have also experiemented with this.ReportViewer1.ServerReport.SetDataSourceCredentials but that doesn't seem to help either; When stepping through the code I get the error the first time I try to communicate with the report, even if that communication is the call to try to set the credentials.
What I find most frustrating about this is that it seems like an entirely illogical dependency; The database credentials you specify in the embedded data source cannot be used to actually run the report (as they are just SQL credentials) and the Execution Account is a Windows account that cannot access the database. So, it looks like an entirely arbitrary dependency for no practical reason whatsoever.

You can use this solution if you are able to create a general purpose "ReportUser" windows account on your server and give this account viewer access to your reports.
You can get around this by adding two parameters to your report.
1. #DatabaseServerName
2. #DatabaseName
Then in your report data source set the connection expression to:
="Data Source="+Parameters!DatabaseServerName.Value+";Initial Catalog="&Parameters!DatabaseName.Value
When developing the reports you should add a TEST data source to the report that points to a valid development endpoint then prior to deployment to your production server set all the datasets to point back to the data source using the expression above.
In order for this to work you will need to create a "ReportUser" account on your production server and set the credentials of your dynamic data source to the username and password of this account.

Related

Parameterised datasets in Azure Data Factory

I'm wondering if anyone has any experience in calling datasets dynamically in Azure Data Factory. The situation we have is that we dynamically sweep all tables in from IaaS (on-premise SQL Server installations on an Azure VM) application systems to a data lake. We want to have one pipeline that can pass server name, database name, user name and password to the pipeline's activities. The pipelines will then sweep whatever source they've been told to read from the parameters. The source systems are currently within a separate subscription and domain within our Enterprise Agreement.
We have looked into using the AutoResolveIntegrationRuntime on a generic SQL Server dataset but, as it is Azure and the runtimes on the VMs are self-hosted, it can't resolve and we get 'cannot connect' errors. So,
i) I don't know if this problem goes away if they are in the same subscription and domain?
That leaves whether anyone can assist with:
ii) A way of getting a dynamic runtime to resolve which SQL Server runtime it should use (we have one per VM for resilience purposes, but they can all see each other's instances). We don't want to parameterise a linked service on a particular VM as it places reliance for other VMs on that single VM.
iii) Ability to parameterise a dataset to call a runtime (doesn't look possible in the UI).
iv) Ability to parameterise the source and sink connections with pipeline activities to call a dataset parameter.
Servers, database, tableNames are possible to be dynamic by using parameters. The key problem here is that all the reference in ADF can’t be parameterized, like linked services reference in dataset, integrationRuntime reference in linked service. If you don’t have too many selfhosted integrationRuntime, maybe you can try setup different pipelines for different network?

Azure Logic Apps - Connection to Azure SQL Server "Bad Gateway" Error

I'm using a logic app to pull tweets from the native Twitter connector, score the sentiment of the tweet, and then store the result in a table within an Azure SQL Server database. The first two steps work fine, but setting up the connection to the SQL Server is giving me trouble. When I set up the connection, I give it a name and then select the database I want from the available ones shown in my Azure subscription, then provide the username and password. After hitting create it asks for a table name - I click the dropdown and it says "Loading" for a while, then shows this:
"Could not retrieve values. BadGateway"
I can't seem to find any details on this error message in the Microsoft docs, is there any way to resolve this?
Make sure your database server is allowing access to azure services in the firewall. https://learn.microsoft.com/en-us/azure/sql-database/sql-database-firewall-configure

Windows Azure mobile services, server side scripts: data

I have added some tables to my database in in windows azure via entity framework however I am not able to access these tables through the server side scripts (mobile services custom api)and they do not appear through the "MOBILE SERVICES: DATA" section. Do I have to add these tables and set permissions on them manually though the portal to get access to these via the scripts etc? I am sure there is some documentation on this somewhere but have been chasing my tail trying to find it.
The only table that currently appears there is the TodoItem table created by default.
A bit of direct on this would be great
you need to move it to the schema of your Mobile Services App and add the tables, see: http://blogs.msdn.com/b/jpsanders/archive/2013/05/24/using-an-existing-azure-sql-table-with-windows-azure-mobile-services.aspx
You only need to define the table name through the portal interface, its pretty easy to use. This is also when you define if that table requires authentication, and which kind of authentication you will use, also pretty well explained in the interface so I will leave that to you. After you've done this basic layout in azure, the entity framework will take over and define the table details from within your code. (Version and table type Depending) Something like;
private IMobileServiceSyncTable<MyTable> mySyncTable = App.MobileService.GetSyncTable<MyTable>();
Your table names in azure must exactly match the class names that your using in the code to define the tables, this mirroring is how the server maps your data to its intended location in the cloud.
You should now have complete access to your cloud data from the MobileServices API by calling operations on mySyncTable.

Changing Azure .cscfg settings on Role Start

I'm trying to create a startup script or task that changes the configuration settings in the cscfg file, that executes on role start.
I'm able to access these settings, but haven't been able to successfully change them. I'm hoping for pointers on how to change settings on Role Start, or if it's even possible.
Thanks.
EDIT: What I'm trying to accomplish
I'm trying to make a service to more easily configure configuration values on Azure applications. Right now, if I want to change a setting that it the same over 7 different environments, I have to change it in 7 different .cscfg files.
My thought is I can create a webservice, that the application will query for its configuration values. The webservice will look in a storage place, like Azure Tables, and return the correct configuration values. This way, I can edit just one value in Tables, and it will be changed in the correct environments much more quickly.
I've been able to integrate this into a deployment script pretty easily (package the app, get the settings, change the cscfg file, deploy). The problem with that is every time you want to change a setting, you have to redeploy.
Black-o, given that your desire appears to be to manage the connection settings among multiple deployments (30+), I would suggestion that perhaps your need would be better met by using a separate configuration store. This could be Azure Storage (tables, or perhaps just a config file in a blob container), a relational database, or perhaps even an external configuration service.
These options require only a minimum amount of information to be placed into the cscfg file (just enough to point at and authorize against the configuration store), and allow you to maintain all the detail settings side by side.
A simple example might use a single storage account, put the configuration settings into Azure Tables, and use a "deployment" ID as the partition key. The config file for deployment then just needs the connection info for the storage location (unless you want to get by with a shared access signature), and its deployment ID. Then can then retrieve the configuration settings on role startup and cache them locally for performance improvements (either in a distributed memory cache or perhaps on the temp "local storage" drive for each instance).
The code to pull all this together shouldn't take more then a couple hours. Just make sure you also account for resiliency in case your chosen configuration provider isn't available.
The only way to change the settings during runtime is via Management API - craft the new settings and execute "Update Deployment" operation. This will be rather slow because it honors update domains. So depending on your actual problem there might be a much better way to solve it.

Azure SQL Database naming ambiguity

Our application uses an Azure SQL Database.
Apart from our local dev setup, we have two environments:
Staging (for quality assurance and client testing), and
Production (live)
The Staging database and Production database are stored on two separate SQL Database servers. In both servers, the databases are named the same.
Problem:
Since the server names are automatically and uniquely generated (and are a bunch of randomly generated letters), it is very difficult to distinguish between Staging and Production. Screenshot from the Azure portal below:
This also increases the possibility of pointing to the wrong database when running change scripts, queries, etc. If it was possible to alias/rename the servers, then this wouldn't be a problem, but I know that this isn't possible.
Any suggestions? What do you do in your environment?
If you want to have speaking database URLs you could use custom DNS Names for your SQL Azure Servers.
So you could CNAME your custom domains like this:
liveDB.mydomain.com to random2323LIVE32323.database.windows.net
stageDB.mydomain.com to random43435STAGE34.database.windows.net
But there is one caveat:
You still need the server name, because you need to login as user#random2323LIVE32323.
Anyways... if you use this scenario, the worst case can be a rejected login, if you mix the real server names.
For a detailed explanation see here
Although it's a bit more administrative work, I typically recommend different Live IDs for Stage vs. Prod. That's because I normally want to have a different set of administrators for my cloud services. If you name one Live ID as PRODAppName and STGAppName, you can make yourself the co-admin on both of those live IDs, and simply use the Filter capability of the portal to only see PROD or STG when you need to know which service is which. Hopefully this makes sense.

Resources