Dynamic Data Source at run time based on Active Directory account - sharepoint

I have a set of SSRS reports that I want to provide access to multiple customers. Each customer has their own database. There is a shared data source that all the reports access. Each customer's database has identical schema/objects, only the data is different. Each customer has an Active Directory login.
Is there a way for the Shared Data Source to dynamically change which database it connects to based on the user accessing the report? Is there a way using SSRS integrated with Sharepoint?
The only other solution I've seen is passing in the Server/Database as parameters, which would require an application and use of the web service.

Sharepoint Mode might offer some (AD) authentication and datasource stuff I don't know about, but you could combine the parameter solution you mention with the [%UserId] global, and build a connection dynamically based on it.
Here's an alternative solution with a single source report, deployed once per client:
create one report folder per database
use a shared data source, one per database (located in that folder)
on first deploy, configure the data source connection for each database
set OverwriteDatasources to false
set up a deployment process (using multiple configurations, or perhaps a script) that deploys the source report to all customer folders
update your site so each user gets to see a report from the corresponding customer folder

Related

Azure Data Factory and SharePoint

I have some Excel files stored in SharePoint online. I want copy files stored in SharePoint folders to Azure Blob storage.
To achieve this, I am creating a new pipeline in Azure Data factory using Azure Portal. What are possible ways to copy files from SharePoint to Azure blob store using Azure Data Factory pipelines?
I have looked at all linked services types in Azure data factory pipeline but couldn't find any suitable type to connect to SharePoint.
Rather than directly accessing the file in SharePoint from Data Factory, you might have to use an intermediate technology and have Data Factory call that. You have a few of options:
Use a Logic App to move the file
Use an Azure Function
Use a custom activity and write your own C# to copy the file.
To call a Logic App from ADF, you use a web activity.
You can directly call an Azure Function now.
We can create a linked service of type 'File system' by providing the directory URL as 'Host' value. To authenticate the user, provide username and password/AKV details.
Note: Use Self-hosted IR
You can use the logic app to fetch data from Sharepoint and load it to azure blob storage and now you can use azure data factory to fetch data from blob even we can set an event trigger so that if any file comes into blob container the azure pipeline will automatically trigger.
You can use Power Automate (https://make.powerautomate.com/) to do this task automatically:
Create an Automated cloud flow trigger whenever a new file is dropped in a SharePoint
Use any mentioned trigger as per your requirement and fill in the SharePoint details
Add an action to create a blob and fill in the details as per your use case
By using this you will be pasting all the SharePoint details to the BLOB without even using ADF.
My previous answer was true at the time, but in the last few years, Microsoft has published guidance on how to copy documents from a SharePoint library. You can copy file from SharePoint Online by using Web activity to authenticate and grab access token from SPO, then passing to subsequent Copy activity to copy data with HTTP connector as source.
I ran into some issues with large files and Logic Apps. It turned out there were some extremely large files to be copied from that SharePoint library. SharePoint has a default limit of 100 MB buffer size, and the Get File Content action doesn’t natively support chunking.
I successfully pulled the files with the web activity and copy activity. But I found the SharePoint permissions configuration to be a bit tricky. I blogged my process here.
You can use a binary dataset if you just want to copy the full file rather than read the data.
If my file is located at https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV, the URL I need to retrieve the file is https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV')/$value.
Be careful about when you get your auth token. Your auth token is valid for 1 hour. If you copy a bunch of files sequentially, and it takes longer than that, you might get a timeout error.

Azure SQL Database sandbox based on production

I have a large Azure Sql Database. I need to provide a sandbox to a team that is a copy of the database, but allows them to create sql objects. The data in the sandbox needs to be up to date with production. I used elastic queries, but the performance is not ideal. I've looked at data sync, but the company requires AD authentication. Restoring production periodically as the sandbox is not ideal as the team does not want to lose their work. Any suggestions? I'm sure I must be overlooking something.
I would first make a copy the production database, then create a "From the Hub" sync group.
1. Copy Database
You can easily create a copy of an Azure SQL database by going to the database blade and clicking "Copy" in the header. From there it will ask you the new database name and target server. You can put it on the same server or create a new server, that is up to you.
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-copy
Once you've done that, you now have a "sandbox" database you control which would be an exact copy of production.
2. Sync Group
After that, you can sync specific tables from production to the sandbox by creating a Azure SQL "Sync Group".
You want to initiate this from your production database since that is the source (or hub) database, so go to the database blade of your production database and choose "Sync to other databases".
Click on "New Sync Group". From there it will ask you for a sync group name which could be something like "SyncSandbox".
Select your member database(s), this would be your sandbox database, so choose "Use Existing Database" and select your sandbox database.
Choose your sync direction. This is important, since you only want to sync from production to the sandbox, select "From the Hub".
Finally you can configure the sync group. On the Tables page, select a database from the list of sync group members and select Refresh schema. Once you're done, select Save. You can also go into the properties and select the sync frequency if you want it automatic.
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-get-started-sql-data-sync
The only one thing worry me that you mentioned your team want to keep their work. I dont know how it would be possible imagine you copy database and your team created new customer with new id which is lets say 31 and then same thing will happens on production so how to resolve those conflicts. If to omit this then I would recommend you to do following.
Setup database replication
Create job Logic App or Azure function which will execute command on that replica
CREATE DATABASE Database2 AS COPY OF Database1;
I am not sure but you probably will need to run command make this database writable since if you copy replica it will be read only.
Run script to replace all sensitive data.
But keep in mind that you will have down time so probably better would be to do this job every morning so when team starts they will have fresh data
More options how to COPY

Initial remote data - Xamarin Forms

I have a Xamarin Forms application, and I have to get initial remote data (with images, maybe in urls) and save that data as a cache on my app. Every time the application starts, the data has to be refreshed, and if cannot, use the cached data.
So far, I have already viewed Easy Tables, but seems that its focus is on save user data on the cloud, and I don't want to do that.
I only want to get the initial data for an application, cache that data and refresh that data every time the app starts.
I didn't find a scenario with Easy Tables that the app administrator loads the initial data (maybe by REST calls) and then the app only consumes that data without modifying it.
Could you give some advice on how to do this? Using Azure.
Thanks!
So far, I have already viewed Easy Tables, but seems that its focus is on save user data on the cloud, and I don't want to do that.
Easy Tables work with Node.js backend, you just need to add the table and your backend would be automatically created for you. By using Offline Data Sync, you could create and modify data in your local store (e.g. sqlite) when your app is offline mode, then when your app is online you could push local changes to your server or pull changes from your server into your local store. This may be an approach for you and you could just pull the data from server and only read data from your local store.
I have a Xamarin Forms application, and I have to get initial remote data (with images, maybe in urls) and save that data as a cache on my app.
I didn't find a scenario with Easy Tables that the app administrator loads the initial data (maybe by REST calls) and then the app only consumes that data without modifying it.
Per my understanding, if your initial data is more about images, settings and without any sensitive data, I assumed that you could just leverage Azure Blob storage for storing data (image urls or settings within *.json file) or Azure Table storage, and you could leverage the related client SDK to retrieve the data and store into your local sqlite db or files.
I would prefer to use the blob storage and you could control access (Anonymous access or delegated access permissions) to your blob data. For more details, you could refer to Managing security for blobs.
You absolutely can do that with a Sync Table.
https://learn.microsoft.com/en-us/azure/app-service-mobile/app-service-mobile-xamarin-forms-get-started-offline-data
Just do a PullAsync in the splash screen to retrieve the values, you don't need to make use of the Post methods, and can even remove them (or return errors) in your Azure TableController

Unable to create Easy Tables in Azure

I have created a Mobile App and Database in Microsoft Azure. Now I am trying to create Easy Tables from Mobile App but I am getting error message "You need database to use East Tables. Click here to create one.
Even though I have existing database Easy Tables doesn't list it.
Below is the screenshot.
I have mapped Data Connection with mobile app, below is the screenshot
It would be great if anyone can help, I am new to azure.
Your connection string must be created with name: MS_TableConnectionString
Just because you have a database doesn't mean it is linked. Click on Data Connections, then Add, then add your existing SQL database.
Note that Easy Tables won't recognize the existing tables unless you add them through Easy Tables. There are notes around the format of Id (it needs to be
a string) and other fields.
When you created your Database Server did you check "Allow azure services to access server"? That could be why you cannot see the Database listed.

Business Data Connectivity Service

I am reading this article:
https://technet.microsoft.com/en-us/library/jj683108.aspx
On step 6b it says to:
In the Database area, leave the prepopulated values for Database Server, Database Name, and Database authentication, which is Windows authentication (recommended) unless you have specific design needs to change them.
However my question on that is the prepopulated values for the database server is Sharepoint. why would I want that as a default if I am trying to set up an external database for external content types?
I want to make sure my understanding is correct before I make a change at this high level of SharePoint
The document you are referring to is about setting up BDC service. The database mentioned in step 6 is not the database containing the external data but internal database of BDC service that will hold the configuration data of the service.
Setting up new connection for particular data source is another step (following the BDC service setup). You can definitely connect to remote servers.

Resources