I have 140 users that I want to query for the sign-ins in the past 90 days. I see nothing in the MS documentation that lets me target either a) a CSV of UPNs or b) Azure cloud sourced User accounts (i.e. On-premises sync enabled = "No")
Seeking a query that gets me started in the right direction...
thanks
Related
I need the near real-time front end data from a web app for use in PowerBI. I need to keep this data forever.
I would like to automatically export the App customEvents and pageViews tables for this purpose.
It seems like I need to go from Azure Logs -> Azure Storage Account -> Azure SQL Server -> PowerBI
The steps I'm having trouble with are going from Logs to storage, and then getting the data that's passed into there into a SQL server.
To send logs to Storage Accounts, Event Hubs and Log Analytics, go to the App Service and on the left panel select Diagnostic setting and click on + Diagnostic settings.
Select the options which are shown in below image to store the logs in Storage account and click on Save.
You can now use Azure Data Factory service to copy the logs from Azure Storage account to Azure SQL Database.
Please refer this tutorial from Microsoft – Copy data from Azure Blob storage to a SQL Database by using the Copy Data tool to implement the same.
Once data available in Database, we are good to use Power BI to read the data.
Open the Power BI dashboard and click on Get data from another source ->.
Select Azure -> Azure SQL Database and click on Connect.
Give the server’s name.
In the next step just give the username and password for your account and you will get the access.
Now you can select the data from any table and showcase it in Power BI dashboard as per of your requirement.
I wanted to check with you all just to confirm some final numbers.
I would like to migrate around 20 TBs of data (around 40 million files) into Azure File Share or Blob.
I am getting a little confused with the Azure Cost Estimator since it seems to talk about Azure transactions/Operations fees and storage at rest but not the cost to migrate data from On-prem file server to Azure Files.
Am I wrong to Assume all data migrations to Azure Files (Hot, Cool, Transaction Optimized) and Azure Blob (Archive or Cool) are free of charge? Cost will only start after the migration is completed and users start interacting with the data?
Please let me know what the correct answer is here.
Thank you all in advance!
There's nothing special about migration. If you copy files from on-prem to Azure the normal billing meters will charge you for it. You could use something like Azure Data Box to migrate data, but that's a separate service with its own billing.
I have a working query for my app data to be analyzed.
currently it analyzes the last two weeks data with an ago(14d).
Now i want to use a value containing the release date of the apps current version. Since i havent found a way to add a new database table to the already existing database containing the log data in azure analytics, i created a new database in azure and entered my data there.
Now i just don't know, if i can get access to that database at all from within the web query interface of Azure log analytics, or if i have to use some other tool for that?.
i hope that somebody can help me on this.
As always with azure there is a lot of stuff to read about it, but nothing concrete for my issue (or at least i haven't found it yet).
And yes, i know how to insert the data into the query with a let, but since I want to use the same data in different queries, an external location which can be accessed from all the queries would be the solution I prefer.
Thx in advance.
Maverick
You cannot access a db directly. You are better of using a csv/json file in blob storage. In the following example I uploaded a txt file with csv data like this:
2a6c024f-9093-434c-b3b1-000821a15b1a,"Customer 1"
28a658a8-5466-45ea-862c-003b20507dd4,"Customer 2"
c46fb949-d807-4eea-8de4-005dd4beb39a,"Customer 3"
e05b67ee-ff83-4805-b004-0064449f196c,"Customer 4"
Then I can reference this data from log analytics / application insights in a query like this using the externaldata operator:
let customers = externaldata(id:string, companyName:string) [
h#"https://xxx.blob.core.windows.net/myblob.txt?sv=2019-10-10&st=2020-09-29T11%3A39%3A22Z&se=2050-09-30T11%3A39%3A00Z&sr=b&sp=r&sig=xxx"
] with(format="csv");
requests
| extend CompanyId = tostring(customDimensions.CustomerId)
| join kind=leftouter
(
customers
)
on $left.CompanyId == $right.id
The url https://xxx.blob.core.windows.net/myblob.txt?sv=2019-10-10&st=2020-09-29T11%3A39%3A22Z&se=2050-09-30T11%3A39%3A00Z&sr=b&sp=r&sig=xxx is created by creating a url including a SAS token by using the Microsoft Azure Storage Explorer, selecting a blob and then right click -> Get Shared Access Signature. In the popup create a SAS and then copy the uri.
i know Log Analytics uses Azure Data Explorer in the back-end and Azure Data Explorer has a feature to use External Tables within the queries but I am not sure if Log Analytics support External Tables.
External Tables in Azure Data Explorer
https://learn.microsoft.com/en-us/azure/data-explorer/kusto/query/schema-entities/externaltables#:~:text=An%20external%20table%20is%20a,and%20managed%20outside%20the%20cluster.
I have two Azure accounts, A and B
I have a sql server database created under subscription of Account A.
I want to move this sql server database to subscription of Account B.
You can do it all in SQL Server Management Studio
Connect to both Azure accounts in SSMS.
Right click on the source database and do Tasks -> Export data tier application
Save as a .bacpac file to your local disk
In the target connection, go to Databases, right click and choose Import data tier application
Choose Import from local disk and browse to the .bacpac file you just created
There is another way similar to what hawbsl has described. However instead of exporting data tier application to local disk, export it to azure blob storage.
So export database in Account A to azure blob storage in Account B:
Then in Account B go to your sql server and click import database:
This link will describe how to import database from blob file
There are 2 things that might help:
Moving resources between subscriptions
Subscription ownership transfer (to a different billing account)
I don't know anything about the latter, so unless somebody else has tips, it might be best to contact Support, if that's what you need.
In order to move resources between subscriptions, you'll need to make sure:
Both source and target subscriptions need to be in the same directory
One user account must have access to create and delete resources in both subscriptions
You need to move all SQL databases on that server at once
There may be other requirements, but those are the main ones I suspect you'll hit.
If subscriptions are in different directories, you can move the SQL server to a temporary, trial subscription, then move that subscription to the target directory (from the old portal), and finish the move in the new portal using the target directory.
If you're moving resources between organizations that use AAD accounts, you'll likely need to grant a Microsoft Account (e.g. outlook.com) access to each subscription and perform the actual move operation with that user. Don't forget to delete the subscription and remove the temporary user account from both subscriptions and directories after you finish the move.
Hopefully, that should be it. Of course, you could always just create a backup and restore that to a new server :-P
EDIT:
Have you looked into SSIS?
https://msdn.microsoft.com/en-us/library/ms141204.aspx
Our customer has a daily 4 gig data file (.csv) file to be uploaded in a blob storage (Windows Azure).
After upload (by a web role) we want the csv file to be processed to a SQL Azure database (processing/converting is done by a worker role).
After processing, it must be consumed by Microsoft Azure Marketplace / Data Market.
Unfortunately, the information for content providers (like us in this case) is very spare.
My question to be answered is: Can any SQL Azure database be hosted in Windows Azure MarketPlace/Data Market ? Or even better: what are the requirements for content providers regarding SQL Azure DB's ?
have you looked at the Data Publishing Kit?
To answer the question strictly speaking, no. You cannot have your database hosted in the marketplace. With regards to your second question, you need to supply the DataMarket team the valid parameters for data retrieval and the connection strings to your service.