Please could you all assist us in answering what we believe to be a rather simple question, but is proving to be really difficult to get a solution to. We have explored things like Data Bricks and Snowflake for our Azure based data warehouse, but keep getting stuck at the same point.
Do you have any info you could share with us around how you would move data from an Azure database (source) to another Azure database (destination) without using SSIS ?
We would appreciate any info you would be able to share with us on this matter.
Looking forward to hearing from you
Thanks
Dom
Related
I am trying to automate refunds report to google analytics 4. I can't find good documentation on importing this data using analytics management API. I came across https://www.npmjs.com/package/#google-analytics/data which seems to be good for pulling reports from GA but couldn't get a way of doing data import.
I am writing a nodejs script and was hoping someone has encountered this scenario before and could share how they accomplished it. Any help or point in the right direction will be really appreciated.
The alternative to the UA Analytics Management api is the Google Analytics Admin API for ga4
To my knowledge it doesn't support data important at this time the API is still under development it may come in the future there is no way to know.
I would suggest looking at the measurement protocol for ga4 you may be able to use that
I have a sentinel instance running in azure but not enough data to test the full functionality of sentinel.
Doing some research I came across with azure sentinel GitHub:
https://github.com/Azure/Azure-Sentinel/tree/master/Sample%20Data
This has enough sample data for testing and getting my hands dirty and try to understand the full power of sentinel and how to leverage it. But I was wondering if there is a way or option about how to get those csv file (import) into sentinel portal.
I hope my question is clear and if not please don't not hesitate to ask more details.
Thank you so much for any help you can provide.
You can import them as custom logs:
https://learn.microsoft.com/en-us/azure/azure-monitor/agents/data-sources-custom-logs#define-a-custom-log
Additionally, check out the "Training Lab" solution in Content Hub in the Sentinel console. Installing this will populate data.
We have a cube that contains 1.6 years of data and it is taking a long time to load. Previously we got a memory issue error, but we have increased the SAP Memory size. Can anyone explain me any ways to troubleshoot, or any best practices that we can follow?
We are currently pulling 30-35 combinations of Dimensions and Characteristics and still its taking a lot of time, and we don’t have that amount of time in order to get the error and then act on it.
It is the internal MDX limitation that you have to live with. In order to mitigate this, you will have to use filters or variables to restrict the return volume. If you don't mind moving data out of SAP onto Azure storage first, then you will gain much better user experience by pointing Power BI to Azure DW, Azure SQL database, or even Blob. Otherwise, you will have to be stuck with SAP bottleneck.
Because Power BI and ADF share the same underlying engine to access SAP BW, for your reference, you can check our blog out for comparison and further explanation in context of ADF and BW integration:
Our team have just recently started using Application Insights to add telemetry data to our windows desktop application. This data is sent almost exclusively in the form of events (rather than page views etc). Application Insights is useful only up to a point; to answer anything other than basic questions we are exporting to Azure storage and then using Power BI.
My question is one of data structure. We are new to analytics in general and have just been reading about star/snowflake structures for data warehousing. This looks like it might help in providing the answers we need.
My question is quite simple: Is this the right approach? Have we over complicated things? My current feeling is that a better approach will be to pull the latest data and transform it into a SQL database of facts and dimensions for Power BI to query. Does this make sense? Is this what other people are doing? We have realised that this is more work than we initially thought.
Definitely pursue Michael Milirud's answer, if your source product has suitable analytics you might not need a data warehouse.
Traditionally, a data warehouse has three advantages - integrating information from different data sources, both internal and external; data is cleansed and standardised across sources, and the history of change over time ensures that data is available in its historic context.
What you are describing is becoming a very common case in data warehousing, where star schemas are created for access by tools like PowerBI, Qlik or Tableau. In smaller scenarios the entire warehouse might be held in the PowerBI data engine, but larger data might need pass through queries.
In your scenario, you might be interested in some tools that appear to handle at least some of the migration of Application Insights data:
https://sesitai.codeplex.com/
https://github.com/Azure/azure-content/blob/master/articles/application-insights/app-insights-code-sample-export-telemetry-sql-database.md
Our product Ajilius automates the development of star schema data warehouses, speeding the development time to days or weeks. There are a number of other products doing a similar job, we maintain a complete list of industry competitors to help you choose.
I would continue with Power BI - it actually has a very sophisticated and powerful data integration and modeling engine built in. Historically I've worked with SQL Server Integration Services and Analysis Services for these tasks - Power BI Desktop is superior in many aspects. The design approaches remain consistent - star schemas etc, but you build them in-memory within PBI. It's way more flexible and agile.
Also are you aware that AI can be connected directly to PBI Web? This connects to your AI data in minutes and gives you PBI content ready to use (dashboards, reports, datasets). You can customize these and build new reports from the datasets.
https://powerbi.microsoft.com/en-us/documentation/powerbi-content-pack-application-insights/
What we ended up doing was not sending events from our WinForms app directly to AI but to the Azure EventHub
We then created a job that reads from the eventhub and send the data to
AI using the SDK
Blob storage for later processing
Azure table storage to create powerbi reports
You can of course add more destinations.
So basically all events are send to one destination and from there stored in many destinations, each for their own purposes. We definitely did not want to be restricted to 7 days of raw data and since storage is cheap and blob storage can be used in many analytics solutions of Azure and Microsoft.
The eventhub can be linked to stream analytics as well.
More information about eventhubs can be found at https://azure.microsoft.com/en-us/documentation/articles/event-hubs-csharp-ephcs-getstarted/
You can start using the recently released Application Insights Analytics' feature. In Application Insights we now let you write any query you would like so that you can get more insights out of your data. Analytics runs your queries in seconds, lets you filter / join / group by any possible property and you can also run these queries from Power BI.
More information can be found at https://azure.microsoft.com/en-us/documentation/articles/app-insights-analytics/
I have a table storage in Azure where-in one of the tables is growing rapidly and I need to archive the tables for any data older than 90 days. Tried reading online and the only solution I can get online is to use Eventually consistent transactions pattern : https://azure.microsoft.com/en-us/documentation/articles/storage-table-design-guide/. Although the document takes an example of an employee table and can help me in achieving my objective, the intention of posting this question is to identify if there is a better solution.
Please note I am very new to Azure so might be missing a very easy step to achieve this.
Regards Tarun
The guide you are referencing is a good source. Note that if you are using Tables for logging data (has a common requirement for archiving older data) then you might want to look at blob storage instead - see the log data pattern in the guide you reference. Btw, AzCopy can also be used to export the data either to a blob or to local file system. See here for more information: https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/#copy-entities-in-an-azure-table-with-azcopy-preview-version-only.