I am using PowerBI to analyze Cost data from Azure. I am making a direct connection and pulling in data by: opening PowerBI | Get Data | Online Services | Microsoft Azure Consumption Insights (Beta) This works, however, I am only able to see two months of data and ideally I'd like to see 6. After a lot of searching the general consensus from other users seems to be using the advanced editor to tweak the query by adding "optionalParameters" and specifying the number of months... I came across a few other sites where users were experiencing the same issue but the suggestions didn't work. I am hoping someone here can point me in the right direction.
I'm going to post the query string and below that list out the URLs containing suggestions I've already tried.
let
enrollmentNumber = "xxxxxxx",
optionalParameters = [ numberOfMonth = 6, dataType="DetailCharges" ],
Source = MicrosoftAzureConsumptionInsights.Tables(enrollmentNumber, optionalParameters),
usagedetails = Source{[Key="usagedetails"]}[Data],
#"Parsed JSON" = Table.TransformColumns(usagedetails,{{"Tags", Json.Document}}),
#"Expanded Tags" = Table.ExpandRecordColumn(#"Parsed JSON", "Tags", {"environment", "application", "costCenter", "owner"}, {"Tags.environment", "Tags.application", "Tags.costCenter", "Tags.owner"})
in
#"Expanded Tags"
https://community.powerbi.com/t5/Desktop/Azure-consumption-insights-get-more-than-two-month-usage-details/td-p/541413
https://community.powerbi.com/t5/Desktop/Power-BI-desktop-and-getting-multiple-months-in-one-row-from-the/td-p/50585
https://community.powerbi.com/t5/Desktop/Extend-the-Azure-consupmtion-data/td-p/444508
https://learn.microsoft.com/en-us/power-bi/desktop-connect-azure-consumption-insights
I opened a case and now have a solution to this problem.
First - The version of PowerBI had to be upgraded to the latest version. The version I upgraded to is: Version: 2.75.5649.961 64-bit (November 2019)
Second - Microsoft Azure Consumption Insights is being phased out and is being replaced by Azure Cost Management. This won't work with Consumption insights.
I was able to increase the number of months by:
Get Data | Azure | Azure Cost Management
Fill in the required information specifying number of months.
Get Data Dialog
Related
Taking the Terraform resource example:
resource "azurerm_billing_account_cost_management_export" "example" {
name = "example"
billing_account_id = "example"
recurrence_type = "Daily"
recurrence_period_start_date = "2020-08-18T00:00:00Z"
recurrence_period_end_date = "2020-09-18T00:00:00Z"
export_data_storage_location {
container_id = azurerm_storage_container.example.resource_manager_id
root_folder_path = "/root/updated"
}
export_data_options {
type = "Usage"
time_frame = "Custom"
}
}
The documentation isn't very clear what time_frame = "Custom" does and what else to add here?, I would like to create an export that runs daily however it only exports that day or maybe the previous days worth of data not month-to-date being the closest to this. As i do not want all of the other days data on that export. Is Setting the time_frame to custom allow me to do this? Will i have to set a start_date and end_date? and if so can i then run an update request daily potentially to change the days in a script someway as an alternative option
Tried creating a day-to-month export however the file is too large and comes with unwanted data as the end of the month comes along
Is Setting the time_frame to custom allow me to do this?
Yes, we can do this via json api.
Terraform provider it self not supporting this option. Refer below screenshots.
I have replicated the ways using terraform, no luck. Using terraform we have only below custom options. time_frame only allows below mentioned parameters.
Possible values include: WeekToDate, MonthToDate, BillingMonthToDate, TheLastWeek, TheLastMonth, TheLastBillingMonth, Custom.
seems the respective custom parameters are specific to Month.
we can use like
time_frame = "TheLastMonth"
I would suggest you to use the Azure Cost Management connector in Power BI Desktop , upload that on Power BI and set a daily refresh.
You can then query the Datamart like a database daily.
I have an Azure Log Analytics workspace and inside it I created a custom table to ingest some of my logs.
I used these two guides for it (mainly the first one):
https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-portal
https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-api
In my logs I have a field:
"Time": "2023-02-07 11:15:23.926060"
Using DCR, I create a field TimeGenerated like this:
source
| extend TimeGenerated = todatetime(Time)
| project-away Time
Everything works fine, I manage to ingest my data and query it with KQL.
The problem is that I can't ingest data with some older timestamp. If timestamp is current time or close to it, it works fine.
If my timestamp, let's say from two days ago, it overwrites it with current time.
Example of the log I send:
{
"Time": "2023-02-05 11:15:23.926060",
"Source": "VM03",
"Status": 1
}
The log I receive:
{
"TimeGenerated": "2023-02-07 19:35:23.926060",
"Source": "VM03",
"Status": 1
}
Can you tell why is it happening, why can't I ingest logs from several days ago and how to fix. The guides I used do not mention any of the sort at all, regrettably.
I've hit this limit once before, a long long time ago. Asked a question and got a response frome someone working on Application Insights and the response was that only data not older than 48h is ingested.
Nowadays AFAIK the same applies to Log Analytics, I am not sure the same limit of 48 hours stills stands but I think it is fair to assume some limit is still enforced and there is no way around it.
Back in the time I took my loss and worked with recent data only.
I have a CosmosDB Data Explorer workbook with a set of functional predefined queries. The workbook offers real benefits by improving efficiency when our organization needs to run some useful queries against when debugging our data/applications. The one draw back is that its prone to failure via user error.
As of now, users need to edit the cells, replacing query values for those that match their needs before running the cells. This is not optimal as it introduces the opportunity that the queries could be broken unintentionally by an unwitting user should they save an erroneous query, hence erasing record of the working query.
One of the queries looks as follows:
%%sql --database DatabaseName --container ContainerName
SELECT c.propertyOne, c.propertyTwo, SUM(c.propertyForSummation)
FROM c
WHERE c.propertyOne = "XYZ123"
AND c.propertyTwo = 1
AND c.timestamp >= '2021-07-15'
AND c.lastUpdatedTimestamp <= '2021-07-16'
GROUP BY c.propertyOne, c.propertyTwo
I'd like to introduce parameter fields {propertyOne, propertyTwo, timestamp, lastUpdatedTimestamp}, so the query would look as below for future users of the workbook to avoid the previously stated scenario. Is that possible?
%%sql --database DatabaseName --container ContainerName
SELECT c.propertyOne, c.propertyTwo, SUM(c.propertyForSummation)
FROM c
WHERE c.propertyOne = #propertyOne
AND c.propertyTwo = #propertyTwo
AND c.timestamp >= #timestamp
AND c.lastUpdatedTimestamp <= #lastUpdatedTimestamp
GROUP BY c.propertyOne, c.propertyTwo
I'm aware how to do this in Azure Data Studio (ADS) workbooks, but unfortunately can not see the option in Data Explorer, and am unaware of the possibility to connect to CosmosDb from ADS
I also posted the question in learn.microsoft.com and it has been confirmed that the feature is not supported at this time (Aug.10th, 2021)
I am new to Clockify and to API usage (so I apologize if I express myself not properly). My objective is to automatically load in Excel (via Power Query) the detailed report with all time entries. I have tried the below code and it works ... but for the fact that I got just the latest 50 time entries.
This because, as I understand properly, I should use this base Endpoint: "https://api.clockify.me/api/v1" (the one I use is deprecated). Is there a way to make the below work on the correct Endpoint?
let Source = Web.Contents("https://api.clockify.me/api/reports/{your report ID}", [https://api.clockify.me/api", #"X-Api-Key"="{your API Key}"]]), jsonResponse = Json.Document(Source), workspace = jsonResponse[workspace] in workspace
Thanks in advance,
We have live tracking & delivering application.On each minute we are storing almost 50K spatial data and Update 100K spatial records but right now its working for few delivery boys once we increase the delivery boy count to 15+ our SQL databse server DTU usage get fulled(Used 50 DTUs & database is in elastic pool). and hence we are getting perofrmance issue.As I am new to azure I click on support request for azure database in azurefrom Menus and what it mentins the following issue with database
"Azure SQL DB database is an instance with high locking waits"
any help or suggestion please
Tried with code optimization in .net core & spatial indexing & code optimization is also done.I also found from application insight that my Spatial query for fetching nearest delivery point of tracking delivery boy is taking much time but itb works fine when 15 or less users are running simaltaneously.Please see below linq code with Entity framework core & NetTopology for spatial data
var deliveriesQuery = (from a in db.AddressPointDelivery.Where(w=>w.IsDelivered==false
&& activeRouteJobIDList.Contains(w.RouteJobId.Value))
from b in db.BreadCrumb.Where(w=>breadcrumbIDList.Any(bread=> bread == w.Id))
//join rj in db.RouteJob on a.RouteJobId equals rj.Id
where
//activeRouteJobIDList.Contains(a.RouteJobId.Value)
//&& breadcrumbIDList.Contains(b.Id) &&
Convert.ToDouble(a.Point.Distance(b.Point)) < 200
//&& a.IsDelivered == false
// && b.IsProcessed==false
select new
{
breadcrumbId = b.Id,
AddressPointId = a.Id,
///jobID = rj.JobId,
routeJobId = a.RouteJobId,
isUndelivered = !a.IsDelivered,
CreatedOn=b.CapturedAt
});
[Attached image is error message which I saw on click of support request][1]