I am trying to save kibana dashboards - I have tried the regular save as well as save to file option. In either case, I am unable to get the same dashboard to open up - the error I see at the top is as follows:
Error Alert
No time filter Timestamped indices are configured without a failover. Waiting for time filter
Before saving the dashboard, I can see the logs correctly in kibana. Any thoughts on troubleshooting or fixing this will be greatly appreciated.
That error happens when you try to refresh the dashboard and you do not have a Time Filter. Try to select for example, Last 15m and the error should not appear more.
Related
I am trying to read my Shopify Products and Export them to an Excel file. So I was able to create a custom Connector and Get products, however I have an issue of exporting them into an excel. I need the list of SKUs, however it runs few minutes and throws me an error that job has timed out. Can you please help with the Power Automate step for getting data from GetProduct custom app and write into the excel. I actually tried to extract titles, and it worked, however when I go to the lower level to get sku, the job times-out.
When you apply a filter in the console and find your log among many, you made it, you found your log. But when the filter is removed, scrolls goes to the top or down, so cannot see where de log is respect to all the rest of the logs in the app.
I'd like that when a filtered log in the console is selected and the filter is removed, to have the scroll at that point with all the rest of the logs of the application. Either because there is a console.group I'd like to expand and check, or because I would like to see the log position of my log of interest respect to other logs.
Is this possible?
I tried to use the PowerShell sample https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api#powershell-sample without any changes.
It completes with status 200 (OK) and correctly creates a new table with LogType (MyRecordType) within the Custom Logs in the portal (Log Analytics Workspace->Logs).
However, the events that are submitted don't turn up there - there are always "No results from the last 24 hours". Also, within the new table, none of the custom properties are created.
Has anybody observed a similar problem? (Some people seem to be using the C# code successfully.) Thanks!
Crazy... on the next day, all the events had turned up in the Log Analytics workspace. Even new events that I generated turned up immediately.
It seems this was a glitch just on that day. Well, the API is in "preview"...
Where:
In build -> Actions -> Upload content via Google Sheets
Problem: I have the trivia template selected. I've already created a sheet and uploaded and deployed my action. Now I figured out that one of my questions had a wrong answer. So I updated the sheet and then went back to the actions tab to upload content via google sheets
Then I clicked on UPDATE TEST and tested my app in the simulator. The changes were reflected.
Then I clicked on UPDATE PROD and I get the following message
On doing further investigation I found out that the GET request is returning a 403 error
and in the Network tab, this is the response I'm getting:
I'm not sure what is going on.
I have encountered the same situation.
However, there is a workaround or it must be the right action when upgrading your Q&A Sheet. It is to use "submit for production" in Deploy>Release sidebar. See [attached image].1
I also had the same problem. I think you have changed the Google Sheet's name and questions per page, so it gives an error. If you give the same name and questions per page it is taking i.e. updating.
I am having the same issues with powerBI ,seems the automatic refresh its failing ,i currently have to click refresh to see new data coming in ,i have configure the the tumblingwindow part e.g tumblingwindow(second,3),done the live to dashboard are the any other settings/factors i have to set for the automatic refresh to work.(its a console app that selects data from database and sends each row to event hubs from event hubs to stream analytics then output is powerBi ).i am assuming the is time restrictions depending on throughput but how do i really calculate the time for tumblingwindow i should set ,i have tried the equation entitycount*60*60/throughput = seconds still no success.
below is a code but still the events take time to reach powerBi even after tumblingwindow(second,3) ,i could stop my application running then delete the dataset from powerbi ,but then the dataset will reappear
EventData data = new EventData(Encoding.UTF8.GetBytes(serializedobjects));
eventHubClient.SendAsync(data);
Overall workflow you describe seems fine and tumbling window size should not cause the data to never show up. You will have to debug this issue, with following steps
Goto Azure portal, inputs page for the stream analytics job and get a "sample". Does it return you any samples?
Goto Monitoring page in azure portal and check if input events and output events are greater than zero. If you see input events but don't see output events, you are likely hitting an error writing to output.
Also check if operation logs has any errors for this job.
Above steps should tell you if something is wrong with event format or the output.
Are you pinning the individual live tile to the dashboard or the entire report? Pinning the entire report does not appear to work.
If you pin the single tile containing the data you want, does that refresh in real time?