No data found in Timeseries table and line graph on Dasbhoard Thingsboard - display

everyone
I already made a timeseries table and line graph in the dashboard.
On 17 August 2022 is work properly, suddenly now I don't know why the data can't be displayed all the time on line graph and timeseries table. this a bugs or is there any way i can do to fix this problem?
Please give me an advice, thanks
*Update
the device is still on, and can send the latest telemetry data
telemetry data SS
, but it can't display time series data and line graphs.
time series image
line graph image

Without further information it's hard to tell what's the issue in your case. There could be several causes for this like device inactivity etc.
Please investigate the Log file thingsboard.log if there are any issues while sending the data to ThingsBoard.
Alternatively, take a look at the API Usage dashboard to verify that new data is stored in ThingsBoard.

Related

How to make Copy Data work faster and have better performance (Azure Synapse)

A bit of context: my Azure Synapse pipeline makes a GET Request to a REST API in order to import data to the Data Lake (ADLSGen2) in parquet file format.
I am looking forward to requesting data to the API on an hourly basis in order to get information of the previous hour. I have also considered to set the trigger to run every half an hour to get the data of the previous 30 minutes.
The thing is: this last GET request and Copy Data debug took a bit less than 20 minutes. The DUI used was set in "Auto", and it equals 4 even if I set it manually to 8 on the activity settings.
I was wondering if there are any useful suggestions to make a Copy Data activity work faster, whatever the cost may be (I would really like info about it, if you consider it pertinent).
Thanks in advance!
Mateo
You need to check which part is running slow.
You can click on the glasses icon to see the copy data details.
If the latency is on "Time to first byte" or "Reading from source" the issue is on the REST API side.
If the latency is on "Writing to sink" the problem may be from writing to data lake.
If the issue is on the API side, try to contact the provider. Another option, if applicable, is to use a few copy data activities, each will copy a part of the data.
If the issue is on data lake, you should check the setting on the sink side.

How to increase amount of metrics in Google Data Studio?

I have reached the number of 20 metrics in GDS. I am using report to do some aggregation and calculation on data and then download it to Excel file. To add more metrics I am using optional metrics but every time I refresh report I have to manually click all squares before I would be able to download this file. How can I deal with it?
AFAIK, Google Data Studio doesn't have a limit of 20 metrics.
Some visual component may have this limit. If that's the case, there isn't anything that can be done.
You can try to add multiple visuals and position them close to each other, so users will think they're the same component. Like in the picture bellow:
Notice there are two tables (the first one selected), but they were positioned in a way that users will think there is only one.

How to get daily data report using python3

I will try to explain my question as clear as i can.It would be so nice if you help me out.
(1)I am working in a tech support company where I am working on storage box like VNX and XtremeIo. I want to fetch the report of the data usage on each of the storage pool available in the storage box.
(2) Since the data keep on varying daily , so I want the daily report to be generated at a particular time everyday, and the report I should get it in my mail(outlook, gmail etc..) using python language.
Thank You!!

CSV Playback with Node Red

Disclaimer - I am not a software guy so please bear with me while I learn.
I am looking to use node red as a parser/translator by taking data from a CSV file and sending out the rows of data at 1Hz. Let's say 5-10 rows of data being read and published per second.
Eventually, I will publish that data to some Modbus registers but I'm not there yet.
I have scoured the web and tried several examples, however, as soon as I trigger the flow, Node.Red stops responding and I have to delete the source CSV,(so it can't run any more) and restart node.red in order to get it back up in running.
I have many of the Big Nodes from this guy installed and have tried a variety of different methods but I just can't seem to get it.
If I can get a single column of data from a CSV file being sent out one row at a time, I think that would keep me busy for a bit.
There is a file node that will read a file a line at a time, you can then feed this through the csv node to parse out the fields in the CSV into an object so you can work with it.
The delay node has a rate limiting function that can be used to limit the flow to processing 1 message per second to achieve the rate you want.
All the nodes I've mentioned should be in the core set that ships with Node-RED

Azure Stream Analytics to Power BI - event ordering - drop other events

I'm trying to stream data from my device to Azure IoT Hub to Stream Analytics to Power BI.
Power BI implemented a new way to display streaming data. I would like to generate a line chart via the "Add tile" button on a Power BI Dashboard. This takes care of autorefresh of my streaming data chart.
My current streaming data (which works excellent when displayed statically in Power BI via "Create report"...) produces a rather weird line chart in streaming data mode:
image.
My guess is that the arrival of new data in Power BI is not in chronological order. New data may be placed in the line chart in the correct temporal position but the line connecting the values is drawn in the order of arrival. This might cause the line to "jump back" in time?!
To minimize wrong ordering I am trying to prevent "adjusting other events" as well as accepting wrong ordering in Stream Analytics: configuration
The problem: with this configuration the Stream Analytics Job creates no output.
My ASA Query looks like this:
SELECT
Name,
Value,
Timecreated,
CAST (latest AS float) AS latest,
COUNT(*)
INTO
[ToPowerBI]
FROM
[Eing-CANdata] TIMESTAMP BY Timecreated
GROUP BY
Name, Value,Timecreated,latest,
tumblingWindow(Duration(Second, 1))
The "Timecreated" is formatted this way:
2017-03-06T11:51:22.246235Z
its accepted by Azure as timestamp.
Changing the configuration to accepting "out of order events with a timestamp in" the range of 10 seconds doesn't produce any output either.
The only way to create output is changing the configuration to "adjusting other events." But the Azure information tells me that "Adjust keeps the events and changes their timestamps". This would reorder the data which is not what I want.
My goals:
get data through Stream Analytics as fast as possible
avoid adjusting the timestamp as I need the original one!!
ultimately get a proper (& "real-time-like") streaming data line chart in PowerBI
My question(s): Why is Stream Analytics not outputting any data in "Drop other events" mode? How can I get output from Stream Analytics in this mode?
(I have an important presentation coming up and your help would be greatly appreciated!)

Resources