Get framebased timecode from receiver (Chromecast) - google-cast

is it at all possible to let the sender app know the exact time code (seconds AND frames!) of the video being played by the receiver? I could not find a native method for doing so but maybe someone knows something or managed to create a proprietary function for this?
Thanks!
Cheers :)

The Cast SDK does send regular media status updates to the sender. This is typically once per second. Frame level or time code accuracy isn't supported by the SDK.

Related

Using Nodejs to stream GIF count down timer to client

I need a countdown timer in my email. The only way I know it's possible is to use the img element and point to a server in my src attribute that returns a stream of say GIF encoded data. I went ahead and built it out and most of it works. However, when the timer updates, I am not seeing the changes being consumed in my stream. Please help!
My code is here: https://github.com/ElijahCannon/email-countdown.git
I think I'm not understanding something in the timer.js update() method, do I use canvasStream.on with events, or do I use pipe? I wasn't able to get as far just using pipe(), but maybe that's because I'm using the wrong stream types? My temporary stream (created every update) is a readableStream (https://github.com/Automattic/node-canvas#canvascreatepngstream), and my main stream piped to res is a duplex stream.
Any insight you can provide would be very helpful. Streams are a very confusing topic for me.

What is DAGScheduler.messageProcessingTime?

Can some point me where I can find the descriptions for the metrics I get from spark sink ?
This is timer that tracks the time to process messages in the DAGScheduler's event loop
Note that this seems to include all kinds of events (JobSubmitted, JobCancelled, etc.)
It was introduced by SPARK-8344 to help troubleshoot issue and delays in DAGScheduler's event loop. At least as far as I can understand.
I was hoping to be able to use it for payload processing time but that does not seem to be the correct metric for that.

How to get events count from Microsoft Azure EventHub?

I want to get events count from Microsoft Azure EventHub.
I can use EventHubReceiver.Receive(maxcount) but it is slow on big number of big events.
There is NamespaceManager.GetEventHubPartition(..).EndSequenceNumber property that seems to be doing the trick but I am not sure if it is correct approach.
EventHub doesn't have a notion of Message count, as EventHub is a high-Throughput, low-latency durable stream of events on cloud - getting the CORRECT current count at a given point of time, could be wrong the very next milli-second!! and hence, it wasn't provided :)
Hmm, we should have named EventHubs something like a StreamHub - which would make this obvious!!
If what you are looking for is - how much is the Receiver lagging behind - then EventHubClient.GetPartitionRuntimeInformation().LastEnqueuedSequenceNumber is your Best bet.
As long as no messages are sent to the partition this value remains constant :)
On the Receiver side - when a message is received - receivedEventData.SequenceNumber will indicate the Current sequence number you are processing and the diff. between EventHubClient.GetPartitionRuntimeInformation().LastEnqueuedSequenceNumber and EventData.SequenceNumber can indicate how much the Receiver of a Partition is lagging behind - based on which, the receiver process can Scale up or down the no. of Workers (work distribution logic).
more on Event Hubs...
You can use Stream Analytics, with a simple query:
SELECT
COUNT(*)
FROM
YourEventHub
GROUP BY
TUMBLINGWINDOW(DURATION(hh, <Number of hours in which the events happened>))
Of course you will need to specify a time window, but you can potentially run it from when you started collecting data to now.
You will be able to output to SQL/Blob/Service Bus et cetera.
Then you can get the message out of the output from code and process it. It is quite complicated for a one off count, but if you need it frequently and you have to write some code around it, it could be the solution for you.

How to handle j2me call events/interruptions

I have to implement the code in which I have to call, then have to get the call duration and also should be notified when the call is ended.
I know it's not possible in j2me with MIDP 2.0 but still if anyone has solved this issue however, plz post ur solutions here. I need it.
I have already done calling with the method platformRequest() but I am unable to track the call duration, event when call was not received and end call event.
Plz do reply some gud solutions if any of these things is possible.
You cannot access this type of native functionality using VM.
You can try and make a rough estimation but this won't cover all circumstances e.g. when you application gets closed or the Date/Time of the phone is changed but it's the best I could think.
You could persist the amount time it has taken using a RecordStore. The basic idea would be to persist the time before you make a call to the platformRequest() and record what the time is within different sections of the MIDlet life cycle functions and make the best guess you can.

realtime midi input and synchronisation with audio

I have built a standalone app version of a project that until now was just a VST/audiounit. I am providing audio support via rtaudio.
I would like to add MIDI support using rtmidi but it's not clear to me how to synchronise the audio and MIDI parts.
In VST/audiounit land, I am used to MIDI events that have a timestamp indicating their offset in samples from the start of the audio block.
rtmidi provides a delta time in seconds since the previous event, but I am not sure how I should grab those events and how I can work out their time in relation to the current sample in the audio thread.
How do plugin hosts do this?
I can understand how events can be sample accurate on playback, but it's not clear how they could be sample accurate when using realtime input.
rtaudio gives me a callback function. I will run at a low block size (32 samples). I guess I will pass a pointer to an rtmidi instance as the userdata part of the callback and then call midiin->getMessage( &message ); inside the audio callback, but I am not sure if this is thread-sensible.
Many thanks for any tips you can give me
In your case, you don't need to worry about it. Your program should send the MIDI events to the plugin with a timestamp of zero as soon as they arrive. I think you have perhaps misunderstood the idea behind what it means to be "sample accurate".
As #Brad noted in his comment to your question, MIDI is indeed very slow. But that's only part of the problem... when you are working in a block-based environment, incoming MIDI events cannot be processed by the plugin until the start of a block. When computers were slower and block sizes of 512 (or god forbid, >1024) were common, this introduced a non-trivial amount of latency which results in the arrangement not sounding as "tight". Therefore sequencers came up with a clever way to get around this problem. Since the MIDI events are already known ahead of time, these events can be sent to the instrument one block early with an offset in sample frames. The plugin then receives these events at the start of the block, and knows not to start actually processing them until N samples have passed. This is what "sample accurate" means in sequencers.
However, if you are dealing with live input from a keyboard or some sort of other MIDI device, there is no way to "schedule" these events. In fact, by the time you receive them, the clock is already ticking! Therefore these events should just be sent to the plugin at the start of the very next block with an offset of 0. Sequencers such as Ableton Live, which allow a plugin to simultaneously receive both pre-sequenced and live events, simply send any live events with an offset of 0 frames.
Since you are using a very small block size, the worst-case scenario is a latency of .7ms, which isn't too bad at all. In the case of rtmidi, the timestamp does not represent an offset which you need to schedule around, but rather the time which the event was captured. But since you only intend to receive live events (you aren't writing a sequencer, are you?), you can simply pass any incoming MIDI to the plugin right away.

Resources