Access to live stream from Nest Cam - nest-api

I'm the developer on a comprehensive Apple TV app for the Nest called Feather (featherapp.co). One issue that I've run into is that users are expecting to be able to view the live feed of their cameras.
Has anyone determined any way of accessing the live feed of the camera? I've done quite a bit of reverse engineering but I believe the stream itself is protected by some sort of DRM.
It looks like it's an RTMP stream that takes a format like below:
rtmps://oculus387-vir.dropcam.com/nexus/[cameraid]
with some parameters
_sessionToken,_isHD,_camera.uuid,time
I've tried a number of things but I'm never really able to establish a connection to the source. I'm a little out of my depth here, as an application developer getting into the more hardcore streaming technology. Any insight would be really appreciated!

This should now be possible with Device Access.
GenerateRtspStream
Request a token to access an RTSP live stream URL.
RTSP live stream URLs cannot be shared between clients. A stream URL can only be used by one client at a time. If multiple clients want to stream from the same camera at the same time, RTSP commands must be sent for each individual client, and each individual client must use its own stream URL.
Source: https://developers.google.com/nest/device-access/traits/device/camera-live-stream#generatertspstream

There is no way to access the live stream using an API.

you can't acesses the live stream in any normal way but you may be able to get one frame per second from the old android api. I have tried this but i think they patched it or it doesn't work with the new cameras
#! /bin/bash
i=00
while [ $i -lt 300 ]
do
curl 'https://home.nest.com/dropcam/api/cameras/_your camera url etc' -H 'Cookie: YOUR_COOKIE ETC' -H 'Accept: application/json, text/javascript, */*; q=0.01' -H 'Referer: https://home.nest.com/' -H 'X-Requested-With: XMLHttpRequest' -H 'Connection: keep-alive' --compressed -o nest\ testing/$i.jpeg
let i+=1
echo $i
done
ffmpeg -r 25 -start_number 1 -f image2 -i "%04d.jpg" -vcodec png video.avi
o.avi

Related

Question for Node multithreading, media consuming and piping to HTTP response

I have an interesting problem, in short: how to share information between threads in NodeJS (12+).
The tech stack - in short also:
A remote/online streaming server, what producing an MP4 live stream
A client application what only consumes live view through RTSP over HTTP
A small NodeJS based application to get the MP4, transform it and pipe it back to the client.
.
The modules what I use:
NodeJS 12+
Request/fetch/https module
Express module
Stream module
The story:
I have an application, what has a gateway/relay role between two different system. One provide a live media stream (simple MP4(h264) stream) and another one supposed to consume it as RTSP over HTTP. The weird part is, the consumer client does not behave like any other player (like VLC or a webplayer), sometime - seemingly randomly - resend the request, sometime close the current request and resend it. So direct pipe not really working for this use-case.
I made a worker (from worker_threads), what hold a readable stream object, and when the client hit the request, I start populate the MP4 stream into the readable object in the worker, so even if the stream is does get a close or resend, it will not break the live media stream consuming process.
And wherever the client connect, I just would like to pipe the readable object for it.
Originally, I though a simple pipe from like request/fetch/http.get or FFMPEG would be enough, but the client could call the call between 3 seconds and 2 minutes.
.
So, my questions are, what could be the best solution, to pass back the data from the worker to the main and let reach the HTTP routing?
I had some idea like:
I know, I can have my own channel between the threads and can pass back-and-forth information, but waiting for message and keep up the process does block the app, as far as I know (worker.on('message', (stuff) => {});).
Using Socket.io to pass data back from the worker, populate the readable in the main, and pipe the readable at http level (fake shared object basically)
Creating a secondary http server what offer the media stream, then i will just relay this into the response (e.g.: gatewaying/proxying)
Looking up some proxy solution where I can just simply redirect and reshape thing, like the input mp4 transforms into RTSP stream and pipe it to the consumer response
Should I just "remember" to the active stream, and if its streamed by the remote server, always just using the same url, passing to FFMPEG and continue piping to the res?
Note:
I setted up all the headers to keep alive the connection, but seems the client software act as-is.
By default its using RTSP and RTP/TCP to consume video stream, but has option for RTSP over http.
Probably I overlook some trivial task for serving RTSP video from a remote live MP4, but I did not found any good example or source anywhere (everywhere the same 3 article re-shared basically)
I did not found any similar question, nor article anywhere (but checked out the nodejs ffmpeg play video at specific time and stream it to client).

Does Nest provide an API for my app to auto-detect/discover a new device being added

I created an app using the API instructions at developer.nest.com website. So the app can get a list of existing devices and control them the way I want.
But what if I want the app to do something special anytime I add a new device? Is this possible today? Has anyone done this successfully?
you can listen to updates several different ways. It depends on what you're using: Firebase, Rest Stream, Rest polling, etc.
Basically, you'll need to create a Listener in your code, and after that check on triggered events. You can listen to different levels: structure or the whole account (root level).
If you have a Rest stream, you can check out this curl command to see what's getting triggered in JSON:
1.Open console and Nest developer tool
Run this at the console (root level) with the access token* you have for your client :
curl -L -v "https://developer-api.nest.com?auth=" -H "Accept: text/event-stream”
3.Remove any thermostat from your virtual account.
4.Check the triggered events in the console: you'll have your JSON missing the data for the thermostat you just removed.
5.Now go one level up to the structures: curl -L -v "https://developer-api.nest.com/structures/?auth=" -H "Accept: text/event-stream”
Repeat the same procedure with the virtual devices.
*To get access token for your client go here: https://developer.nest.com/documentation/cloud/how-to-auth
In case you want to check this out on a code level, you can find it in the Nest Sample Code. All of the samples have Listeners which allow you to see any changes happening to the thermostats in the account/structure.
P.S. It's better to use Firebase or Rest Stream if you want those events to be triggered real-time.

NodeJS piping with ffmpeg

I wanted to do a HTTP live stream on a screen cast with using ffmpeg, nodejs and html5 . I wanted it to be as real time as possible. However, I find that my video received by the client was behind by 1~2 seconds (On Chrome/Chromium). I am using vp8/webm as my codec.
I have eliminated the following factors as such:
1) Network: I have tried serving and receiving the video file locally by stating the video source to be 127.0.0.1:PORT or localhost:PORT
2) ffmpeg encoding speed:I have tried outputting the file locally, it the "delay" seems to be negligible.
3) Chrome internal buffer. The buffer was accounted to be 0.07s~0.08s.
On the nodeJS side, I have a child process that runs the ffmpeg command, and did a ffmpeg.stdout.pipe(res); <-- ffmpeg is child_process.spawn(...)
So it seems that the ffmpeg.std.pipe(res) of nodejs seems to be the one delaying the video stream. Am I correct in assuming so ? Is there anyway that I may reduce the delay ?
Go to WebRTC no need to implement any thing like codec,pipe,etc(already in chrome,opera,firefox)
Uses:
MediaCaptureAPI(access your cam and mic and convert object to URL, default they are using vp8 codec,etc)
RTCPeerconnectionAPI(send and receive media stream p2p)
RTCDatachannelAPI(send and receive data using p2p)

How to take Node.js stream and send it to form endpoint (dropbox)

I have a readable stream (s3) that I would like to pipe to dropbox's put endpoint, however this does not support Transfer-Encoding: chunked required for streaming data.
I see 2 possible solutions
read the stream into a variable of some sort then send up, memory can then be a problem
write the stream to disk then read it back and upload, which feels dirty and will be slow
What is the best solution to this problem?

how to create a RTSP streaming server

So I am trying to create a RTSP server that streams music.
I do not understand how the server plays a music and different requests get what ever is playing at that time.
so, to organize my questions:
1) how does the server play a music file?
2) how does the request to the server look like to get whats currently playing?
3) what does the response look like to get the music playing in the client that requested the music?
First: READ THIS (RTSP), and THEN READ THIS (SDP), and then READ THIS (RTP). Then you can ask more sensible questions.
It doesn't, server streams little parts of the audio data to the client, telling it when each part is to be played.
There is no such request. If you want, you can have URL for live streaming, and in RTSP DESCRIBE request, tell the client what is currently on.
Read the first (RTSP) document, all is there! Answer to your question is this:
RTSP/1.0 200 OK
CSeq: 3
Session: 123456
Range: npt=now-
RTP-Info: url=trackID=1;seq=987654
But to get the music playing you will have to do a lot more to initiate a streaming session.
You should first be clear about what is RTSP and RTP. The Real Time Streaming Protocol (RTSP) is a network control protocol designed for use in communications systems to control streaming media servers. where as Most RTSP servers use the Real-time Transport Protocol (RTP) for media stream delivery. RTP uses UDP to deliver the Packet Stream. try to Understanding these concepts.
then Have a look at this project.
http://sourceforge.net/projects/unvedu/
This a open source project developed by our university, which is used to stream video(MKV) and audio file over UDP.
You can also find a .Net Implementation of RTP and RTSP here # https://net7mma.codeplex.com/ which includes a RTSP Client and Server implementation and many other useful utilities e.g. implementations of many popular Digital Media Container Formats.
The solution has a modular design and better performance than ffmpeg or libav at the current time.

Resources