Is Instagram using x265 as an encoding format to further compress and reformat the uploaded videos for mobile devices?
I tried doing ffprobe on the video downloaded (after sniffing the https traffic and getting the url from the GET body), but it turned out to be x264. Not sure if at all they are using x265 as an encoding format.
Any help would be appreciated in this regard, if not instagram then any other video streaming social sites which are doing the same?
Related
I want to play RTSP stream from ip video cameras (MP4, H264) on my intranet web page, I use React. I have 12 cameras and NVR.
I did not find a way to do this without an intermediate server (Webrtc is not suitable), that spends resources on transcoding h264 stream to the mjpeg.
If I set a high resolution and quality of the stream, then a lot of resources are spent on transcoding, and most importantly, the streaming of mjpeg images takes a lot of traffic.
Is there a way or solution to stream from the ip camera directly to the web page so that the decoding is on the user's webbrowser side.
This will free the intermediate server from a heavy load for big streams.
It is necessary that the playback work on mobile phones.
Thanks for the answer.
There is no way to stream RTSP camera's H264 video directly to web browser.
But cameras support outputting still jpeg images - you can create a webpage that will display such an image from a camera every 200ms or so.
If you are not happy with the above solution, you must use a media server in between, which will pull RTSP stream from the camera and will convert it to some protocol that browser understands. You are mistaken in one thing: no video transcoding is involved. I don't know why WebRTC is not an option for you, but most media servers will offer 4 types of output:
Low latency:
WebRTC
Websockets to MSE
High latency:
HLS
MPEG-Dash
All these methods do NOT require transcoding of your original H264 video, encoded by RTSP camera/NVR. Some media servers you can use: Unreal Media Server, Wowza, Janus.
Live demo: http://www.umediaserver.net/umediaserver/demos.html
No browser has native RTSP support, so if you want decoding to happen on the end user side, then you'll have to write your own custom web player.
You can start by looking at the open-source solution like this one:
git://github.com/Streamedian/html5_rtsp_player.git
It works on PC and Android, but didn't work with iPhone for me (but you can try it for yourself https://streamedian.com/demonstration/ maybe it's just my issue), but maybe you can find better alternative or fork it and make it work on all devices.
It still requires a middle-man proxy server though because it uses a websocket tech to work, but since it doesn't do any video converting or decoding, it don't suppose to take any resources at all.
I have a content creation site I am building and im confused on audio and video.
If I have a content creators audio or video stored in s3 and then I want to display their file will the html video player or audio player stream the media or will it download it fully then play it?
I ask because what if the video or audio is significantly long. like 2 hours for example. I need to know how to solve the use case.
Lastly what file type is most acceptable for viewing on webpages? It seems like MPEG-4 is the best bet. Is that true?
Most video player clients and browsers will attempt to stream the video if they can.
For an mp4 video file hosted on a server, so long as the header is at the start and the server accepts range requests, this will mean the player downloads the video in chunks and starts playing as soon as it has enough to decide the first frames.
For more professional streaming services, they will generally use an adaptive bit rate streaming protocol like DASH or HLS (see this answer: https://stackoverflow.com/a/42365034/334402) and again the video will be streamed in chunks, or segments, and will start playing while it is streaming.
To answer your last question you need to be aware that the raw video is encoded (e.g. h.264, VP9 etc) and the video, audio, subtitle etc tracks stored in a video container (e.g. mp4, Web etc).
The most common format is probaly h.264 encoded and mp4 containers at this time.
The particular profile for h.264 can matter also depending on the device - baseline is probably the most supported profile at this time. You can find examples of media support for different devices online, e.g. for Android: https://developer.android.com/guide/topics/media/media-formats
#Mick's answer is spot on. I'll just add that mp4 (with h264 encoding) will work in just about every browser out there.
The issue with mp4 files (especially with a 2 hour long movie) isn't so much the seeking & streaming. If your creator creates a 4K video - thats what you'll deliver to everyone (even mobile phones). HLS streaming on the other hand has adaptive bitrates - where the video adapts to both the screen & the available network speeds. You'll get better playback results with less buffering (and if you're using AWS - a LOT LESS data egress) with video streaming.
(there are a bunch of APIs and services that can help you do this - including api.video (where I work), Mux and others).
I am new to media streaming, just started learning about adaptive streaming.
I have few queries, please clarify -
Does MSE support only DASH streaming, I mean if any website using DASH and my browser supports MSE with DASH, it will play. But if a website uses HLS, then my browser is not playing video content although it has MSE.
Is it because MSE does not support HLS, or my browser MSE does not have implementation of HLS?
If I inspect a webpage playing video stream, I checked many sites uses video tag with "src" attribute as blob. Does blob means it is using MSE.
Can we have blob in "src" attribute for DASH(I checked in Youtube) and for HLS(as in dailymotion or twitch.tv) as well?
I was reading few articles on twitch.tv, does twitch.tv only support HLS with html5 player or flash? If suppose a browser does not support flash and HLS through html5 player, then there is no way to play twitch.tv content on browser?
Thanks
MediaSource Extensions (MSE) supports anything you can de-mux in JavaScript and send to the browser's native codecs. Browsers don't support DASH natively. Some browsers support HLS natively but most don't. It is possible to use both DASH and HLS in browsers that support MSE with the correct JavaScript library for handling each.
The blob you see could be a regular blob (an immutable chunk of binary), but more than likely it's coming from MSE.
I can't speak to what Twitch does internally.
Your questions don't really make sense as they are asked, so I can't answer the 1,2,3. But I can clear up some of your confusion. HLS and DASH are a collection of technologies, not single competing technologies. Most HTTPS streaming protocols are made up of a binary video format, and a text based manifest format. DASH uses an overly complex XML manifest format with a fragmented MP4 video format. HLS uses an m3u8 manifest, with fragmented Transport stream for the video format. As of IOS 10 HLS also supports fragmented MP4. MSE can play fragmented MP4. But browsers don't read manifests. Hence a player application must be used to download and parse the manifest, download the video fragments, then give them to the browser to play. Twitch uses HLS with transport streams, but runs custom software in the browser to convert them to MP4 fragments. (Or flv streams in the case of flash). When you see a src with a blob, that is a normal (not fragmented) MP4, and is completely different. Safari is an exception, it can play HLS using an m3u8 manifest as the source.
I have a common use case scenario where I want to do the following
Upload an audio file. (wav/mp3)
Transcodes to 128k or 192k mp3.
Stores the audio asset.
Allows the audio asset to be streamed.
Supports streaming actions such as play pause and seek.
The documentation for azure media services seems like it might be able to support this but I am not too sure, seems like they focus on video content. Anyone have experience with this?
You can manage audio and encode audio only assets with azure media services.
WAV is supported input format/conatiner as a input asset. To see full list of supported formats check following link:
https://azure.microsoft.com/en-us/documentation/articles/media-services-media-encoder-standard-formats/
Check https://github.com/Azure/azure-content/blob/master/articles/media-services/media-services-custom-mes-presets-with-dotnet.md#audio_only to see audio only preset options which you will use to encode an audio only preset.
I have every stream type enabled on my wowza server.
a week ago i had it working in the wowza test players
now the only one that works is the RTSP in VLC
now every stream just shows a black screen.
If i try to access the m3u8 via Safari browser i can hear the audio but no visual
Any assistance on this would be a major help.
this is due to incompatible video, audio codec. you need to user right encoder or you can use wowza transcoder to transcode your stream (h.264 , AAC) . it will resolve your playback problem. if you are using flash media live encoder you can select h.264 encoding method, and audio codec should be aac or mp3.
Some quick troubleshooting steps:
What source encoder are you using? Can you playback the source URL on VLC as well? What codec info is displayed in VLC for the source stream? It should be one of the supported codecs for Wowza.
What codec info is displayed in VLC for the RTSP playback link generated by your Wowza server?
Do you see any errors/warnings in the access log when you publish and playback your HLS (m3u8) stream?
Usually, this kind of error is due to an incorrect video codec, or an encoding setting, or network saturation where the video packets are not coming through correctly. You can check what error or warning messages are being generated by tailing the access log found in logs/ folder of your Wowza installation directory.
Hope this helps.