Hope someone can help out here. I have been trying out the IIS Smooth Streaming for many weeks. But without much success.
On-demand Smooth Streaming
No problems with streaming on-demand clips on LANNo problems with streaming on-demand clips across the web
Live streaming using file source
No problems with live streaming a file on LANCannot live stream a file across the webInitiated a publishing point using AWS-EC2Connected Encoder Pro to publishing pointPublishing point never gets past "Starting"
Live streaming using live webcam
Slight problem with live streaming from my webcam on LAN10 seconds lagAfter like 20 seconds, silverlight client hangs and stops requesting for chunksHTTP 412 - precondition failedOnly way to rectify is to refresh the browser
Cannot live stream from webcam across the webInitiated a publish point using AWS-EC2Connected Encoder Pro to publishing pointPublishing point never gets past "Starting"
Things I have tried to rectify network problems
Connecting my laptop directly to the gateway, rather than through a routerShutting down Windows firewall on laptopInitiating a AWS-EC2 with no firewallWireShark indicates HTTP404 and HTTP501 error when "connecting" to the publishing point from the encoder
My LAN specs
Running Encoder, and IIS Streaming Server on Boot Camp MacBook Pro, i7, 2.2GHz, 4GB RAMRunning Silverlight Client on i5, 2.53GHz, 4GB RAMOutput Stream: Default configurations for H.264 IIS Smooth Streaming Low Bandwidth
To test streaming stuff you really need to use separate PCs as the streamer/encoder, transport server and client. Or at least start off that way. You are asking a bit too much out of that macbook pro there, especially when it comes to I/O.
Related
I am developing the framework for Image processing using the Deep learning neural network YOLOv7. I have 20000 live data stream images per second, which is produced by the real-time machine. I want to handle these 20000Frames in minimal time using the current GPU architecture. Is this possible for I can increase the processing FPS using any open-source tools-techniques? multithreading or Kafka Concept.
Any suggestion or idea will be appreciated.
Note: I am using the NGINX server and docker-Kubernetes combo.
I'm playing around with Azure MediaServices and the live streaming feature. I'm using OBS as my streaming source and just trying to stream my desktop from my laptop and view it on my desktop machine.
It all works fine but there is a tremendous lag time (north of 30s). That's not really a "live" stream. I tried creating my live event with "low latency" checked to see if that would improve the lag time and it doesn't appear to have done anything.
I'm just doing a simple pass-through so no encoding on the server or anything. Is there something else I can do to improve the lag time besides the low latency toggle?
There's good info on https://learn.microsoft.com/en-us/azure/media-services/latest/live-event-latency that talks about the low latency feature on the live event as well as a setting you set on the player. It also discusses the expected latencies.
I have a video on demand platform hosted in a server with 1GB Bandwidth / 11GB RAM / 4 CPU Cores.
I'm reaching now 10.000 users/day and the stream start being slow.
I'm using nodeJS on server-side with send module for streaming.
I want to provide the best streaming experience for my users, so what I need to do to make the stream faster.
I'm trying to create a media streaming server which will stream images captured from camera to connected javascript/html clients.
Currently, I have developed a windows service which captures images and sends it to multiple clients through continuously polling, however, it lags in performance. For example, it congests the network with too much traffic, and creates delay in streams.
The service is running on Hyper V VM with 6 cores and 8 Gb memory.
Where can I find the lag? Any suggestion?
You must implement a queue or use something like SignalR. Looks for something like reduce to send traffic in network like ziping or cache
Look for Hangfire.io its good too
I have created a channel on Azure Media Services, I correctly configured it as an RTMP channel and streamed a live video with Android + FFMpeg libraries.
The problem is the client end-point latency.
I need a maximum latency ~2 seconds, but now I have about ~25 seconds!
I'm using Azure Media Player in a browser page to stream the content.
Do you know a configuration of the client/channel which can reduce the latency?
Thanks
As you pointed , there are few factors which affects latency.
Total delay time =
time to push video from client to server
server processing time
latency for delivering content from server to client.
Check https://azure.microsoft.com/en-us/documentation/articles/media-services-manage-origins/#scale_streaming_endpoints to see how you can minimize #3 mentioned above by configuring cdn and scaling streaming end units.
Given these 3 components, i don't think at this stage you will be able archive less than 2 seconds end to end delay globally from Android client to browser client.
Easiest way to check latency is ffplay --fflags nobuffer rtmp:///app/stream_name
As I did in this video https://www.youtube.com/watch?v=Ci646LELNBY
Then if there's no latency by ffplay, it's the player that introduce latency