i wanted to create a project which :
a) Will have a list of MP3 file with each of their length.
b) Now i want to stream them
c) Using the timer the sound track will be new one after sudden interval.
d) and it should be accessible in the form like http://localhost:1024/livemusic
Like we find in live streaming web sites.if the song is accessible in URL form i guess it will be accessible via mobile phone too.
what would be the best way to do so.
You can use an open source or commercial streaming platform
http://icecast.org
http://ampache.org
http://www.videolan.org
https://obsproject.com
Related
Hello developers community,
Currently I have task to create walkie talkie app,
I am using React.js,socket.io(for real time communication) & express js.
I am not able to play audio continuously which arrives from socket listener.
specifically in IOS safari.
I can play audio with static url(base64 data url) but not with dynamic base64.
Is there any way to contionuosly pass & play the audio.
I am free to adopt any other framework or protocol also. just need some guidence for create this type of application.
I tried Audio() api and also set UX flow to get user activity in website so that browser allows Audio() to play.
I have a video and a list of timestamps. The timestamps are not defined as an asset filter in Azure Media Service itself.
Is there a way to load the video from Azure Media Service with a specific start time without defining asset filters? Maybe something similar with what you have in youtube when you have a link that starts at a specific second?
Not specifically supported in AMS to alter the URL without using filters, but you could very simply do that client side. Most client side libraries will let you seek.
Have you tried doing that in Javascript already? AMP may not be able to do it, but I have seen video.js and Shaka player implementations that take the t= value from the query string and just set the player.currentPosition.
I need to create a provider in my #ionic-2 app with RSS reader functionality.
I have some website with RSS feed (XML format) but I can return data in JSON format as well.
Now I'm thinking which concept is the best to create mobile app with news from this website.
Below are some ideas but I don't know whether they are possible to create.
Reading news
a) when user opens app then app reads the news from website
b) using Push notifications:
create nodejs / express app which checks the news on website and using some platform (1) sends notifications about last added news
using cron and PHP script checks the news in database (MySQL) and using some platform (1) sends notifications about last added news
maybe is there other solution?
*(1) - some platform - I don't know which and how - maybe someone have good experience with any?
Storing news
a) news should be store in database (PouchDB)?
b) if yes, they should be removed after some period of time?
c) if no, is it efficient to get news from website every time when app opens?
d) maybe is there other solution?
This app will be run on iOS, Android and Windows Phone.
If someone can help me with some of these issues I would be grateful :)
Thanks
All those ideas are implementable.
b) using Push notifications: Create an Azure Mobile app. Go through the videos until part 7 and the final one which will be uploaded tonight or tomorrow.
For everything else you asked, start writing code first. As I said, they are all implementable. How ? There is no single answer for that in my view.
i am new baby in WebRTC and want to implement system like video conferencing , live streaming or you can skype using WebRTC and NodeJS.
i am confused with one thing , as its our one of client's requirement , suppose on page whatever is happening it may be video conferencing say one moderator answering to many audiences one by one , so there should be one video created , which continuously recording all this stuff together and sending live stream to server to save in our database.
is this kind of stuff implementable or not?
any help please.
You can capture video through grabbing Jpeg images from a canvas element. You could also capture the entire page(if numerous videos in the same page) through grabbing the page itself through chrome.
For audio, recording remote audio with the Audio API is still an issue but locally grabbed audio is not an issue.
RecordRTC and my Modified Version for recording streams either to file or through websockets respectively.
Capture a page to a stream on how to record or screenShare an entire page of Chrome.
If you have multiple different videos not all in the same page but want to combine them all, I would suggest recording them as above and then combining and syncing them up server side(not in javascript but probably in C or C++).
If you MUST record remote audio, then I would suggest that you have those particular pages send their audio data over websockets themselves so that you can sync them up with their video and with the other sessions.
Is it possible to send video to the chromecast device from a native application? It would be nice to share any window on a system instead of only chrome tabs. Also, is there any documentation of the communication used by chrome to communicate with the chromecast? It is my understanding that the chromecast essentially loads content from an embedded chrome instance, but there appears to be more direct ways of communicating with the device since it is able to stream content from a chrome tab using the extension.
You need to whitelist your receiver device if you are developing a receiver application. That would be a Chome app that runs on the receiver's Chrome instance.
You need to whitelist a sender url if you are developing a Chrome app that will cast it's contents.
Video casting works by sending a url to the receiver device, which the device will load directly.
Tab casting works by encoding the tab contents using WebM/Opus (in the Chrome cast extension) and streaming that to the receiver device. (This is limited to 720p, see this question)
Chrome apps can only use Video casting.
The chrome cast extension is going to be the only way to stream directly to the device.
So the answer to your question is no, you cannot stream video directly to the device. The receiver must load the video from the url you provide.
There is some speculation whether the receiver can be provided with a local url or if it must already be available on the internet. This has yet to be clarified.
From how I understand the Chromecast architecture:
You can display any URL you want on the TV (you have to whitelist your app and register the URL first). It must be a URL. This can include HTML, JS, CSS, etc. Anything that is already on the internet.
To receive data from a device (say, the URL of a video to load), you must implement logic to interpret messages from channels. These messages are encoded as JSON, which makes it difficult to send videos or pictures (binary data). It is obviously easiest to upload things like this to some website, and have the receiver display them.
People have asked, "well, then how does the tab/screen sharing work?" The JSON encoding is just what Google provides in their SDK. In their own source, they don't have this restriction.
Update:
It turns out you can actually stream local videos to your TV by just opening the local file in Chrome, and then casting that to your TV.