I have installed red5 media server via windows OS. Demo application are running perfectly. My question is, i want to stream audio from red5 media server. So please help how to store audio in red5 and play from red5 to my custom player(Jplayer) using php
JPlayer doesn't require Red5. You can just put your media in a folder that's accessible via the web and stream from there. If you feel you must use Red5 then follow their examples for the demo app - you can even use the sample folder they stream those from since it doesn't sound like you need any server-side code.
Related
The actual task is to play mp3 file that should be embedded with the electron app.
I placed the mp3 file near my electron app "main.js".
I wanted to play it using some "sound-play" (https://github.com/nomadhoc/sound-play) package using bridge api, but it didn't worked out, sound is not playing.
So I want to just play it using an audio element (which seems to be more universal), but I don't get what src url I should supply to it. And then I wonder how to correctly add an additional asset to the built version (I'm using electron-builder).
Working with extra/local resources in election apps is kind of obscure topic for me. Advice would be appreciated.
I used npm hls-server to setup an hls streaming server on my dev machine. Then i took android sample app and tried adding another video with url that of my localhost.
The video plays fine inside the sample app but i am unable to cast it to chromecast.
I tried changing mediatype and also setup cors proxy using npm package cors-anywhere.
I used ffmpeg to generate hls playlist and .ts files.
Would be great if someone can help out.
The problem was with my HLS stream. What I did was to use sample receiver app from google and attached debugger to see error code/exceptions.
Then i took another video and used ffmpeg to produce hls again which worked out quite fine.
I want to build an Adobe AIR app that will upload video files to a NodeJS server, receive a notification when the files are done being processes, and then prompt the user to download the finished video file.
Possible?
I am not asking for specific code solution, rather a general yes/no as to the feasibility of this solution.
Yes it's possible, I am using node.js for server side and flash/flex/AIR for client side, so I am confident about this.
I am developing a phonegap application for iOS and Android. In iOS, recorded sound format is .wav and in android, it's .amr/.mp3. After recording, audio file gets uploaded to a web server.
I need to convert these files from .amr/.wav to .mp3 after upload - what options do I have?
I've read about ffmpeg and ffmpeg-php. Will that work with shared hosting? If yes, can someone provide any link to install it?
Any converter, which will work with shared hosting of justHost cPanel?
any php class which can send my audio file to ffmpeg server and get back mp3 file? Just thinking.
Any other option?
I don't mind the method - I need the the required output.
I've seen a tutorial about how to play local files using a background agent in WP7 Mango but it specifically states that the tutorial is related to local files:
http://msdn.microsoft.com/en-us/library/hh202978(v=VS.92).aspx
Does a similar tutorial exist for streaming files from the web?
I've used the same tutorial, and it works fine if you want to play something like a podcast that is stored as an mp3 file on a webserver. You just set an absolute Uri in the audiotrack.
I don't think you need to use the specific Streaming project unless you are doing live streaming.