How to stream and play shoutcast radio server in j2me MIDP(java)? - java-me

i have tried String url = "38.101.195.5:9156"
facing java.io.IOException: 11-Error in HTTP operation
code here
connection = (HttpConnection) Connector.open(url);
dataIn = connection.openDataInputStream();

If the device have support to the JSR 234 - Advanced Multimedia Supplements, then you can use your API: http://jcp.org/en/jsr/detail?id=234
The package "javax.microedition.amms.control.tuner" Description
This package contains Controls for various tuner settings. These Controls, if they are supported, can typically be fetched from a radio Player (for example a Player created by Manager.createPlayer("capture://radio");).
See more in
public interface TunerControl

Related

How to use Azure Media Service Live Stream with Apple AirPlay

I am using Azure Media Service to live stream an event. My source is a HD SDI feed being captured via an AJA Kona LHi card and sent to Azure using Wirecast.
I'm using the default settings in Wirecast for Azure Media Service. I am using standard encoding and the 720p preset. I am using the Azure Media Player embedded in a private site.
Everything seems to work however iOS devices are unable to AirPlay the stream. The video plays correctly on the device but no AirPlay controls are available.
If I use the exact same setup to stream my webcam the AirPlay controls are available. Is there some specific configuration required to make this work reliably?
On iPad, Azure Media Player by default streams the videos in DASH. You need to specify the techorder to use give HLS a higher priority, e.g.:
var myOptions = {
techOrder: ["html5", "azureHtml5JS", "flashSS"],
autoplay: false,
controls: true,
width: "640",
height: "400",
};
var myPlayer = amp("azuremediaplayer", myOptions);
If you don't see the AirPlay button on the video player, you can do a swipe down gesture on the upper right region of the screen and bring up the "Quick controls" to access AirPlay:
If this still doesn't work, reply with the url of your webpage and I can take a look.

Windows IoT Core RaspPi Walkie Talkie

I am trying to use a few Rasp Pi Board for a local network audio broadcast with one RaspPi as main(broadcast) and the rest as slave (receiver) like walkie talkie.
I have looked up for example such as [WebCam App][1] but it seems like the audio is recorded first before playback.
Is there any sample I could refer to for my application where the audio input is captured and live stream to the respective slave device.
Thanks.
There is a sample of recording and handling and sending the live stream of the WebCam. You can reference it and modify it for your use case.
HttpWebcamLiveStream
It also uses MediaCapture API. You can set to audio recording, for example, like this:
mediaCapture = new MediaCapture();
var settings = new Windows.Media.Capture.MediaCaptureInitializationSettings();
settings.StreamingCaptureMode = Windows.Media.Capture.StreamingCaptureMode.Audio;
settings.MediaCategory = Windows.Media.Capture.MediaCategory.Communications;
settings.AudioProcessing = Windows.Media.AudioProcessing.Raw;
await mediaCapture.InitializeAsync(settings);
This sample also shows how to implement an HTTP server so that it can send the live stream data to other clients.

Switch audio output to speakers

I work on own specialized VoIP client for W10 mobile & desktop. Basic things work ok.
However I cannot get audio output to speakers on my old Lumia.
foreach (var item in (await DeviceInformation.FindAllAsync(DeviceClass.AudioRender)).Where(i => i.Name.Contains("Speakers")))
RendererId = item.Id;
There is "Speakers (WaveRT)" in device list so RendererId is valid.
Later application tries to open audio device (WSAPI) with found RendererId. But anyway phone plays to receiver only.
I modified Voip sample app in attempt to reproduce issue - yes, it happens with Voip sample app also.
My collegue confirms he has same problem on his phone.
Is it possible to play audio via speaker for voip app ?
Thank you!
On the Phone devices only, you can use the AudioRoutingManager to change the audio output.
// to get the audio manager
IsAudioRoutingSupported = ApiInformation.IsApiContractPresent(typeof(PhoneContract).FullName, 1);
if(IsAudioRoutingSupported)
{
// audio routing is supported, we register for the output change events
m_audioRoutingManager = AudioRoutingManager.GetDefault();
m_audioRoutingManager.AudioEndpointChanged += OnAudioEndpointChanged;
}
// to change to output
m_audioRoutingManager.SetAudioEndpoint(AudioRoutingEndpoint.Speakerphone);

Playing Smoothstreaming URL by providing Manifest file of smoothstreaming to Chromecast device

We want to play Smoothstreaming URL by providing Manifest file of smoothstreaming to Chromecast device.
We could play the following on Chromecast device,
1. .mp4 file
2. .ismv file
3. .isma file.
But, if we provide an Manifest file as follows, we are not able to play on Chromecast device.
http://playready.directtaps.net/smoothstreaming/SSWSS720H264/SuperSpeedway_720.ism/Manifest
Please let us know how to play Smoothstreaming URL on Chromecast device.
Or do we need to play .ismv files one by one by providing them in a loop.
The Chromecast has support for SmoothStreaming content through their Media Player Library: https://developers.google.com/cast/docs/player
Below is a bare bones implementation.
Google provides a proper example on GitHub which takes advantage of the MediaManager and accounts for other streaming formats: https://github.com/googlecast/CastMediaPlayerStreamingDRM)
var $mediaElement = $('<video>').attr('autoplay', ''),
mediaElement = $mediaElement[0],
mediaUrl = "http://playready.directtaps.net/smoothstreaming/SSWSS720H264/SuperSpeedway_720.ism/Manifest",
mediaHost,
mediaPlayer;
cast.receiver.CastReceiverManager.getInstance().start();
$('body').append(mediaElement);
mediaHost = new cast.player.api.Host({
mediaElement: mediaElement,
url: mediaUrl
});
var protocol = cast.player.api.CreateSmoothStreamingProtocol(mediaHost);
mediaPlayer = new cast.player.api.Player(mediaHost);
mediaPlayer.load(protocol);
Microsoft's test files (inc the ISM) do not return the CORS header required for the chromecast. use a CORS on all of your server and it shall work.
I've encountered this too, and its working if I host them myself with CORS

Monotouch video streaming with MPMoviePlayerController

I am creating a Monotouch iPhone app that will display streaming videos. I have been able to get the MPMoviePlayerController working with a local file (NSUrl FromFile), but have not been able to get videos streamed from a media server.
Here is the code I am using to play the video:
string url = #"http://testhost.com/test.mp4";
var nsurl = NSUrl.FromString(url);
mp = new MPMoviePlayerController(nsurl);
mp.SourceType = MPMovieSourceType.Streaming;
//enable AirPlay
//mp.AllowsAirPlay = true;
//Add the MPMoviePlayerController View
this.View.AddSubview(mp.View);
//set the view to be full screen and show animated
mp.SetFullscreen(true, true);
//MPMoviePlayer must be set to PrepareToPlay before playback
mp.PrepareToPlay();
//Play Movie
mp.Play();
Is there something else in implementing the MPMoviePlayerController for video streaming that I am missing? I also read that videos for iOS should be streamed using Apple's HTTP Live Streaming on the media server, is that a requirement? I have never worked with video streaming on an iOS device before so I am not sure if there is something lacking in my code or the media server setup or a combination.
I'm pretty sure you need a streaming server to use a video file from an HTTP url. However it's a requirement for applications (on the AppStore) to do so:
Important iPhone and iPad apps that send large amounts of audio or
video data over cellular networks are required to use HTTP Live
Streaming. See “Requirements for Apps.”
The good news is that your existing code should not have to be changed to handle this (it's server, not client side).

Resources