using libstreaming to get the thumbnail of stream being published - rtsp

Hy everyone
i am using libstreaming in my project and it works great for publishing stream from android device to Wowza server, now the issue is that i need to get thumbnail of the stream being published to the server.
For this purpose, i guess i need to grab the first frame of the stream being published, but how do i do that???
the examples mentioned here doesn't show anything related to this.
any help in this regard will be highly appreciated
thanks in advance.....

#khurramengr,
There are two ways
1) You can write a custom module in wowza to record the live stream and then use FFMPEG command to take snapshot of the file.
Refer: http://www.wowza.com/forums/showthread.php?577-Custom-module-to-create-single-frame-snapshots-of-live-and-VOD-stream
2) Enable recording option in Wowza media engine-> live. So automatically every stream recorded under content folder. You can use FFMPEG to generate thumbnail for available recorded mp4 in content folder.
I tried and both working, let me know in-case of doubts.
~Manikandan Chandran

It is very late, but I hope to help other people who come later.
I think the solution is using Wowza Transcoder: https://www.wowza.com/forums/content.php?307-How-to-get-thumbnail-images-from-Wowza-Transcoder-with-an-HTTP-Provider
Have a look at function onGrabFrame(TranscoderNativeVideoFrame videoFrame), the image is given:
public void onGrabFrame(TranscoderNativeVideoFrame videoFrame)
{
BufferedImage image = TranscoderStreamUtils.nativeImageToBufferedImage(videoFrame);
if (image != null)
{
getLogger().info("ModuleTestTranscoderFrameGrab#GrabResult.onGrabFrame: "+image.getWidth()+"x"+image.getHeight());
String storageDir = appInstance.getStreamStoragePath();
File pngFile = new File(storageDir+"/thumbnail.png");
File jpgFile = new File(storageDir+"/thumbnail.jpg");
try
{
if (pngFile.exists())
pngFile.delete();
ImageIO.write(image, "png", pngFile);
getLogger().info("ModuleTestTranscoderFrameGrab#GrabResult.onGrabFrame: Save image: "+pngFile);
}
catch(Exception e)
{
getLogger().error("ModuleTestTranscoderFrameGrab.grabFrame: File write error: "+pngFile);
}
try
{
if (jpgFile.exists())
jpgFile.delete();
ImageIO.write(image, "jpg", jpgFile);
getLogger().info("ModuleTestTranscoderFrameGrab#GrabResult.onGrabFrame: Save image: "+jpgFile);
}
catch(Exception e)
{
getLogger().error("ModuleTestTranscoderFrameGrab.grabFrame: File write error: "+jpgFile);
}
}
}
Regards,

Related

How to encode an empty audio track for Azure Media Services v3?

I have a site where users can upload videos to be encoded and viewed in azure media player. Some of the videos uploaded do not have audio tracks which azure media player can't play. How can I encode an empty audio track with these videos? I'm using v3 of the REST api.
My current code for transforms is:
private async Task<string> CreateTransformAsync(string transform)
{
JObject body = new JObject(
new JProperty("properties",
new JObject(
new JProperty("description", "Basic Transform using an Adaptive Streaming encoding preset from the libray of built-in Standard Encoder presets"),
new JProperty("outputs",
new JArray(
new JObject(
new JProperty("onError", "StopProcessingJob"),
new JProperty("relativePriority", "Normal"),
new JProperty("preset",
new JObject(
new JProperty("#odata.type", "#Microsoft.Media.BuiltInStandardEncoderPreset"),
new JProperty("presetName", "H264MultipleBitrate720p")
)
)
)
)
)
)
)
);
var jsonBody = new StringContent(body.ToString(), Encoding.UTF8, "application/json");
HttpResponseMessage responseMsg = await _httpClient.PutAsync($"subscriptions/{_config.Value.SubscriptionId}/resourceGroups/{_config.Value.ResourceGroup}/providers/Microsoft.Media/mediaServices/{_config.Value.MediaAccountName}/transforms/{transform}/?api-version={_config.Value.ApiVersion}", jsonBody);
string responseContent = await responseMsg.Content.ReadAsStringAsync();
var response = JObject.Parse(responseContent);
if (response["error"] == null)
{
return response["name"].ToString();
} else
{
throw new Exception(response["error"].ToString());
}
}
UPDATE:
After scouring the documentation, I've gotten a little further with this: https://learn.microsoft.com/en-us/azure/media-services/latest/custom-preset-rest-howto#define-a-custom-preset
I now define a custom preset, read it in and send that in the body instead. Problem now is I can't find a similiar option for "condition": "InsertSilenceIfNoAudio" like in v2 of the API. I've opened a github issue about it here: https://github.com/MicrosoftDocs/azure-docs/issues/28133
What's your target encoding settings? Do you need a custom preset?
If not, and you just need a standard adaptive streaming profile preset, you can use the
AdaptiveStreaming preset. It handles the insert silence.
It's not finally announced, but as we tested for our project, Azure Media Player got complete support of Video Only content, starting with version 2.3.0 (April 30, 2019).
Officially there is a mention in feature list that the feature is already implemented ("Video Only" feature with comment "Supported in AzureHtml5JS", here) and it's said in Change list of 2.3.0 release that "Added support for video-only assets for DASH" (here), but we personally tested for SMOOTH and HLS as well - no issues, so video-only assets start playing without any issues starting with version 2.3.0.
At the same time the issue is still mentioned in Known Issues: "Assets that are audio or video only will not play back via the AzureHtml5JS tech.", but I guess they just didn't update the docs. Another option, probably they didn't test it completely, but as I say from our internal testing it looks like it completely works.

Best approach for playing a single tone audio file?

Can anyone tell me the best approach to playing single-tone, audio (.mp3) files in a Windows Phone 8 app? Think of a piano app, where each key would represent a button, and each button would play a different tone.
I'm looking for the most efficient way to go about this - I've got 8 different buttons that need to play a different tone when tapped.
I tried using the MediaElement:
MediaElement me;
public MainPage()
{
InitializeComponent();
me = new MediaElement();
me.AutoPlay = false;
me.Source = new Uri("/Sounds/Sound1.mp3", UriKind.Relative);
btnPlay.Click += btnPlay_Click;
}
private void btnPlay_Click(object sender, EventArgs e)
{
me.Play();
}
But nothing happens, either in the emulator or on a device (testing w/ a Lumia 822). Am I doing something wrong here? It seems like it should be pretty simple. Or would using MediaElement even be the best thing to use for my scenario?
Would this fall under the Background Audio category? I've read through this example but it seems overkill for what I want to do.
I've also read about using XNA's SoundEffect to do the job, but then I'd have to convert my .mp3 files to .wav (which isn't necessarily a problem, but I'd rather not go through that if I don't need to).
Can anyone tell me either what I'm doing wrong in my example above or guide me to a better solution for playing quick <1s audio tones?
I had this problem before with MediaElement not playing audio files. After many attempts I found out that it only plays if it defined in the xaml and AutoPlay is set to true.
Try defining it in the xaml or you can just add it to your LayoutRoot.
var me = new MediaElement();
LayoutRoot.Children.Add(me);
me.AutoPlay = true;
me.Source = new Uri("Sound/1.mp3", UriKind.Relative);
I have had good luck just doing this piece of code in my app. But it may not work as well in your context, give it a whirl though.
mediaElement.Source = new Uri("/Audio/" + songID.ToString() + ".mp3", UriKind.Relative);
mediaElement.Play();

Blackberry Audio Recording Sample Code

Does anyone know of a good repository to get sample code for the BlackBerry? Specifically, samples that will help me learn the mechanics of recording audio, possibly even sampling it and doing some on the fly signal processing on it?
I'd like to read incoming audio, sample by sample if need be, then process it to produce a desired result, in this case a visualizer.
RIM API contains JSR 135 Java Mobile Media API for handling audio & video content.
You correct about mess on BB Knowledge Base. The only way is browse it, hoping they'll not going to change site map again.
It's Developers->Resources->Knowledge Base->Java API's&Samples->Audio&Video
Audio Recording
Basically it's simple to record audio:
create Player with correct audio encoding
get RecordControl
start recording
stop recording
Links:
RIM 4.6.0 API ref: Package javax.microedition.media
How To - Record Audio on a BlackBerry smartphone
How To - Play audio in an application
How To - Support streaming audio to the media application
How To - Specify Audio Path Routing
How To - Obtain the media playback time from a media application
What Is - Supported audio formats
What Is - Media application error codes
Audio Record Sample
Thread with Player, RecordControl and resources is declared:
final class VoiceNotesRecorderThread extends Thread{
private Player _player;
private RecordControl _rcontrol;
private ByteArrayOutputStream _output;
private byte _data[];
VoiceNotesRecorderThread() {}
private int getSize(){
return (_output != null ? _output.size() : 0);
}
private byte[] getVoiceNote(){
return _data;
}
}
On Thread.run() audio recording is started:
public void run() {
try {
// Create a Player that captures live audio.
_player = Manager.createPlayer("capture://audio");
_player.realize();
// Get the RecordControl, set the record stream,
_rcontrol = (RecordControl)_player.getControl("RecordControl");
//Create a ByteArrayOutputStream to capture the audio stream.
_output = new ByteArrayOutputStream();
_rcontrol.setRecordStream(_output);
_rcontrol.startRecord();
_player.start();
} catch (final Exception e) {
UiApplication.getUiApplication().invokeAndWait(new Runnable() {
public void run() {
Dialog.inform(e.toString());
}
});
}
}
And on thread.stop() recording is stopped:
public void stop() {
try {
//Stop recording, capture data from the OutputStream,
//close the OutputStream and player.
_rcontrol.commit();
_data = _output.toByteArray();
_output.close();
_player.close();
} catch (Exception e) {
synchronized (UiApplication.getEventLock()) {
Dialog.inform(e.toString());
}
}
}
Processing and sampling audio stream
In the end of recording you will have output stream filled with data in specific audio format. So to process or sample it you will have to decode this audio stream.
Talking about on the fly processing, that will be more complex. You will have to read output stream during recording without record commiting. So there will be several problems to solve:
synch access to output stream for Recorder and Sampler - threading issue
read the correct amount of audio data - go deep into audio format decode to find out markup rules
Also may be useful:
java.net: Experiments in Streaming Content in Java ME by Vikram Goyal
While not audio specific, this question does have some good "getting started" references.
Writing Blackberry Applications
I spent ages trying to figure this out too. Once you've installed the BlackBerry Component Packs (available from their website), you can find the sample code inside the component pack.
In my case, once I had installed the Component Packs into Eclipse, I found the extracted sample code in this location:
C:\Program
Files\Eclipse\eclipse3.4\plugins\net.rim.eide.componentpack4.5.0_4.5.0.16\components\samples
Unfortunately when I imported all that sample code I had a bunch of compile errors. To workaround that I just deleted the 20% of packages with compile errors.
My next problem was that launching the Simulator always launched the first sample code package (in my case activetextfieldsdemo), I couldn't get it to run just the package I am interested in. Workaround for that was to delete all the packages listed alphabetically before the one I wanted.
Other gotchas:
-Right click on the project in Eclipse and select Activate for BlackBerry
-Choose BlackBerry -> Build Configurations... -> Edit... and select your new project so it builds.
-Make sure you put your BlackBerry source code under a "src" folder in the Eclipse project, otherwise you might hit build issues.

Why am I getting an IllegalArgumentException with this code to create an image?

I wrote this code for my j2me project:
try {
Image immutableThumb = Image.createImage(temp, 0, temp.length);
} catch (Exception ex) {
System.out.println(ex);
}
Where temp is a byte array.
When I tried it for localhost it works, and the image gets created.
But when I tried it on LAN it throws an IllegalArgumentException, and the image is not created.
How can I solve this problem?
The docs say
IllegalArgumentException - if imageData is incorrectly formatted or otherwise cannot be decoded
so I'd say you're getting a different byte array.
If you get the byte array from a network location, be sure it came from a supported image type. I mean, not every image format is available on MIDP. To be sure, you can use PNG.
Is it on device or in emulator that you are having a problem? It could be the url string, or a problem with the connection.
Can you post all your source code?
Here's a simple example: how to download an image from a web-server.

Upload document to specific folder in a SharePoint Document library using WebClient

I have some client side code that uploads an Outlook email to a document library and as long as the path is pointing to the root of the doc library it works just fine.
#"https://<server>/sites/<subweb>/<customer>/<teamweb>/<Documents>/" + docname;
is the projectUrl in this function :
public bool SaveMail(string filepath, string projectUrl)
{
try
{
using (WebClient webclient = new WebClient())
{
webclient.UseDefaultCredentials = true;
webclient.UploadFile(projectUrl, "PUT", filepath);
}
}
catch(Exception ex)
{
//TO DO Write the exception to the log file
return false;
}
return true;
}
but I have not been able to figur out how to upload to an existing folder i.e. "Emails" in the same document library.
Not even Google seems to know the answer :-)
Note: I know that I could use something like the Copy web service within SharePoint to move the file to its final destination, but that is more like a workaround.
When will I learn not to work that late into the night :-(
Sorry about that question. Igalse is right, I just needed to add "emails/" to the URL. I could swear that I had tried that, but then again it sure looks like I didn't.
With your code I just added /Emails/ to the projectUrl and the upload worked just fine. Have you tried that? Maybe you have permission problem.

Resources