Android NDK / MediaPlayer - Customer player logging C2DColorConvert - android-ndk

I have VR video player that is built (C++, shared library) using Android NDK's Media API (mediandk, OpenSLES, EGL etc.). Player works as expected but one issue that I have is that in logcat, I see lot of messages as below: C2DColorConvert: unknown format passed for luma alignment number
This is running on LG G6/Qualcomm and post decoding of video, images are sent through some post processing and finally rendered on two textures (RGBA32 type) shown inside Unity App.
Any suggestions regarding how to remove above error?
Above error comes from following file
size_t C2DColorConverter::calcLumaAlign(ColorConvertFormat format) {
if (!isYUVSurface(format)) return 1; //no requirement
switch (format) {
case NV12_2K:
return ALIGN2K;
default:
ALOGE("unknown format passed for luma alignment number");
return 1;
}
}

Related

DirectXTK make_unique<AudioEngine>(flags) fails

I am just about ready to release my first little game with my game engine. However, through having some people test it, we found that the call to acquire the pointer to the AudioEngine interface fails for one of my testers.
The call works fine for me on both my desktop and my laptop, and it works fine on tester 2's computer. However, tester 1's computer will not succeed on the call.
By "fails" I mean it throws an exception which I am handling with a try/catch block. The "what" of the exception just tells me "AudioEngine" so no help there. He has a heavily customized computer which utilizes two graphics cards which are not linked and handle separate tasks. He uses the same Virtual Audio Cable set up that I have on my Desktop (used to separate voice sources for easier video editing re: streaming/game recording).
If anyone has any clue what might cause this call to fail, we would greatly appreciate the information. Please let me know if you require any additional information. Code for initialization is below:
//aud engine declaration is in the header for the class
unique_ptr<AudioEngine> audEngine;
//function being called
bool AudioEngineClass::InitializeAudioEngine()
{
//Call this to create the DXTK Audio Engine
//Setup flags:
AUDIO_ENGINE_FLAGS eflags = AudioEngine_Default;
eflags = eflags | AudioEngine_EnvironmentalReverb | //Enables environmental reverb for 3D (required for 3D audio)
AudioEngine_ReverbUseFilters | //Enables additional features for 3D positional audio reverb
AudioEngine_UseMasteringLimiter; //Enables a mastering volume limiter to avoid distortion and clipping with 3D audio.
//MessageBox(NULL, "Attempting to assign AudioEngine Pointer", "AudioEngine.InitializeAudioEngine", MB_OK);
try
{
audEngine = make_unique<AudioEngine>(eflags);
}
catch(exception& e)
{
//Tester 1 falls into this
string exceptionStr = e.what();
string outputStr = "Failed to Initialize Audio Engine. Exception: \n";
outputStr += exceptionStr;
MessageBox(NULL, outputStr.c_str(), "AudioEngine.InitializeAudioEngine", MB_OK);
}
//MessageBox(NULL, "Got past AudioEngine Pointer Assignment", "AudioEngine.InitializeAudioEngine", MB_OK);
if (!audEngine)
{
//failed to create audio engine.
initialized = false;
}
else
{
initialized = true;
}
return initialized;
}
UPDATE: Been trying stuff all day with no luck so far.
-Had Tester download the June 2010 DirectX DLLs and install them and restart their computer.
-Had them update all of their drivers.
-Had them check their System folder (all XAudio2_#.dll files are present).
-They have Windows 10

For testing GPS based J2ME app, Is there any emulator which have built in GPS?

I am new in J2ME app development field. I am developing GPS based app using Nokia Maps for series 40 mobiles. I want such emulator which provide GPS(to retrieve & set current coordinates & many other purposes). I search a lot on google but I didn't found such emulator.... even what ever emulators provide by Nokia SDKs; they also don't have GPS capability... Then how should I get such emulator??
You can download the latest Nokia IDE (which includes the Nokia Maps Plugin) here:
Emulator download
The emulator associated with the SDK includes tools to simulate JSR-179 location calls (e.g. Cell-Id/GPS), you can play back coordinates for a saved file and receive them at regular intervals. Look at the emulator's Tools > Route Editor menu.
The confusion here is the difference between GPS positioning and Cell-Id positioning. There are currently no series 40 mobiles (that I know of) with a GPS unit - hence positioning will need to be done by Cell-ID - In this case the only way to retrieve frequent location updates in the Cell-ID scenario is to call the getLocation() method within a repeating loop. Retrieving location objects via the locationUpdated() method, can only be done in a GPS-based location retrieval.
In summary you can get a location from any Java ME phone supporting JSR-179, you won't be using GPS though.
To get a location use the following:
cellIdLocator = getCellIdProvider();
cellIdLocator.getLocation(DEFAULT_TIMEOUT);
Where the cell-id provider can be held in a singleton
private LocationProvider cellIdLocator;
public LocationProvider getCellIdProvider() throws LocationException {
if (cellIdLocator == null) {
int[] methods = {
Location.MTA_ASSISTED | Location.MTE_CELLID
| Location.MTY_NETWORKBASED};
cellIdLocator = LocationUtil.getLocationProvider(methods, null);
}
return cellIdLocator;
}

IMFMediaPlayer hangs during SetSourceFromByteStream

Background: I'm coding a metro-styled app for Win8. I need to be able to play music-file. Because of quality and space requirements we're using encoded audio (mp3/ogg).
I'm using XAudio2 to play sound effects (.wav files), but since I couldn't figure out a way to play encoded audio with it, I decided to play the music files with Media Foundation (IMFMediaPlayer interface).
I downloaded metro apps sample, and found out that the Media Engine Native C++ video playback sample was closest to what I needed.
Now that my app has MediaPlayer playing musics, I ran into a problem. If the device running the app is slow enough, MediaPlayer hangs. When I'm running the release-version of the app on my device, it's fine and I can hear the music just fine. But when I attach the debugger or run it on a slower device, it hangs when I'm setting bytestream for the MediaPlayer to play.
Here's some code, you'll find it pretty similiar to the sample:
StorageFolder^ installedLocation = Windows::ApplicationModel::Package::Current->InstalledLocation;
m_pickFileTask = Concurrency::task<StorageFile^>(installedLocation->GetFileAsync(filename)), m_tcs.get_token());
auto player = this;
m_pickFileTask.then([player](StorageFile^ fileHandle)
{
player->SetURL(fileHandle->Path);
Concurrency::task<IRandomAccessStream^> fOpenStreamTask = Concurrency::task<IRandomAccessStream^> (fileHandle->OpenAsync(Windows::Storage::FileAccessMode::Read));
fOpenStreamTask.then([player](IRandomAccessStream^ streamHandle)
{
MEDIA::ThrowIfFailed(
player->m_spMediaEngine->Pause()
);
MEDIA::GetMediaError(player->m_spMediaEngine);
player->SetBytestream(streamHandle);
if (player->m_spMediaEngine)
{
MEDIA::ThrowIfFailed(
player->m_spEngineEx->Play()
);
MEDIA::GetMediaError(player->m_spMediaEngine);
}
}
);
}
);
And here's the SetBytestream method:
SetBytestream(IRandomAccessStream^ streamHandle)
{
if(m_spMFByteStream != nullptr)
{
m_spMFByteStream->Close();
m_spMFByteStream = nullptr;
}
MEDIA::ThrowIfFailed(
MFCreateMFByteStreamOnStreamEx((IUnknown*)streamHandle, &m_spMFByteStream)
);
MEDIA::ThrowIfFailed(
m_spEngineEx->SetSourceFromByteStream(m_spMFByteStream.Get(), m_bstrURL)
);
MEDIA::GetMediaError(m_spEngineEx);
return;
}
The line where it hangs is:
m_spEngineEx->SetSourceFromByteStream(m_spMFByteStream.Get(), m_bstrURL)
When I'm debugging the app, I can press pause and see the stack. Well, not much of it, but atleast I can see it that it's indefinitely at
ntdll.dll!77b7f4dc()
Any ideas why my app would hang in such a way?
(OPTIONAL: If you know a better way to play mp3/ogg in a c++ metro-styled app, let me know)
Could not figure out why this is happening, but I managed to code a work-a-round:
IMFSourceReader can be used to decode MP3s and feed bytes into XAudio2SourceVoice.
XAudio2 audio stream effect sample contains good example how to do this.

How to get webcam video stream bytes in c++

I am targeting windows machines. I need to get access to the pointer to the byte array describing the individual streaming frames from an attached usb webcam. I saw the playcap directshow sample from the windows sdk, but I dont see how to get to raw data, frankly, I don't understand how the video actually gets to the window. Since I don't really need anything other than the video capture I would prefer not to use opencv.
Visual Studio 2008 c++
Insert the sample grabber filter. Connect the camera source to the sample grabber and then to the null renderer. The sample grabber is a transform, so you need to feed the output somewhere, but if you don't need to render it, the null renderer is a good choice.
You can configure the sample grabber using ISampleGrabber. You can arrange a callback to your app for each frame, giving you either a pointer to the bits themselves, or a pointer to the IMediaSample object which will also give you the metadata.
You need to implement ISampleGrabberCB on your object, and then you need something like this (pseudo code)
IFilterInfoPtr m_pFilterInfo;
ISampleGrabberPtr m_pGrabber;
m_pGrabber = pFilter;
m_pGrabber->SetBufferSamples(false);
m_pGrabber->SetOneShot(false);
// force to 24-bit mode
AM_MEDIA_TYPE mt;
ZeroMemory(&mt, sizeof(mt));
mt.majortype = MEDIATYPE_Video;
mt.subtype = MEDIASUBTYPE_RGB24;
m_pGrabber->SetMediaType(&mt);
m_pGrabber->SetCallback(this, 0);
// SetCallback increments a refcount on ourselves,
// but we own the grabber so this is recursive
/// -- must addref before SetCallback(NULL)
Release();

Blackberry Audio Recording Sample Code

Does anyone know of a good repository to get sample code for the BlackBerry? Specifically, samples that will help me learn the mechanics of recording audio, possibly even sampling it and doing some on the fly signal processing on it?
I'd like to read incoming audio, sample by sample if need be, then process it to produce a desired result, in this case a visualizer.
RIM API contains JSR 135 Java Mobile Media API for handling audio & video content.
You correct about mess on BB Knowledge Base. The only way is browse it, hoping they'll not going to change site map again.
It's Developers->Resources->Knowledge Base->Java API's&Samples->Audio&Video
Audio Recording
Basically it's simple to record audio:
create Player with correct audio encoding
get RecordControl
start recording
stop recording
Links:
RIM 4.6.0 API ref: Package javax.microedition.media
How To - Record Audio on a BlackBerry smartphone
How To - Play audio in an application
How To - Support streaming audio to the media application
How To - Specify Audio Path Routing
How To - Obtain the media playback time from a media application
What Is - Supported audio formats
What Is - Media application error codes
Audio Record Sample
Thread with Player, RecordControl and resources is declared:
final class VoiceNotesRecorderThread extends Thread{
private Player _player;
private RecordControl _rcontrol;
private ByteArrayOutputStream _output;
private byte _data[];
VoiceNotesRecorderThread() {}
private int getSize(){
return (_output != null ? _output.size() : 0);
}
private byte[] getVoiceNote(){
return _data;
}
}
On Thread.run() audio recording is started:
public void run() {
try {
// Create a Player that captures live audio.
_player = Manager.createPlayer("capture://audio");
_player.realize();
// Get the RecordControl, set the record stream,
_rcontrol = (RecordControl)_player.getControl("RecordControl");
//Create a ByteArrayOutputStream to capture the audio stream.
_output = new ByteArrayOutputStream();
_rcontrol.setRecordStream(_output);
_rcontrol.startRecord();
_player.start();
} catch (final Exception e) {
UiApplication.getUiApplication().invokeAndWait(new Runnable() {
public void run() {
Dialog.inform(e.toString());
}
});
}
}
And on thread.stop() recording is stopped:
public void stop() {
try {
//Stop recording, capture data from the OutputStream,
//close the OutputStream and player.
_rcontrol.commit();
_data = _output.toByteArray();
_output.close();
_player.close();
} catch (Exception e) {
synchronized (UiApplication.getEventLock()) {
Dialog.inform(e.toString());
}
}
}
Processing and sampling audio stream
In the end of recording you will have output stream filled with data in specific audio format. So to process or sample it you will have to decode this audio stream.
Talking about on the fly processing, that will be more complex. You will have to read output stream during recording without record commiting. So there will be several problems to solve:
synch access to output stream for Recorder and Sampler - threading issue
read the correct amount of audio data - go deep into audio format decode to find out markup rules
Also may be useful:
java.net: Experiments in Streaming Content in Java ME by Vikram Goyal
While not audio specific, this question does have some good "getting started" references.
Writing Blackberry Applications
I spent ages trying to figure this out too. Once you've installed the BlackBerry Component Packs (available from their website), you can find the sample code inside the component pack.
In my case, once I had installed the Component Packs into Eclipse, I found the extracted sample code in this location:
C:\Program
Files\Eclipse\eclipse3.4\plugins\net.rim.eide.componentpack4.5.0_4.5.0.16\components\samples
Unfortunately when I imported all that sample code I had a bunch of compile errors. To workaround that I just deleted the 20% of packages with compile errors.
My next problem was that launching the Simulator always launched the first sample code package (in my case activetextfieldsdemo), I couldn't get it to run just the package I am interested in. Workaround for that was to delete all the packages listed alphabetically before the one I wanted.
Other gotchas:
-Right click on the project in Eclipse and select Activate for BlackBerry
-Choose BlackBerry -> Build Configurations... -> Edit... and select your new project so it builds.
-Make sure you put your BlackBerry source code under a "src" folder in the Eclipse project, otherwise you might hit build issues.

Resources