How to obtain bitrate, sample rate and bits per sample in python-vlc - libvlc

I am trying to obtain bitrate, sample rate actual bits per sample of an audio file using python-vlc.
I saw that we can use libvlc_media_tracks_get() to obtain bitrate, but I am not sure how to get the others.
But even if this method can obtain all 3 info, I still can't manage to make the method work. It takes 2 arguments p_md and tracks. I don't understand what is tracks. It says it requires an instance of LP_LP_MediaTrack but I can't somehow find what that means.

Bitrate:
# This allocates a double pointer and a MediaTrack object in RAM, then points to
# it, but no value is assigned to this double pointer or the MediaTrack object.
mediaTrack_pp = ctypes.POINTER(vlc.MediaTrack)()
# Assigns the MediaTrack in the double pointer to the values stored in `media`,
# returning the amount of media tracks as `n`
n = vlc.libvlc_media_tracks_get(media, ctypes.byref(mediaTrack_pp))
# Converts the double pointer to `POINTER` class that Python understands. We can
# then get the value the pointer is pointing at.
info = ctypes.cast(mediaTrack_pp, ctypes.POINTER(ctypes.POINTER(vlc.MediaTrack) * n))
# This gets the actual value of the double pointer i.e. value of MediaTrack
media_tracks = info.contents[0].contents
Sample rate (appending the above code):
# According to the API doc, MediaTrack class has an `u` field which has an
# `audio` field, containing the audio track.
audio = ctypes.cast(media_tracks.u.audio, ctypes.POINTER(vlc.AudioTrack))
# The AudioTrack class has a `rate` field, which is the sample rate.
sample = audio.contents.rate
The main reason I couldn't get the bitrate is because I didn't understand how to get the data type LP_LP_MediaTrack. This is a double pointer of a MediaTrack instance.
When vlc.libvlc_media_tracks_get(media, ctypes.byref(mediaTrack_pp)) runs, I think the library simply sets copies the MediaTrack instance in media to the actual memory location thatmediaTrack_pp points to
We can then get the actual object in Python using the ctypes.cast() method. This algorithm should apply on all Python codes that uses LP_xxx data types, even for other libraries.
With that problem solved, all that left was to crunch the API documentation.
VLC MediaTrack class documentation

In libVlc try
MediaPlayer.Media.Tracks[0].Bitrate;

Related

How to modify a Transform stream in Node.js whose parameters are part of the source stream

Say I have a transform stream that is created with a single parameter - e.g. createTransform(a).
It expects to read from a stream, do its work, and output a transformed stream. But I have a situation where the parameter a is the first x bytes of the source stream. x is always fixed and known.
How can I create my own transform stream that expects a stream with a as the first x bytes?
i.e. original scenario:
source.pipe(createTransform(a)).pipe(destination)
desired scenario:
sourceWithAPrepended.pipe(createTransformReadingA()).pipe(destination)
I'm afraid I've been unable to work out how to even approach this. Assume that createTransform is a black box, and I want to create the function createTransformReadingA
Many thanks for any guidance

Python nidaqmx to read K thermocouple value

I am a Python newbie but get by generally by modifying examples to suit my limited needs.
I am trying to automate some temperature measurements using NI 9213 in conjunction with NI CDAQ 9714
I viewed the video from NI and was able to take measurements generically
https://www.youtube.com/watch?v=NMMRbPvkzFs
However I cant specify correctly the type of thermocouple
import nidaqmx
nidaqmx.constants.ThermocoupleType(10073)
nidaqmx.constants.TemperatureUnits(10143)
with nidaqmx.Task() as task:
task.ai_channels.add_ai_thrmcpl_chan("cDaq1Mod1/ai0:1")
#task.ai_channelsadd_ai_thrmcpl_chan("cDaq1Mod1/ai0:1","bob", 0.0, 100.0,units="TemperatureUnits.DEG_C: 10143", thermocouple_type="ThermocoupleType.J: 10072")
data=task.read(1,1.0)
print (data[0])
From here
http://nidaqmx-python.readthedocs.io/en/latest/ai_channel_collection.html
I just cannot work out how to set the units and the type of thermocouple.
I can use commands to set these generically but I cannot refer to them in the add thermocouple command
I am using Anaconda Spyder 3.6
add_ai_thrmcpl_chan(physical_channel, name_to_assign_to_channel=u'', min_val=0.0, max_val=100.0, units=, thermocouple_type=, cjc_source=, cjc_val=25.0, cjc_channel=u'')[source]
Creates channel(s) that use a thermocouple to measure temperature.
Parameters:
physical_channel (str) – Specifies the names of the physical channels to use to create virtual channels. The DAQmx physical channel constant lists all physical channels on devices and modules installed in the system.
name_to_assign_to_channel (Optional[str]) – Specifies a name to assign to the virtual channel this function creates. If you do not specify a value for this input, NI-DAQmx uses the physical channel name as the virtual channel name.
min_val (Optional[float]) – Specifies in units the minimum value you expect to measure.
max_val (Optional[float]) – Specifies in units the maximum value you expect to measure.
units (Optional[nidaqmx.constants.TemperatureUnits]) – Specifies the units to use to return temperature measurements.
thermocouple_type (Optional[nidaqmx.constants.ThermocoupleType]) – Specifies the type of thermocouple connected to the channel. Thermocouple types differ in composition and measurement range.
cjc_source (Optional[nidaqmx.constants.CJCSource]) – Specifies the source of cold-junction compensation.
cjc_val (Optional[float]) – Specifies in units the temperature of the cold junction if you set cjc_source to CONSTANT_VALUE.
cjc_channel (Optional[str]) – Specifies the channel that acquires the temperature of the thermocouple cold- junction if you set cjc_source to CHANNEL.
Any suggestions hugely appreciated. Such a simple thing but I hit a roadblock and cant see any examples that directly relate.
Many Thanks
Gavin
I've had the same issue.
The solution was to also use nidaqmx.constants. in add_ai_thrmcpl_chan:
with nidaqmx.Task() as task:
task.ai_channels.add_ai_thrmcpl_chan("cDaq1Mod1/ai0:2",name_to_assign_to_channel="", min_val=0.0,
max_val=100.0, units=nidaqmx.constants.TemperatureUnits.DEG_C,
thermocouple_type=nidaqmx.constants.ThermocoupleType.K,
cjc_source=nidaqmx.constants.CJCSource.CONSTANT_USER_VALUE, cjc_val=20.0,
cjc_channel="")

Get duration of recorded audio

I want to get the duration of the recorded audio from microphone. Currently, I'm using the method GetSampleDuration of the Microphone class.
totalRecordTime = microphone.GetSampleDuration(stream.ToArray().Length);
This works great but I think there are two problems:
The stream object contains the WAV header stuff. I don't how it affects the duration of the file? Or I can substract the WAV header length from the stream length before.
The method GetSampleDuration(int sizeInBytes). Is it possible that the sizeInBytes parameter exceed the maximum supported Integer length?

OMXCodec::onEvent -- OMX Bad Parameter

I have been trying to use OMXCodec through Stagefright. I have implemented the code for ICS version of Android.I have two classes CustomDataSource which derives MediaSource and another is CustomOmxCodec which calls OMXCodec::Create method and execute read operation to decode h264 frames. I have tested this implementation on a device with omx.google.video.avc software decoder and it works fine. Now, when I try to run the same implementation on an android phone with hardware h264 decode, it returns error on read call. The error is as below:
[OMX.MTK.VIDEO.DECODER.AVC] ERROR (0x80001005, 0)
0x80001005 is for OMX_ErrorBadParameter.
and I get the error code -1103 on read operation.
I tried various parameters but no success.
The complete log is as below:
[OMX.MTK.VIDEO.DECODER.AVC] mVideoInputErrorRate (0.000000)
!##!>>create tid (21087) O<XCodec mOMXLivesLocally=0, mIsVideoDecoder (1), mIsVideoEncoder (0), mime(video/avc)
[OMX.MTK.VIDEO.DECODER.AVC] video dimensions are 640X480
mSupportesPartialFrames 1 err 0
[OMX.MTK.VIDEO.DECODER.AVC] allocating 10 buffers of size 65536 on input port.
[OMX.MTK.VIDEO.DECODER.AVC] mMemHeapBase = 0x00E8C288, mOutputBufferPoolMemBase=0x51F8E000, size = 9578848
[OMX.MTK.VIDEO.DECODER.AVC] ERROR (0x80001005, 0)
OMXCodec::onEvent--OMX Bad Parameter!!
Read Error : -1103
I'd grateful for any direction on this.
From the question, the hardware codec i.e. OMX.MTK.VIDEO.DECODER.AVC is not supporting one of the parameters being passed as part of the configuration steps.
From OMXCodec::create, configureCodec will be invoked which internally invokes a lot of other functions. Since the error is coming as part of OMXCodec::onEvent, one of the possible scenarios could be that the component encountered an error while decoding the first few bytes of the first frame.
Specifically, when the component encounters SPS and PPS (part of codec specific data), the component would typically trigger a portSettingsChanged. From your response, I feel that during this process, there is some error and hence, onEvent has been triggered.
Please share more logs to analyze further.
The MTK H264 decoder need the parameter csd-0 and csd-1 to init the decoder(You can get some information at http://developer.android.com/reference/android/media/MediaCodec.html). csd-0 and csd-1 stands for SPS and PPS of H264.I have asked a MTK engineer and he said that we can use the code below to set these two parameters.
byte[] sps = {0,0,0,1,103,100,0,40,-84,52,-59,1,-32,17,31,120,11,80,16,16,31
,0,0,3,3,-23,0,0,-22,96,-108};
byte[] pps = {0,0,0,1,104,-18,60,-128};
MediaFormat mFormat = MediaFormat.createVideoFormat("video/avc", width, height);
mFormat.setByteBuffer("csd-0", ByteBuffer.wrap(sps));
mFormat.setByteBuffer("csd-1", ByteBuffer.wrap(pps));
Maybe that's whay we got the OMX Bad Parameter error message.
From the logs and mapping the same to the implemented code, I feel that the following is happening
[OMX.MTK.VIDEO.DECODER.AVC] allocating 10 buffers of size 65536 on input port.
This step allocates the buffers on the input port of the decoder
From the flow of code, after input port buffers are allocated, the buffers on output port are allocated from nativeWindow through allocateOutputBuffersFromNativeWindow.
One of the steps as part of this method implementation is to increase the number of buffers on the output port by 2 and set the same to the OMX component as shown here.
I feel your error might be stemming from this specific point as nBufferSize is a read-only parameter of OMX_IndexParamPortDefinition index. Please refer to the OMX Standard, section 3.1.3.12.1, page 83, which clearly shows that nBufferSize is a read-only parameter.
It appears that your OMX component may be strictly OMX compliant component, whereas in Android from ICS onwards, certain deviations are expected. This could be one of the potential causes of your error.
P.S: If you could share more information, we could help further.

Reading from a Network Stream in C#

I just have this issue with reading from a network stream in C#. Since am more of a Java developer I came across this issue.
In java I have this option of knowing the length of the received packet using the following code
int length = dataiInputStream.read(rcvPacket);
eventhough the size of the byte array rcvPacket assigned is larger than the amount of elements contained in it. which will allow me to read only the required length of elements so that i do not have elements in the byte array containing zeros.
While i was trying to use a similar thing on C# which was
long len = networkStream.length;
but the docucmentation says that this property is not supported. is there a workaround for this?
Thank you
The Java code doesn't show you the length of a packet. It shows you the amount of data read within that read call, which could have come from multiple packets.
NetworkStream.Read returns the same information, except using 0 to indicate the end of the stream instead of -1.

Resources