Gstreamer decodebin not linking to the audioconvert - rust

I'm trying to play a sound file with gstreamer in Rust (using the gstreamer crate).
Here's my code:
gst::init().unwrap();
let source = gst::ElementFactory::make("filesrc", Some("source")).expect("Could not create source");
source.set_property_from_str("location", "/home/yuutsuna/Music/m3.mp3");
let decodebin = gst::ElementFactory::make("decodebin", Some("decodebin")).expect("Could not create decodebin element");
let audioconvert = gst::ElementFactory::make("audioconvert", Some("audioconvert")).expect("Could not create audioconvert element");
let sink = gst::ElementFactory::make("pulsesink", None).expect("SINK");
sink.set_property_from_str("device", "alsa_output.pci-0000_02_02.0.analog-stereo");
let pipeline = gst::Pipeline::new(Some("music-pipeline"));
pipeline.add_many(&[&source, &decodebin, &audioconvert, &sink]).unwrap();
gst::Element::link(&source, &decodebin).expect("could not link source and decodebin");
gst::Element::link(&decodebin, &audioconvert).expect("Could not link decodebin and audioconvert");
gst::Element::link(&audioconvert, &sink).expect("Could not link audioconvert and sink");
I get the error Could not link decodebin and audioconvert
However the following command which do the same is working:
gst-launch-1.0 filesrc location="/home/yuutsuna/Music/m3.mp3" ! decodebin ! audioconvert ! pulsesink device=alsa_output.pci-0000_02_02.0.analog-stereo
After looking into the logs and the documentation I found out gstreamer is not able to link the pads between the decodebin and the audioconvert
0:00:00.098766751 13567 0x7fd40c004600 INFO GST_ELEMENT_PADS gstutils.c:1816:gst_element_link_pads_full: trying to link element decodebin:(any) to element audioconvert:(any)
0:00:00.098808081 13567 0x7fd40c004600 INFO GST_PADS gstpad.c:4357:gst_pad_peer_query:<audioconvert:src> pad has no peer
0:00:00.098874314 13567 0x7fd40c004600 INFO structure gststructure.c:2917:gst_structure_get_valist: Expected field 'channel-mask' in structure: audio/x-raw, rate=(int)[ 1, 2147483647 ], channels=(int)[ 1, 2147483647 ];
0:00:00.099085743 13567 0x7fd40c004600 INFO GST_ELEMENT_PADS gstelement.c:1013:gst_element_get_static_pad: no such pad 'src_%u' in element "decodebin"
0:00:00.099126988 13567 0x7fd40c004600 INFO GST_ELEMENT_PADS gstutils.c:1270:gst_element_get_compatible_pad:<decodebin> Could not find a compatible pad to link to audioconvert:sink
The decodebin source pad is a dynamic pad so it's not available right at the creation of the element. I confirmed it by running the request_pad method which returned None. So I tried to delete the line where I linked the elements then added an action on the event pad_added like shown in the documentation in order to directly link the pads. However my lambda is never called. Here's the line I used:
decodebin.connect("pad_added", true, |value| { info!("new pad {:?}", value); None });
I'm not sure if I'm listening to the event correctly. The only examples I found were in C. Either that or there's another issue...
EDIT: After a good sleep I just found an example where instead it uses the connect_pad_added method. I'll test this later but I think it's the solution.

Ok so I replaced this line:
gst::Element::link(&decodebin, &audioconvert).expect("Could not link decodebin and audioconvert");
into:
let audioconvert_weak = audioconvert.downgrade();
decodebin.connect_pad_added(move |_, src_pad| {
println!("new pad {:?}", src_pad);
let sink_pad = match audioconvert_weak.upgrade() {
None => return,
Some(s) => s.static_pad("sink").expect("cannot get sink pad from sink")
};
src_pad.link(&sink_pad).expect("Cannot link the decodebin source pad to the audioconvert sink pad");
});
Instead of linking the 2 elements I set a closure which is called when a pad is added for the decodebin element. The code in the closure will then print a message, get the sink pad from my audioconvert element (that I passed using a weak reference) and finally link the pads.

Related

How to get Bevy to run longer than 5 seconds on Windows 11 with Geforce RTX 2060

I'm trying to go through the Bevy docs and have noticed that I absoilutely cannot run a single example or basic app for longer than about 5 seconds without getting errors breaking execution. Is there something special outside of the doc's setup needed to get things to run or is Bevy just broken for an up-to-date Windows 11 + Geforce RTX 2060 machine?
Doesn't matter which example I run or try to follow along with the docs, this always happens:
PS C:\Development\GameDev\my_bevy_game> cargo run
warning: unused manifest key: target.aarch64-apple-darwin.rustflags
warning: unused manifest key: target.x86_64-apple-darwin.rustflags
warning: unused manifest key: target.x86_64-pc-windows-msvc.linker
warning: unused manifest key: target.x86_64-pc-windows-msvc.rustflags
warning: unused manifest key: target.x86_64-unknown-linux-gnu.linker
warning: unused manifest key: target.x86_64-unknown-linux-gnu.rustflags
Compiling my_bevy_game v0.1.0 (C:\Development\GameDev\my_bevy_game)
Finished dev [unoptimized + debuginfo] target(s) in 4.01s
Running `target\debug\my_bevy_game.exe`
2022-04-18T15:56:45.590239Z ERROR wgpu_hal::vulkan::instance: GENERAL [Loader Message (0x0)]
setupLoaderTrampPhysDevs: Failed during dispatch call of 'vkEnumeratePhysicalDevices' to lower layers or loader to get count.
2022-04-18T15:56:45.591644Z ERROR wgpu_hal::vulkan::instance: objects: (type: INSTANCE, hndl: 0x207c17a7b00, name: ?)
2022-04-18T15:56:45.592432Z ERROR wgpu_hal::vulkan::instance: GENERAL [Loader Message (0x0)]
setupLoaderTrampPhysDevs: Failed during dispatch call of 'vkEnumeratePhysicalDevices' to lower layers or loader to get count.
2022-04-18T15:56:45.592561Z ERROR wgpu_hal::vulkan::instance: objects: (type: INSTANCE, hndl: 0x207c17a7b00, name: ?)
2022-04-18T15:56:45.901926Z INFO bevy_render::renderer: AdapterInfo { name: "NVIDIA GeForce RTX 2060", vendor: 4318, device: 7957, device_type: DiscreteGpu, backend: Dx12 }
hello Elaina Proctor!
hello Renzo Hume!
hello Zayna Nieves!
2022-04-18T15:56:48.506223Z ERROR present_frames: wgpu_hal::dx12::instance: ID3D12CommandQueue::Present: Resource state (0x800: D3D12_RESOURCE_STATE_COPY_SOURCE) (promoted from COMMON state) of resource (0x00000207DD7D0A70:'Unnamed ID3D12Resource Object') (subresource: 0) must be in COMMON state when transitioning to use in a different Command List type, because resource state on previous Command List type : D3D12_COMMAND_LIST_TYPE_COPY, is actually incompatible and different from that on the next Command List type : D3D12_COMMAND_LIST_TYPE_DIRECT. [ RESOURCE_MANIPULATION ERROR #990: RESOURCE_BARRIER_MISMATCHING_COMMAND_LIST_TYPE]
error: process didn't exit successfully: `target\debug\my_bevy_game.exe` (exit code: 1)
PS C:\Development\GameDev\my_bevy_game>
The Rust code I made from the book (please note this happens with the bevy repo's untouched example code as well):
use bevy::prelude::*;
pub struct HelloPlugin;
struct GreetTimer(Timer);
#[derive(Component)]
struct Person;
#[derive(Component)]
struct Name(String);
impl Plugin for HelloPlugin {
fn build(&self, app: &mut App) {
// the reason we call from_seconds with the true flag is to make the timer repeat itself
app.insert_resource(GreetTimer(Timer::from_seconds(2.0, true)))
.add_startup_system(add_people)
.add_system(greet_people);
}
}
fn greet_people(time: Res<Time>, mut timer: ResMut<GreetTimer>, query: Query<&Name, With<Person>>) {
// update our timer with the time elapsed since the last update
// if that caused the timer to finish, we say hello to everyone
if timer.0.tick(time.delta()).just_finished() {
for name in query.iter() {
println!("hello {}!", name.0);
}
}
}
fn add_people(mut commands: Commands) {
commands
.spawn()
.insert(Person)
.insert(Name("Elaina Proctor".to_string()));
commands
.spawn()
.insert(Person)
.insert(Name("Renzo Hume".to_string()));
commands
.spawn()
.insert(Person)
.insert(Name("Zayna Nieves".to_string()));
}
fn main() {
App::new()
.add_plugins(DefaultPlugins)
.add_plugin(HelloPlugin)
.run();
}
You can see from the output that I'm trying to run the my_bevy_game example from the book, but this exact same issue with wgpu occurs on all the examples I've run thus far. What does one need to do to run anything with Bevy?
-- edit --
It would appear that this is a dx12 issue that WGPU needs to address. The proposed workarounds in the Bevy issue don't work for my, and other's, machine. It would appear that Bevy is "broken" irreparably for the time being, due to depending on WGPU.
It look like a bug see issues #4461 of bevy the bug was inside wgpu:
You can try to temporary use the last wgpu version:
[patch.crates-io]
wgpu = { git = "https://github.com/gfx-rs/wgpu" }
# or
wgpu = { git = "https://github.com/gfx-rs/wgpu" , rev = "3d10678a91b78557b0dea537407eb4a4ff754872" }

How do you use a teensy 4 pin via the teensy4-bsp Rust crate as an input with the pull-up resistor enabled?

I am trying to figure out how to do the Rust equivalent of
pinMode(PIN_D7, INPUT_PULLUP); // Pushbutton
(from https://www.pjrc.com/teensy/td_digital.html)
I have created a project using the template https://github.com/mciantyre/teensy4-rs-template as outlined in the Getting Started section of https://github.com/mciantyre/teensy4-rs .
Unfortunately, the Rust arduino code is a rabbit hole that IntelliJ IDEA can not fully navigate (they use macros to generate structs and impls), so I do not get any helpful completion results that would help me figure out what methods and fields are available.
I'm not sure what to do with pins.p7 to activate the pull-up resistor, or even sample it. Chasing the docs from p7 to P7 to B1_01 to Pad leaves me still confused.
(documenting some failure here)
I did some experiments and some text searches across the crates and found the Config structure.
Unfortunately, when I used it like this, the results were not reliable.
// pull-down resistor. Switch drags to 3.3v
fn mission1(mut switch_pin: B0_10, led: &mut LED, systick: &mut SysTick) -> !
{
let cfg = teensy4_bsp::hal::iomuxc::Config::zero().set_pullupdown(PullUpDown::Pulldown100k);
iomuxc::configure(&mut switch_pin, cfg);
let bacon = GPIO::new(switch_pin);
loop {
if bacon.is_set() {
led.toggle()
}
systick.delay(300);
}
}
It still picked up spurious button clicks. I turned things around and tried to rig it for pull-up
// pull-up resistor. Switch drags to ground
fn mission2(mut switch_pin: B0_10, led: &mut LED, systick: &mut SysTick) -> !
{
let pull_up = match 22
{
100 => PullUpDown::Pullup100k, // unreliable
47 => PullUpDown::Pullup47k, // unreliable
_ => PullUpDown::Pullup22k,
};
let cfg = teensy4_bsp::hal::iomuxc::Config::zero().set_pullupdown(pull_up);
iomuxc::configure(&mut switch_pin, cfg);
let bacon = GPIO::new(switch_pin);
loop {
if ! bacon.is_set() {
led.toggle()
}
systick.delay(300);
}
}
All 3 of the pull-up options were not useful. I attached a multimeter and it was reading about 0.67v between the pin and ground with the switch open and the 22k pull-up resistor option.
When I wire up a physical 10K resistor it behaves like I expect and the multimeter measures 3.23V. If I wire two 10Ks in series for pull-up, it measures 3.20V.
I am going to say that this is not the proper technique for a Teensy 4.0.
Based on the response to https://github.com/mciantyre/teensy4-rs/issues/107 and the code at https://github.com/imxrt-rs/imxrt-hal/issues/112 I was able to create the following example that seems to work on my teensy 4.0
let cfg = Config::zero()
.set_hysteresis(Hysteresis::Enabled)
.set_pull_keep(PullKeep::Enabled)
.set_pull_keep_select(PullKeepSelect::Pull)
.set_pullupdown(PullUpDown::Pulldown100k);
iomuxc::configure(&mut switch_pin, cfg);
let switch_gpio = GPIO::new(switch_pin);
loop {
if switch_gpio.is_set() {
led.toggle()
}
systick.delay(LED_PERIOD_MS);
}
Full code at https://github.com/mciantyre/teensy4-rs/blob/997d92cc880185f22272d1cfd54de54732154bb5/examples/pull_down_pin.rs .

Working with Protobuf-encoded MQTT streams in Apache Beam

I am trying to decode and process protobuf-encoded MQTT messages (from an Eclipse Mosquitto broker) using Apache Beam. In addition to the encoded fields, I also want to process the full topic of each message for grouping and aggregations, as well as the timestamp.
What I have tried so far
I can connect to Mosquitto via
val options = PipelineOptionsFactory.create()
val pipeline = Pipeline.create(options)
val mqttReader: MqttIO.Read = MqttIO
.read()
.withConnectionConfiguration(
MqttIO.ConnectionConfiguration.create(
"tcp://localhost:1884",
"my/topic/+"
)
)
val readMessages = pipeline.apply<PCollection<ByteArray>>(mqttReader)
In order to decode the messages, I have compiled the .proto schema (in my case quote.proto containing the Quote message) via Gradle, which allows my to transform ByteArray into Quote objects via Quote.parseFrom():
val quotes = readMessages
.apply(
ParDo.of(object : DoFn<ByteArray, QuoteOuterClass.Quote>() {
#ProcessElement
fun processElement(context: ProcessContext) {
val protoRow = context.element()
context.output(QuoteOuterClass.Quote.parseFrom(protoRow))
}
})
)
Using this, in the next apply, I can then access individual fields with a ProcessFunction and a lambda, e.g. { quote -> "${quote.volume}" }. However, there are two problems:
With this pipeline I do not have access to the topic or timestamp of each message.
After sending the decoded messages back to the broker with plain UTF8 encoding, I believe that they do not get decoded correctly.
Additional considerations
Apache Beam provides a ProtoCoder class, but I cannot figure out how to use it in conjunction with MqttIO. I suspect that the implementation has to look similar to
val coder = ProtoCoder
.of(QuoteOuterClass.Quote::class.java)
.withExtensionsFrom(QuoteOuterClass::class.java)
Instead of a PCollection<ByteArray>, the Kafka IO reader provides a PCollection<KafkaRecord<Long, String>>, which has all the relevant fields (including topic). I am wondering if something similar can be achieved with Mqtt + ProtoBuf.
A similar implementation to what I want to achieve can be done in Spark Structured Streaming + Apache Bahir as follows:
val df_mqttStream = spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic)
.load(brokerUrl)
val parsePayload = ProtoSQL.udf { bytes: Array[Byte] => Quote.parseFrom(bytes) }
val quotesDS = df_mqttStream.select("id", "topic", "payload")
.withColumn("quote", parsePayload($"payload"))
.select("id", "topic", "quote.*")
However, with Spark 2.4 (the latest supported version), accessing the message topic is broken (related issue, my ticket in Apache Jira).
From my understanding, the latest version of Apache Beam (2.27.0) does simply not offer a way to extract the specific topics of MQTT messages.
I have extended the MqttIO to return MqttMessage objects that include a topic (and a timestamp) in addition to the byte array payload. The changes currently exist as a pull request draft.
With these changes, the topic can simply be accessed as message.topic.
val readMessages = pipeline.apply<PCollection<MqttMessage>>(mqttReader)
val topicOfMessages: PCollection<String> = mqttMessages
.apply(
ParDo.of(object : DoFn<MqttMessage, String>() {
#ProcessElement
fun processElement(
#Element message: MqttMessage,
out: OutputReceiver<String>
) { out.output(message.topic) }
})
)

Java Gstreamer Gnonlin source segmentation

Using Java Gstreamer binding 1, I want to read an audio file from disk and write a segment of this file back to disk. For this, I cannot use the "filesrc" element, but instead I found that I can use the "gnlurisource" element from the Gnonlin plugin 2.
I took Gstreamer Java binding and I compiled it locally, getting a jar file that I added to my project. I also installed Gstreamer on Ubuntu using the following commands:
sudo apt-get install libgstreamer1.0-dev
sudo apt-get install gstreamer1.0-gnonlin
The program compiles without errors, but it remains stuck and does nothing. Below I attach my program code:
import java.util.concurrent.TimeUnit;
import org.freedesktop.gstreamer.Element;
import org.freedesktop.gstreamer.ElementFactory;
import org.freedesktop.gstreamer.Gst;
import org.freedesktop.gstreamer.Pipeline;
import org.freedesktop.gstreamer.State;
public class AudioSegmentation {
public static void main(String[] args) {
Pipeline pipe;
Element asr;
Element composition;
Element gnlsource;
Element convert;
Element filesink;
Gst.init();
pipe = new Pipeline("SimplePipeline");
composition = ElementFactory.make("gnlcomposition", "comp");
gnlsource = ElementFactory.make("gnlurisource", "gnlsource");
convert = ElementFactory.make("audioconvert", "compconvert");
filesink = ElementFactory.make("filesink", "filesink");
gnlsource.set("uri", "file:///home/user/Desktop/file-source.wav");
gnlsource.set("start", TimeUnit.SECONDS.toNanos(5));
gnlsource.set("duration", TimeUnit.SECONDS.toNanos(2));
filesink.set("location", "/home/user/Desktop/file-destination.wav");
composition.link(gnlsource);
pipe.addMany(composition, convert, filesink);
Element.linkMany(composition, convert, filesink);
pipe.setState(State.PLAYING);
Gst.main();
Gst.quit();
}
}
I don't have so much experience with Gstreamer, can you give me a hint about what's wrong?
Thank you!
UPDATE: I managed to use gstreamer from the command line to select a segment from an audio file. The "gnlurisource" element has the "inpoint" paramenter to set the segment start time and "duration" to specify the duration of the segment.
Here is the command:
gst-launch-1.0 gnlurisource uri=file:///home/user/Desktop/file-source.wav inpoint=2000000000 duration=1500000000 ! audioconvert ! wavenc ! filesink location=/home/user/Desktop/file-destination.wav
I'm still trying to implement this pipeline in Java. I tried something like that, but it doesn't work:
import java.util.concurrent.TimeUnit;
import org.freedesktop.gstreamer.Element;
import org.freedesktop.gstreamer.ElementFactory;
import org.freedesktop.gstreamer.Gst;
import org.freedesktop.gstreamer.Pipeline;
import org.freedesktop.gstreamer.State;
public class AudioSegmentation {
public static void main(String[] args) {
Pipeline pipe;
Element gnlsource;
Element audioconvert;
Element wavenc;
Element filesink;
Gst.init();
pipe = new Pipeline("SimplePipeline");
gnlurisource = ElementFactory.make("gnlurisource", "gnlurisource");
audioconvert = ElementFactory.make("audioconvert", "audioconvert");
wavenc = ElementFactory.make("wavenc", "wavenc");
filesink = ElementFactory.make("filesink", "filesink");
gnlurisource.set("uri", "file:///home/user/Desktop/file-source.wav");
gnlurisource.set("inpoint", TimeUnit.SECONDS.toNanos(2));
gnlurisource.set("duration", TimeUnit.SECONDS.toNanos(3));
filesink.set("location", "/home/user/Desktop/file-destination.wav");
pipe.addMany(gnlurisource, audioconvert, wavenc, filesink);
Element.linkMany(gnlurisource, audioconvert, wavenc, filesink);
pipe.setState(State.PLAYING);
Gst.main();
Gst.quit();
}

Vala gstreamer link failed

I've a problem with playing a mjpeg stream in vala.
I have construct my pipeline and it works with only two elements(videotestsrc and cluttersink), but if I want to add more I get an "Internal data flow error" and "streaming task paused, reason not-linked (-1)".
If I run the pipeline manually it works:
gst-launch souphttpsrc location=http://mjpeg.sanford.io/count.mjpeg ! multipartdemux ! jpegdec ! autovideosink
Here is my streaming class:
public class Stream : Clutter.Actor {
Clutter.Texture video;
public dynamic Gst.Element playbin;
public Gst.Pipeline pipeline;
public Gst.Element demux;
public Gst.Element jpegdec;
public Gst.Element outputsink;
public dynamic Gst.Element src;
public dynamic Gst.Element video_sink;
public Stream(){
print("stream");
video = new Clutter.Texture ();
this.pipeline = new Gst.Pipeline("videopipeline");
this.src = Gst.ElementFactory.make ("souphttpsrc","httpsrc");
this.demux = Gst.ElementFactory.make ("multipartdemux","demux");
this.jpegdec = Gst.ElementFactory.make ("jpegdec","jpegdec");
this.outputsink = Gst.ElementFactory.make("autovideosink","output");
this.video_sink = Gst.ElementFactory.make ("cluttersink", "source");
this.video_sink.texture = video;
this.src.set("location","http://mjpeg.sanford.io/count.mjpeg");
this.pipeline.add_many(this.src,this.demux,this.jpegdec,this.outputsink,this.video_sink);
this.src.link(this.demux);
this.demux.link(this.jpegdec);
this.jpegdec.link(this.outputsink);
this.outputsink.link(this.video_sink);
this.add_child (video);
this.pipeline.set_state(Gst.State.PLAYING);
}
}
Here is the full error log:
http://pastebin.com/b9GnA5ke
You can't have two sink elements attached to jpegdec. If you need to do that you should use the "tee" element, while making sure to use add a "queue" to each branch of the tee.
There may also be a caps issue going from jpegdec to cluttersink. I'd structure it as follows:
souphttpsrc ! multipartdemux ! jpegdec ! tee name=t ! queue ! videoconvert ! autovideosink
t. ! queue ! videoconvert ! cluttersink

Resources