I have used rodio crate for playing audio from local file, by going through docs, but not able to figure out how to play audio using url.
Here is a simple example using blocking reqwest. This downloads the entire audio file into memory before it starts playing.
use std::io::{Write, Read, Cursor};
use rodio::Source;
fn main() {
// Remember to add the "blocking" feature in the Cargo.toml for reqwest
let resp = reqwest::blocking::get("http://websrvr90va.audiovideoweb.com/va90web25003/companions/Foundations%20of%20Rock/13.01.mp3")
.unwrap();
let mut cursor = Cursor::new(resp.bytes().unwrap()); // Adds Read and Seek to the bytes via Cursor
let source = rodio::Decoder::new(cursor).unwrap(); // Decoder requires it's source to impl both Read and Seek
let device = rodio::default_output_device().unwrap();
rodio::play_raw(&device, source.convert_samples()); // Plays on a different thread
loop {} // Don't exit immediately, so we can hear the audio
}
If you want to implement actual streaming, where parts of the audio file is downloaded then played, and more gets fetched later as it is needed, it gets quite a bit more complicated. See this entry about partial downloads in the Rust Cookbook: https://rust-lang-nursery.github.io/rust-cookbook/web/clients/download.html#make-a-partial-download-with-http-range-headers
I believe it can also be done easier with async reqwest, but I am still experimenting with that myself.
Related
When I was using the hyper server, I wanted to be able to shut it down gracefully. So I was using with_graceful_shutdown() to get a Future.
The following code works fine.
let signer = Arc::new(Notify::new());
let waiter = signer.clone();
let graceful = server.with_graceful_shutdown(waiter.notified());
tokio::spawn(async move {
time.sleep(ONE_SEC).await;
signer.notify_one();
});
graceful.await
...
But now I want to save the `graceful` variable above into a structure so that I can easily call .await at the right time to start the server, instead of starting it directly in this function.
However, I so want to have a hard time writing the exact type of `graceful`. Can someone help me?
I've been trying to read serial data from a Feather M0, but for some reason I can't read the data into a buffer. This device is for sure outputting serial data, and both PlatformIO and the Arduino IDE show serial data in their respective serial monitors. However, it will timeout when I'm reading it in Rust, every single time, no matter what timeout value I have it set to. Here is my code:
// First, find the serial port
let port_info = find_receiver();
// If we didn't find a port, then we can't continue
if port_info.is_none() {
panic!("Could not find a serial port");
}
let mut port = serialport::new(port_info.unwrap().port_name, 9600)
.timeout(Duration::from_millis(1000))
.open()
.expect("Could not open serial port");
let mut serial_buf: Vec<u8> = vec![0; 8];
loop {
let n = port.read_exact(serial_buf.as_mut_slice()).unwrap();
println!("Buffer is {:?}", serial_buf);
}
The find_reciever() function simply scans the open ports and returns the one that I want to connect to. I'm on Windows, so in this case it's usually COM9 or COM12. I would like this to be a cross-platform application, so I'm not using the open_native() function that serialport provides.
I've tried varrying the size of the buffer from 1 byte to 1000, I've trying different versions of read into the port, I've tried skipping over timeout errors, and I've tried directly outputting the read bytes to io::Stdout. Any ideas on what to do?
Apparently, the serialport crate that I was using requires you to set the command
port.write_data_terminal_ready(true);
in order for it to start reading data. On Linux this works perfectly fine without it. Rip 4 hours trying to change what IO reader I was using.
I'm trying to make a music player where the user can play any audio file inside a folder. To do this, I'm trying to spawn entities containing a Music component and an Sound(Handle<AudioSource>) component. In the Bevy examples, I saw this line of code that seemed to be what I wanted:
// You can load all assets in a folder like this. They will be loaded in parallel without blocking
let _scenes: Vec<HandleUntyped> = asset_server.load_folder("models/monkey").unwrap();
Here is the function I wrote:
fn load_audio(mut commands: Commands, asset_server: Res<AssetServer>, audio: Res<Audio>) {
let music = asset_server.load_folder("music").unwrap();
for song in music {
commands.spawn((
Music,
Sound(song),
));
}
}
This code gives a compilation error because song is of the type HandleUntyped. My first idea was to convert HandleUntyped into Handle<AudioSource>. I have to imagine there is some way to do this, or else HanldeUntyped would be pretty useless, but looking through the Bevy docs I can't find any way to do this. Handle::<AudioSource>::from(song) didn't work. I've also considered using the std::fs library to get all the audio files in the directory and load them each individually with Bevy, but the existence of the load_folder method seems to imply that Bevy has a more elegant and simple way of doing this.
Of course immediately after giving up the search and posting a question, I find the answer. HandleUntyped implements the typed() function, which converts it into a typed Handle. All I needed to do was replace Sound(song) with Sound(song.typed()).
I'm trying to play some sounds with rodio.
I am creating a Source and putting it into a Sink, but how can I know when one or the other has stopped playing? For example, I want to move to the next song after the first.
let device = rodio::default_output_device().unwrap();
let sink = Sink::new(&device);
let file = File::open(path).unwrap();
let source = rodio::Decoder::new(BufReader::new(file)).unwrap();
sink.append(source);
I have found nothing in the rodio docs about a callback or something similar. There is a Done struct, but it's not clear to me how to use it, or even if is the thing I'm looking for.
I think you're looking for the Sink::done which returns true if there are no more samples within the sink to play.
When I pipe something like an image file through a stream is there any way to send an meta object along with it?
My server gets sent an image from a user. The image gets pushed through a set of streams that perform various actions.
The final stream emits a data event, it passes the resulting image buffer into a callback but I lose all context for the user. I need to keep the resulting image tied to the user's id and some other meta data.
Ideal:
stream.on('data', function(img, meta){
...
})
Thanks for any possible solutions!
In short, no, there's nothing built into Node.js to support including metadata with streams. You do have some other options, though, including:
You could use a closure to track the meta data separately from the stream. For example:
function handleImage(imageStream) {
var meta = {...};
imageStream.pipe(otherStreams).on('data', function(image) {
// you now have `image` and `meta` variables at your disposal here.
}
}
The downside of this is that the metadata is not available to your otherStreams.
This is a good solution if your other streams are third-party code outside of your control, of if they don't need to know about the metadata.
You could do something similar to HTTP headers, where all the data up to a certain point is meta data, and everything after it is the image. (In HTTP, the deliminator is wherever \n\n occurs first.) All of your streams in the chain have to know about this and handle it though.
If you know your metadata will always be in one chunk and none of your streams split or merge chunks, then you could simplify this a bit and just say that the first (or last) chunk is always metadata.
Switch to an object stream like Amoli mentioned in his answer. Here you would pass {image: imgData, meta: {...}}. You would then have to update your other streams to expect this format.
The main downside of this method, though, is that you either have to pass the metadata multiple times, cache it somewhere for each stream that needs it, or pass your entire image as one chunk (which kind of kills the entire point of "streams"). And, from what I've been told, node.js can optimize text/binary streams better than object streams. So, this probably isn't a good approach for your situation.
https://github.com/dominictarr/mux-demux might be helpful here. It combines multiple streams into one, so you could have separate image and meta streams. I'm not sure how well it would work for your situation though. You'd probably need to update all of your streams to be aware of it.
I know I said that all but the first option require modifying the other streams, but there is a way around that: you could create a generic "stream wrapper" that splits up the image and meta data and passes just the image data through the main stream, and has the meta data bypass it and go on to the next one down the chain. This gets ugly fast though, so probably not the best idea.
Basically, whenever you want to read or write any objects which are not strings or buffers, you’ll need to put your stream into objectMode
Example (source):
function S3Lister (s3, options) {
options || (options = {});
stream.Readable.call(this, { objectMode : true });
this.s3 = s3; // a knox-like client.
this.marker = options.start;
this.connecting = false;
this.ended = false;
}
util.inherits(S3Lister, stream.Readable);
We set the stream to use objectMode as we want to return not just data but also some metadata.
For more information:
Node.js Docs stream object mode
An introduction to nodes streams
I created a module called metastream for this type of thing. (It is in npm).