I'm new to Rust and I stitched together a program for playing sounds based on a sequence which will be generated by the secret sauce algorithm.
However, I hear no sound being played with a sequence of integers as test data.
I have no clue what values are valid in the number sequence, and which values are audible and which values are awful as in loosing my hearing by playing them.
I'm also struggling with the Result wrapper concept of Rust; In my code I'm using three different ways to unwrap some of which probably are inappropriate.
Here is my code:
fn main() {
use std::process::exit;
if let Err(e) = run() {
println!("Failed to run basic example: {}", e);
exit(1);
}
}
extern crate alto;
use alto::{Alto, AltoResult, Mono, Source};
fn run() -> AltoResult<()> {
use std::sync::Arc;
let alto = Alto::load_default()?;
for s in alto.enumerate_outputs() {
println!("Found device: {}", s.to_str().unwrap());
}
let device = alto.open(None)?; // Opens the default audio device
let context = device.new_context(None)?; // Creates a default context
// Configure listener
context.set_position([1.0, 4.0, 5.0])?;
context.set_velocity([2.5, 0.0, 0.0])?;
context.set_orientation(([0.0, 0.0, 1.0], [0.0, 1.0, 0.0]))?;
let mut _source = context.new_static_source()?;
// Now you can load your samples and store them in a buffer with
// `context.new_buffer(samples, frequency)`;
let data: Vec<_> = (10..200u32).map(|i| i as i16).collect();
let buffer = context.new_buffer::<Mono<i16>, _>(data, 44_100);
let buf = Arc::new(buffer.unwrap());
let good_result = _source.set_buffer(buf);
assert!(good_result.is_ok() && !good_result.is_err());
_source.play();
Ok(())
}
Desired result: Play some sounds from a sequence of integers (not from an audio file).
I also would like to know how to append to a Vec(vector).
Looks like this is very close to working, just a couple issues:
It appears that the call to _source.play() is non-blocking, which means that the process is exiting almost immediately after starting the audio. You can prevent this by inserting a call to thread::sleep.
The sample data (10..200) is only 190 samples long, which at 44100 Hz would only last for a duration of 190 / 44100 seconds, or about 0.004 seconds. Also, the waveform here might not be audible because there is no oscillation in it, only a single linear ramp in the position/pressure of the speaker. You could get something audible by creating a sinusoid or some other repeating waveform.
Here is a modification of your example which is working for me, generating a 220 Hz sinusoid for 2 seconds:
use alto::{Alto, AltoResult, Mono, Source};
use std::{thread, time};
fn main() {
use std::process::exit;
if let Err(e) = run() {
println!("Failed to run basic example: {}", e);
exit(1);
}
}
fn run() -> AltoResult<()> {
use std::sync::Arc;
let alto = Alto::load_default()?;
for s in alto.enumerate_outputs() {
println!("Found device: {}", s.to_str().unwrap());
}
let device = alto.open(None)?; // Opens the default audio device
let context = device.new_context(None)?; // Creates a default context
// Configure listener
context.set_position([1.0, 4.0, 5.0])?;
context.set_velocity([2.5, 0.0, 0.0])?;
context.set_orientation(([0.0, 0.0, 1.0], [0.0, 1.0, 0.0]))?;
let mut _source = context.new_static_source()?;
// Now you can load your samples and store them in a buffer with
// `context.new_buffer(samples, frequency)`;
let pi = std::f32::consts::PI;
let data: Vec<_> = (0..88200u32)
.map(|i| ((i16::MAX as f32) * f32::sin(2.0 * pi * (i as f32) * 220.0 / 44100.0)) as i16)
.collect();
let buffer = context.new_buffer::<Mono<i16>, _>(data, 44_100);
let buf = Arc::new(buffer.unwrap());
let good_result = _source.set_buffer(buf);
assert!(good_result.is_ok() && !good_result.is_err());
_source.play();
thread::sleep(time::Duration::from_millis(2000));
Ok(())
}
To answer your question about the range of valid values in the sequence, any i16 values are valid, so you'd want to provide values between i16::MIN and i16::MAX, i.e., between -32768 and +32767. For highest quality, you probably want your audio to be normalized such that at its loudest parts it hits (or come close to hitting) that maximum value of 32767, and you can adjust your system volume level to prevent this from causing discomfort/hearing loss.
Related
I'm trying to make a NIH-Plug sample player. This wouldn't be a pitch shifting plugin, but per note a different sample. (note A1 -> A1.wav). Are there any resources, about this subject. I'm also trying to cache the samples, but it's not really succeeding.
Ultimately, I'm going to use include_dir instead of include_bytes.
Here's a PolyModSynth demo, which I'm using and trying to adapt to playing samples.
Thank you.
let C1 = include_bytes!("samples/c1.wav");
let CS1 = include_bytes!("samples/c#1.wav");
fn process(
&mut self,
buffer: &mut Buffer,
_aux: &mut AuxiliaryBuffers,
context: &mut impl ProcessContext<Self>
) -> ProcessStatus {
let mut next_event = context.next_event();
for (sample_id, wav_sample) in buffer.iter_samples().zip(wav_vec) {
// Act on the next MIDI event
while let Some(event) = next_event {
if event.timing() > (sample_id as u32) {
break;
}
// handle the MIDI event as needed
// ...
next_event = context.next_event();
}
// Set the sample value in the output buffer
for sample in channel_samples {
*sample = self.sample_value[sample_id];
}
}
sorry, i'm a complete newbie to Rust. I try to read the temp from the sensor mentioned above on a Raspberry Pi using the code provided on this site: https://github.com/fuchsnj/ds18b20
Actually, i want to call the function
get_temperature
but i have no idea how to declare the parameters, especially delay and one_wire_bus.
I was able to resolve all the 'namespaces' or name bindings (sorry, coming from C++) but got stuck with the parameters. Can someone give me an example how to call and use this function like this:
use ds18b20::{Resolution, Ds18b20};
use embedded_hal::blocking::delay::{DelayUs, DelayMs};
use embedded_hal::digital::v2::{OutputPin, InputPin};
use one_wire_bus::{self, OneWire, OneWireResult};
use core::fmt::Debug;
use std::io::Write;
fn main() {
let mut delay = ?????;
let mut one_wire_bus = ?????;
let mut tx = ?????; //&mut Vec::new();
let temp = get_temperature(delay, tx, one_wire_bus);
...
//do something whit the temp
...
}
This is the implementation of the function from the website
fn get_temperature<P, E>(
delay: &mut (impl DelayUs<u16> + DelayMs<u16>),
tx: &mut impl Write,
one_wire_bus: &mut OneWire<P>,
) -> OneWireResult<(), E>
where
P: OutputPin<Error=E> + InputPin<Error=E>,
E: Debug
{
// initiate a temperature measurement for all connected devices
ds18b20::start_simultaneous_temp_measurement(one_wire_bus, delay)?;
// wait until the measurement is done. This depends on the resolution you specified
// If you don't know the resolution, you can obtain it from reading the sensor data,
// or just wait the longest time, which is the 12-bit resolution (750ms)
Resolution::Bits12.delay_for_measurement_time(delay);
// iterate over all the devices, and report their temperature
let mut search_state = None;
loop {
if let Some((device_address, state)) = one_wire_bus.device_search(search_state.as_ref(), false, delay)? {
search_state = Some(state);
if device_address.family_code() != ds18b20::FAMILY_CODE {
// skip other devices
continue;
}
// You will generally create the sensor once, and save it for later
let sensor = Ds18b20::new(device_address)?;
// contains the read temperature, as well as config info such as the resolution used
let sensor_data = sensor.read_data(one_wire_bus, delay)?;
writeln!(tx, "Device at {:?} is {}°C", device_address, sensor_data.temperature);
} else {
break;
}
}
Ok(())
}
i would like to generate a Midi Clock signal with UART on a stm32f1 bluepill board. The Signal basicly just needs to send one byte (0xF8) at a maximum frequency of 128 hz to a 31250 bps serial interface. I created a minimal example using one of the STM32f1s Timers. The problem is, that the received signal on my midi gear does not seem to be very stable. It jumps between 319 and 321 bpm, whereas it should show a stable clock of 320bpm for a 128hz signal (the conversion formula: freq = bpm * 24 / 60). Do you have any idea why there is so much jitter? is it the serial implementation that creates that jitter or can it be a hardware problem? or is it the hal abstraction layer which introduces the jitter?
This is the time differences between the clock signals i measured for 124hz:
On the y axis is the time difference in useconds, on the x axis is the number of readings. 8000 us should be the correct time interval between signals. But in regular intervals there seems to be a signal fired with a time difference of only ~500. What could cause that? Maybe a Counter overflow?
After reducing the prescaler to 12mhz i got this pattern:
Here is the code that generates the clock signal
#![no_std]
#![no_main]
use cortex_m_rt::entry;
use stm32f1xx_hal::{
pac,
pac::{interrupt, Interrupt, TIM4},
prelude::*,
gpio,
afio,
serial::{Serial, Config},
timer::{Event, Timer, CountDownTimer},
};
use core::mem::MaybeUninit;
use stm32f1xx_hal::pac::{USART2};
pub use embedded_hal::digital::v2::{OutputPin, InputPin};
pub type Usart2Serial = Serial<
USART2, (gpio::gpioa::PA2<gpio::Alternate<gpio::PushPull>>,
gpio::gpioa::PA3<gpio::Input<gpio::Floating>>)>;
// When a panic occurs, stop the microcontroller
#[allow(unused_imports)]
use panic_halt;
static mut G_TIM2: MaybeUninit<CountDownTimer<TIM4>> = MaybeUninit::uninit();
static mut G_SERIAL: MaybeUninit<Usart2Serial> = MaybeUninit::uninit();
#[entry]
fn main() -> ! {
let dp = pac::Peripherals::take().unwrap();
let rcc = dp.RCC.constrain();
let mut flash = dp.FLASH.constrain();
let clocks = rcc.cfgr
.use_hse(8.mhz()) // set clock frequency to external 8mhz oscillator
.sysclk(72.mhz()) // set sysclock
.pclk1(36.mhz()) // clock for apb1 prescaler -> TIM1
.pclk2(36.mhz()) // clock for apb2 prescaler -> TIM2,3,4
.adcclk(12.mhz()) // clock for analog digital converters
.freeze(&mut flash.acr);
let mut apb1 = rcc.apb1;
let mut apb2 = rcc.apb2;
let mut gpioa = dp.GPIOA.split(&mut apb2);
let mut afio = dp.AFIO.constrain(&mut apb2);
// init serial
let mut serial = init_usart2(dp.USART2, gpioa.pa2, gpioa.pa3, &mut gpioa.crl, &mut afio, &clocks, &mut apb1);
unsafe { G_SERIAL.write(serial) };
// init timer
let bpm = 320;
let frequency_in_hertz : u32 = (bpm as u32) * 24 / 60;
let mut timer = Timer::tim4(dp.TIM4, &clocks, &mut apb1).start_count_down((frequency_in_hertz).hz());
timer.listen(Event::Update);
// write to global static var
unsafe { G_TIM2.write(timer); }
cortex_m::peripheral::NVIC::unpend(Interrupt::TIM4);
unsafe {
cortex_m::peripheral::NVIC::unmask(Interrupt::TIM4);
}
loop {
// do nothing
}
}
fn init_usart2(
usart2: USART2,
pa2: gpio::gpioa::PA2<gpio::Input<gpio::Floating>>,
pa3: gpio::gpioa::PA3<gpio::Input<gpio::Floating>>,
crl: &mut gpio::gpioa::CRL,
afio: &mut afio::Parts,
clocks: &stm32f1xx_hal::rcc::Clocks,
apb1: &mut stm32f1xx_hal::rcc::APB1
) -> Usart2Serial {
let tx = pa2.into_alternate_push_pull(crl);
let rx = pa3;
return Serial::usart2(
usart2,
(tx, rx),
&mut afio.mapr,
Config::default().baudrate(31250.bps()),
*clocks,
apb1,
);
}
#[interrupt]
fn TIM4() {
let serial = unsafe { &mut *G_SERIAL.as_mut_ptr() };
serial.write(0xF8).ok();
let tim2 = unsafe { &mut *G_TIM2.as_mut_ptr() };
tim2.clear_update_interrupt_flag();
}
Ok, i found the solution by myself. It seems to be connected with the Timers counter. I guess it overflows and triggers the timer at a wrong interval.
adding the following line to the timer interrupt function resets the counter and removes the jitter:
#[interrupt]
fn TIM4() {
...
tim2.reset();
}
You need to confirm the priority of your interrupt to make sure there isn't other higher priority interrupts that delay the UART output.
I have been working for several hours trying to simply record audio from my microphone and encode it with opus so I can ultimately stream it and play it elsewhere. Currently, I am struggling with trying to get it to simply play a file. I based this pretty heavily based upon the cpal examples as well as the libopus c example. Currently, it just outputs a nonsense file that VLC cant even read. However, if I print the raw bytes as they are encoded, I can definitely tell something is happening. I have also tried messing with endianess, but it did not work at all. I have also been using rubato to resample the raw output into a way the opus can use.
fn main() -> Result<(), anyhow::Error> {
let mut encoder = opus::Encoder::new(48000, opus::Channels::Stereo, opus::Application::Voip).unwrap();
let mut resampler = rubato::FftFixedInOut::<f32>::new(44100, 48000, 896, 2);
let host = cpal::default_host();
let device = host.default_input_device().unwrap();
println!("Input device: {}", device.name()?);
let config = device
.default_input_config()
.expect("Failed to get default input config");
println!("Default input config: {:?}", config);
println!("Begin recording...");
let err_fn = move |err| {
eprintln!("an error occurred on stream: {}", err);
};
let sample_format = config.sample_format();
// let socket = std::net::UdpSocket::bind("192.168.1.82:1337")?;
const PATH: &str = concat!(env!("CARGO_MANIFEST_DIR"), "/recorded.pcm");
let socket = std::fs::File::create(PATH).unwrap();
let mut socket = BufWriter::new(socket);
let stream = device
.build_input_stream_raw(
&config.into(),
sample_format,
move |data, _: &_| write_input_data_f32(data, &mut encoder, &mut resampler, &mut socket),
err_fn,
)
.unwrap();
stream.play()?;
std::thread::sleep(std::time::Duration::from_secs(10));
drop(stream);
Ok(())
}
type ResamplerHandle = rubato::FftFixedInOut<f32>;
// type SocketHandle = std::net::UdpSocket;
type SocketHandle = BufWriter<std::fs::File>;
fn write_input_data_f32(
input: &Data,
encoder: &mut Encoder,
resampler: &mut ResamplerHandle,
socket: &mut SocketHandle,
) {
let mut inp = input.as_slice::<f32>().unwrap().to_vec();
inp.truncate(resampler.nbr_frames_needed());
if inp.len() < resampler.nbr_frames_needed() {
inp.append(&mut vec![0f32; resampler.nbr_frames_needed() - inp.len()]);
}
let mut wave_out = resampler.process(&vec![Vec::from(inp); 2]).unwrap();//[0].to_owned();
use itertools::interleave;
let v1 = wave_out[0].to_owned();
let v2 = wave_out[1].to_owned();
let v = interleave(v1.chunks(1), v2.chunks(1)).flatten().copied().collect::<Vec<f32>>();
let buff = encoder.encode_vec_float(v.as_slice(), 960).unwrap();
use std::io::Write;
socket.write(&buff);
}
It looks like you're writing raw audio frames to a file, which most likely is not what you are looking for. Most audio files aren't just raw data, they use an audio container with a header and other features. For Opus, you most likely want to use an Ogg container for simply saving audio to a file (as opposed to e.g. streaming).
The issue, I found, is a misuderstanding with how opus was working. It seems to work distinctly in chunks and not a stream of input. So I went ahead and started streaming them over the network and when I put in the exact output chunks it worked perfectly!
I am trying to implement streaming of UTF-8 characters from a file. This is what I've got so far, please excuse the ugly code for now.
use std::fs::File;
use std::io;
use std::io::BufRead;
use std::str;
fn main() -> io::Result<()> {
let mut reader = io::BufReader::with_capacity(100, File::open("utf8test.txt")?);
loop {
let mut consumed = 0;
{
let buf = reader.fill_buf()?;
println!("buf len: {}", buf.len());
match str::from_utf8(&buf) {
Ok(s) => {
println!("====\n{}", s);
consumed = s.len();
}
Err(err) => {
if err.valid_up_to() == 0 {
println!("1. utf8 decoding failed!");
} else {
match str::from_utf8(&buf[..err.valid_up_to()]) {
Ok(s) => {
println!("====\n{}", s);
consumed = s.len();
}
_ => println!("2. utf8 decoding failed!"),
}
}
}
}
}
if consumed == 0 {
break;
}
reader.consume(consumed);
println!("consumed {} bytes", consumed);
}
Ok(())
}
I have a test file with a multibyte character at offset 98 which fails to decode as it does not fit completely into my (arbitrarily-sized) 100 byte buffer. That's fine, I just ignore it and decode what is valid up to the start of that character.
The problem is that after calling consume(98) on the BufReader, the next call to fill_buf() only returns 2 bytes... it seems to have not bothered to read any more bytes into the buffer. I don't understand why. Maybe I have misinterpreted the documentation.
Here is the sample output:
buf len: 100
====
UTF-8 encoded sample plain-text file
‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾
consumed 98 bytes
buf len: 2
1. utf8 decoding failed!
It would be nice if from_utf8() would return the partially decoded string and the position of the decoding error so I don't have to call it twice whenever this happens, but there doesn't seem to be such a function in the standard library (that I am aware of).
I encourage you to learn how to produce a Minimal, Complete, and Verifiable example. This is a valuable skill that professional programmers use to better understand problems and focus attention on the important aspects of a problem. For example, you didn't provide the actual input file, so it's very difficult for anyone to reproduce your behavior using the code you provided.
After trial-and-error, I was able to reduce your problem down to this code:
use std::io::{self, BufRead};
fn main() -> io::Result<()> {
let mut reader = io::BufReader::with_capacity(100, io::repeat(b'a'));
let a = reader.fill_buf()?.len();
reader.consume(98);
let b = reader.fill_buf()?.len();
println!("{}, {}", a, b); // 100, 2
Ok(())
}
Unfortunately for your case, this behavior is allowed by the contract of BufRead and is in fact almost required. The point of a buffered reader is to avoid making calls to the underlying reader as much as possible. The trait does not know how many bytes you need to read, and it doesn't know that 2 bytes isn't enough and it should perform another call. Flipping it the other way, pretend you had only consumed 1 byte out of 100 — would you want all 99 of those remaining bytes to be copied in memory and then perform another underlying read? That would be slower than not using a BufRead in the first place!
The trait also doesn't have any provisions for moving the remaining bytes in the buffer to the beginning and then filling the buffer again. This is something that seems like it could be added to the concrete BufReader, so you may wish to provide a pull request to add it.
For now, I'd recommend using Read::read_exact at the end of the buffer:
use std::io::{self, BufRead, Read};
fn main() -> io::Result<()> {
let mut reader = io::BufReader::with_capacity(100, io::repeat(b'a'));
let a = reader.fill_buf()?.len();
reader.consume(98);
let mut leftover = [0u8; 4]; // a single UTF-8 character is at most 4 bytes
// Assume we know we need 3 bytes based on domain knowledge
reader.read_exact(&mut leftover[..3])?;
let b = reader.fill_buf()?.len();
println!("{}, {}", a, b); // 100, 99
Ok(())
}
See also:
What is the maximum number of bytes for a UTF-8 encoded character?