I have an application for iPAD.
This application records the voice of the microphone.
The audio formats of the item must be PCM, MP3 and WAV files. The MP3 file I get it starting from the original raw file and then convert using LAME.
Unfortunately I have not found any example that allows me to convert a PCM file to a WAV file.
I just noticed that if I put the file extension to WAV format, starting from the raw application saves without problems, so I think that there is no type conversion from PCM WAV files.
Correct?
PS: Sorry for my english ... I use Google Translate
WAV is some kind of a box. PCM is in the box. There are many container formats like MP4. MP4 can contain audio, video or both. It can also contain multiple video or audio streams. Or zip files. Zip files can contain text files. But zip files can also contain images, pdfs,... But you can't say "how can I convert a zip file to the text file inside the zip".
If you want to convert PCM data to a WAVE file you should not many problems because WAV files are quite simple files. Take a look at this:
(See also WAVE PCM soundfile format.)
You first need that header and after you can just append all your pcm data (see the data field).
Converting PCM to WAV isn't too hard. PCM and WAV both format contains raw PCM data, the only difference is their header(wav contains a header where pcm doesn't). So if you just add wav header then it will do the tricks. Just get the PCM data and add the wav header on top of the PCM data. To add wav header with PCM data, check this link.
I was working on a system where it accepts only wav files, but the one I was receiving from amazon Polly was pcm, so finally did this and got my issue resolved. Hope it helps someone. This is an example of nodejs.
// https://github.com/TooTallNate/node-wav
const FileWriter = require('wav').FileWriter
let audioStream = bufferToStream(res.AudioStream);
var outputFileStream = new FileWriter(`${outputFileFolder}/wav/${outputFileName}.wav`, {
sampleRate: 8000,
channels: 1
});
audioStream.pipe(outputFileStream);
function bufferToStream(binary) {
const readableInstanceStream = new Stream.Readable({
read() {
this.push(binary);
this.push(null);
}
});
return readableInstanceStream;
}
Related
Im trying to receive a conversation streaming from Twilio in 8kHz mulaw and I want to convert it to 16kHz PCM for some processing ( that doesnt support 8kHz mulaw format), I tried this method but without success :
- convert the string payload to base64 buffer.
- convert the buffer to Uint8Array with this package: buffer-to-uint8array.
- convert the Uint8Array to Int16Array with this pacakge: alawmulaw.
- then use wav library to write the results.
I am still unable to get a valid audio file following this process, Can someone tell me what i am doing wrong ? or guide me to achieve this ?
I've had good luck using the WaveFile lib (https://www.npmjs.com/package/wavefile)
const wav = new WaveFile();
wav.fromScratch(1, 8000, '8m', Buffer.from(payload, "base64"));
wav.fromMuLaw();
// You can resample.
wav.toSampleRate(16000);
// You can write this straight to a file (will have the headers)
const results = wav.toBuffer();
// Or you can access the samples without the WAV header
const samples = wav.data.samples;
Hope that helps!
I am able to upload the AMR file to SIM800C successfully.
When I play the uploaded audio file during the call using the below command :
#if CALL_RECORDED_AUDIO
Serial1.print("AT+CMEDPLAY=1,C:\\REC\\");
// "Command Media PLAY" -> to play an audio if it is a recorded audio
#else
Serial1.print("AT+CMEDPLAY=1,C:\\User\\");
// "Command Media PLAY" -> to play an audio if it is a uploaded audio
#endif
Played audio always has noise, from C:\User\.
However if I record the audio during call and save it. Play the recorded audio during next call then there is no noise. ( By defining CALL_RECORDED_AUDIO in above code snippet)
according to the documentation of the sim800 it is necessary to play a sound wav during the call
Note
. mode 2 and 3 are not supported when playing audio file during call.
. The audio file can not be played duirng incoming call or outgoing call.
. Only support WAV, PCM, AMR and MP3 format.
. Only support WAV format with 8K 16bit during call.
page 201/202 of the sim800 guide
personally i did not suck having no sim800
I think the recording of a call must be in .WAV format
let me know if it works
I'm trying to create a wav file from multiple other wav files.
I use AVAsset, AVAssestReader and AVAssetWriter.
The format setting used for the AVAssetWriterInput and AVAssetReaderAudioMixOutput is created like this:
AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: 44100, channels: 2, interleaved: true)
And the AVAssetWriter is created like this: AVAssetWriter(url: outputURL, fileType: .wav)
Btw I noticed 2 weirds things:
1) When I create an AVAsset from a wav file I haven't any metadta.
The asset creation is:
let url = URL(fileURLWithPath: mWaveFilePath)
let asset = AVAsset(url: url)
I cannot do simpler, and when I look for metadata properties of this asset I get always empty array with wav file...
2) The most important is when I write a wave file I've the feelling that AVFoundation makes some errors in the wave header. Maybe it comes from me but I manage to create a wave file with audio, and followed some tutorial I've a bad time for finding where the error could come from.
Here is an example of good an and bad header:
The good header before importing the file.
We can see that the format tag is set to 1 which mean PCM. That's what we want.
Now the wrong header after the creation of my audio file:
-2... It's clearly wrong.
So did I miss something on using AVFoundation for creating a wav file, should I do something special?
I am new to ffmpeg, i have spend more than 10 days on finding any way to do muxing in mp4 format with audio and vedio in streams buffer not in file.
What i want is to mux mp4 format audio & vedio in a streams.
I am able to do muxing mp4 format in file. But not able to get mux mp4 format in streams buffer.
Till now i have tried this:
avio_alloc_context(avio_ctx_buffer, avio_ctx_buffer_size, 1, &bd, NULL, &write_packet, NULL);
By calling this avio_alloc_context and passing reference of write_packet function. I am able to get call write_packet. But when i write the data coming in write_packet in a file, and making mp4 file with that. The resultant mp4 file is not working. There is no vedio or audio information available by watching this file in Media Info.
The header is written , then loop is written and finally trailer is written, but not getting success in running final file.
Is there any good way to do mux in mp4 format in streams, so please tell me.
Kindly help to me to do this.
Thanks in advance.
Im trying to extract each frame from a rtsp mp4 stream, and convert that into a jpeg/gif using ffmpeg. I'm getting the sdp header from 000001b0.....000001b5, and adding that into an byte array then capturing a frame starting from 000001b6 and appending it to the byte array.
When I flush it to a file (.mpg) and use ffmpeg it throws errors and not converting.
my header looks like 000001B008000001B58913000001000000012000C488BA98514043C1463F and after this I'm appending a frame (starting from 000001b6).
I did something similar with FFMPEG, and it seems that the frame data you get from FFMPEG already contains the frame header, which is all you need to transcode the data. Please make sure that you decode the mp4 data to a raw format (RGB24 for instance), then encode it to the pixelformat the JPEG/GIF encoder expects (probably a YUV format) using libswscale, before passing the data to the encoder.
Depending on the Codec you may not have to add anything or you may have to add a lot..
This is referred to as de-packetization and MPEG4-ES has no packetization model... H264 has many depending on the profile.
Check out the RFC..
Either 3016 or 3640 should help you.
https://www.rfc-editor.org/rfc/rfc3640
https://www.rfc-editor.org/rfc/rfc3016