Nodejs & Angular - Google Text-To-Speech - node.js
I want to send text from my client (Angular v.12) to the backend through REST API so I'll get the audio back, then in the client use it with new Audio(...) and be able to play the sound on user click.
My backend looks like this:
const express = require("express");
const cors = require("cors");
const textToSpeech = require('#google-cloud/text-to-speech');
const stream = require("stream");
const app = express();
app.get('/api/tts', async (req, res) => {
const txt = req.query.txt
console.log('txt', txt);
const client = new textToSpeech.TextToSpeechClient();
const request = {
input: {text: txt},
voice: {languageCode: 'en-US', ssmlGender: 'NEUTRAL'},
audioConfig: {audioEncoding: 'MP3'},
};
const [response] = await client.synthesizeSpeech(request);
const readStream = new stream.PassThrough();
readStream.end(response.audioContent);
res.set("Content-disposition", 'attachment; filename=' + 'audio.mp3');
res.set("Content-Type", "audio/mpeg");
readStream.pipe(res);
})
Now in my client I just created a button to test, and on click I send an HTTP request like so:
public textToSpeech(txt: string) {
let httpParams: HttpParams = new HttpParams()
.set('txt', txt)
return this.http.get('//localhost:3030/api/tts', { params: httpParams, responseType: 'text' })
}
I do get a 200 OK code and a long string as a response.
In my component:
onButtonClick() {
this.speechService.textToSpeech('testing')
.subscribe(res => {
this.audio = new Audio(res)
this.audio.play()
})
}
but I get the following errors:
GET http://localhost:4200/��D�
Uncaught (in promise) DOMException: The media resource indicated by the src attribute or assigned media provider object was not suitable.
Okay, so I solved it with a different approach.
On the backend, I use fs to write and create an MP3 file to the public folder, and then on the frontend, I put the link to the file as the source like so:
Backend:
app.get('/api/tts', async (req, res) => {
const {text} = req.query
const client = new textToSpeech.TextToSpeechClient();
const request = {
input: {text},
voice: {languageCode: 'en-US', ssmlGender: 'FEMALE'},
audioConfig: {audioEncoding: 'MP3'},
};
const [response] = await client.synthesizeSpeech(request);
const writeFile = util.promisify(fs.writeFile);
await writeFile(`./public/audio/${text}.mp3`, response.audioContent, 'binary');
res.end()
})
Frontend:
onButtonClick() {
this.speechService.textToSpeech('hello')
.subscribe(res => {
this.audio = new Audio(`//localhost:3030/audio/hello.mp3`)
this.audio.play()
})
}
It's hardcoded right now, but I'm going to make it dynamic, just wanted to test.
I don't know if this is the best approach but I got it to work the way I wanted.
Related
Delivering image from S3 to React client via Context API and Express server
I'm trying to download a photo from an AWS S3 bucket via an express server to serve to a react app but I'm not having much luck. Here are my (unsuccessful) attempts so far. The Workflow is as follows: Client requests photo after retrieving key from database via Context API Request sent to express server route (important so as to hide the true location from the client) Express server route requests blob file from AWS S3 bucket Express server parses image to base64 and serves to client Client updates state with new image React Client const [profilePic, setProfilePic] = useState(''); useEffect(() => { await actions.getMediaSource(tempPhoto.key) .then(resp => { console.log('server resp: ', resp.data.data.newTest) // returns ����\u0000�\u0000\b\u0006\ const url = window.URL || window.webkitURL; const blobUrl = url.createObjectURL(resp.data.data.newTest); console.log("blob ", blobUrl); setProfilePic({ ...profilePic, image : resp.data.data.newTest }); }) .catch(err => errors.push(err)); } Context API - just axios wrapped into its own library getMediaContents = async ( key ) => { return await this.API.call(`http://localhost:5000/${MEDIA}/mediaitem/${key}`, "GET", null, true, this.state.accessToken, null); } Express server route router.get("/mediaitem/:key", async (req, res, next) => { try{ const { key } = req.params; // Attempt 1 was to try with s3.getObject(downloadParams).createReadStream(); const readStream = getFileStream(key); readStream.pipe(res); // Attempt 2 - attempt to convert response to base 64 encoding var data = await getFileStream(key); var test = data.Body.toString("utf-8"); var container = ''; if ( data.Body ) { container = data.Body.toString("utf-8"); } else { container = undefined; } var buffer = (new Buffer.from(container)); var test = buffer.toString("base64"); require('fs').writeFileSync('../uploads', test); // it never wrote to this directory console.log('conversion: ', test); // prints: 77+977+977+977+9AO+/vQAIBgYH - this doesn't look like base64 to me. delete buffer; res.status(201).json({ newTest: test }); } catch (err){ next(ApiError.internal(`Unexpected error > mediaData/:id GET -> Error: ${err.message}`)); return; } }); AWS S3 Library - I made my own library for using the s3 bucket as I'll need to use more functionality later. const getFileStream = async (fileKey) => { const downloadParams = { Key: fileKey, Bucket: bucketName } // This was attempt 1's return without async in the parameter return s3.getObject(downloadParams).createReadStream(); // Attempt 2's intention was just to wait for the promise to be fulfilled. return await s3.getObject(downloadParams).promise(); } exports.getFileStream = getFileStream; If you've gotten this far you may have realised that I've tried a couple of things from different sources and documentation but I'm not getting any further. I would really appreciate some pointers and advice on what I'm doing wrong and what I could improve on. If any further information is needed then just let me know. Thanks in advance for your time!
Maybe it be useful for you, that's how i get image from S3, and process image on server Create temporary directory createTmpDir(): Promise<string> { return mkdtemp(path.join(os.tmpdir(), 'tmp-')); } Gets the file readStream(path: string) { return this.s3 .getObject({ Bucket: this.awsConfig.bucketName, Key: path, }) .createReadStream(); } How i process file async MainMethod(fileName){ const dir = await this.createTmpDir(); const serverPath = path.join( dir, fileName ); await pipeline( this.readStream(attachent.key), fs.createWriteStream(serverPath + '.jpg') ); const createFile= await sharp(serverPath + '.jpg') .jpeg() .resize({ width: 640, fit: sharp.fit.inside, }) .toFile(serverPath + '.jpeg'); const imageBuffer = fs.readFileSync(serverPath + '.jpeg'); //my manipulations fs.rmSync(dir, { recursive: true, force: true }); //delete temporary folder }
Temporary fetch audio from URL, write id3 and send it to user
I was making a YouTube music downloader and i use next and react and it will return like URL of audio and the information of that song but i don't really know how to temporary fetch the audio write the id3 in background and send it to user i found a way but i dont know if it will work and i dont know what the negative impact using this method const axios = require('axios'); const NodeID3 = require('node-id3'); const tags = { title: 'Tomorrow', artist: 'Kevin Penkin', album: 'TVアニメ「メイドインアビス」オリジナルサウンドトラック', TRCK: '27' }; const url = 'https://file-examples.com/storage/fe52cb0c4862dc676a1b341/2017/11/file_example_MP3_5MG.mp3'; const handler = async (req, res) => { const response = await axios(url, { responseType: 'arraybuffer' }); const success = NodeID3.write(tags, response.data); // Returns Buffer res.setHeader('Content-Type', 'audio/mpeg'); res.setHeader('Content-Disposition', 'attachment; filename=dummy.mp3'); res.send(success); // await pipeline(success, res); }; export default handler;
Express JS API req.body shows buffer data
I created an API below: app.post("/categories", async (req, res) => { console.log(`req.body: ${JSON.stringify(req.body)}`) console.log(`req.body.title: ${JSON.stringify(req.body.title)}`) console.log(`req.files: ${JSON.stringify(req.files)}`) res.json({}) }); Where the data passed is: { "title": "Video Title" "description": "Video Description" "thumbnail": [object File] "video": [object File] } The data passed is powered by VueJS and Axios: methods: { async createCategory() { const formData = new window.FormData(); formData.set("title", this.category.title); formData.set("description", this.category.description); formData.set("thumbnail", this.thumbnail); formData.set("video", this.video); await $this.axios.post("clothing/v1/categories/", formData, { headers: { "Content-Type": "multipart/form-data" }, }); } } However the shown data in the req.body is: req.body: {"type":"Buffer","data":[45,45,45,45,45,45,87,101,98,75,105,116,70,111,114,109,66,111,117,110,100,97,114,121,121,104,112,52,54,97,82,89,68,121,77,82,57,66,52,110,13,10,67,111,110,116,101,110,116,45,68,105,115,112,111,115,105,116,105,111,110,58,32,102,111,114,109,45,100,97,116,97,59,32,110,97,109,101,61,34,116,105,116,108,101,34,13,10,13,10,86,105,100,101,111,32,84,105,116,108,101,13,10,45,45,45,45,45,45,87,101,98,75,105,116,70,111,114,109,66,111,117,110,100,97,114,121,121,104,112,52,54,97,82,89,68,121,77,82,57,66,52,110,13,10,67,111,110,116,101,110,116,45,68,105,115,112,111,115,105,116,105,111,110,58,32,102,111,114,109,45,100,97,116,97,59,32,110,97,109,101,61,34,100,101,115,99,114,105,112,116,105,111,110,34,13,10,13,10,86,105,100,101,111,32,68,101,115,99,114,105,112,116,105,111,110,13,10,45,45,45,45,45,45,87,101,98,75,105,116,70,111,114,109,66,111,117,110,100,97,114,121,121,104,112,52,54,97,82,89,68,121,77,82,57,66,52,110,13,10,67,111,110,116,101,110,116,45,68,105,115,112,111,115,105,116,105,111,110,58,32,102,111,114,109,45,100,97,116,97,59,32,110,97,109,101,61,34,116,104,117,109,98,110,97,105,108,34,13,10,13,10,91,111,98,106,101,99,116,32,70,105,108,101,93,13,10,45,45,45,45,45,45,87,101,98,75,105,116,70,111,114,109,66,111,117,110,100,97,114,121,121,104,112,52,54,97,82,89,68,121,77,82,57,66,52,110,13,10,67,111,110,116,101,110,116,45,68,105,115,112,111,115,105,116,105,111,110,58,32,102,111,114,109,45,100,97,116,97,59,32,110,97,109,101,61,34,118,105,100,101,111,34,13,10,13,10,91,111,98,106,101,99,116,32,70,105,108,101,93,13,10,45,45,45,45,45,45,87,101,98,75,105,116,70,111,114,109,66,111,117,110,100,97,114,121,121,104,112,52,54,97,82,89,68,121,77,82,57,66,52,110,45,45,13,10]} I am hoping that I can retrieve my passed data inside my API something like: req.body: {"title":"Example","description":"example"} as I will use these data to save in FireStore and upload the files in Cloud Storage. NOTE: I tried using multer but got the error below: > return fn.apply(this, arguments); > ^ > > TypeError: Cannot read properties of undefined (reading 'apply') > at Immediate.<anonymous> (/Users/adminadmin/Desktop/projects/dayanara-environments/dayanara-clothing-api/functions/node_modules/express/lib/router/index.js:641:15) > at processImmediate (node:internal/timers:468:21)
I did not mention that I was developing NodeJS with Google Cloud Functions and only in local and testing development. The error below always shows whenever there is any kind of error in my code. > return fn.apply(this, arguments); > ^ > > TypeError: Cannot read properties of undefined (reading 'apply') > at Immediate.<anonymous> (/Users/adminadmin/Desktop/projects/dayanara-environments/dayanara-clothing-api/functions/node_modules/express/lib/router/index.js:641:15) > at processImmediate (node:internal/timers:468:21) As for the multipart, I used busboy like below: app.post("/categories", (req, res) => { let writeResult; const storageRef = admin.storage().bucket(`gs://${storageBucket}`); const busboy = Busboy({headers: req.headers}); const tmpdir = os.tmpdir(); // This object will accumulate all the fields, keyed by their name const fields = {}; // This object will accumulate all the uploaded files, keyed by their name. const uploads = {}; // This code will process each non-file field in the form. busboy.on('field', (fieldname, val) => { /** * TODO(developer): Process submitted field values here */ console.log(`Processed field ${fieldname}: ${val}.`); fields[fieldname] = val; }); const fileWrites = []; // This code will process each file uploaded. busboy.on('file', (fieldname, file, {filename}) => { // Note: os.tmpdir() points to an in-memory file system on GCF // Thus, any files in it must fit in the instance's memory. console.log(`Processed file ${filename}`); const filepath = path.join(tmpdir, filename); uploads[fieldname] = filepath; const writeStream = fs.createWriteStream(filepath); file.pipe(writeStream); // File was processed by Busboy; wait for it to be written. // Note: GCF may not persist saved files across invocations. // Persistent files must be kept in other locations // (such as Cloud Storage buckets). const promise = new Promise((resolve, reject) => { file.on('end', () => { writeStream.end(); }); writeStream.on('finish', resolve); writeStream.on('error', reject); }); fileWrites.push(promise); }); // Triggered once all uploaded files are processed by Busboy. // We still need to wait for the disk writes (saves) to complete. busboy.on('finish', async () => { console.log('finished busboy') await Promise.all(fileWrites); /** * TODO(developer): Process saved files here */ for (const file in uploads) { const filePath = uploads[file] const name = fields.name.replaceAll(' ', '-') const _filePath = filePath.split('/') const fileName = _filePath[_filePath.length - 1] const destFileName = `${name}/${fileName}` // eslint-disable-next-line no-await-in-loop const uploaded = await storageRef.upload(filePath, { destination: destFileName }) const _file = uploaded[0]; const bucketFile = "https://firebasestorage.googleapis.com/v0/b/" + storageBucket + "/o/" + encodeURIComponent(_file.name) + "?alt=media" fields[file] = bucketFile } writeResult = await admin .firestore() .collection(collection) .add({ name: fields.name, description: fields.description, timestamp: admin.firestore.Timestamp.now(), thumbnail: fields.thumbnail, video: fields.video }); const written = await writeResult.get(); res.json(written.data()); } }); Then I needed to change how I pass formData from my VueJS and Axios where I replaced using model to refs on my file data. I only needs to use model in Django so I thought it would be the same on ExpressJS: methods: { async createCategory() { const formData = new window.FormData(); const thumbnail = this.$refs.thumbnail; const video = this.$refs.video; formData.set("name", this.category.name); formData.set("description", this.category.description); formData.set("thumbnail", thumbnail.files[0]); formData.set("video", video.files[0]); await $this.axios.post("clothing/v1/categories/", formData, { headers: { "Content-Type": "multipart/form-data" }, }); } } After the changes above, I can finally send multi-part/form-data properly. Resources below helped me a lot: https://cloud.google.com/functions/docs/samples/functions-http-form-data#functions_http_form_data-nodejs Handling multipart/form-data POST with Express in Cloud Functions
fs.writefile is making POST request loop infinitly within my express app
I have this current server code: const express = require("express") const fs = require("fs") const router = express.Router() const path = require("path") const todos = JSON.parse(fs.readFileSync(path.join(__dirname, "../db", "todolist.json"), "utf8")) router.get("/", async (req, res) => { res.send(todos) }) router.post("/new", async (req, res) => { const { title, description } = req.body const todoItem = { id: "3", title, description } todos.todos.push(todoItem) const data = JSON.stringify(todos, null, 2) fs.writeFile(path.join(__dirname, "../db", "todolist.json"), data, () => {}) res.status(201).json(todoItem) }) client: console.log("Hello world!") const somedata = { title: "A new boy", description: "Recieved from the client" } const main = async () => { const response1 = await fetch("http://localhost:3000/todo", { method: "GET", }) const data1 = await response1.json() const response2 = await fetch("http://localhost:3000/todo/new", { method: "POST", body: JSON.stringify(somedata), headers: { 'Content-Type': 'application/json', "Accept": "application/json" } }) const data2 = await response2.json() return { data1, data2 } } main().then(data => console.log(data)) When I make a /POST request to create a new entity the browser just loops the request over and over until I manually have to quit the server. This does not happen if I use postman for some reason. Does anybody see any obvious error here with how the writeFile-method is used and why it continuously reloads the browser to keep pushing POST requests? Thanks! :)
i had the same problem! And it took me about 1 hour to understand what my Problem is: If you use "live server extension", the server will restart everytime, when you write, change or delete a file in the project folder! So, if your node-app wirte a file, the live-server will restart and the app writes the file again! => loop In my case, i write a pdf-file. All i had to do, is to tell the live server extension to ignore pdf files: So i just add to "settings.json": "liveServer.settings.ignoreFiles":["**/*.pdf"]
fs.writeFile is asynchronous function. So, to send a response after file written you must do it in the callback. And of course, don't forget about error checking. I.e. router.post("/new", async (req, res) => { const { title, description } = req.body const todoItem = { id: "3", title, description } todos.todos.push(todoItem) const data = JSON.stringify(todos, null, 2) fs.writeFile(path.join(__dirname, "../db", "todolist.json"), data, (err) => { if(err) { throw err; } res.status(201).json(todoItem) }) }) Or you can use fs.writeFileSync as Muhammad mentioned earlier.
I think I found the problem. It seemed that the live server extension was messing things up when I had the client and server on separate ports, making the browser refresh for every request made somehow. I switched back to them sharing port, which then makes it work. I have to find a good way of separating them on a later basis without this bug happening, but that is for another time. Thanks for your help :)
I share my working sample.body-parser dependency is need to get body in post request.Please don't change the order in server.js.Check and let me know. and also check once whether your client code is in in loop. My server.js const express = require("express") const fs = require("fs") const router = express.Router() const path = require("path") const app = express(); const bodyParser = require("body-parser") const todos = JSON.parse(fs.readFileSync(path.join(__dirname, "../db", "todolist.json"), "utf8")) app.use(bodyParser.json()); app.use("/",router) router.get("/todo", async (req, res) => { res.send(todos) }) router.post("/todo/new", async (req, res) => { const { title, description } = req.body const todoItem = { id: "3", title, description } todos.todos.push(todoItem) const data = JSON.stringify(todos, null, 2) fs.writeFile(path.join(__dirname, "../db", "todolist.json"), data, () => {}) res.status(201).json(todoItem) }); app.listen(3000, () => { console.log(`Server running in Port`); }); todolist.json { "todos": [] }
I think you should use fs.writeFileSync() or write some code in its callback
How do I properly encode audio bytes when writing to Google Cloud Storage
I am trying to read an audio file from Twilio and save it in Google Cloud Storage (GCS). I've gotten to the point where I can read and write bytes, but what I end up with in GCS is no longer a valid .mp3 file and almost twice the size of a manually downloaded .mp3 file (from Twilio). This is all in Node.js using Firebase server. Here's the code: const recordingResults = await Recording.getTwilioRecording(req.body.RecordingSid) if (recordingResults){ await Recording.saveRecording(req.body.RecordingSid,RECORDING_TYPE_MESSAGE,recordingResults['data']) } exports.getTwilioRecording = async function(recordingSid){ const url = Config.TWILIO_API_BASE_URL+Config.TWILIO_ACCOUNT_SID+'/Recordings/'+recordingSid+'.mp3' const promise = axios({ 'method':'get', 'url':url, 'headers': { 'Content-Type': 'audio/mpeg' } }) const data = await promise.then(mp3 =>{ return mp3 }).catch(err=>{ console.log('Error connecting to Twilio to get recording',err.message) return false }) return data } exports.saveRecording = async function(recordingSid, recordingType, data){ const projectId = Config.GOOGLE_PROJECT_ID const keyFilename = Config.GOOGLE_STORAGE_SA_JSON const storage = new Storage({projectId, keyFilename}); const myBucket = storage.bucket(Config.GOOGLE_STORAGE_ROOT_DIR); const gscname = '/recordings/'+recordingType+'/'+recordingSid+'.mp3' const file = myBucket.file(gscname); file.save(data) }
After numerous attempts at "getting" audio content, and then "pushing" it, I finally tried the method that appears most often in the examples. That is piping from the get URL directly to Google Cloud Storage's createWriteStream method. I wish I could say why other methods don't work. Perhaps someone else can educate us. function someFunction(recordingSid, recordingType){ const url = Config.TWILIO_API_BASE_URL+Config.TWILIO_ACCOUNT_SID+'/Recordings/'+recordingSid+'.mp3' const projectId = Config.GOOGLE_PROJECT_ID const keyFilename = Config.GOOGLE_STORAGE_SA_JSON const storage = new Storage({projectId, keyFilename}); const myBucket = storage.bucket(Config.GOOGLE_STORAGE_ROOT_DIR); const gscname = '/recordings/'+recordingType+'/'+recordingSid+'.mp3' const file = myBucket.file(gscname); axios({ method: 'get', url: url, responseType:'stream' }) .then(res => { res.data.pipe(file.createWriteStream({ 'contentType':'audio/mp3', 'resumable':false, })) .on('error',(err)=>{console.log(err.message)}) .on('success',()=>{console.log('success')}) return null }) .catch(err => console.log(err)); }