How to save state of audio blob in my chrome extension? - google-chrome-extension

I have the following popup.js file in which I capture audio from a tab and store it in a blob and then set my audio tag's src URL to it. I would like to store this blob in chrome's local storage so that the audio data remains even after closing the extension's popup.html window. How do I go about doing this? I tried to serialize the blob to JSON using ArrayBuffer but it seemed to not be the correct method. If there is a better way please let me know. Thanks.
function captureTabAudio() {
chrome.tabCapture.capture({ audio: true, video: false }, (stream) => {
context = new AudioContext();
const chunks = [];
if (context.state === 'suspended') {
context.resume();
}
var newStream = context.createMediaStreamSource(stream);
newStream.connect(context.destination);
const recorder = new MediaRecorder(stream);
recorder.start();
setTimeout(() => recorder.stop(), 10000);
recorder.ondataavailable = (e) => {
chunks.push(e.data);
};
recorder.onstop = (e) => {
const blob = new Blob(chunks, { type: "audio/ogg; codecs=opus" });
document.querySelector("audio").src =
URL.createObjectURL(blob);
};
})
}

I assume that when you say "Chrome's local storage" you mean the web's localStorage API. If so, this won't work because, "The keys and the values stored with localStorage are always in the UTF-16 string format, which uses two bytes per character" (MDN). If you meant the extension platform's Storage API, you'll have a similar problem because it only supports JSON-serializable values. While you could potentially convert the ArrayBuffer into a Base64 string, I wouldn't recommend it as you're more likely to run into storage limits.
For storing binary data locally, currently your best bet is to use IndexedDB. Since this API supports the structured clone algorithm, it supports a much broader set of types including ArrayBuffer. A similar question was asked here: Saving ArrayBuffer in IndexedDB.

Related

How to send audio saved as a Buffer, from my api, to my React client and play it?

I've been chasing my tail for two days figuring out how to best approach sending the <Buffer ... > object generated by Google's Text-To-Speech service, from my express-api to my React app. I've come across tons of different opinionated resources that point me in different directions and only potentially "solve" isolated parts of the bigger process. At the end of all of this, while I've learned a lot more about ArrayBuffer, Buffer, binary arrays, etc. yet I still feel just as lost as before in regards to implementation.
At its simplest, all I aim to do is provide one or more strings of text to tts, generate the audio files, send the audio files from my express-api to my react client, and then automatically play the audio in the background on the browser when appropriate.
I am successfully sending and triggering google's tts to generate the audio files. It responds with a <Buffer ...> representing the binary data of the file. It arrives in my express-api endpoint, from there I'm not sure if I should
...
convert the Buffer to a string and send it to the browser?
send it as a Buffer object to the browser?
set up a websocket using socket.io and stream it?
then once it's on the browser,
do I use an <audio /> tag?
should I convert it to something else?
I suppose the problem I'm having is trying to find answers for this results in an information overload consisting of various different answers that have been written over the past 10 years using different approaches and technologies. I really don't know where one starts and the next ends, what's a bad practice, what's a best practice, and moreover what is actually suitable for my case. I could really use some guidance here.
Synthesise function from Google
// returns: <Buffer ff f3 44 c4 ... />
const synthesizeSentence = async (sentence) => {
const request = {
input: { text: sentence },
voice: { languageCode: "en-US", ssmlGender: "NEUTRAL" },
audioConfig: { audioEncoding: "MP3" },
};
const response = await client.synthesizeSpeech(request);
return response[0].audioContent;
};
(current shape) of express-api POST endpoint
app.post("/generate-story-support", async (req, res) => {
try {
// ? generating the post here for simplicity, eventually the client
// ? would dictate the sentences to send ...
const ttsResponse: any = await axios.post("http://localhost:8060/", {
sentences: SAMPLE_SENTENCES,
});
// a resource said to send the response as a string and then convert
// it on the client to an Array buffer? -- no idea if this is a good practice
return res.status(201).send(ttsResponse.data[0].data.toString());
} catch (error) {
console.log("error", error);
return res.status(400).send(`Error: ${error}`);
}
});
react client
so post
useEffect(() => {
const fetchData = async () => {
const data = await axios.post(
"http://localhost:8000/generate-story-support"
);
// converting it to an ArrayBuffer per another so post
const encoder = new TextEncoder();
const encodedData = encoder.encode(data.data);
setAudio(encodedData);
return data.data;
};
fetchData();
}, []);
// no idea what to do from here, if this is even the right path :/

View and not download Google Cloud Storage files in browser

I'am working with NodeJs/ express to create a POST api endpoint to upload files to google cloud storage and returning a public URL like this :
https://storage.googleapis.com/[BUCKET_NAME]/[OBJECT_NAME]
When the upload is done, I get the URL and when I open it , the file is downloaded directly (image, pdf etc...)
Is there a way to view and open it in the browser ?
here is my upload function :
const uploadImage = (file) => new Promise((resolve, reject) => {
const { originalname, buffer } = file
const blob = bucket.file(originalname.replace(/ /g, "_"))
const blobStream = blob.createWriteStream({
resumable: false
})
blobStream.on('finish', () => {
const publicUrl = format(
`https://storage.googleapis.com/${bucket.name}/${blob.name}`
)
resolve(publicUrl)
})
.on('error', () => {
reject(`Unable to upload image, something went wrong`)
})
.end(buffer)
})
Thanks
If the URL looks like the code above, https://storage.googleapis.com/bucketName/objectName, a browser should be able to view it directly, so long as a few conditions are in place:
The object's contentType is set appropriately. If you don't specify a type, the default is application/octet-stream, and web browsers will probably decide to just download such an object rather than displaying it in some form.
The object's metadata does not override the contentDisposition. It's possible to force objects to be downloaded as attachments by setting that property to something like attachment; filename=foo.txt.
The object is either publicly viewable or appropriate credentials are passed in the GET request. This is not the default setting. When you upload the object, you'll need to explicitly note that the ACL should allow the group allUsers read permission. Alternately, you could set the default object ACL property of the bucket to include that permission.
In your case, the object is downloading successfully, so it's not the ACL issue, and if you don't know about the contentDisposition setting, then it's probably problem #1. Make sure you specify a reasonable content type for the object.
Example:
const blobStream = blob.createWriteStream({
resumable: false
contentType: "text/html"
})

Display video from Gridfs storage in react app

I am using multer-gridfs-storage and gridfs-stream to store my video in the backend (Express/Node). When I try to retrieve the file to play on my front end (React) the player refuses to recognize the source.
I am using Video-React to display the video on download. The download is successful, I get a Binary string back from the backend, which I converted to a Blob.
try{
fileBlob = new Blob([res.data], {type : res.headers['content-type']});
}catch(err){
console.log('Error converting to blob');
console.log(err);
}
This is my Video-React player being rendered
<Player
autoPlay
ref="player"
>
<source src={this.state.fileURL} />
<ControlBar autoHide={false} />
</Player>
Then I tried two techniques
readDataAsURL
let reader = new FileReader();
reader.onload = function(event){
//rThis is just a reference to the parent function this
rThis.setState({fileURL: reader.result}, () => {
rThis.refs.player.load();
});
}
try{
reader.readAsDataURL(fileBlob);
}catch(err){
console.log('Error trying readDataURL');
console.log(err);
}
src is being set correctly but the video never loads
URL.createObjectURL
let vidURL = URL.createObjectURL(fileBlob);
rThis.setState({fileURL: vidURL}, () => {
rThis.refs.player.load();
});
src is set to a blob: url but still nothing
Is this an issue with Video-react or should I be doing something else? Any pointers to references I could look at will also help. What am I doing wrong? dataURL works in the case of images, I checked, but not video.
So after some more reading, I finally figured out the problem. Since I'm using gridfs-stream I'm actually piping the response from the server. So I was never getting the whole file, and trying to convert res.data, which is just a chunk, was a mistake. Instead, in my res object, I found the source url within the config property.
res.config.url
This contained my source url to which my server was piping the chunks. Should have figured it out earlier, considering I picked GridFS storage for precisely this reason.

How to send File through Websocket along with additional info?

I'm developing a Web application to send images, videos, etc. to two monitors from an admin interface. I'm using ws in Node.js for the server side. I've implemented selecting images available on the server and external URLs and sending them to the clients, but I also wanted to be able to directly send images selected from the device with a file input. I managed to do it using base64 but I think it's pretty inefficient.
Currently I send a stringified JSON object containing the client to which the resource has to be sent, the kind of resource and the resource itself, parse it in the server and send it to the appropriate client. I know I can set the Websocket binaryType to blob and just send the File object, but then I'd have no way to tell the server which client it has to send it to. I tried using typeson and BSON to accomplish this, but it didn't work.
Are there any other ways to do it?
You can send raw binary data through the WebSocket.
It's quite easy to manage.
One option is to prepend a "magic byte" (an identifier that marks the message as non-JSON). For example, prepend binary messages with the B character.
All the server has to do is test the first character before collecting the binary data (if the magic byte isn't there, it's probably the normal JSON message).
A more serious implementation will attach a header after the magic byte (i.e., file name, total length, position of data being sent etc').
This allows the upload to be resumed on disconnections (send just the parts that weren't acknowledged as received.
Your server will need to split the data into magic byte, header and binary_data before processing. but it's easy enough to accomplish.
Hope this help someone.
According to socket.io document you can send either string, Buffer or mix both of them
On Client side:
function uploadFile(e, socket, to) {
let file = e.target.files[0];
if (!file) {
return
}
if (file.size > 10000000) {
alert('File should be smaller than 1MB')
return
}
var reader = new FileReader();
var rawData = new ArrayBuffer();
reader.onload = function (e) {
rawData = e.target.result;
socket.emit("send_message", {
type: 'attachment',
data: rawData
} , (result) => {
alert("Server has received file!")
});
alert("the File has been transferred.")
}
reader.readAsArrayBuffer(file);
}
on server side:
socket.on('send_message', async (data, cb) => {
if (data.type == 'attachment') {
console.log('Found binary data')
cb("Received file successfully.")
return
}
// Process other business...
});
I am using pure WebSocket without io, where you cannot mix content - either String or Binary. Then my working solution is like this:
CLIENT:
import { serialize } from 'bson';
import { Buffer } from 'buffer';
const reader = new FileReader();
let rawData = new ArrayBuffer();
ws = new WebSocket(...)
reader.onload = (e) => {
rawData = e.target.result;
const bufferData = Buffer.from(rawData);
const bsonData = serialize({ // whatever js Object you need
file: bufferData,
route: 'TRANSFER',
action: 'FILE_UPLOAD',
});
ws.send(bsonData);
}
Then on Node server side, the message is catched and parsed like this:
const dataFromClient = deserialize(wsMessage, {promoteBuffers: true}) // edited
fs.writeFile(
path.join('../server', 'yourfiles', 'yourfile.txt'),
dataFromClient.file, // edited
'binary',
(err) => {
console.log('ERROR!!!!', err);
}
);
The killer is promoteBuffer option in deserialize function.

How can I get my Express app to respect the Orientation EXIF data on upload?

I have an Express application that uses multer to upload images to an S3 bucket. I'm not doing anything special, just a straight upload, but when they are displayed in the browser some of the iPhone images are sideways.
I know this is technically a browser bug and Firefox now supports the image-orientation rule, but Chrome still displays the images on their side.
Is there a way I can have Express read the EXIF data and just rotate them before uploading?
Right, I figured it out. I used a combination of JavaScript Load Image and the FormData API.
First I'm using Load Image to get the orientation of the image from the exif data and rotating it. I'm then converting the canvas output that Load Image provides and converting that to a blob (you may also need the .toBlob() polyfill for iOS as it does not support this yet.
That blob is then attached to the FormData object and I'm also putting it back in the DOM for a file preview.
// We need a new FormData object for submission.
var formData = new FormData();
// Load the image.
loadImage.parseMetaData(event.target.files[0], function (data) {
var options = {};
// Get the orientation of the image.
if (data.exif) {
options.orientation = data.exif.get('Orientation');
}
// Load the image.
loadImage(event.target.files[0], function(canvas) {
canvas.toBlob(function(blob) {
// Set the blob to the image form data.
formData.append('image', blob, 'thanksapple.jpg');
// Read it out and stop loading.
reader.readAsDataURL(blob);
event.target.labels[0].innerHTML = labelContent;
}, 'image/jpeg');
}, options);
reader.onload = function(loadEvent) {
// Show a little image preview.
$(IMAGE_PREVIEW).attr('src', loadEvent.target.result).fadeIn();
// Now deal with the form submission.
$(event.target.form).submit(function(event) {
// Do it over ajax.
uploadImage(event, formData);
return false;
});
};
});
Now for the uploadImage function which I'm using jQuery's AJAX method for. Note the processData and contentType flags, they are important.
function uploadImage(event, formData) {
var form = event.target;
$.ajax({
url: form.action,
method: form.method,
processData: false,
contentType: false,
data: formData
}).done(function(response) {
// And we're done!
});
// Remove the event listener.
$(event.target).off('submit');
}
All the info is out there but it's spread across multiple resources, hopefully this will save someone a lot of time and guessing.

Resources