I'm setting up an Express.js endpoint that would support decoding a base64 string that came from the client (camera record), which is basically an encoded webm video.
The problem is that to validate this, I am trying to create a .webm video file from the encoded base64 string on the server. Unfortunately this results in a video file that cannot be played due to an error of "No video with supported format and MIME type found" (Ubuntu 18.04).
I did not have any issue with any other .webm files but also the base64 URL is sure to be correct as I am using it on the cliet as the source of a element to replay what was recorded and works fine.
The issue I suspect is something wrong with the code that generates the file on the server route.
import express from "express";
import multer from "multer";
import fs from "fs";
const router = express.Router();
// ...
router.post("/upload", multer().fields([]), (req, res) => {
const formData = req.body;
// formData.vid_string is the base64 encoded string
fs.writeFileSync(
"./test.webm",
Buffer.from(formData.vid_string.split(",")[1], "base64")
);
res.sendStatus(200);
});
I have also tried to change fs.writeFileSync code and add "base64" as a third argument but it has the same result.
Any input would be appreciated! Thanks!
I am pretty sure the correct way to do this is to remove the portion of the base64 string that is the URL specifications (If this is applicable to your situation being the part string that states base64 data and the "video/webm" portion and the trailing comma,) and call atob() on the resulting string this atob will represent the binary string version of the video. Then simply write that string to your new file!
However, I would recommend parsing the base64URL to binary string from the client side, since I am fairly sure that base64 uses more space to account for a smaller group of available bytes, if you have any control over how the video will be sent to the node server.
Related
I captured one voice message from WhatsApp and it saved as wav file using node js. But I pass this wav file for speech translation using azure speech to text JavaScript sdk it not showing anything and also try this file into demo app also, in azure demo app(azure demo app) I got error "Cannot Recognize Speech Error: Error occurred while processing 'audio.wav'. Invalid WAV header in file, RIFF was not found".
converted audio file information
Encoding code
var encoder = new base64.Base64Encode();
var b64s = request(options).pipe(encoder);
var strBase64 = yield getStream(b64s);
const fs = require('fs');
const wavUrl = 'data:audio/wav;codecs=pcm;base64,' + strBase64;
const buffer = Buffer.from(wavUrl.split('base64,')[1], // only use encoded data after "base64,"
'base64');
fs.writeFileSync('./audio.wav', buffer);
Is any one have any idea about this
This problem may occur because of a WAV header size limit in the JavaScript SDK.
You can pull request to raise this limit microsoft/cognitive-services-speech-sdk-js#328
Or in some cases you can solve it by simply converting the audio sample rate from 48kHz to 16kHz.
Refer this Github link for more information
I'm working with Etsy api uploading images like this example, and it requires the images be in binary format. Here is how I'm getting the image binary data:
async function getImageBinary(url) {
const imageUrlData = await fetch(url);
const buffer = await imageUrlData.buffer();
return buffer.toString("binary");
}
However Etsy says it is not a valid image file. How can I get the image in the correct format, or make it in a valid binary format?
Read this for a working example of Etsy API
https://github.com/etsy/open-api/issues/233#issuecomment-927191647
Etsy API is buggy and has an inconsistent guide. You might think of using 'binary' encoding for the buffer because the docs saying that the data type is string but you actually don't need to. Just put the default encoding.
Also currently there is a bug for image upload, try to remove the Content-type header. Better read the link above
I have a server running on nodejs, and I have the following piece of code to manage a post request -
form.on('file', function (field, file) {
var RecordingInfo = JSON.parse(file.name);
...
when I tried to upload a file I got the following exception:
undefined:1
"}
SyntaxError: Unexpected end of input
at Object.parse (native)
at IncomingForm.<anonymous> (.../root.js:31:34)
...
searching around the web, I fond that this exception is caused because the data comes in bits, and the event is fired after the first bit arrives, and I don't have all the data. OK. The thing is, after a little testing I fond that from chrome I can upload large files (tried a 1.75gb file) without any problem, while firefox crashes the server with a 6kb file.
My question is - why are they different?
A sample capture can be downloaded form here. The first post is from chrome, the second from firefox.
The complete file.name string before uploading is:
// chrome
"{"subject":"flksajfd","lecturer":"אבישי וינר","path":"/גמרא","fileType":".png"}"
// firefox
"{"subject":"fdsa","lecturer":"אלקס ציקין","path":"/גמרא","fileType":".jpg"}"
(The data submitted is not the same, but I don't think it matters)
Chrome is encoding double-quotes in the JSON-encoded "filename" as %22 while Firefox is encoding them as \".
Your file upload parsing library, Formidable, explicitly truncates the filename from the last \ character. It expects double-quotes to be encoded as %22 although RFC 2616 allows backslash-escaped quotes like Firefox has implemented. You can consider this a bug in Formidable. The result is that the following JSON string:
'{"subject":"fdsa",...,"fileType":".jpg"}'
...is encoded as follows:
'{%22subject%22:%22fdsa",...,%22fileType%22:%22.jpg%22}' // Chrome
'{\"subject\":\"fdsa\",...\"fileType\":\".jpg\"}' // Firefox
...and then decoded by Formidable:
'{"subject":"fdsa",..."fileType":".jpg"}' // Chrome
'"}' // Firefox
To fix the issue you have a few choices:
Raise the issue with Formidable to correctly handle backslash-escaped quoted-value strings (or fix it yourself and submit a pull request).
Send the JSON payload in a separate part of the FormData object, e.g. using a Blob.
Transliterate all double-quote characters in your JSON-format filename to a 'safe' character that will not appear elsewhere in the string (I chose ^ as an example); replace the quote client-side and reinstate it server-side as follows.
Client:
var formData = new FormData();
formData.append('file', $scope.recording, JSON.stringify(RecordingInfo).replace(/"/g, '^');
Server
form.on('file', function (field, file) {
var RecordingInfo = JSON.parse(file.name.replace(/\^/g, '"');
I would like to store some documents in a database as base64 strings. Then when those docs are requested using HTTP, I would like ExpressJS to decode the base64 docs and return them. So something like this:
app.get('/base64', function (req, res) {
//pdf is my base64 encoded string that represents a document
var buffer = new Buffer(pdf, 'base64');
res.send(buffer);
});
The code is simply to give an idea of what I'm trying to accomplish. Do I need to use a stream for this? If so, how would I do that? Or should I be writing these docs to a temp directory and then serving up the file? Would be nice to skip that step if possible. Thanks!
UPDATE: Just to be clear I would like this to work with a typical HTTP request. So the user will click a link in his browser that will take him to a URL that returns a file from the database. Seems like it must be possible, Microsoft SharePoint stores serialized files in a SQL database and returns those files over http requests, and I don't believe it writes all those files to a temp location first. I'm feeling like a nodejs stream may be the answer, but I'm not very familiar with streaming.
Before saving a file representation to the DB you can just use the toString method with base 64 encoding:
var base64pdf = pdf.toString('base64');
After you get the base64 file representation from db use the buffer as follows in order to convert it back to a file:
var decodedFile = new Buffer(base64pdf, 'base64');
More information on Buffer usages can be found here - NodeJS Buffer
As for how to send a buffer from express server to the client, Socket IO should solve this issue.
Using socket.emit -
Emits an event to the socket identified by the string name. Any
other parameters can be included.
All datastructures are supported, including Buffer. JavaScript
functions can’t be serialized/deserialized.
var io = require('socket.io')();
io.on('connection', function(socket){
socket.emit('an event', { some: 'data' });
});
Required documentation on socket.io website.
I am trying to send a parameter after convert it to the base 64 the definition of the geddy.js route :
router.get(routing_prefix+'/gogetthecart/:data').to('Main.gogetthecart');
In the client side, javascript, I generate a base64 json data var jsonb64 = btoa(JSON.stringify(params)); after that I call the url that will somthing like this
http://www.mydomain.com//gogetthecart/GVudGl...aWNo=
I got Error: 404 Not Found.. But If I delete manually the = from the end of data that work
Solved by the community in the git repos issues https://github.com/geddy/geddy/issues/556 as Kieran said
I looked into adding support for base64 encoded vars to Barista
directly, but some characters in the b64 spec are reserved in the URI
spec. I don't feel comfortable making that the default behaviour.
However! You can simply override the behaviour to support this use
case:
router
.get( routing_prefix+'/gogetthecart/:data')
.to('Main.gogetthecart')
.where({
data: /[\w\-\/+]+={0,2}/ // base64-safe regex condition
})
and that should do the trick!
I added a test here:
https://github.com/kieran/barista/blob/master/tests/barista.test.js#L812