How to readFile() with async / execFile() Node js - node.js

Thanks in advance.
I'm creating an Electron-Create-React-App using electron-forge on Windows 10 Pro and am stuck with using async functions with execFile and readFile().
I want to achieve the following:-
main process - Receive a buffer of a screen capture (video) from the renderer process.
Create a temporary file and write the buffer to a .mp4 file.
Crop the video (based on x:y:width:height) using ffmpeg (installed in Electron as a binary).
Output = .mp4 file in temporary directory
Read the cropped .mp4 file using fs.readFile() (as a base64 encoded buffer)
Send the buffer to another renderer screen.
Delete temp file.
Q: I've managed to do most of it but cannot access the cropped .mp4 file in the temp directory.
I've tried the following:-
Electron main process
const fs = require('fs').promises
const path = require('path')
ipcMain.on("capture:buffer", async (video_object) => {
const {x_pos, y_pos, window_width, window_height, buffer} = video_object
try {
const dir = await fs.mkdtemp(await fs.realpath(os.tmpdir()) + path.sep)
const captured_video_file_path = path.join(dir, "screen_capture_video.mp4")
// This works
await fs.writeFile(captured_video_file_path, buffer, (error, stdout, stderr) => {
if (error) {
console.log(error)
}
console.log("Screen Capture File written")
})
// This also works
execFile(`${ffmpeg.path}`,
['-i', `${captured_video_file_path}`, '-vf',
`crop=${window_width}:${window_height}:${x_pos}:${y_pos}`,
`${path.join(dir,'cropped_video.mp4')}`],
(error, stdout, stderr) => {
if (error) {
console.log(error.message)
}
if (stderr) {
console.log(stderr)
}
console.log("Cropped File created")
})
// This code onwards doesn't work
await fs.readFile(path.join(dir, "cropped_video.mp4"), 'base64', (error, data) => {
if (error) {
console.log(error)
}
// To renderer
mainWindow.webContents.send("main:video_buffer", Buffer.from(data))
})
} catch (error) {
console.log(error)
} finally {
fs.rmdir(dir, {recursive: true})
}
})
When trying to read the file i get the following error :-
[Error: ENOENT: no such file or directory, open 'C:\Users\XXXX\XXXXX\XXXXX\temp\temp_eYGMCR\cropped_video.mp4']
I've checked that the correct path exists with console.log.
I suspect it is a 'simple' issue with using async / execFile() properly but don't know exactly where I am making a silly mistake.
Any help would be appreciated.
Thanks.

Because at that time of calling fs.readFile, execFile may not be done yet.
Untested, but you may want to create a promise and wait for execFile to be completed before proceeding and see whether it works.
await new Promise( resolve => {
execFile(`${ffmpeg.path}`,
['-i', `${captured_video_file_path}`, '-vf',
`crop=${window_width}:${window_height}:${x_pos}:${y_pos}`,
`${path.join(dir,'cropped_video.mp4')}`],
(error, stdout, stderr) => {
if (error) {
console.log(error.message)
}
if (stderr) {
console.log(stderr)
}
console.log("Cropped File created")
resolve() //this tells `await` it's ready to move on
})
})

Thanks for the pointers guys.
Here's the solution I found.
Another big problem with safely creating and removing temporary directories in Electron is fs.rmdir() doesn't work when using Electron-Forge / Builder due to an issue with ASAR files.
(ASAR files are used to package Electron apps).
const fsPromises = require('fs').promises
ipcMain.on("capture:buffer", async (video_object) => {
const {x_pos, y_pos, window_width, window_height, buffer} = video_object
const temp_dir = await fsPromises.mkdtemp(await fsPromises.realpath(os.tmpdir()) + path.sep)
const captured_video_file_path = path.join(dir, "screen_capture_video.mp4")
try {
await fsPromises.writeFile(captured_video_file_path, buffer)
}
catch (error) {console.error}
// note no callback req'd as per jfriends advice
let child_object =
execFile(`${ffmpeg.path}`,
['-i', `${captured_video_file_path}`, '-vf',
`crop=${window_width}:${window_height}:${x_pos}:${y_pos}`,
`${path.join(dir,'cropped_video.mp4')}`],
(error, stdout, stderr) => {
if (error) {
console.log(error.message)
}
if (stderr) {
console.log(stderr)
}
console.log("Cropped File created")
})
child_object.on("close", async
() => {
try { video_buffer = await fsPromises.readFile(path.join(dir, "cropped_video.mp4")
// To renderer
mainWindow.webContents.send("main:video_buffer", video_buffer)
} catch (error) {
log(error)
} finally {
process.noAsar = true
fs.rmdir(temp_directory, {recursive: true}, (error) => {if (error) {log(error)}})
console.log("Done !!!")
process.noASAR = false
}
})

Related

How to store output of ffmpeg in a variable using NodeJS / execFile and ffmpeg?

I'm running an Electron app which uses the 'ffmpeg-static-electron' package to process a local video file.
I am trying to store the output as a buffer in a variable for subsequent processing.
Using IPC, I can successfully execute the following in the main.js file (running Node), which produces a cropped video file.
const { execFile } = require("child-process")
const ffmpeg = require("ffmpeg-static-electron")
ipcMain.on("crop_video", () => {
execFile(
`${ffmpeg.path}`,
[
"-i",
`${testVideoCropPath}`,
"-filter:v",
`crop=600:600:100:100`,
"-preset",
"fast",
"-progress",
"pipe:1",
`${path.join(desktopDirPath, "cropped-video.mp4")}`,
],
(error, stdout, stderr) => {
if (error) {
console.error("stderr", stderr);
throw error;
}
console.log("Success", stdout);
}
);
})
Here's the output:
Success frame=238
[1] fps=0.0
...
[1] dup_frames=0
[1] drop_frames=0
[1] speed=8.23x
[1] progress=end
What I wish to do is to store the cropped-video as a buffer in a variable.
To experiment, I tried:
ipcMain.on("crop_video", () => {
const test = execFile(
`${ffmpeg.path}`,
[
"-i",
`${testVideoCropPath}`,
"-filter:v",
`crop=600:600:100:100`,
"-preset",
"fast",
`${path.join(desktopDirPath, "cropped-video.mp4")}`,
"pipe:1",
],
(error, stdout, stderr) => {
if (error) {
console.error("stderr", stderr);
throw error;
}
console.log("Success", stdout);
}
);
test.on('data', (data) => {console.log(typeof data)}
})
Which doesn't print anything to the console.
How can I access sdout in buffer format ?
Is there a way to store the output with something like:
test.on('data', (data)=> {
let var = Buffer.from(data)})
Thanks.

node ffmpeg module stuck more than one file

when I read more than one file it will be stuck and also hang my pc. I need to restart my pc
on one file or 5 files it will work perfectly but not more than 5 files
if anyone know this issue let me know
const ffmpegPath = require('#ffmpeg-installer/ffmpeg').path
const ffmpeg = require('fluent-ffmpeg')
ffmpeg.setFfmpegPath(ffmpegPath)
const testFolder = './videos/';
const fs = require('fs');
fs.readdir(testFolder, async(err, files) => {
try {
for(let i = 0; i < 10; i++){
if(files[i] != '1 Surah Fatiha Dr Israr Ahmed Urdu - 81of81.mp4'){
let converter = await ffmpeg(`./videos/${files[i]}`)
await converter.setStartTime('00:00:00').setDuration('30').output(`./outputfolder/${files[i]}`).on('end', function(err) {
if(err) {
console.log(`err durinng conversation \n ${err}`)
}
else{
console.log(`Done ${files[i]}`);
}
}).on('error', function(err){
console.log(`error: ${files[i]}`, err)
}).run()
}
}
} catch (error) {
console.log(error)
}
});
Your computer is crashing because fluent-ffmpeg is not asynchronous, this makes your code run in a loop running ffmpeg several times without waiting for the previous video to complete processing, consequently consuming all your processing power.
I created an asynchronous function called renderVideo and changed the for loop to the for of loop, as only it is able to wait for an asynchronous function.
The code looked like this:
const ffmpegPath = require('#ffmpeg-installer/ffmpeg').path
const ffmpeg = require('fluent-ffmpeg')
ffmpeg.setFfmpegPath(ffmpegPath)
const fs = require('fs');
const testFolder = './videos/';
fs.readdir(testFolder, async(err, videos) => {
try {
for(let video of videos){
if(video != '1 Surah Fatiha Dr Israr Ahmed Urdu - 81of81.mp4'){
await renderVideo(video)
}
}
} catch (error) {
console.log(error)
}
function renderVideo(video){
return new Promise((resolve, reject)=>{
let converter = ffmpeg(`./videos/${video}`)
converter
.setStartTime('00:00:00')
.setDuration('30')
.output(`./outputfolder/${video}`)
.on('end', (done)=> {
resolve(done)
})
.on('error', (err)=> {
reject(`The video ${video} return with error: ${err}`)
})
.run()
})
}
})
I also changed the names of some variables to make sense in the current code.

How to write multiple streams for one file in Node?

Learning how to do large file manipulation with Node and streams I'm stuck in the middle of a file change when passing down the results to a module and I think the process is still in memory when it reaches another module.
I get a zip from an s3 bucket locally and unzip the contents:
try {
const stream = fs.createReadStream(zipFile).pipe(unzipper.Extract({ path }))
stream.on('error', err => console.error(err))
stream.on('close', async () => {
fs.removeSync(zipFile)
try {
const neededFile = await dir(path) // delete files not needed from zip, rename and return named file
await mod1(neededFile) // review file, edit and return info
await mod2(neededFile, data) // pass down data for further changes
return
} catch (err) {
console.log('error')
}
})
} catch (err) {
console.log('stream error')
}
Initial unzip I learned that there is a difference between stream on close and finish because I could pass the file to the first module and start the manipulation but the file, I guess due to the size, output and file never matched. After cleaning the files I dont need I pass the renamed file to mod1 for changes and run a write file sync:
mod1.js:
const fs = require('fs-extra')
module.exports = file => {
fs.readFile(file, 'utf8', (err, data) => {
if (err) return console.log(err)
try {
const result = data.replace(/: /gm, `:`).replace(/(?<=location:")foobar(?=")/gm, '')
fs.writeFileSync(file, result)
} catch (err) {
console.log(err)
return err
}
})
}
when I tried to do the above with:
const readStream = fs.createReadStream(file)
const writeStream = fs.createWriteStream(file)
readStream.on('data', chunk => {
const data = chunk.toString().replace(/: /gm, `:`).replace(/(?<=location:")foobar(?=")/gm, '')
writeStream.write(data)
})
readStream.on('end', () => {
writeStream.close()
})
the file would always be blank. After writeFileSync I proceed with the next module to search for a line ref:
mod2.js:
const fs = require('fs-extra')
module.exports = (file, data) => {
const parseFile = fs.readFileSync(file, 'utf8')
parseFile.split(/\r?\n/).map((line, idx) => {
if (line.includes(data)) console.log(idx + 1)
})
}
but the line number returned is that of the initial unzipped file not the file that was modded from the first module. Because I thought the sync process would be for the file it would appear the file being referenced is in memory? My search results for streams when learning about them:
Working with Node.js Stream API
Stream
How to use stream.pipe
Understanding Streams in Node.js
Node.js Streams: Everything you need to know
Streams, Piping, and Their Error Handling in Node.js
Writing to Files in Node.js
Error handling with node.js streams
Node.js Readable file stream not getting data
Node.js stream 'end' event not firing
NodeJS streams not awaiting async
stream-handbook
How should a file be manipulated after an unzip stream and why does the second module reference the file after it was unzipped and not when it was already manipulated? Is it possible to write multiple streams synchronously?

Cannot convert video file to audio file inside AWS lambda function using Node js

I cannot convert a video file into an audio file inside AWS lambda function using Node JS. While running my lambda function it doesn't throw any error it executes without any error. But the audio file size is still 0 MB size. I am not able to find bugs or any issues in my code.
Here is my code,
const fs = require('fs');
const childProcess = require('child_process');
const AWS = require('aws-sdk');
const path = require('path');
AWS.config.update({
region : 'us-east-2'
});
const s3 = new AWS.S3({apiVersion: '2006-03-01'});
exports.handler = (event, context, callback) => {
process.env.PATH = process.env.PATH + ':/tmp/';
process.env['FFMPEG_PATH'] = '/tmp/ffmpeg';
const BIN_PATH = process.env['LAMBDA_TASK_ROOT'];
process.env['PATH'] = process.env['PATH'] + ':' + BIN_PATH;
childProcess.exec(
'cp /var/task/ffmpeg /tmp/.; chmod 755 /tmp/ffmpeg;',
function (error, stdout, stderr) {
if (error) {
console.log('Error occured',error);
} else {
var ffmpeg = '/tmp/ffmpeg';
var createStream = fs.createWriteStream("/tmp/video.mp3");
createStream.end();
var params = {
Bucket: "test-bucket",
Key: event.Records[0].s3.object.key
};
s3.getObject(params, function(err, data) {
if (err) {
console.log("Error", err);
}
fs.writeFile("/tmp/vid.mp4", data.Body, function (err) {
if (err) console.log(err.code, "-", err.message);
return callback(err);
}, function() {
try {
var stats = fs.statSync("/tmp/vid.mp4");
console.log("size of the file1 ", stats["size"]);
try {
console.log("Yeah");
const inputFilename = "/tmp/vid.mp4";
const mp3Filename = "/tmp/video.mp3";
// // Convert the FLV file to an MP3 file using ffmpeg.
const ffmpegArgs = [
'-i', inputFilename,
'-vn', // Disable the video stream in the output.
'-acodec', 'libmp3lame', // Use Lame for the mp3 encoding.
'-ac', '2', // Set 2 audio channels.
'-q:a', '6', // Set the quality to be roughly 128 kb/s.
mp3Filename,
];
try {
const process = childProcess.spawnSync(ffmpeg, ffmpegArgs);
console.log("stdout ", process.stdout);
console.log("stderr ", process.stderr);
console.log("tmp files ");
fs.readdir('/tmp/', (err, files) => {
files.forEach(file => {
var stats = fs.statSync(`/tmp/${file}`);
console.log("size of the file2 ", stats["size"]);
console.log(file);
});
});
} catch (e) {
console.log("error while converting video to audio ", e);
}
// return process;
} catch (e) {
console.log(e);
}
} catch (e) {
console.log("file is not complete", e);
}
}, function () {
console.log("checking ");
var stats = fs.statSync("/tmp/video.mp3");
console.log("size of the file2 ", stats["size"]);
});
return callback(err);
});
}
}
)
}
Code workflow
First of all, I have downloaded ffmpeg binary exec file and put into my project directory. After that, I compressed my project and put it into the lambda function. This lambda function will be triggered whenever the new files are uploaded into an S3 bucket. I have checked /tmp/ storage files and the audio file .mp3 present but the size is 0 MB.
Note
And also, in my code the below is not calling or this part is not reaching. When I look into Cloudwatch logs I can't see this console log messages. I don't know why this function is not calling.
function () {
console.log("checking ");
var stats = fs.statSync("/tmp/video.mp3");
console.log("size of the file2 ", stats["size"]);
});
Please help me to find the solution of this issue. I have spent a lot of times to figure out this issue. But I am not able to find the solution. Any suggestions are welcome!!
Thanks,
There are many limitations that can cause Lambda to bomb while trying to do conversions. The first thing you will need to do is compile ffmpeg for AWS Linux, generally, you have to compile with static links instead of dynamic links.
Another approach is to use a docker container and running it on aws ecs fargate, this will allow you to control the dependencies much easier, and you do not have any limitations on run time, and you can still outsource the management of machines to AWS.
Transcoding Video on Lambda
https://intoli.com/blog/transcoding-on-aws-lambda/
Pre-compiled ffmpeg
https://johnvansickle.com/ffmpeg/

Stream a zip file in nodejs

I am searching for a solution to stream my zip file in order to send it through to azure blob-storage.
Currently this is what I have
async _uploadStreamToBlob(zipFile, fileName) {
const blobService = await this.__self.blobStorage.createBlobService(this.__self.blobStorageConnectionString);
const containerName = this.__self.blobContainerName;
const sourceFilePath = `${path.resolve(zipFile)}`;
const streamSource = fs.createReadStream(sourceFilePath);
return new Promise((resolve, reject) => {
streamSource.pipe(blobService.createWriteStreamToBlockBlob(containerName, fileName, error => {
if (error) {
reject(error);
} else {
resolve({ message: `Upload of '${fileName}' complete` });
}
}));
});
};
This clearly does not work as I've tested otherwise since the fileStream feeds zero bytes into the pipe, resulting in a succesful upload of a 0 byte zipFile into the blob-storage.
How do I stream the zipFile onto the azureWriteStream? Or how do I get the bytes off of the zipFile(preserving the contents)?
If there is any other way to achieving this, I am all ears.
Thanks
Use createBlockBlobFromLocalFile directly:
blobService.createBlockBlobFromLocalFile(containerName, fileName, sourceFilePath, (err) => {
// Handle err
});

Resources