How do i get a file from a writable node stream? - node.js

What am i doing
So I'm trying to reply to a command interaction with a QR code, and I don't want to save the file to the directory of the bot and then send it that way, instead, I have the QR code text, and I'm using the QR Code package to convert it into a QR Code, and now I'm trying to write the file to a node stream.
What is the problem
The problem is I don't know how to access that file, since when I try to use { files: [returned.qrcode] } it doesn't send an image attachment.
More information
Its a writable stream, dont know how to get the file from the stream, it returns a normal writable stream
The code
The code for getting the QR Code and writing it to a stream
const id = uuid.v4();
const temp_secret = speakeasy.generateSecret({
name: "VikkiVuk (2FA): " + id
});
let newUser = await new factorschema({
id: id,
temp_secret: temp_secret.base32,
secret: "waiting"
}).save()
await schema.updateOne({ userid: user.id }, { uuid: id })
let data
const filestream = new Stream.Writable()
qrcode.toFileStream(filestream, temp_secret.otpauth_url)
const readableStream = new Stream.Readable()
readableStream.push(null)
readableStream.pipe(filestream)
filestream.end()
return { userid: id, temp_secret: temp_secret, qrcode: readableStream }
Command code (the part where I try and send the attachment)
await interaction.deferReply({ ephemeral: true })
const result = await handler.register2FA(interaction.user)
console.log(result) // logs a stream, its not undefined
await interaction.editReply({ content: "Scan this qr code", ephemeral: true, files: [result.qrcode]})
Logged stream:
Readable {
_readableState: ReadableState {
objectMode: false,
highWaterMark: 16384,
buffer: BufferList { head: null, tail: null, length: 0 },
length: 0,
pipes: [ [Writable] ],
flowing: true,
ended: true,
endEmitted: false,
reading: false,
constructed: true,
sync: true,
needReadable: false,
emittedReadable: true,
readableListening: false,
resumeScheduled: true,
errorEmitted: false,
emitClose: true,
autoDestroy: true,
destroyed: false,
errored: null,
closed: false,
closeEmitted: false,
defaultEncoding: 'utf8',
awaitDrainWriters: null,
multiAwaitDrain: false,
readingMore: false,
decoder: null,
encoding: null,
[Symbol(kPaused)]: false
},
_events: [Object: null prototype] {
end: [Function: bound onceWrapper] { listener: [Function: onend] },
data: [Function: ondata]
},
_eventsCount: 2,
_maxListeners: undefined,
[Symbol(kCapture)]: false
}

Firstly, transform your stream like so:
async function createStream() {
const stream = new Transform({
transform(chunk, encoding, callback) {
this.push(chunk)
callback()
}
})
await qrcode.toFileStream(filestream, temp_secret.otpauth_url)
}
return stream;
The MessageAttachment constructor available to discord.js accepts writeable streams you may convert your stream to a .png using the constructor like so:
const {
MessageAttachment
} = require("discord.js");
let qrcode = createStream();
const x = new MessageAttachment(qrcode, 'QRCode.png') // further attach it like so:
await interaction.editReply({
content: "Skeniraj ovaj qr code dole.",
ephemeral: true,
files: [x]
})
Constructor accepts the following three parameters, Where attachment would be your writable stream , name would be your filename and data would be the APIAttachment object which also offers you options such as width and height of that image / specifics listed in the documentation.

Related

Parse csv file from S3 using Lambda and Node Stream

I'm trying to code a lambda that triggers an s3 bucket and gets a CSV file when it is uploaded, and parse this file.
I'm using: Node 14x
This is the code:
import { S3Event } from 'aws-lambda';
import { S3 } from 'aws-sdk';
import * as csv from 'fast-csv';
const s3 = new S3({ apiVersion: 'latest' });
export async function hello(event: S3Event, context, cb) {
event.Records.forEach(async (record) => {
const bucket = record.s3.bucket.name;
const key = decodeURIComponent(record.s3.object.key.replace(/\+/g, ' '));
const params: S3.GetObjectRequest = {
Bucket: bucket,
Key: key,
};
const stream = s3.getObject(params).createReadStream();
console.log({ stream });
csv.parseStream(stream, {
headers: true
}).on('data', data => { console.log(data); })
.on('error', error => console.error(error))
.on('end', (rowCount: number) => console.log(`Parsed ${rowCount} rows`));
console.log('processo 01 acabou!');
});
}
When I execute this lambda I'm not receiving anything. In console.log(stream) I'm receiving a PassTrought object...
stream: PassThrough {
_readableState: ReadableState {
objectMode: false,
highWaterMark: 16384,
buffer: BufferList { head: null, tail: null, length: 0 },
length: 0,
pipes: [],
flowing: null,
ended: false,
endEmitted: false,
reading: false,
sync: false,
needReadable: false,
emittedReadable: false,
readableListening: false,
resumeScheduled: false,
errorEmitted: false,
emitClose: true,
autoDestroy: true,
destroyed: false,
errored: null,
closed: false,
closeEmitted: false,
defaultEncoding: 'utf8',
awaitDrainWriters: null,
multiAwaitDrain: false,
readingMore: false,
dataEmitted: false,
decoder: null,
encoding: null,
[Symbol(kPaused)]: null
},
_events: [Object: null prototype] { prefinish: [Function: prefinish] },
_eventsCount: 1,
_maxListeners: undefined,
_writableState: WritableState {
objectMode: false,
highWaterMark: 16384,
finalCalled: false,
needDrain: false,
ending: false,
ended: false,
finished: false,
destroyed: false,
decodeStrings: true,
defaultEncoding: 'utf8',
length: 0,
writing: false,
corked: 0,
sync: true,
bufferProcessing: false,
onwrite: [Function: bound onwrite],
writecb: null,
writelen: 0,
afterWriteTickInfo: null,
buffered: [],
bufferedIndex: 0,
allBuffers: true,
allNoop: true,
pendingcb: 0,
prefinished: false,
errorEmitted: false,
emitClose: true,
autoDestroy: true,
errored: null,
closed: false
},
allowHalfOpen: true,
[Symbol(kCapture)]: false,
[Symbol(kTransformState)]: {
afterTransform: [Function: bound afterTransform],
needTransform: false,
transforming: false,
writecb: null,
writechunk: null,
writeencoding: null
}
}
}
I have a picture from my CloudWatch
Can anyone help me, and tell me what I'm doing wrong?
The issue with your code is that it's not correctly dealing with the asynchronous nature of JavaScript. Specifically, your code is exiting before any asynchronous activity has completed.
Your Lambda function is async so it should return a promise that is ultimately settled (fulfilled or rejected) when your processing of the S3 object(s) has completed. This allows the AWS Lambda runtime environment to await completion.
For example:
exports.handler = async function(event, context) {
const promises = event.Records.map((record) => {
const Bucket = record.s3.bucket.name;
const Key = decodeURIComponent(record.s3.object.key.replace(/\+/g, ' '));
const params = { Bucket, Key };
const stream = s3.getObject(params).createReadStream();
return new Promise(function(resolve, reject) {
csv.parseStream(stream, {
headers: true
}).on('data', (data) => {
console.log(data);
}).on('error', (error) => {
console.error(error);
reject(error);
}).on('end', (rows) => {
console.log(`Parsed ${rows} rows`);
resolve(rows);
});
});
});
return Promise.all(promises);
}

How to access the file path of uploaded files in fastify

When using a form to upload some files I can see in dev tools in the network inspector and specifically in the payload tab under form data, in view source.
Note the below includes the file name including the path twoItems/Screenshot... its this path twoItems I need to access in the API but can't.
Security? Err why do I want this?
It's for a document management app, users cant be creating folders in the browser and then add the files. They need to drag and drop nested directories of files.
------WebKitFormBoundarydJ6knkAHgNW7SIF7
Content-Disposition: form-data; name="file"; filename="twoItems/Screenshot 2022-03-11 at 08.58.24.png"
Content-Type: image/png
------WebKitFormBoundarydJ6knkAHgNW7SIF7
Content-Disposition: form-data; name="file"; filename="twoItems/Screenshot 2022-03-11 at 08.58.08.png"
Content-Type: image/png
so in the API I have a standard fastify API running
mport Fastify, { FastifyInstance, RouteShorthandOptions } from "fastify";
import { Server, IncomingMessage, ServerResponse } from "http";
const fs = require("fs");
const util = require("util");
const { pipeline } = require("stream");
const pump = util.promisify(pipeline);
const fastify: FastifyInstance = Fastify({});
fastify.register(require("fastify-multipart"));
fastify.register(require("fastify-cors"), {
methods: ["GET", "PUT", "POST"],
});
const dir = "./files";
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir);
}
fastify.post("/upload", async (req: any, reply) => {
console.log(req);
const parts = await req.files();
for await (const part of parts) {
console.log(part); //---------------- LOG BELOW
await pump(part.file, fs.createWriteStream(`./files/${part.filename}`));
}
reply.send();
});
const start = async () => {
try {
await fastify.listen(3001);
const address = fastify.server.address();
const port = typeof address === "string" ? address : address?.port;
} catch (err) {
fastify.log.error(err);
process.exit(1);
}
};
start();
I can't find how to access the path of each item
when I log out part I get...
<ref *1> {
fieldname: 'file',
filename: 'Screenshot 2022-03-11 at 17.52.11.png',
encoding: '7bit',
mimetype: 'image/png',
file: FileStream {
_readableState: ReadableState {
objectMode: false,
highWaterMark: 16384,
buffer: BufferList { head: [Object], tail: [Object], length: 4 },
length: 208151,
pipes: [],
flowing: null,
ended: false,
endEmitted: false,
reading: false,
sync: false,
needReadable: false,
emittedReadable: false,
readableListening: false,
resumeScheduled: false,
errorEmitted: false,
emitClose: true,
autoDestroy: true,
destroyed: false,
errored: null,
closed: false,
closeEmitted: false,
defaultEncoding: 'utf8',
awaitDrainWriters: null,
multiAwaitDrain: false,
readingMore: false,
dataEmitted: false,
decoder: null,
encoding: null,
[Symbol(kPaused)]: null
},
_events: [Object: null prototype] {
end: [Function (anonymous)],
limit: [Function (anonymous)]
},
_eventsCount: 2,
_maxListeners: undefined,
bytesRead: 208151,
truncated: false,
_read: [Function (anonymous)],
[Symbol(kCapture)]: false
},
fields: { file: [ [Object], [Object], [Object], [Circular *1] ] },
_buf: null,
toBuffer: [AsyncFunction: toBuffer]
}
this is undefined...
console.log(part.path);
You need to set the busboy's option:
fastify.register(require("fastify-multipart"), {
preservePath: true
});
You can find all the options here: https://github.com/fastify/busboy#busboy-methods

MongoDB find an element on NodeJS returning an object

I'm working with the MongoClient and it would seem that whenever I try to find by query, I get an object back in return, but it works perfectly fine in the mongo terminal. This database initialization works for inputting data.
MongoDB terminal:
mongo
use player-db
db.players.find({"id":"1"})
Result: { "_id" : ObjectId("5f3ca631950b2f4b1f157e27"), "id" : "1", "name" : "test" }
And now in server.js:
const url = '{ommited}'
const dbName = 'player-db'
let db;
MongoClient.connect(url, { useNewUrlParser: true, useUnifiedTopology: true }, (err, client) => {
if (err) return console.log(err)
db = client.db(dbName)
console.log(`Connected to Database: \n ${url}/${dbName}`)
})
/** Function that isn't working **/
const GetOne = (collection, id) => {
let test = db.collection("players").find({"id" : "1"});
console.log(test);
}
Expected output:
{ "_id" : ObjectId("5f3ca631950b2f4b1f157e27"), "id" : "1", "name" : "test" }
Actual output:
Cursor {
_readableState: ReadableState {
objectMode: true,
highWaterMark: 16,
buffer: BufferList { head: null, tail: null, length: 0 },
length: 0,
pipes: [],
flowing: null,
ended: false,
endEmitted: false,
reading: false,
sync: true,
needReadable: false,
emittedReadable: false,
readableListening: false,
resumeScheduled: false,
errorEmitted: false,
emitClose: true,
autoDestroy: true,
destroyed: false,
errored: false,
closed: false,
closeEmitted: false,
defaultEncoding: 'utf8',
awaitDrainWriters: null,
multiAwaitDrain: false,
readingMore: false,
decoder: null,
encoding: null,
[Symbol(kPaused)]: null
........................... etc.
You are currently getting and logging the cursor.
let test = db.collection("players").find({"id" : "1"}).toArray(function(err, docs) {
console.log(docs)
});
Should display your docs. toArray will "convert" the cursor to an array of found documents. It accepts a callback function that it will execute when it is complete. You could also you promises or async/await await db.collection...

How to transform a LOB (Binary image) to send to Firebase Storage

I need to return some Oracle images that are recorded in LOB field to send them to Firebase Storage.
I am using oracledb with typescript library to call a procedure that returns certain records. One of the fields is LOB (images). I need to return this data and send this image to Firebase storage. I can't code it.
import { IConnection } from "oracledb";
import oracledb = require("oracledb");
oracledb.fetchAsString = [ oracledb.CLOB ];
export async function uploadImages(db: IConnection) {
const query = `
BEGIN
mgglo.pck_wglo_binario.p_obter_binarios_filtro
(
retorno => :retorno,
pfiltro => :pfiltro,
pmod_in_codigo => :pmod_in_codigo,
pcodigoempreendimento => :pcodigoempreendimento,
pcodigobloco => :pcodigobloco,
pcodigounidade => :pcodigounidade
);
END;`;
const bindvars = {
retorno : { dir: oracledb.BIND_OUT, type: oracledb.CURSOR },
pfiltro : 0,
pmod_in_codigo : 1,
pcodigoempreendimento : 5689,
pcodigobloco : 9645,
pcodigounidade : 8966
}
const exec = await db.execute(query, bindvars);
const row = await exec.outBinds["retorno"].getRow();
console.log(row);
}
Return:
{ BIN_IN_CODIGO: 469,
CAT_IN_CODIGO: 63,
BIN_BO_ATIVO: 'S',
BIN_ST_MIME: 'image/png',
BIN_ST_NOME: 'Image 1.png',
BIN_LO_BINARIO:
Lob {
_readableState:
ReadableState {
objectMode: false,
highWaterMark: 16384,
buffer: [Object],
length: 0,
pipes: null,
pipesCount: 0,
flowing: null,
ended: false,
endEmitted: false,
reading: false,
sync: true,
needReadable: false,
emittedReadable: false,
readableListening: false,
resumeScheduled: false,
destroyed: false,
defaultEncoding: 'utf8',
awaitDrain: 0,
readingMore: false,
decoder: null,
encoding: null },
readable: true,
domain: null,
_events: { end: [Object], finish: [Object] },
_eventsCount: 2,
_maxListeners: undefined,
_writableState:
WritableState {
objectMode: false,
highWaterMark: 16384,
finalCalled: false,
needDrain: false,
ending: false,
ended: false,
finished: false,
destroyed: false,
decodeStrings: true,
defaultEncoding: 'utf8',
length: 0,
writing: false,
corked: 0,
sync: true,
bufferProcessing: false,
onwrite: [Function: bound onwrite],
writecb: null,
writelen: 0,
bufferedRequest: null,
lastBufferedRequest: null,
pendingcb: 0,
prefinished: false,
errorEmitted: false,
bufferedRequestCount: 0,
corkedRequestsFree: [Object] },
writable: true,
allowHalfOpen: true,
iLob:
ILob {
valid: true,
autoCloseLob: true,
type: 2007,
offset: 1,
pieceSize: 8060,
length: 814115,
chunkSize: 8060 },
close: [Function] },
BIN_ST_DESCRICAO: 'Teste Valmir',
BIN_DT_CRIACAO: 2019-05-28T13:32:37.000Z,
BIN_BO_LINK: 'N' }
FIELD: BIN_LO_BINARIO
The LOB is coming out as a Lob instance. That can be used for streaming large objects, but if the LOB is relatively small (compared to the amount of memory the Node.js process has access to), then you can override the default to get and String or Buffer depending on whether the LOB is a BLOB or CLOB.
Here's an example that fetches a BLOB out as a Buffer from this post:
const getSql =
`select file_name "file_name",
dbms_lob.getlength(blob_data) "file_length",
content_type "content_type",
blob_data "blob_data"
from jsao_files
where id = :id`;
async function get(id) {
const binds = {
id: id
};
const opts = {
fetchInfo: {
blob_data: {
type: oracledb.BUFFER
}
}
};
const result = await database.simpleExecute(getSql, binds, opts);
return result.rows;
}

How to correctly save images to the filesystem with fs.writeFile?

I can't figure out how i can correctly save a file i got from formidable to the file system my server is running on.
I am able to console.log the files, however i do not know what to do with the information provided there.
app.post("/sendImages", (req, res) => {
const files = req.files;
Object.keys(files).forEach((key) => {
console.log(files[key]);
fs.writeFile('images/' + files[key].name, files[key], 'binary', (error) => {
if (error) console.log(error);
else console.log('image created');
});
})
});
This request handler right here creates files with the correct names, but when i try to open them in VS Code the only thing i see is [object Object].
An example of a console logged file:
File {
_events: [Object: null prototype] {},
_eventsCount: 0,
_maxListeners: undefined,
size: 3835864,
path:
'C:\\Users\\MY_USER_DIR\\AppData\\Local\\Temp\\upload_b099c61751b3b25772344e20df06a4d9',
name: '20190602_134136.jpg',
type: 'image/jpeg',
hash: null,
lastModifiedDate: 2019-06-30T15:03:22.060Z,
_writeStream:
WriteStream {
_writableState:
WritableState {
objectMode: false,
highWaterMark: 16384,
finalCalled: true,
needDrain: true,
ending: true,
ended: true,
finished: true,
destroyed: true,
decodeStrings: true,
defaultEncoding: 'utf8',
length: 0,
writing: false,
corked: 0,
sync: false,
bufferProcessing: false,
onwrite: [Function: bound onwrite],
writecb: null,
writelen: 0,
bufferedRequest: null,
lastBufferedRequest: null,
pendingcb: 0,
prefinished: true,
errorEmitted: false,
emitClose: false,
autoDestroy: false,
bufferedRequestCount: 0,
corkedRequestsFree: [Object] },
writable: false,
_events: [Object: null prototype] {},
_eventsCount: 0,
_maxListeners: undefined,
path:
'C:\\Users\\MY_USER_DIR\\AppData\\Local\\Temp\\upload_b099c61751b3b25772344e20df06a4d9',
fd: null,
flags: 'w',
mode: 438,
start: undefined,
autoClose: true,
pos: undefined,
bytesWritten: 3835864,
closed: false } }
I hope someone of you can tell me what i did wrong here, i am new to node in general and still have some problems here and there :)
You should copy files from tmp folder to images folder, like this (Node.js >= 8.5.0):
const fs = require('fs');
const util = require('util');
const path = require('path');
const copyFile = util.promisify(fs.copyFile);
app.post('/sendImages', async (req, res) => {
const files = req.files;
const results = Object.keys(files).map((key) => {
const file = files[key];
const dest = path.join('images/', file.name);
return copyFile(file.path, dest);
});
await Promise.all(results);
// ...
});
And if you don't want to save files to tmp folder, you should check the api document for change the uploadDir. Like express-formidable:
app.use(formidableMiddleware({
encoding: 'utf-8',
uploadDir: 'images/',
multiples: true
});

Resources