SheetJS always throws the same output - node.js

I'm having a problem parsing an ".xlsx" or ".xls" file with SheetJS ("xlsx" on npm) i don´t know what I'm doing wrong, but I always get the same output
[
{
"__EMPTY": "i"
},
{
"__EMPTY":"«Z.7¦§dÞZµìe°
I'm using an empty controller in loopback 4 in case you recognize the syntax, and the problem doesn't seem to bee a loopback since I'm able to save the file on the server and open it without a problem.
It seems that xlsx module it's unable to parse my files for some reason, can anyone take a look and see if something it's wrong?
Here it's my code:
import { inject } from '#loopback/context';
import { ParamsDictionary, RequestHandler } from 'express-serve-static-core';
import * as multer from "multer";
import { unlinkSync } from "fs";
import * as xlsx from "xlsx"
import {
requestBody,
RestBindings,
RequestBodyObject,
post,
Request,
Response
} from '#loopback/rest';
const MULTIPART_FORM_DATA: RequestBodyObject = {
description: 'multipart/form-data value.',
required: true,
content: {
'multipart/form-data': {
// Skip body parsing
'x-parser': 'stream',
schema: { type: 'object' },
},
},
}
export class TemplateActionsController {
constructor() { }
#post('/parse-template', {
responses: {
200: {
content: {
'multipart/form-data': {
schema: {
type: 'object',
},
},
},
description: '',
},
},
})
async parseTemplate(
#requestBody(MULTIPART_FORM_DATA)
request: Request,
#inject(RestBindings.Http.RESPONSE) response: Response,
): Promise<object> {
//const storage = multer.memoryStorage();
const storage = multer.diskStorage({
filename: (req: Request<ParamsDictionary>, file: Express.Multer.File, callback: (error: Error | null, filename: string) => void) => {
callback(null, file.originalname);
}
});
const upload = multer.default({ storage });
return new Promise<object>((resolve, reject) => {
let middleware: RequestHandler<ParamsDictionary> = upload.any();
console.log('----------------------------------------------------------');
//console.log(request);
middleware(request as any, response as any, (err: any) => {
if (err) return reject(err);
let arrFiles: Express.Multer.File[];
arrFiles = (request as Express.Request).files as Express.Multer.File[];
console.log('----------------------------------------------------------');
console.log(arrFiles[0]);
let workbook: xlsx.WorkBook = xlsx.read(arrFiles[0].path);
var sheet_name_list: string[] = workbook.SheetNames;
let firstSheet: xlsx.WorkSheet = workbook.Sheets[sheet_name_list[0]]
let strResult: any = xlsx.utils.sheet_to_json(firstSheet);
console.log('----------------------------------------------------------');
console.log(sheet_name_list);
console.log('----------------------------------------------------------');
console.log(strResult);
try {
unlinkSync(arrFiles[0].path);
} catch (e) {
//error deleting the file
}
resolve(strResult);
});
});
}
}
the line that parses the file it's this one:
let strResult: any = xlsx.utils.sheet_to_json(firstSheet);
My input excel file (template.xlsx) only has simple data in the first sheet:
I can't find any other issue that looks like this anywhere.
If anyone can help please tell me.
Much appreciated.
Omar

It seems that I was using:
let workbook: xlsx.WorkBook = xlsx.read(arrFiles[0].path);
instead of:
let workbook: xlsx.WorkBook = xlsx.readFile(arrFiles[0].path);
It was my mistake, now everything it's working fine.

Related

How do I propagate schema types to different files on fastify?

I am trying out Fastify with Typescript and I would like to have separation of concerns. Specifically, I want to separate my schema from my controller and routers. However, I do not manage to pass around the schema types easily.
My server creation is as follows:
import Fastify from 'fastify';
import { JsonSchemaToTsProvider } from '#fastify/type-provider-json-schema-to-ts';
import balanceRoute from './features/balance/route';
const createServer = () => {
const server = Fastify({ logger: true }).withTypeProvider<JsonSchemaToTsProvider>();
server.get('/healthz', async (request, reply) => {
return reply.code(200).send({
data: {
status: 'OK'
}
});
})
server.register(balanceRoute, { prefix: '/balance' });
return server;
}
My route is:
const route = async (server: FastifyTyped) => {
server.get(
'/:address',
{
schema: GetBalanceSchema
},
getBalanceController
);
};
My controller is:
export const getBalanceController = async (req: FastifyRequest, res: FastifyReply) => {
console.log('Within get balance handler');
const address = req.params.address; // PROBLEM IS HERE
const currentBalance = await getBalance('', '');
res.send({ hello: 'hello' });
};
My schema is as follows:
import { FastifySchema } from 'fastify';
export const GetBalanceSchema: FastifySchema = {
params: {
address: { type: 'string' }
},
querystring: {
chainID: { type: 'string' }
},
response: {
200: {
type: 'object',
properties: {
data: {
type: 'string'
}
}
}
}
} as const;
In the controller code, I cannot get Typescript to infer that req.params has an address field. Also, if I move the controller within the route it does not help neither.
Any clue about how to get this working in an easy way?
Thank you in advance and regards
That's because you've given your schema an explicit type annotation FastifySchema, which overrides the as const. You can try removing the explicit type annotation:
export const GetBalanceSchema = {
...
} as const;
Or not using as const:
export const GetBalanceSchema: FastifySchema = {
...
};
Maybe even using a utility function to enforce the type while retaining the original structure of the object:
function schema<S extends FastifySchema>(schema: S): S { return schema; }
export const GetBalanceSchema = schema({
...
});
But in TypeScript 4.9, we've got a new satisfies operator, that we can use like this:
export const GetBalanceSchema = {
...
} satisfies FastifySchema;

I always get a problem if the image is already created

This project changes the image sizes this project is using a library called sharp, but at first, if the image is created for the first time, it does not cause problems, but it always shows me that problem if it was already created.
In imageProcessing.ts File
import { Request, Response } from "express";
import fs, { promises as fsPromises } from 'fs';
import resizingTheImage from "./resizingTheImage";
const imageProcessing = async(req: Request, res: Response, next: Function): Promise<void | string> => {
const widthValue = <string>req.query.width;
const heightValue = <string>req.query.height;
const nameFileValue = <string>req.query.namefile;
const outputDir = `../../images/imageOutput/${nameFileValue}${widthValue}_${heightValue}.jpg`;
const makeDir = async (): Promise<void> => {
await fsPromises.mkdir('./images/imageOutput');
};
const imageProcessing = async (): Promise<void> => {
try {
await resizingTheImage(widthValue, heightValue, nameFileValue);
} catch {
res.send('There is a problem with the image processing');
}
};
if (fs.existsSync(outputDir)) {
return outputDir;
} else {
makeDir();
await imageProcessing();
}
next();
}
export default imageProcessing;
In resizingTheImage.ts File
import sharp from "sharp";
const resizingTheImage = async (imageWidth: string, imageHeight: string, fileName: string): Promise<void> => {
const imageDir: string = `./images/${fileName}.jpg`;
const outputDir: string = `./images/imageOutput/${fileName}${imageWidth}_${imageHeight}.jpg`;
if (isNaN(parseInt(imageWidth) && parseInt(imageHeight)) && fileName) {
console.log("the inputs is Invalid")
} else {
await sharp(imageDir)
.resize(parseInt(imageWidth), parseInt(imageHeight))
.toFile(outputDir);
}
}
export default resizingTheImage;
In error
[Error: EEXIST: file already exists, mkdir 'E:\Image Processing API\images\imageOutput'] {
errno: -4075,
code: 'EEXIST',
syscall: 'mkdir',
path: 'E:\\Image Processing API\\images\\imageOutput'
}

Download file from internet and send it to S3 as stream

I am trying to download some PDFs from internet into my Amazon S3 bucket, so far i download the files on my server and then upload them from my server to S3 Bucket but i was curious if i can upload them while downloading them as stream.
private async download(url: string, path: string): Promise<void> {
const response = await fetch(url);
const fileStream = createWriteStream(path);
await new Promise((resolve, reject) => {
response.body.pipe(fileStream);
response.body.on('error', reject);
fileStream.on('finish', resolve);
});
}
and this is my upload file after i downloaded it
public async upload(path: string, name: string): Promise<string> {
const url = 'documents/${name}.pdf';
const params = {
Body: createReadStream(path),
Bucket: AWS_S3_BUCKET_NAME,
Key: url
}
const data = await s3.putObject(params).promise().then(data => { console.log(data); return url; }, err => { console.log(err); return err; });
return data;
}
I am looking for a way to merge these 2 functions into one and return the S3 bucket reply after finished or throw an error if download or upload gave an error.
Also i wanted to ask if it is possible to call this function multiple times in parallel and if it's possible, how many times is safe to not break the server.
Thank you in advance, Daniel!
Yes, you can. Working example for downloading and uploading at the same time using multipart upload for node environment:
import {
AbortMultipartUploadCommandOutput,
CompleteMultipartUploadCommandOutput,
S3Client,
} from '#aws-sdk/client-s3';
import { Upload } from '#aws-sdk/lib-storage';
import Axios, { AxiosResponse } from 'axios';
import mime from 'mime-types';
import { Logger } from 'pino';
import { PassThrough } from 'stream';
export class S3HandlerClient {
private readonly PART_SIZE = 1024 * 1024 * 5; // 5 MB
private readonly CONCURRENCY = 4;
private readonly logger: Logger;
private readonly client: S3Client;
constructor(props: { logger: Logger; sdkClient: S3Client }) {
this.logger = props.logger;
this.client = props.sdkClient;
}
async uploadVideo(props: {
input: {
videoUrl: string;
};
output: {
bucketName: string;
fileNameWithoutExtension: string;
};
}): Promise<string> {
try {
const inputStream = await this.getInputStream({ videoUrl: props.input.videoUrl });
const outputFileRelativePath = this.getFileNameWithExtension(
props.output.fileNameWithoutExtension,
inputStream,
);
await this.getOutputStream({
inputStream,
output: {
...props.output,
key: outputFileRelativePath,
},
});
return `s3://${props.output.bucketName}/${outputFileRelativePath}`;
} catch (error) {
this.logger.error({ error }, 'Error occurred while uploading/downloading file.');
throw error;
}
}
private getFileNameWithExtension(fileName: string, inputStream: AxiosResponse) {
this.logger.info({ headers: inputStream.headers });
return `${fileName}.${this.getFileExtensionFromContentType(
inputStream.headers['content-type'],
)}`;
}
private getFileExtensionFromContentType(contentType: string): string {
const extension = mime.extension(contentType);
if (extension) {
return extension;
} else {
throw new Error(`Failed to get extension from 'Content-Type' header': ${contentType}.`);
}
}
private async getInputStream(props: { videoUrl: string }): Promise<AxiosResponse> {
this.logger.info({ videoUrl: props.videoUrl }, 'Initiating download');
const response = await Axios({
method: 'get',
url: props.videoUrl,
responseType: 'stream',
});
this.logger.info({ headers: response.headers }, 'Input stream HTTP headers');
return response;
}
private async getOutputStream(props: {
inputStream: AxiosResponse;
output: {
bucketName: string;
key: string;
};
}): Promise<CompleteMultipartUploadCommandOutput | AbortMultipartUploadCommandOutput> {
this.logger.info({ output: props.output }, 'Initiating upload');
const output = props.output;
const passThrough = new PassThrough();
const upload = new Upload({
client: this.client,
params: { Bucket: output.bucketName, Key: output.key, Body: passThrough },
queueSize: this.CONCURRENCY,
partSize: this.PART_SIZE,
leavePartsOnError: false,
});
props.inputStream.data.pipe(passThrough);
if (this.logger.isLevelEnabled('debug')) {
upload.on('httpUploadProgress', (progress) => {
this.logger.debug({ progress }, 'Upload progress');
});
}
return await upload.done();
}
}
This is how you can initialize it:
import { S3Client } from '#aws-sdk/client-s3';
import pino from 'pino';
const sdkClient = new S3Client({ region: 'us-west-2' });
const client = new S3HandlerClient({ logger: pino(), sdkClient });
Example dependencies:
{
...
"dependencies": {
"#aws-sdk/client-s3": "^3.100.0",
"#aws-sdk/lib-storage": "^3.100.0",
"axios": "^0.27.2",
"mime-types": "^2.1.35",
"pino": "^7.11.0",
"pino-lambda": "^4.0.0",
"streaming-s3": "^0.4.5",
},
"devDependencies": {
"#types/aws-lambda": "^8.10.97",
"#types/mime-types": "^2.1.1",
"#types/pino": "^7.0.5",
}
}

SvelteKit endpoint: converting from Node/Express

New to SvelteKit and working to adapt an endpoint from a Node/Express server to make it more generic so as to be able to take advantage of SvelteKit adapters. The endpoint downloads files stored in a database via node-postgresql.
My functional endpoint in Node/Express looks like this:
import stream from 'stream'
import db from '../utils/db'
export async function download(req, res) {
const _id = req.params.id
const sql = "SELECT _id, name, type, data FROM files WHERE _id = $1;"
const { rows } = await db.query(sql, [_id])
const file = rows[0]
const fileContents = Buffer.from(file.data, 'base64')
const readStream = new stream.PassThrough()
readStream.end(fileContents)
res.set('Content-disposition', `attachment; filename=${file.name}`)
res.set('Content-Type', file.type)
readStream.pipe(res)
}
Here's what I have for [filenum].json.ts in SvelteKit so far...
import stream from 'stream'
import db from '$lib/db'
export async function get({ params }): Promise<any> {
const { filenum } = params
const { rows } = await db.query('SELECT _id, name, type, data FROM files WHERE _id = $1;', [filenum])
if (rows) {
const file = rows[0]
const fileContents = Buffer.from(file.data, 'base64')
const readStream = new stream.PassThrough()
readStream.end(fileContents)
let body
readStream.pipe(body)
return {
headers: {
'Content-disposition': `attachment; filename=${file.name}`,
'Content-type': file.type
},
body
}
}
}
What is the correct way to do this with SvelteKit without creating a dependency on Node? Per SvelteKit's Endpoint docs,
We don't interact with the req/res objects you might be familiar with from Node's http module or frameworks like Express, because they're only available on certain platforms. Instead, SvelteKit translates the returned object into whatever's required by the platform you're deploying your app to.
UPDATE: The bug was fixed in SvelteKit. This is the updated code that works:
// src/routes/api/file/_file.controller.ts
import { query } from '../_db'
type GetFileResponse = (fileNumber: string) => Promise<{
headers: {
'Content-Disposition': string
'Content-Type': string
}
body: Uint8Array
status?: number
} | {
status: number
headers?: undefined
body?: undefined
}>
export const getFile: GetFileResponse = async (fileNumber: string) => {
const { rows } = await query(`SELECT _id, name, type, data FROM files WHERE _id = $1;`, [fileNumber])
if (rows) {
const file = rows[0]
return {
headers: {
'Content-Disposition': `attachment; filename="${file.name}"`,
'Content-Type': file.type
},
body: new Uint8Array(file.data)
}
} else return {
status: 404
}
}
and
// src/routes/api/file/[filenum].ts
import type { RequestHandler } from '#sveltejs/kit'
import { getFile } from './_file.controller'
export const get: RequestHandler = async ({ params }) => {
const { filenum } = params
const fileResponse = await getFile(filenum)
return fileResponse
}

Angular 5 - Node/Express - not able to download pdf

am trying to download pdf file from local folder that structures like
assets/test.pdf.
server.js
app.get('/ePoint', (req,res)=>{
// some dumb code :P
});
demo.ts
import { HttpClient, HttpHeaders } from '#angular/common/http';
import { Headers } from '#angular/http';
import {Observable} from 'rxjs';
fileDownload() {
const headers = new HttpHeaders();
headers.append('Accept', 'application/pdf');
this._http.get('http://localhost:3000/ePoint', { headers: headers })
.toPromise()
.then(response => this.saveItToClient(response));
}
private saveItToClient(response: any) {
const contentDispositionHeader: string = response.headers.get('Content-Disposition');
const parts: string[] = contentDispositionHeader.split(';');
const filename = parts[1].split('=')[1];
const blob = new Blob([response._body], { type: 'application/pdf' });
saveAs(blob, filename);
}
i dont know where i did mistake. in browser network console. its shows 200 ok. but in normal browser console shows as below attachment
Note: i referred for ts file from here
helps much appreciated
try this...
component.ts
downloadDocument(documentId: string) {
this.downloadDocumentSubscription = this.getService.downloadScannedDocument(documentId).subscribe(
data => {
this.createImageFromBlob(data);
},
error => {
console.log("no image found");
$("#errorModal").modal('show'); //show download err modal
});
}
createImageFromBlob(image: Blob) {
console.log("mylog", image);
if (window.navigator.msSaveOrOpenBlob) // IE10+
window.navigator.msSaveOrOpenBlob(image, "download." + (image.type.substr(image.type.lastIndexOf('/') + 1)));
else {
var url = window.URL.createObjectURL(image);
window.open(url);
}
}
service.ts
downloadScannedDocument(documentId: string): Observable<any> {
let params = new HttpParams();
if (documentTypeParam == false)
params = params.set('onlyActive', 'false');
let fileResult: Observable<any> = this.http.get(`${this.apiBaseUrl}/${documentId}`, { responseType: "blob", params: params });
return fileResult;
}

Resources