I'm trying to show multiple signed urls from gcs to the client and I don't know how to change the console.log to something that works - node.js

I have a bucket in gcs that contains images so with this code from the server I managed to paginate them and get 10 in a request and at the same time generate 10 signed urls but I still don't know how to send these urls to the client to be able to show them on my web page
For now, I can only send the name of the objects with this code and the signed urls appear in the console
import { Injectable, Options, UseFilters } from '#nestjs/common';
import { AdminService } from 'src/firebase-admin/admin/admin.service';
#Injectable()
export class FilesService {
constructor(
private adminService: AdminService) {}
async get() {
let options = undefined;
options = {
projection: 'noAcl',
maxResults: 10,
};
return this.adminService.bucket.getFiles(options).then(async ([files]: any) => {
const fileNames = files.map((file: any) => file.name);
for (const fileName of fileNames) {
const [signedUrl] = await this.adminService.bucket.file(fileName).getSignedUrl({
version: 'v4',
expires: Date.now() + 1000 * 60 * 60,
action: 'read'
});
console.log(`The signed URL for ${fileName} is ${signedUrl}`);
}
return fileNames;
})
}
}
u

Related

how to prevent file upload when body validation fails in nestjs

I have the multipart form to be validated before file upload in nestjs application. the thing is that I don't want the file to be uploaded if validation of body fails.
here is how I wrote the code for.
// User controller method for create user with upload image
#Post()
#UseInterceptors(FileInterceptor('image'))
create(
#Body() userInput: CreateUserDto,
#UploadedFile(
new ParseFilePipe({
validators: [
// some validator here
]
})
) image: Express.Multer.File,
) {
return this.userService.create({ ...userInput, image: image.path });
}
Tried so many ways to turn around this issue, but didn't reach to any solution
Interceptors run before pipes do, so there's no way to make the saving of the file not happen unless you manage that yourself in your service. However, another option could be a custom exception filter that unlinks the file on error so that you don't have to worry about it post-upload
This is how I created the whole filter
import { isArray } from 'lodash';
import {
ExceptionFilter,
Catch,
ArgumentsHost,
BadRequestException,
} from '#nestjs/common';
import { Request, Response } from 'express';
import * as fs from 'fs';
#Catch(BadRequestException)
export class DeleteFileOnErrorFilter implements ExceptionFilter {
catch(exception: BadRequestException, host: ArgumentsHost) {
const ctx = host.switchToHttp();
const response = ctx.getResponse<Response>();
const request = ctx.getRequest<Request>();
const status = exception.getStatus();
const getFiles = (files: Express.Multer.File[] | unknown | undefined) => {
if (!files) return [];
if (isArray(files)) return files;
return Object.values(files);
};
const filePaths = getFiles(request.files);
for (const file of filePaths) {
fs.unlink(file.path, (err) => {
if (err) {
console.error(err);
return err;
}
});
}
response.status(status).json(exception.getResponse());
}
}

Intercepting in Multer Mutates Request? (NestJS)

Does multer mutates any request that has given to it? I'm currently trying to intercept the request to add this in logs.
But whenever I try to execute this code first:
const newReq = cloneDeep(request); // lodash cloneDeep
const newRes = cloneDeep(response);
const postMulterRequest: any = await new Promise((resolve, reject) => {
const multerReponse = multer().any()
multerReponse(request, newRes, err => {
if (err) reject(err)
resolve(request)
})
})
files = postMulterRequest?.files;
The #UseInterceptors(FileInterceptor('file')) becomes undefined.
I have already seen the problem, it seems like the multerReponse(request, newRes, err => { mutates the request. But I don't know what the other approach I can do to fix this. (I tried JSON Serialization, Object.assign, cloneDeep, but none of those worked)
I have tried adding newReq and newRes (cloned object) to multerResponse at first it worked. But at the second time, the thread only hangs up, and doesn't proceed to next steps. Or the multerReponse(request, newRes, err => { doesn't return anything.
The whole code looks like this and used globally (some parts of here were redacted/removed; but the main logic is still the same) :
#Injectable()
export class AuditingInterceptor implements NestInterceptor {
constructor(
#InjectModel(Auditing.name)
private readonly AuditingModel: Model<Auditing>,
) {}
async intercept(
context: ExecutionContext,
next: CallHandler,
): Promise<Observable<any>> {
const request = context.switchToHttp().getRequest();
const response = context.switchToHttp().getResponse();
const { headers, method, ip, route, query, body } = request;
let bodyParam = Object.assign({}, body),
files: any;
const newReq = cloneDeep(request); // lodash cloneDeep
const newRes = cloneDeep(response);
const postMulterRequest: any = await new Promise((resolve, reject) => {
const multerReponse = multer().any();
multerReponse(newReq, newRes, (err) => {
if (err) reject(err);
resolve(newReq);
});
});
files = postMulterRequest?.files;
return next.handle().pipe(
tap(() =>
this.AuditingModel.create({
request: {
query,
bodyParam,
files,
},
timeAccessed: new Date().toISOString(),
}),
),
);
}
}
Summary of what I need to do here is I need to intercept and log the file in our DB before it gets processed in the method/endpoint that uses #UseInterceptors(FileInterceptor('file')).
I have solve this by intercepting the request using the
#Req() req
and creating a method to handle the files that was intercepted inside the FileInterceptor decorator.
Code Example:
// create logs service first to handle your queries
createLogs(file, req){
// do what you need to do with the file, and req here
const { filename } = file;
const { ip } = req
....
}
// main service
// inject the service first
constructor(#Inject(LogsService) private logsService: LogsService)
uploadHandler(file, req){
this.logsService.createLogs(file, req)
// proceed with the next steps
....
}
// controller
#Post('upload')
#UseInterceptors(FileInterceptor('file'))
testFunction(#UploadedFile() file: Express.Multer.File,, #Req req){
return this.serviceNameHere.uploadHandler(file, req);
}

Node.js throws a "ENOENT" when I trying to write a file with a name too large

I'm writing a session library for Express.js that stores the sessions in encrypted files. The basic interfaces hierarchy is:
export interface Current<T = any> {
load(): Promise<T>;
save(value: T): Promise<void>;
}
export interface Manager {
current<T = any>(): Current<T>;
create(): Promise<void>;
delete(): Promise<void>;
rewind(): void;
}
declare global {
namespace Express {
export interface Request {
session: Manager;
}
}
}
Every session is stored in a JSON file, which name is a unique hash id. That hash is returned to the client as a cookie. The implementation it's like that:
import { sessionCrossover } from '.';
import express from 'express';
const app = express();
app.use(sessionCrossover({
path: './data',
expires: 1000 * 10,
hashLength: 126
}));
For create a new session:
// The data structure of every session in this example
interface Data {
id: number;
value: string;
}
// Create an endpoint for create a new session
app.get('/create', async (req, res) => {
try {
// Get the current session instance
const current = req.session.current<Data>();
if (!current) {
// Create a new session instance
await req.session.create();
// Save in the current session the data. Just in this
// case, if the session file doesn't exist, it will
// be created. This method throws the error...
await req.session
.current<Data>()
.save({
id: ++id,
value: new Date().toJSON()
});
res.json('Session created sucessfully');
} else {
req.session.rewind();
res.json('Session rewinded...');
}
} catch (err) {
console.error(err);
res.json(err);
}
});
Inside of the method that throws the error:
import { File } from '../tool/fsys';
import { Current } from './interfaces';
export class CurrentSession<T = any> implements Current<T> {
private _file: File;
// The related method
save(value: T): Promise<void> {
if (!this._killed) {
const text = JSON.stringify(value, null, ' ');
const byte = Buffer.from(text, 'utf8');
// Throws the error
return this._file.write(byte);
} else {
return Promise.resolve();
}
}
}
And the File class:
import * as fs from 'fs';
import * as fsPromises from 'fs/promises';
export class File extends FSys {
// The related method
public write(byte: Buffer): Promise<void> {
// this._path it's a protected property of FSys
return fsPromises.writeFile(this._path, byte);
}
}
You can set the current hash length in the middleware shown before. The problem is this:
When you set a hash of a 126 bytes or more, node.js throws an "ENOENT" error (path not found).
When the hash is 125 bytes or less the file with the session it's created normally.
My questions are:
Why node.js thows an "ENOENT" (path not found error) when i try to create a file with a large filename?
Exists a method to detect a "too large filesize" exception in windows?
Observations:
I tried with different hash byte length. The paths in those cases, are:
Hash length = 8; OK
C:\Projects\Node.JS\modules\session-crossover\data\be49c866b0181718.json
Hash length = 64; OK
C:\Projects\Node.JS\modules\session-crossover\data\419410c8db26d74563e31b3c0a12e9fb12d31951abe6b280869af47db088c9acaf251de12cd6b6fc51bf3182fa07597add2b48825498d869b99e914c64d42efa.json
Hash length = 125; OK
C:\Projects\Node.JS\modules\session-crossover\data\b6f4026e893fb053e626ca3318771a70e0802ca10bfc4ea018e18f35b04aa7f9e365a9883a35eea381d9cb9ad2ca11c8961e0096aacd2802e9e0b4cd96920c073800f40a1224d99a093f7fa0b4eca8799bc84c4fa84db2b8b62df211824271c4d908d3d62defa6f1890e613e04af86bcd04379b57ab3728e0366ed42c9.json
Hash length = 126; "ENOENT"
C:\Projects\Node.JS\modules\session-crossover\data\7784ae9a697eb7e5a2ccdf7a9b27c4d182e1e637da4efc47d21a1d48f208f1058bf40f6026dccb79702ea61ea3f4ca307fdeb960a38c89187b0c1b66395934a802ee62769810bd191eb85636d6a86c900299b68fcc1ad6ccfbd83aba863fc181a522cd22d0671148b56d6e4c8051b4366439d6855597caad0eb6a4bba043.json

Extensions not returned in GraphQL query results

I'm creating an Apollo Client like this:
var { ApolloClient } = require("apollo-boost");
var { InMemoryCache } = require('apollo-cache-inmemory');
var { createHttpLink } = require('apollo-link-http');
var { setContext } = require('apollo-link-context');
exports.createClient = (shop, accessToken) => {
const httpLink = createHttpLink({
uri: `https://${shop}/admin/api/2019-07/graphql.json`,
});
const authLink = setContext((_, { headers }) => {
return {
headers: {
"X-Shopify-Access-Token": accessToken,
"User-Agent": `shopify-app-node 1.0.0 | Shopify App CLI`,
}
}
});
return new ApolloClient({
cache: new InMemoryCache(),
link: authLink.concat(httpLink),
});
};
to hit the Shopify GraphQL API and then running a query like that:
return client.query({
query: gql` {
productVariants(first: 250) {
edges {
node {
price
product {
id
}
}
cursor
}
pageInfo {
hasNextPage
}
}
}
`})
but the returned object only contain data and no extensions which is a problem to figure out the real cost of the query.
Any idea why?
Many thanks for your help
There's a bit of a hacky way to do it that we wrote up before:
You'll need to create a custom apollo link (Apollo’s equivalent of middleware) to intercept the response data as it’s returned from the server, but before it’s inserted into the cache and the components re-rendered.
Here's an example were we pull metrics data from the extensions in our API:
import { ApolloClient, InMemoryCache, HttpLink, ApolloLink } from 'apollo-boost'
const link = new HttpLink({
uri: 'https://serve.onegraph.com/dynamic?show_metrics=true&app_id=<app_id>',
})
const metricsWatchers = {}
let id = 0
export function addMetricsWatcher(f) {
const watcherId = (id++).toString(36)
metricsWatchers[watcherId] = f
return () => {
delete metricsWatchers[watcherId]
}
}
function runWatchers(requestMetrics) {
for (const watcherId of Object.keys(metricsWatchers)) {
try {
metricsWatchers[watcherId](requestMetrics)
} catch (e) {
console.error('error running metrics watcher', e)
}
}
}
// We intercept the response, extract our extensions, mutatively store them,
// then forward the response to the next link
const trackMetrics = new ApolloLink((operation, forward) => {
return forward(operation).map(response => {
runWatchers(
response
? response.extensions
? response.extensions.metrics
: null
: null
)
return response
})
})
function create(initialState) {
return new ApolloClient({
link: trackMetrics.concat(link),
cache: new InMemoryCache().restore(initialState || {}),
})
}
const apolloClient = create(initialState);
Then to use the result in our React components:
import { addMetricsWatcher } from '../integration/apolloClient'
const Page = () => {
const [requestMetrics, updateRequestMetrics] = useState(null)
useEffect(() => {
return addMetricsWatcher(requestMetrics =>
updateRequestMetrics(requestMetrics)
)
})
// Metrics from extensions are available now
return null;
}
Then use a bit of mutable state to track each request and its result, and the use that state to render the metrics inside the app.
Depending on how you're looking to use the extensions data, this may or may not work for you. The implementation is non-deterministic, and can have some slight race conditions between the data that’s rendered and the data that you've extracted from the extensions.
In our case, we store performance metrics data in the extensions - very useful, but ancillary - so we felt the tradeoff was acceptable.
There's also an open issue on the Apollo client repo tracking this feature request
I dont have any idea of ApolloClient but i tried to run your query in shopify graphql app. It return results with extensions. Please find screenshot below. Also You can put questions in ApolloClient github.

NestJS upload using GraphQL [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
Is anyone has an example of how to upload a file in NestJs using GraphQl?
I can upload using given example via controller
https://github.com/nestjs/nest/issues/262#issuecomment-366098589,
but I couldn't find any comprehensive documentation how to upload using GrahpQL in NestJS
Apollo Server 2.0 should be able to do this now (packaged in nest), although I needed to install graphql-upload and import GraphQLUpload as I couldn't find the Upload type:
#Mutation(() => Image, { nullable: true })
async addImage(#Args({ name: 'image', type: () => GraphQLUpload }) image) {
// Do stuff with image...
}
At the time of this answer FileInterceptor is using multer and by converting ExecutionContext to http it uses getRequest and getResponse methods to provide req and res to multer.single which they are (req and res) undefined in GraphQL.
I have tried to get request from context using:
const ctx = GqlExecutionContext.create(context);
and there is req property in ctx but I can't find a way to use multer (yet).
Anyway, I made some changes to FileFieldsInterceptor to use it inside my project, but I may make pull request when I had time to clean it up:
import { Observable } from 'rxjs';
import {
NestInterceptor,
Optional,
ExecutionContext,
mixin,
} from '#nestjs/common';
import { GqlExecutionContext } from '#nestjs/graphql';
import { storeFile } from './storeFile';
interface IField {
name: string;
options?: any;
}
export function GraphqlFileFieldsInterceptor(
uploadFields: IField[],
localOptions?: any,
) {
class MixinInterceptor implements NestInterceptor {
options: any = {};
constructor(#Optional() options: any = {}) {
this.options = { ...options, ...localOptions };
}
async intercept(
context: ExecutionContext,
call$: Observable<any>,
): Promise<Observable<any>> {
const ctx = GqlExecutionContext.create(context);
const args = ctx.getArgs();
let storeFilesResult = await Promise.all(
uploadFields.map(uploadField => {
const file = args[uploadField.name];
return storeFile(file, {
...uploadField.options,
...this.options,
}).then(address => {
args[uploadField.name] = address;
return address;
});
}),
);
return call$;
}
}
const Interceptor = mixin(MixinInterceptor);
return Interceptor;
}
and store file is something like this (may not be used like this):
import uuid from 'uuid/v4';
import fs from 'fs';
import path from 'path';
const dir = './files';
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir);
}
export const storeFile = async (file, options): Promise<any> => {
// options is not doing anything right now
const { stream } = await file;
const filename = uuid();
const fileAddress = path.join(dir, filename + '.jpg');
return new Promise((resolve, reject) =>
stream
.on('error', error => {
if (stream.truncated)
// Delete the truncated file
fs.unlinkSync(fileAddress);
reject(error);
})
.pipe(fs.createWriteStream(fileAddress))
.on('error', error => reject(error))
.on('finish', () => resolve(fileAddress)),
);
};
In my Cats.resolvers.ts:
...
#Mutation()
#UseInterceptors(
GraphqlFileFieldsInterceptor([
{ name: 'catImage1' },
{ name: 'catImage2' },
{ name: 'catImage3' },
]),
)
async cats(
#Args('catImage1') catImage1: string,
#Args('catImage2') catImage2: string,
#Args('catImage3') catImage3: string,
){
console.log(catImage1) // will print catImage1 address
...
This implementation works perfectly with Node >= v14
package.json
Remove the fs-capacitor and graphql-upload entries from the resolutions section if you added them, and install the latest version of graphql-upload (v11.0.0 at this time) package as a dependency.
src/app.module.ts
Disable Apollo Server's built-in upload handling and add the graphqlUploadExpress middleware to your application.
import { graphqlUploadExpress } from "graphql-upload"
import { MiddlewareConsumer, Module, NestModule } from "#nestjs/common"
#Module({
imports: [
GraphQLModule.forRoot({
uploads: false, // disable built-in upload handling
}),
],
})
export class AppModule implements NestModule {
configure(consumer: MiddlewareConsumer) {
consumer.apply(graphqlUploadExpress()).forRoutes("graphql")
}
}
src/blog/post.resolver.ts (example resolver)
Remove the GraphQLUpload import from apollo-server-core and import from graphql-upload instead
import { FileUpload, GraphQLUpload } from "graphql-upload"
#Mutation(() => Post)
async postCreate(
#Args("title") title: string,
#Args("body") body: string,
#Args("attachment", { type: () => GraphQLUpload }) attachment: Promise<FileUpload>,
) {
const { filename, mimetype, encoding, createReadStream } = await attachment
console.log("attachment:", filename, mimetype, encoding)
const stream = createReadStream()
stream.on("data", (chunk: Buffer) => /* do stuff with data here */)
}
Source: https://github.com/nestjs/graphql/issues/901#issuecomment-780007582
Some other links that I found helpful:
https://stephen-knutter.github.io/2020-02-07-nestjs-graphql-file-upload/
For uploading files using postman Link
EDIT: As per Developia comment below, apollo-server now implements file upload. Should be preferred way.
Below, original answer, for reference.
One normally does not use GraphQL for upload. GraphQL is fancy "specification of API", meaning that in the end of the day, low level HTTP request and responses are translated to/from JSON objects (if you don't have custom transport).
One solution could be to define special endpoint in GraphQL schema like:
mutation Mutation {
uploadFile(base64: String): Int
}
Then client would convert binary data to base64 string, which would be handled accordingly on resolver side. This way, file will become part of JSON object exchanged between GraphQL client and server.
While this is might be suitable for small files, small number of operations, it is definitely not a solution for upload service.
try this
import { Resolver, Mutation, Args } from '#nestjs/graphql';
import { createWriteStream } from 'fs';
import {GraphQLUpload} from "apollo-server-express"
#Resolver('Download')
export class DownloadResolver {
#Mutation(() => Boolean)
async uploadFile(#Args({name: 'file', type: () => GraphQLUpload})
{
createReadStream,
filename
}): Promise<boolean> {
return new Promise(async (resolve, reject) =>
createReadStream()
.pipe(createWriteStream(`./uploads/${filename}`))
.on('finish', () => resolve(true))
.on('error', () => reject(false))
);
}
}
You could use the apollo-upload-server lib. Seems like the easiest thing to do, in my opinion. Cheers
You need to define an upload controller and add it in your app.module, this is an example of what a controller should be (back-end):
#Controller()
export class Uploader {
#Post('sampleName')
#UseInterceptors(FileInterceptor('file'))
uploadFile(#UploadedFile() file) {
// file name selection
const path = `desired path`;
const writeStream = fs.createWriteStream(path);
writeStream.write(file.buffer);
writeStream.end();
return {
result: [res],
};
}
}
And call your controller by fetch in the front-end:
fetch('controller address', {
method: 'POST',
body: data,
})
.then((response) => response.json())
.then((success) => {
// What to do when succeed
});
})
.catch((error) => console.log('Error in uploading file: ', error));

Resources