How to get rid of "TypeError: req.pipe is not a function" nestjs and fastify - node.js

I tried to upload file with NestJS/Fastify and typescript
this is main.ts
async function bootstrap() {
//file upload with fastify
const fastifyAdapter = new FastifyAdapter();
fastifyAdapter.register(fmp, {
limits: {
fieldNameSize: 100, // Max field name size in bytes
fieldSize: 1000000, // Max field value size in bytes
fields: 10, // Max number of non-file fields
fileSize: 100, // For multipart forms, the max file size
files: 1, // Max number of file fields
headerPairs: 2000, // Max number of header key=>value pairs
},
});
const app = await NestFactory.create<NestFastifyApplication>(
AppModule,
fastifyAdapter,
);
await app.listen(3000);
Logger.log('application started on http://localhost:3000', 'Bootstrap');
}
bootstrap();
and this is file.controller.ts
#Post()
#UseInterceptors(FileInterceptor('image'))
#ApiConsumes('multipart/form-data')
#ApiBody({
description: 'logo',
type: UploadFileDto,
})
uploadedFile(#UploadedFile() file) {
const response = {
originalname: file.originalname,
filename: file.filename,
};
return response;
}
after uploading a file to this action, code throw an exception like this
TypeError: req.pipe is not a function
at multerMiddleware (D:\R.Khodabakhshi\Repository\raimun-web\node_modules\multer\lib\make-middleware.js:176:9)
at Promise (D:\R.Khodabakhshi\Repository\raimun-web\node_modules#nestjs\platform-express\multer\interceptors\file.interceptor.js:15:81)
at new Promise ()
at MixinInterceptor.intercept (D:\R.Khodabakhshi\Repository\raimun-web\node_modules#nestjs\platform-express\multer\interceptors\file.interceptor.js:15:19)
at D:\R.Khodabakhshi\Repository\raimun-web\node_modules#nestjs\core\interceptors\interceptors-consumer.js:22:36
at Object.handle (D:\R.Khodabakhshi\Repository\raimun-web\node_modules#nestjs\core\interceptors\interceptors-consumer.js:20:56)
at LoggingInterceptor.intercept (D:\R.Khodabakhshi\Repository\raimun-web\dist\shared\logging.interceptor.js:28:21)
at D:\R.Khodabakhshi\Repository\raimun-web\node_modules#nestjs\core\interceptors\interceptors-consumer.js:22:36
at InterceptorsConsumer.intercept (D:\R.Khodabakhshi\Repository\raimun-web\node_modules#nestjs\core\interceptors\interceptors-consumer.js:24:24)
at D:\R.Khodabakhshi\Repository\raimun-web\node_modules#nestjs\core\router\router-execution-context.js:45:60
[Nest] 10928 - 2020-02-06 10:10:49 [ExceptionFilter] undefined undefined +587529ms
TypeError: req.pipe is not a function
at multerMiddleware (D:\R.Khodabakhshi\Repository\raimun-web\node_modules\multer\lib\make-middleware.js:176:9)
at Promise (D:\R.Khodabakhshi\Repository\raimun-web\node_modules#nestjs\platform-express\multer\interceptors\file.interceptor.js:15:81)
at new Promise ()
at MixinInterceptor.intercept (D:\R.Khodabakhshi\Repository\raimun-web\node_modules#nestjs\platform-express\multer\interceptors\file.interceptor.js:15:19)
at D:\R.Khodabakhshi\Repository\raimun-web\node_modules#nestjs\core\interceptors\interceptors-consumer.js:22:36
at Object.handle (D:\R.Khodabakhshi\Repository\raimun-web\node_modules#nestjs\core\interceptors\interceptors-consumer.js:20:56)
at LoggingInterceptor.intercept (D:\R.Khodabakhshi\Repository\raimun-web\dist\shared\logging.interceptor.js:28:21)
at D:\R.Khodabakhshi\Repository\raimun-web\node_modules#nestjs\core\interceptors\interceptors-consumer.js:22:36
at InterceptorsConsumer.intercept (D:\R.Khodabakhshi\Repository\raimun-web\node_modules#nestjs\core\interceptors\interceptors-consumer.js:24:24)
at D:\R.Khodabakhshi\Repository\raimun-web\node_modules#nestjs\core\router\router-execution-context.js:45:60
how can I fix the problem???

You cannot use the FastifyAdapter with the FileInterceptor. It says so in the beginning of the docs. If you want to use Fastify and a file upload, you'll need to create your own interceptor for it.

Problem solved, as Jay McDaniel mentioned we couldn't use the FastifyAdapter with the FileInterceptor.
I resolved the problem with this little code.
import {
Controller,
Logger,
Post,
Req,
Res,
} from '#nestjs/common';
import * as fs from 'fs';
import * as path from 'path';
import * as pump from 'pump';
const logger = new Logger('FileController');
#ApiTags('File')
#Controller('api/file')
export class FileController {
#Post()
upload(#Req() req: any, #Res() reply: any): void {
const mp = req.multipart(
(field: any, file: any, filename: any, encoding: any, mimeType: any) => {
console.log('save file from request ---- ', field, filename, mimeType);
file.on('limit', () => logger.error('SIZE_LIMITED'));
const filePath = path.resolve('./'+filename);
const writeStream = fs.createWriteStream(filePath);
pump(file, writeStream);
writeStream.on('finish', () => {
reply.code(200).send();
});
},
(error: any) => {
if (error) {
logger.error(error);
reply.code(500).send();
}
},
);
mp.on('partsLimit', () => logger.error('MAXIMUM_NUMBER_OF_FORM_PARTS'));
mp.on('filesLimit', () => logger.error('MAXIMUM_NUMBER_OF_FILES'));
mp.on('fieldsLimit', () => logger.error('MAXIMUM_NUMBER_OF_FIELD'));
}
}
I hope this will help you too...

Related

how to mock react-query useQuery in jest

I'm trying to mock out axios that is inside an async function that is being wrapped in useQuery:
import { useQuery, QueryKey } from 'react-query'
export const fetchWithAxios = async () => {
...
...
...
const response = await someAxiosCall()
...
return data
}
export const useFetchWithQuery = () => useQuery(key, fetchWithAxios, {
refetchInterval: false,
refetchOnReconnect: true,
refetchOnWindowFocus: true,
retry: 1,
})
and I want to use moxios
moxios.stubRequest('/some-url', {
status: 200,
response: fakeInputData,
})
useFetchWithQuery()
moxios.wait(function () {
done()
})
but I'm getting all sorts of issues with missing context, store, etc which I'm iterested in mocking out completely.
Don't mock useQuery, mock Axios!
The pattern you should follow in order to test your usages of useQuery should look something like this:
const fetchWithAxios = (axios, ...parameters) => {
const data = axios.someAxiosCall(parameters);
return data;
}
export const useFetchWithQuery = (...parameters) => {
const axios = useAxios();
return useQuery(key, fetchWithAxios(axios, ...parameters), {
// options
})
}
Where does useAxios come from? You need to write a context to pass an axios instance through the application.
This will allow your tests to look something like this in the end:
const { result, waitFor, waitForNextUpdate } = renderHook(() => useFetchWithQuery(..., {
wrapper: makeWrapper(withQueryClient, withAxios(mockedAxios)),
});
await waitFor(() => expect(result.current.isFetching).toBeFalsy());

Import module with folder and passing data to module in nodejs

I Found The Tutorial about
Designing a clean REST API with Node.js (Express + Mongo)
project in github.
but the problem is i didn't get the concept of routing in one part.
the misundrestanding part is how is it possible to pass httpRequest data to handle method within contact-endpoint module?
because handle method is in here export default function makeContactsEndpointHandler({ contactList }) {
return async function handle(httpRequest) {
this is the index of project:
import handleContactsRequest from "./contacts";
import adaptRequest from "./helpers/adapt-request";
app.all("/contacts", contactsController);
app.get("/contacts/:id", contactsController);
function contactsController(req, res) {
const httpRequest = adaptRequest(req);
handleContactsRequest(httpRequest)
.then(({ headers, statusCode, data }) =>
res.set(headers).status(statusCode).send(data)
)
.catch((e) => res.status(500).end());
}
this is the adaptRequest:
export default function adaptRequest (req = {}) {
return Object.freeze({
path: req.path,
method: req.method,
pathParams: req.params,
queryParams: req.query,
body: req.body
})
}
this is the handleContactsRequest module:
import makeDb from "../db";
import makeContactList from "./contact-list";
import makeContactsEndpointHandler from "./contacts-endpoint";
const database = makeDb();
const contactList = makeContactList({ database });
const contactsEndpointHandler = makeContactsEndpointHandler({ contactList });
export default contactsEndpointHandler;
this is part of contact-endpoint module:
export default function makeContactsEndpointHandler({ contactList }) {
return async function handle(httpRequest) {
switch (httpRequest.method) {
case "POST":
return postContact(httpRequest);
case "GET":
return getContacts(httpRequest);
default:
return makeHttpError({
statusCode: 405,
errorMessage: `${httpRequest.method} method not allowed.`,
});
}
}
makeContactsEndpointHandler is a function that returns a function (async handle(xxx)).
In handleContactsRequest, we export the result of the call: makeContactsEndpointHandler({ contactList }). Which is therefore the function async handle(xxx) itself.
So, in index, when we call handleContactsRequest with the constant httpRequest as argument, we're actually calling that handle(xxx) function. (I wrote xxx as parameter name to highlight the difference between the two httpRequest declarations.)

Jest - mocking and testing pino multi streams based on log levels

I am struggling to find out correct way of mocking and using pino in a test logging service,
So here is my implementation of pino logger. This write to different file streams based on log levels.
getChildLoggerService(fileNameString): pino.Logger {
const streams: Streams = [
{ level: 'fatal', stream: fs.createWriteStream(path.join(process.cwd(), './logs/database-connect-fatal.log'))},
{ level: 'error', stream: fs.createWriteStream(path.join(process.cwd(), './logs/database-connect-error.log'))},
{ level: 'debug', stream: fs.createWriteStream(path.join(process.cwd(), './logs/database-connect-debug.log'))},
{ level: 'info', stream: fs.createWriteStream(path.join(process.cwd(), './logs/database-connect-info.log'))},
];
return pino({useLevelLabels: true,
base: {
hostName: os.hostname(),
platform: os.platform(),
processId: process.pid,
timestamp: this.appUtilService.getCurrentLocaleTimeZone(),
// tslint:disable-next-line: object-literal-sort-keys
fileName: this.appUtilService.getFileName(fileNameString),
} ,
level: this.appUtilService.getLogLevel(),
messageKey: LOGGER_MSG_KEY,
prettyPrint: this.appUtilService.checkForDevEnv(process.env.NODE_ENV),
timestamp: () => {
return this.appUtilService.getCurrentLocaleTimeZone()
},
}, multistream(streams)).child({
connectorReqId: (process.env.REQ_APP_NAME === null ? 'local': process.env.REQ_APP_NAME)
+uuid.v4().toString()
});
}
The most important part I wanted to test is the multistreams where I need to write to different log files based on the log levels and so far I couldn't figure out a way to do that
import pino, { DestinationStream } from 'pino';
const sinon = require('sinon');
import pinoms from 'pino-multi-stream';
const fs = require('fs');
const path = require('path');
const stream = require('stream');
const { PassThrough } = require('stream');
class EchoStream extends stream.Writable {
_write(chunk, enc, next) {
console.log('ssdsdsd',chunk.toString());
next();
}
}
import * as _ from 'lodash';
import { Writable } from 'stream';
import { mocked } from 'ts-jest/utils';
import { LogServiceInstance } from './log.service';
// jest.mock('pino', () => jest.fn().mockImplementation(() => { ====> Tried this inline mock, doesnt work
// return {
// child: jest.fn().mockReturnValue(jest.requireActual('pino').Logger)
// }
// }));
// jest.mock('pino', () => {
// return jest.fn().mockImplementation(() => {
// return {
// child: jest.fn().mockReturnValue(jest.requireActual('pino').Logger),
// stream: jest.fn().mockImplementation(() => {
// return [
// {
// level: 'info',
// stream: fs.createWriteStream(
// path.resolve(process.cwd(), '/test/database-connector-logs/info.log')
// ),
// },
// {
// level: 'warn',
// stream: fs.createWriteStream(
// path.resolve(process.cwd(), '/test/database-connector-logs/warn.log')
// ),
// },
// ];
// }),
// };
// });
// });
describe('Test suite for Log service', () => {
//const mockedPino = mocked(pino, true);
test('Test case for getLoggerInstance', () => {
const mockedPinoMsStream = [
const mockedPinoStream = (pino.prototype.stream = jest.fn(() => mockedPinoMsStream));
console.dir(pino);
const prop = Reflect.ownKeys(pino).find((s) => {
return s === 'symbols';
});
// Tried this but it did not work as the actual files are written with the values
pino[prop]['streamSym'] = jest.fn().mockImplementation(() => {
return fs.createWriteStream(path.resolve(process.cwd(), './test/database-connector-logs/info.log'))
});
console.dir(pino);
const log = LogServiceInstance.getChildLoggerService(__filename);
console.dir(Object.getPrototypeOf(log));
log.info('test logging');
expect(2).toEqual(2);
});
Could someone let me know where the mocking is wrong and how to mock it properly
UPDATE:
I came to understand that mocking pino-multi-stream might do the trick, so tried it this way. This was added at the very top and rest of all mockings are all removed (even inside the test suite as well)
const mockedPinoMultiStream = {
stream: jest.fn().mockImplementation(() => {
return {write: jest.fn().mockReturnValue(new PassThrough())}
})
}
jest.mock('pino-multi-stream', () => {
return {
multistream: jest.fn().mockReturnValue(mockedPinoMultiStream)
}
});
wanted to mock to test if based on the level, respective named files are being used, but this also results in exception
TypeError: stream.write is not a function
at Pino.write (/XXX/node_modules/pino/lib/proto.js:161:15)
at Pino.LOG (/XXXX/node_modules/pino/lib/tools.js:39:26)
LATEST UPDATE:
So I resolved the exception by modifying the way pino multistream is mocked
const { PassThrough } = require('stream');
...
...
const mockedPinoMultiStream = {
write: jest.fn().mockImplementation((data) => {
return new Passthrough();
})
};
Now there is no more exception and write(method) is properly mocked when I print "pino". BUt I do not understand how to test the different files based on different log levels. Could someone let me know, how that is to be done.?
Note: I tried setting a return value of fs.createWriteStream instead of a Passthrough but that didnt work
Atlast, I found the answer to making use of pino streams based on different log levels.
I went ahead and created a test directory to house the test log files. In reality, we do not want pino to be adulterating the actual log files. So I decided to mock the pino streams during the start of the jest test. This file gets executed first before any test suite is triggered. So I modified the jest configuration in package.json like
"setupFiles": [
"<rootDir>/jest-setup/stream.logger.js"
],
in the stream.logger.js file, I added
const pinoms = require('pino-multi-stream');
const fs = require('fs');
const path = require('path');
const stream = require('stream');
const Writable = require('stream').Writable;
const { PassThrough } = require('stream');
const pino = require('pino');
class MyWritable extends Writable {
constructor(options) {
super(options);
}
_write(chunk, encoding, callback) {
const writeStream =fs.createWriteStream(path.resolve(process.cwd(), './test/logs/info.log'));
writeStream.write(chunk,'utf-8');
writeStream.emit('close');
writeStream.end();
}
}
const mockedPinoMultiStream = {
write: jest.fn().mockImplementation((data) => {
const writeStream = new MyWritable();
return writeStream._write(data);
})
};
jest.mock('pino-multi-stream', () => {
return {
multistream: jest.fn().mockReturnValue(mockedPinoMultiStream)
}
});
Now I went ahead and created the test file - log.service.spec.ts
import * as pino from 'pino';
const sinon = require('sinon');
import pinoms from 'pino-multi-stream';
const fs = require('fs');
const path = require('path');
const stream = require('stream');
import * as _ from 'lodash';
import { Writable } from 'stream';
import { mocked } from 'ts-jest/utils';
import { LogServiceInstance } from './log.service';
describe('Test suite for Log service', () => {
//const mockedPino = mocked(pino, true);
afterEach(() => {
// delete the contents of the log files after each test suite
fs.truncate((path.resolve(process.cwd(), './test/logs/info.log')), 0, () => {
console.dir('Info log file deleted');
});
fs.truncate((path.resolve(process.cwd(), './test/logs/warn.log')), 0, () => {
console.dir('Warn log file deleted');
});
fs.truncate((path.resolve(process.cwd(), './test/logs/debug.log')), 0, () => {
console.dir('Debug log file deleted');
});
});
test('Test case fir getLoggerInstance', () => {
const pinoLoggerInstance = LogServiceInstance.getChildLoggerService(__filename);
pinoLoggerInstance.info('test logging');
_.map(Object.getOwnPropertySymbols(pinoLoggerInstance), (mapItems:any) => {
if(mapItems.toString().includes('Symbol')) {
if(mapItems.toString().includes('pino.level')) {
expect(pinoLoggerInstance[mapItems]).toEqual(20);
}
}
if(mapItems.toString().includes('pino.chindings')) {
const childInstance = pinoLoggerInstance[mapItems].toString().substr(1);
const jsonString = '{'+ childInstance+ '}';
const expectedObj = Object.create(JSON.parse(jsonString));
expect(expectedObj.fileName).toEqual('log.service.spec');
expect(expectedObj.appName).toEqual('AppJestTesting');
expect(expectedObj.connectorReqId).toEqual(expect.objectContaining(new String('AppJestTesting')));
}
});
// make sure the info.log file is written in this case
const infoBuffRead = fs.createReadStream(path.resolve(process.cwd(), './test/logs/info.log')).read(1024);
expect(infoBuffRead).toBeDefined();
// now write a warn log
pinoLoggerInstance.warn('test warning log');
const warnBuffRead = fs.createReadStream(path.resolve(process.cwd(), './test/logs/warn.log')).read(1024);
expect(warnBuffRead).toBeDefined();
// now write a debug log
pinoLoggerInstance.debug('test warning log');
const debugBuffRead = fs.createReadStream(path.resolve(process.cwd(), './test/logs/warn.log')).read(1024);
expect(debugBuffRead).toBeDefined();
});
});
I also made sure that the test log files do not get overwhelmed with data over time , by deleting their contents after each execution
Hope this helps people trying to test pino multi stream

context.done called twice within handler 'graphql'

Attempting to create a project based off https://github.com/serverless/serverless-graphql/blob/master/app-backend/dynamodb/handler.js. The code works well, but for some reason, I always get a log warning telling me context.done called twice.
import { graphqlLambda, graphiqlLambda, LambdaHandler } from 'apollo-server-lambda'
import lambdaPlayground from 'graphql-playground-middleware-lambda'
import { makeExecutableSchema } from 'graphql-tools'
import { resolvers } from './resolvers'
const typeDefs = require('./schema.gql')
const schema = makeExecutableSchema({ typeDefs, resolvers, logger: console })
export const graphqlHandler: LambdaHandler = async (event, context) => {
const handler = graphqlLambda({ schema })
return handler(event, context, (error: Error | undefined, output: any) => {
output.headers['Access-Control-Allow-Origin'] = '*'
context.done(error, output)
})
}
export const playgroundHandler = lambdaPlayground({
endpoint: '/graphql',
})
export const graphiqlHandler: any = graphiqlLambda({
endpointURL: '/graphql',
})
This code gives me the following result:
Serverless: POST /graphql (λ: graphql)
Serverless: [200] {"statusCode":200,"headers":{"Content-Type":"application/json","Access-Control-Allow-Origin":"*"},"body":"{\"data\":{\"getUserInfo\":\"ads\"}}"}
Serverless: Warning: context.done called twice within handler 'graphql'!
What is even more strange is that if I comment the context.done call, I get the following output (the call stalls as expected):
Serverless: POST /graphql (λ: graphql)
Serverless: Warning: context.done called twice within handler 'graphql'!
I ran into a similar issue. Try remove async in your function if you are not going to use await. It looks like the function waits until you invoke callback or context.done when async is not there. But it runs through all the way with the async keyword. When it runs through though, whatever it returns at the end of your function is calling the context.done. Here is a sample working code.
Let me also share two useful resources about callback and context from aws lambda.
export const handler = (
{
headers,
pathParameters: pathParams,
queryStringParameters: queryParams,
body,
},
context,
callback
) => {
context.callbackWaitsForEmptyEventLoop = false;
const data = JSON.parse(body);
const params = {
TableName: process.env.EVENT_TABLE,
Item: {
id: uuid.v4(),
title: data.title,
creationDate: new Date().getTime(),
},
};
dynamoDb.put(params, (err, data) => {
if (err) {
callback(err);
}
callback(null, {
statusCode: 200,
body: JSON.stringify(data),
});
});
};
This appears to be already reported: https://github.com/dherault/serverless-offline/issues/405

NestJS upload using GraphQL [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
Is anyone has an example of how to upload a file in NestJs using GraphQl?
I can upload using given example via controller
https://github.com/nestjs/nest/issues/262#issuecomment-366098589,
but I couldn't find any comprehensive documentation how to upload using GrahpQL in NestJS
Apollo Server 2.0 should be able to do this now (packaged in nest), although I needed to install graphql-upload and import GraphQLUpload as I couldn't find the Upload type:
#Mutation(() => Image, { nullable: true })
async addImage(#Args({ name: 'image', type: () => GraphQLUpload }) image) {
// Do stuff with image...
}
At the time of this answer FileInterceptor is using multer and by converting ExecutionContext to http it uses getRequest and getResponse methods to provide req and res to multer.single which they are (req and res) undefined in GraphQL.
I have tried to get request from context using:
const ctx = GqlExecutionContext.create(context);
and there is req property in ctx but I can't find a way to use multer (yet).
Anyway, I made some changes to FileFieldsInterceptor to use it inside my project, but I may make pull request when I had time to clean it up:
import { Observable } from 'rxjs';
import {
NestInterceptor,
Optional,
ExecutionContext,
mixin,
} from '#nestjs/common';
import { GqlExecutionContext } from '#nestjs/graphql';
import { storeFile } from './storeFile';
interface IField {
name: string;
options?: any;
}
export function GraphqlFileFieldsInterceptor(
uploadFields: IField[],
localOptions?: any,
) {
class MixinInterceptor implements NestInterceptor {
options: any = {};
constructor(#Optional() options: any = {}) {
this.options = { ...options, ...localOptions };
}
async intercept(
context: ExecutionContext,
call$: Observable<any>,
): Promise<Observable<any>> {
const ctx = GqlExecutionContext.create(context);
const args = ctx.getArgs();
let storeFilesResult = await Promise.all(
uploadFields.map(uploadField => {
const file = args[uploadField.name];
return storeFile(file, {
...uploadField.options,
...this.options,
}).then(address => {
args[uploadField.name] = address;
return address;
});
}),
);
return call$;
}
}
const Interceptor = mixin(MixinInterceptor);
return Interceptor;
}
and store file is something like this (may not be used like this):
import uuid from 'uuid/v4';
import fs from 'fs';
import path from 'path';
const dir = './files';
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir);
}
export const storeFile = async (file, options): Promise<any> => {
// options is not doing anything right now
const { stream } = await file;
const filename = uuid();
const fileAddress = path.join(dir, filename + '.jpg');
return new Promise((resolve, reject) =>
stream
.on('error', error => {
if (stream.truncated)
// Delete the truncated file
fs.unlinkSync(fileAddress);
reject(error);
})
.pipe(fs.createWriteStream(fileAddress))
.on('error', error => reject(error))
.on('finish', () => resolve(fileAddress)),
);
};
In my Cats.resolvers.ts:
...
#Mutation()
#UseInterceptors(
GraphqlFileFieldsInterceptor([
{ name: 'catImage1' },
{ name: 'catImage2' },
{ name: 'catImage3' },
]),
)
async cats(
#Args('catImage1') catImage1: string,
#Args('catImage2') catImage2: string,
#Args('catImage3') catImage3: string,
){
console.log(catImage1) // will print catImage1 address
...
This implementation works perfectly with Node >= v14
package.json
Remove the fs-capacitor and graphql-upload entries from the resolutions section if you added them, and install the latest version of graphql-upload (v11.0.0 at this time) package as a dependency.
src/app.module.ts
Disable Apollo Server's built-in upload handling and add the graphqlUploadExpress middleware to your application.
import { graphqlUploadExpress } from "graphql-upload"
import { MiddlewareConsumer, Module, NestModule } from "#nestjs/common"
#Module({
imports: [
GraphQLModule.forRoot({
uploads: false, // disable built-in upload handling
}),
],
})
export class AppModule implements NestModule {
configure(consumer: MiddlewareConsumer) {
consumer.apply(graphqlUploadExpress()).forRoutes("graphql")
}
}
src/blog/post.resolver.ts (example resolver)
Remove the GraphQLUpload import from apollo-server-core and import from graphql-upload instead
import { FileUpload, GraphQLUpload } from "graphql-upload"
#Mutation(() => Post)
async postCreate(
#Args("title") title: string,
#Args("body") body: string,
#Args("attachment", { type: () => GraphQLUpload }) attachment: Promise<FileUpload>,
) {
const { filename, mimetype, encoding, createReadStream } = await attachment
console.log("attachment:", filename, mimetype, encoding)
const stream = createReadStream()
stream.on("data", (chunk: Buffer) => /* do stuff with data here */)
}
Source: https://github.com/nestjs/graphql/issues/901#issuecomment-780007582
Some other links that I found helpful:
https://stephen-knutter.github.io/2020-02-07-nestjs-graphql-file-upload/
For uploading files using postman Link
EDIT: As per Developia comment below, apollo-server now implements file upload. Should be preferred way.
Below, original answer, for reference.
One normally does not use GraphQL for upload. GraphQL is fancy "specification of API", meaning that in the end of the day, low level HTTP request and responses are translated to/from JSON objects (if you don't have custom transport).
One solution could be to define special endpoint in GraphQL schema like:
mutation Mutation {
uploadFile(base64: String): Int
}
Then client would convert binary data to base64 string, which would be handled accordingly on resolver side. This way, file will become part of JSON object exchanged between GraphQL client and server.
While this is might be suitable for small files, small number of operations, it is definitely not a solution for upload service.
try this
import { Resolver, Mutation, Args } from '#nestjs/graphql';
import { createWriteStream } from 'fs';
import {GraphQLUpload} from "apollo-server-express"
#Resolver('Download')
export class DownloadResolver {
#Mutation(() => Boolean)
async uploadFile(#Args({name: 'file', type: () => GraphQLUpload})
{
createReadStream,
filename
}): Promise<boolean> {
return new Promise(async (resolve, reject) =>
createReadStream()
.pipe(createWriteStream(`./uploads/${filename}`))
.on('finish', () => resolve(true))
.on('error', () => reject(false))
);
}
}
You could use the apollo-upload-server lib. Seems like the easiest thing to do, in my opinion. Cheers
You need to define an upload controller and add it in your app.module, this is an example of what a controller should be (back-end):
#Controller()
export class Uploader {
#Post('sampleName')
#UseInterceptors(FileInterceptor('file'))
uploadFile(#UploadedFile() file) {
// file name selection
const path = `desired path`;
const writeStream = fs.createWriteStream(path);
writeStream.write(file.buffer);
writeStream.end();
return {
result: [res],
};
}
}
And call your controller by fetch in the front-end:
fetch('controller address', {
method: 'POST',
body: data,
})
.then((response) => response.json())
.then((success) => {
// What to do when succeed
});
})
.catch((error) => console.log('Error in uploading file: ', error));

Resources