Trouble configuring NextAuth and tRPC's Websockets when deploying - node.js

I have built an app with tRPCv10 and NextAuth. As my app requires realtime updates, I have followed tRPC's docs on implementing subscriptions with websockets. tRPC docs on subscription tRPC example app.
From what I understand, to use websockets in tRPC, I need to create a standalone http server and run it alongside my Nextjs app. When I emit data through EventEmitter, the data is proxied through this http server and sent to all other subscribers. Thus, I have deployed my standalone http server on Railway with port 6957, and my Nextjs app on Vercel
Everything is working well when I am developing, through localhost. However, when I'm trying to deploy it, there is an error trying to connect to the websocket server and I'm receiving a NextAuth error when logging in too.
For example, my server name is "https://xxx-server.up.railway.app/" and my Nextjs app is "https://xxx-client.vercel.app/".
On the client side, I'm receiving an error: WebSocket connection to 'wss://xxx-server.up.railway.app:6957/' failed. When I hit the login button which runs the authorize function in NextAuth, the console returns the error: POST https://xxx-client.vercel.app/api/auth/calback/credentials? 401.
For reference, here are the file for _app.tsx and my websocket server:
// _app.tsx
const MyApp: AppType = ({
Component,
pageProps: { session, ...pageProps },
}) => {
return (
<SessionProvider session={session} refetchOnWindowFocus={false}>
<Component {...pageProps} />
</SessionProvider>
);
};
const getBaseUrl = () => {
if (typeof window !== "undefined") {
return "";
}
if (process.env.VERCEL_URL) return `https://${process.env.VERCEL_URL}`; // SSR should use vercel url
return `http://localhost:${process.env.PORT ?? 3000}`; // dev SSR should use localhost
};
function getEndingLink() {
if (typeof window === "undefined") {
return httpBatchLink({
url: `${getBaseUrl()}/api/trpc`,
});
}
const client = createWSClient({
url: "wss://xxx-server.up.railway.app:6957"
});
return wsLink<AppRouter>({
client,
});
}
export default withTRPC<AppRouter>({
config({ ctx }) {
/**
* If you want to use SSR, you need to use the server's full URL
* #link https://trpc.io/docs/ssr
*/
const url = `${getBaseUrl()}/api/trpc`;
return {
url,
transformer: superjson,
links: [getEndingLink()],
/**
* #link https://react-query.tanstack.com/reference/QueryClient
*/
// queryClientConfig: { defaultOptions: { queries: { staleTime: 60 } } },
};
},
/**
* #link https://trpc.io/docs/ssr
*/
ssr: true,
})(MyApp);
// prodServer.ts
const port = parseInt(process.env.PORT || "3000", 10);
const dev = process.env.NODE_ENV !== "production";
const app = next({ dev });
const handle = app.getRequestHandler();
app.prepare().then(() => {
const server = http.createServer((req, res) => {
const proto = req.headers["x-forwarded-proto"];
if (proto && proto === "http") {
// redirect to ssl
res.writeHead(303, {
location: `https://` + req.headers.host + (req.headers.url ?? ""),
});
res.end();
return;
}
const parsedUrl = parse(req.url!, true);
handle(req, res, parsedUrl);
});
const wss = new ws.Server({ server });
const handler = applyWSSHandler({ wss, router: appRouter, createContext });
process.on("SIGTERM", () => {
console.log("SIGTERM");
handler.broadcastReconnectNotification();
});
server.listen(port);
// tslint:disable-next-line:no-console
console.log(
`> Server listening at http://localhost:${port} as ${
dev ? "development" : process.env.NODE_ENV
}`
);
});

Related

How to properly handle errors on subscriptions with Apollo Server?

I have an Express + Apollo Server backend. I enabled subscriptions on it using ws and graphql-ws. Everything is working fine.
Now, I would like to handle resolvers errors properly: hide backend details in production, change message based on error type, add a unique ID, etc. On regular mutations, I'm able to do so using the formatResponse function.
On subscriptions, I can't find where I could do it. All I need is a function called before sending data to the client where I have access to data and errors.
How can I do that?
Here's how the WS Server is created:
// Create Web Socket Server
const wsServer = new WebSocketServer({
server: httpServer,
path: '/graphql'
});
const serverCleanup = graphqlWS.useServer(
{
schema: graphqlApp.schema,
context: async (ctx: any) => {
try {
// ...Some auth checking...
return context;
} catch (e) {
throw new ApolloAuthenticationError('you must be logged in');
}
}
},
wsServer
);
And an example of event sending:
import {PubSub} from 'graphql-subscriptions';
// ...
Subscription: {
tree: {
subscribe: withFilter(
() => pubsub.asyncIterator('some_id'),
(payload, variables) => {
const canReturn = true;
//...Some filtering logic...
return canReturn;
}
)
}
},

PeerJS Peer.on('call') with Socket.io not Triggering

I'm trying to create a video call app using Socket.io, PeerJS, Node, Express and Angular.
The issue is that, while I can connect my own video just fine, I can't see the other user's video. In fact, the Peer.on('call') code doesn't seem to trigger at all.
I think there might also be an issue with my index.js code, because the console.log()s I've added to that file never appear either and I get the following error message:
Failed to load resource: the server responded with a status of 404
(Not Found)
My code looks like this:
// --- index.js:
const express = require("express");
const app = express();
const PORT = 3000;
const path = require('path');
app.set('src', path.join(__dirname, '../src'));
const server = require('http').Server(app)
const io = require('socket.io')(server)
io.on('connection',(socket)=>{
console.log('backend video test 1') // THIS NEVER TRIGGERS
socket.on('join-room',(roomId,userId)=>{
//join the room
console.log('backend video test 2') // THIS NEVER TRIGGERS
socket.join(roomId)
socket.to(roomId).broadcast.emit('user-connected',userId)
//leave room
socket.on('disconnect',()=>{
console.log('backend video test 3') // THIS NEVER TRIGGERS
socket.to(roomId).broadcast.emit('user-diconncected',userId)
})
})
})
app.listen(PORT, console.log(`Your app is running on port ${PORT}`))
// --- component ts file:
import { Component, OnInit } from '#angular/core';
import { ActivatedRoute } from '#angular/router';
import { Socket } from 'ngx-socket-io';
import { Peer } from "peerjs";
interface VideoElement{
muted:boolean;
srcObject:MediaStream;
userId:string;
}
#Component({
selector: 'app-video-call-v2',
templateUrl: './video-call-v2.component.html',
styleUrls: ['./video-call-v2.component.css']
})
export class VideoCallV2Component implements OnInit {
currentUserId:string='testUser'+Math.floor(Math.random()*1000);
videos:VideoElement[]=[];
constructor(
private route: ActivatedRoute,
private socket: Socket,
) {}
ngOnInit() {
console.log(`Init Peer with id ${this.currentUserId}`) // this works fine.
//------------------------------------
// --- Access user video and audio ---
//------------------------------------
navigator.mediaDevices.getUserMedia({
audio:true,
video:true
}).catch((err)=>{
console.log('user media error: ',err);
return null
}).then((stream:any)=>{
const myPeer = new Peer(this.currentUserId,{
host:'/',
port:3001,
});
console.log('myPeer =');
console.log(myPeer) // this works fine.
myPeer.on('open',(userId: any)=>{
console.log('test2') // this works fine.
console.log(userId) // this works fine.
this.socket.emit('join-room','lessonvideo2',userId)
});
if (stream){
this.addMyVideo(stream);
console.log(stream) // this works fine.
} else{
console.log('no stream found')
}
//-------------------------------
// --- Receieve incoming call ---
//-------------------------------
myPeer.on('call',call=>{
console.log(`receiving call from... ${call}`); // THIS NEVER TRIGGERS!
call.answer(stream)
call.on('stream',(otherUserVideoStream: MediaStream)=>{
console.log('receiving other user stream ' + otherUserVideoStream); // THIS NEVER RUNS
this.addOtherUserVideo(call.metadata.userId,otherUserVideoStream);
});
call.on('error',(err:any)=>{
console.log(err)
})
});
//------------------------------
// --- Connecting other user ---
//------------------------------
this.socket.on('user-connected',(userId:string)=>{
console.log('receiving user-connected event', 'Calling ' + userId); // THIS NEVER RUNS
setTimeout(()=>{ // Allow some time for new peers to connect
console.log("test3") // THIS NEVER RUNS
const call = myPeer.call(userId,stream,{
metadata:{userId:this.currentUserId},
});
call.on('stream',(otherUserVideoStream: MediaStream) =>{
console.log('receiving other stream after...') // THIS NEVER RUNS
this.addOtherUserVideo(userId,otherUserVideoStream)
});
call.on('close',()=>{
this.videos=this.videos.filter((video)=>video.userId!==userId);
});
},10000);
});
});
//------------------------------
// --- Disconnect other user ---
//------------------------------
this.socket.on('user-disconnected',(userId:string)=>{
console.log('receiving user-doscconected event from '+ userId) // THIS NEVER RUNS
this.videos = this.videos.filter((video)=>video.userId!==userId);
})
}
addMyVideo(stream:MediaStream){
console.log('added') // This works fine
this.videos.push({
muted:true,
srcObject:stream,
userId:this.currentUserId,
});
}
addOtherUserVideo(userId:string, stream:MediaStream){
console.log('second video added')
const alreadyExisting = this.videos.some(video => video.userId === userId)
if (alreadyExisting){
console.log(this.videos, userId);
return;
}
this.videos.push({
muted:false,
srcObject:stream,
userId,
})
}
onLoadedMetadata(event:Event){
(event.target as HTMLVideoElement).play()
}
}
I've also put the following script into the body of my index.html document:
<script src="http://localhost:3001/socket.io/socket.io.js"></script>
<script>
var socket = io.connect('http://localhost:3001');
socket.on('news', function (data) {
console.log(data);
socket.emit('my other event', { my: 'data' });
});
</script>
I'm importing Socket.Io into my app.module.ts file like this:
import{SocketIoModule} from 'ngx-socket-io';
//...
//...
imports: [
SocketIoModule.forRoot({
url:'http://localhost:3001',options: {}
})
]
I'm running my peerjs port with the following command:
peerjs --port 3001
My backend is running on port 3000 and my frontend on 4200, and they're working just fine.
NOTE: I've seen many other Stack Overflow posts on this topic, like these ones, but I've tried everything mentioned and none of them have worked for me:
peer.on('calll') is never being called
Peerjs call.on "stream" event isn't firing but peer.on "call" is

Posting data from EJS to Node.js

I have an index.html EJS rendered by a koa/Node.js app which contains a javascript snippet to post data about the current page to the same app, to an endpoint to save in a database.
The javascript code ( an AJAX fetch POST) reaches the Node.js endpoint but doesn't transmit any data. I don't see any typo in the code.
[CORRECTION] indeed a typo with the bodyparser
# index.js
const Koa = require("koa");
const path = require("path");
const render = require("koa-ejs");
const bodyParser = require("koa-bodyparser");
const router = require("./routes/routes.js");
const app = new Koa();
render(app, {
root: path.join(__dirname, "/views"),
layout: false,
viewExt: "html",
});
app
.use(bodyParser())
.use(router.routes())
.use(router.allowedMethods())
.use(staticCache("./images", { maxAge: 600000 }))
.listen(PORT, () => {
console.log(`Running on port ${PORT}`);
});
In the index.html, I have a button that triggers a POST request to an endpoint (/node/insert) of the koaRouter. The action is to save information about the current page (say, document.location.href) in a Postgres database.
# /views/index.html
[...]
<form id="newRow">
<input type="submit" value="New row">
</form>
[...]
<script type="module" src="fetch.js" async></script>
where:
# /views/fetch.js
const data = {
app: "Node",
url: document.location.href,
...
};
document.getElementById("newRow").addEventListener("submit", (e) => {
e.preventDefault();
fetch("/node/insert", {
method: "POST",
headers: {
"Content-Type": "application/json; charset-UTF-8",
},
body: JSON.stringify(data),
})
.then((res) => {
if (res.ok) {
return res.json();
}
return Promise.reject(res);
})
.then((res) => console.log("front", res))
.catch((err) => console.warn(err));
Among the routes, I defined an endpoint /node/insert to respond to this action:
# routes.js
const koaRouter = require("koa-router");
const router = new koaRouter();
router.post("/node/insert", async (ctx) => {
console.log("posted", ctx.request.body);
^^^ "posted" is positively printed in terminal after submit
if (ctx.request.body) {
return (ctx.response.status = 200)
} else {
return (ctx.response.status = 404); <-- check
}
})
The endpoint "/node/insert" is reached since I can console.log positively, but the body isn't passed to the endpoint: ctx.request.body = {}. I have the following error:
"SyntaxError: Unexpected token O in JSON at position 0"
detected from fetch.js (probably because the body is {}?).
I don't see what is wrong.
Note: the Node app runs in a container (pm2-runtime start index.js) and use Nginx as reverse proxy, static files server and load-balancer`
Try:
const koaJson = require(koa-json);
app.use(koaJson());
Just a typo in the bodyparser as Evert pointed and bad positioning of the middleware
curl --data "app=Node" HTTP://localhost:8000/node responds normaly.

Vaadin-Upload not working with http-proxy-middleware

I have a node.js / Polymer 1 website. I am using HTTP-proxy-middleware to route api calls (/api/webapi) to my backend API server.
On one of the pages I have a vaadin-upload (v2.3.0) component that sends files to the api. Everything appears to work fine when running on local host but when I deploy to our test servers I am experiencing issues. Either the upload completes quickly and then sits "processing" for a long time or it stalls.
Using postman I have managed to send a file to the API directly, to the proxy server. I have also managed to get the upload component to call the API directly. All these cases work correctly, and output from the API would suggest in all cases the API is receiving/processing data at the same rate. From this I have narrowed it down to an interaction between Vaadin-Upload and http-proxy-middleware.
Does anyone have experience with this and help me configure the proxy correctly.
proxy configuration:
const url = require('url');
var hpmproxy = require('http-proxy-middleware');
var config = require('../config');
// Adds user authorization token from passport to request
var addAuthTokenMiddleware = function (req, res, next) {
if (req.session && req.isAuthenticated()) {
req.headers['authorization'] = 'Bearer ' + req.user.token;
next();
} else {
req.abort();
}
};
function isLoggedIn(req, res, next) {
// if user is authenticated in the session, carry on
if (req.session && req.isAuthenticated())
return next();
res.status(403).end();
};
function restream(proxyReq, req) {
if (isMultipartRequest(req))
console.log('Multipart');
if (!isEmpty(req.body)) {
console.log("parse");
var bodyData = JSON.stringify(req.body);
proxyReq.setHeader('Content-Type', 'application/json');
proxyReq.setHeader('Content-Length', Buffer.byteLength(bodyData));
proxyReq.write(bodyData);
}
console.log("-->[proxyReq]----", proxyReq.path, proxyReq.getHeader('Content-Type'));
};
function handleResponse(proxyRes, req, res) {
console.log('---[proxyRes]<---', proxyRes.req.method, proxyRes.req.path, proxyRes.statusCode);
};
function isMultipartRequest(req) {
let contentTypeHeader = req.headers['content-type'];
return contentTypeHeader && contentTypeHeader.indexOf('multipart') > -1;
};
function isEmpty(obj) {
for(var prop in obj) {
if(obj.hasOwnProperty(prop))
return false;
}
return JSON.stringify(obj) === JSON.stringify({});
}
var options = {
target: config.webApiHost,
changeOrigin: true, // needed for virtual hosted sites
pathRewrite: {
'^/api/webapi/': config.webApiPath
},
secure: !config.selfSigned,
onProxyRes: handleResponse,
onProxyReq: restream
// ,logLevel: 'debug'
};
var hpmApiProxy = hpmproxy(options);
module.exports = function (app, passport, config) {
app.use('/api/webapi/', isLoggedIn, addAuthTokenMiddleware, hpmApiProxy);
console.log(' WebAPI Proxy Loaded');
}

Server Push with Nodejs pushStream method is not working

I am studying http2 on nodejs, but find out a issue pushStream method not working
(client side do not show "Pushed/[fileName]" on developer tool)
I wonder if the reason is nodejs version (I installed the latest version v9.8.0)
My codes is the following :
server.js
'use strict'
const fs = require('fs');
const path = require('path');
const http2 = require('http2');
const utils = require('./utils');
const { HTTP2_HEADER_PATH } = http2.constants;
const PORT = process.env.PORT || 3000;
// The files are pushed to stream here
function push(stream, path) {
const file = utils.getFile(path);
if (!file) {
return;
}
stream.pushStream({ [HTTP2_HEADER_PATH]: path}, (err, pushStream, headers) => {
if (err) throw err;
pushStream.respondWithFD(file.content, file.headers)
});
}
// Request handler
function onRequest(req, res) {
const reqPath = req.headers[':path'] === '/' ? '/index.html' : req.headers[':path']
const file = utils.getFile(reqPath);
// 404 - File not found
if (!file) {
res.statusCode = 404;
res.end();
return;
}
// Push with index.html
if (reqPath === '/index.html') {
push(res.stream, '/assets/main.js');
push(res.stream, '/assets/style.css');
} else {
console.log("requiring non index.html")
}
// Serve file
res.stream.respondWithFD(file.content, file.headers);
}
// creating an http2 server
const server = http2.createSecureServer({
cert: fs.readFileSync(path.join(__dirname, '/certificate.crt')),
key: fs.readFileSync(path.join(__dirname, '/privateKey.key'))
}, onRequest);
// start listening
server.listen(PORT, (err) => {
if (err) {
console.error(err);
return -1;
}
console.log(`Server listening to port ${PORT}`);
});
utils.js
'use strict';
const fs = require('fs');
const mime = require('mime');
module.exports = {
getFile: function (path) {
const filePath = `${__dirname}/public${path}`;
try {
const content = fs.openSync(filePath, 'r');
const contentType = mime.getType(filePath);
return {
content,
headers: {
'content-type': contentType
}
};
} catch (e) {
return null;
}
}
}
Updated 2020 01 28
Resolved: The reason is the latest version of chrome v65. has bug, cause client do not trust PUSH_PROMISE frame. I backup chrome v64 then it working now.
I haven’t tried to run your code but have noticed that Chrome does not allow HTTP/2 push with an untrusted HTTPS certificate (e.g. a self-signed one not yet added to the trust store). Raised a bug with the Chrome team.
If you have the red unsecure padlock item then you could be hitting this issue too. Add the certificate into your trust store, restart Chrome and reload the site where you should get a green padlock.
Note Chrome needs a certificate with a Subject Alternative Name (SAN) field matching the domain so if you’ve just got the older Subject field then it won’t go green even after adding it to your trust store.
Another option is to look at the Chrome HTTP2 frames by typing this into your URL:
chrome://net- internals/#http2
If you see the push promise frames (with a promised_stream_id), followed by the headers and data on that promised_stream_id then you know the sever side is working.

Resources