My api has stopped working, previously it worked fine and as far as i am aware I have changed nothing. When i tested my endpoint i received an internal server error.
Here is a link to my hosted api https://frozen-scrubland-34339.herokuapp.com/api
I have just checked some of my other apis and none are working either, same message. it appears my code isnt the issue but postgres itself?
Any help on what to do would be appreciated
When i tried to npm run prod to re-push it to heroku i received: 'Error: The server does not support SSL connections'
Again this was never an issue previously when it worked.
I imagine i have changed something with heroku itself by accident?
app.js
const express = require("express");
const app = express();
const apiRouter = require("./routers/api-router");
const cors = require("cors");
const {
handle404s,
handlePSQLErrors,
handleCustomError,
} = require("./controllers/errorHandling");
app.use(cors());
app.use(express.json());
app.use("/api", apiRouter);
app.use("*", handle404s);
app.use(handlePSQLErrors);
app.use(handleCustomError);
module.exports = app;
connection.js
const { DB_URL } = process.env;
const ENV = process.env.NODE_ENV || "development";
const baseConfig = {
client: "pg",
migrations: {
directory: "./db/migrations",
},
seeds: {
directory: "./db/seeds",
},
};
const customConfigs = {
development: { connection: { database: "away_days" } },
test: { connection: { database: "away_days_test" } },
production: {
connection: {
connectionString: DB_URL,
ssl: {
rejectUnauthorized: false,
},
},
},
};
module.exports = { ...baseConfig, ...customConfigs[ENV] };
Related
I am getting the below error when I am trying to establish a database connection in my node js application using sequelize
C:\Users\user123\Desktop\project\node_modules\tedious\lib\token\token-stream-parser.js:24
this.parser = _stream.Readable.from(_streamParser.default.parseTokens(message, this.debug, this.options));
^
TypeError: _stream.Readable.from is not a function
I am in initial stage of creating an application. Where I have just tried to create a database connection, for which I have created three files
index.js
var dotenv = require("dotenv").config().parsed;
var customEnv = require("custom-env");
customEnv.env("development").env();
var express = require("express");
const helmet = require("helmet");
var cookieParser = require("cookie-parser");
const app = express();
app.use(helmet());
app.use(cookieParser());
require("./db.js");
httpserver = require("http").createServer(app);
httpserver.timeout = 0;
httpserver.listen(3457, async () => {
connectedEmitter.on("connectedDbs", () => {
console.log(` ----- SERVER LISTENING ON PORT `);
});
});
db.js
const Sequelize = require('sequelize');
const eventEmitter = require('events');
global.connectedEmitter = new eventEmitter()
global.sequelize = new Sequelize(process.env.DB_NAME, process.env.DB_USER, process.env.DB_PASS, {
host: process.env.DB_HOST,
port: 1433,
dialect: process.env.DB_DIALECT,
ssl: false,
dialectOptions: {
ssl:false
},
logging:false,
pool: {
max: 20,
min: 0,
idle: 30000
}
});
sequelize.authenticate().then(() => {
console.log(`${process.env.DB_NAME} - Connection has been established successfully.`);
global.connectedEmitter.emit('connectedDbs')
}).catch((err) => {
console.error(' - Unable to connect to the database:', err);
});
.env (I am giving dummy credentials as I cannot provide original credentials)
# ################################## Database Credentials ##############################################
DB_NAME=mydb
DB_USER=username
DB_PASS=password
DB_HOST=hostname
DB_DIALECT=mssql
Can anyone please tell me why am I getting the error mentioned. Where have I made the mistake in setting the database connection. Please help.
I also faced this issue. Turns out tedious had issues with node versions below 12, and my production app service was running on node 10.
GitHub link that mentions this
I have a form on my site and I want to send the data from fields to my email. I am using nodemailer and node js for this things. But when I submit form I have an 404 error on POST request.
form-component:
this.http.post('api/sendForm',{
to: environment.contactUsEmail,
from: 'zzz',
subject: 'zzz',
mailInfo: contactUsData,
}
).subscribe(() => {
this.cooperationFormGroup.reset();
});
server.ts: (path:backend/server.ts) folder backend is near folder src
const express = require('express');
const bodyParser = require('body-parser');
const nodemailer = require('nodemailer');
const PORT = process.env.PORT || 3000;
const app = express();
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
app.post('api/sendForm', (req, res) => {
const payload = req.body;
const mailInfo = payload.mailInfo;
const transporter = nodemailer.createTransport({
service: 'gmail',
host: 'smtp.gmail.com',
secure: 'true',
port: '465',
auth: {
user: 'email',
pass: 'pass',
}
});
const text = [...];
const mailOptions = {
from: 'zz',
to: payload.to,
subject: payload.subject,
text: text.join('\n'),
};
transporter.sendMail(mailOptions, (error, info) => {
if (error) {
console.log(error);
} else {
console.log('Email sent: ' + info.response);
res.status(200).json({
message: 'successfully sent!'
})
}
});
});
app.listen(PORT, () => {
console.log(`Server is running in ${PORT}`);
});
I run server.ts in the folder backend using node server.ts and run angular app using npm start
As mentioned above in my comment: you need to pass the complete URL of your backend to post: use http://localhost:3000/api/sendForm instead of api/sendForm.
However, to manage different values during development and production, you might want to use environment.ts and environment.prod.ts:
environments/environment.ts:
export const environment = {
production: false,
urlToBackend: 'http://localhost:3000'
}
environments/environment.prod.ts:
export const environment = {
production: true,
urlToBackend: 'http://<IP>:3000'
}
service.ts:
While building the production build with npm run build, environment.ts will be replaced by environment.prod.ts as mentioned in the angular.json (see the object fileReplacements).
import { environment } from '../../environments/environment';
...
#Injectable()
export class AppService {
url = environment.urlToBackend;
constructor(private http: HttpClient) {
}
foo() {
return this.http.post(`${this.url}/api/sendForm`,{ ... });
}
}
My code is not accurate and you need to arrange it for your needs. However, I hope, you get the idea.
You need to mention the complete backend server URL in the first argument of the .post.
Change 'api/sendForm' to 'Your complete backend url'.
this.http.post( 'complete backend server url' ,
since you are running the node server on PORT 3000. Your backend URL will be http://localhost:3000/api/sendForm
I am trying to implement Jaeger in the node js project. I have deployed this node js project(using docker image) and Jaegaer in k8s (kubectl create -f https://raw.githubusercontent.com/jaegertracing/jaeger-kubernetes/master/all-in-one/jaeger-all-in-one-template.yml)
Both are working individually but traces are not visible in the service
var initTracer = require('jaeger-client').initTracer;
const opentracing = require("opentracing");
const bodyParser = require('body-parser');
var config = {
'serviceName': 'user-service',
'local_agent': {
'reporting_host': 'jaeger',
'reporting_port': '6831',
},
'reporter': {
'logSpans': true
},
'sampler': {
'type': 'probabilistic',
'param': 1.0
}
};
var options = {
'tags': {
'user-service': '1.1.2'
}
};
var tracer = initTracer(config, options);
opentracing.initGlobalTracer(tracer);
console.log(tracer);
const express = require('express');
const app = express();
app.use(bodyParser.json({ type: 'application/*+json' }));
app.get('/users/:id',(req, res) => {
const span = tracer.startSpan('get user by user_id');
res.send(JSON.stringify('hello'));
span.log({'event': 'request_end'});
span.finish();
});
// Set up server
const server = app.listen(8000, () => {
let host = server.address().address;
let port = server.address().port;
console.log('Service_1 listening at http://%s:%s', host, port);
});
Have you tried looking at the logs being generated by your pods?
In my case I got the following
ERROR Failed to flush spans in reporter: error sending spans over UDP:
Error: getaddrinfo ENOTFOUND http://jaeger-agent, packet size: 984,
bytes sent: undefined
Changing it to jaeger-agent worked for me.
Also if it helps I have declared this under my jaeger image in docker-compose.yml:
+ ports: - "5775:5775/udp" - "6831:6831/udp" - "6832:6832/udp" - "5778:5778" - "16686:16686" - "14268:14268" - "9411:9411"`
I am setting up the proxy server using setupProxy for my react app using graphql as backend which is running on a different part, doing so The HTTP link proxy is working fine but WebSocket link proxy is giving me an error
For solving the problem I have tried to include options as ws: true, but it's not working.
The error is as follows:
SyntaxError: Failed to construct 'WebSocket': The URL '/ws' is invalid.
Error:
setupProxy.js
const proxy = require("http-proxy-middleware");
module.exports = function(app) {
app.use(proxy("/graphql", { target: "http://localhost:8001/graphql" }));
app.use(
proxy("/ws", {
target: "ws://localhost:8001",
ws: true,
})
);
};
index.js
import { WebSocketLink } from "apollo-link-ws";
import { createUploadLink } from "apollo-upload-client";
//Apollo Imports End
// Declaring constants for GraphQL
const httpLink = new createUploadLink({
uri: "/graphql"
});
const wsLink = new WebSocketLink({
uri: "/ws",
options: {
reconnect: true
}
});
I expected the call should be same as normal call but its throwing an error.
/ws is not a valid URI for a WebSocket class.
A websocket expect a full URL to connect, you can try it in your browser console:
In this case, Apollo is using the native web socket class behind the scenes, so if you can make this work, it will work in Apollo too.
Try using ws://localhost:8001 instead.
Just replace with this
const proxy = require("http-proxy-middleware");
module.exports = function(app) {
app.use(proxy("/graphql", { target: "http://localhost:8001/graphql" }));
app.use(proxy("ws://locahost:8001"));
};
OR
const proxy = require("http-proxy-middleware");
module.exports = function(app) {
app.use(proxy("/graphql", { target: "http://localhost:8001/graphql" }));
app.use(proxy('/ws', {
target: 'http://localhost:8001',
ws: true
})
);
};
Adding this in setupProxy.js file worked for me :
const { createProxyMiddleware } = require("http-proxy-middleware");
module.exports = app => {
const wsProxy = createProxyMiddleware("/graphql", {
ws: true,
changeOrigin: true,
autoRewrite: true,
target: "http://localhost:8001",
});
app.use(wsProxy);
};
This is what I have in my apollo-client.ts file
const httpLink = new HttpLink({
uri: "/graphql",
credentials: "include",
});
const wsLink = new WebSocketLink({
uri: `ws://${window.location.host}/graphql`,
options: {
reconnect: true,
lazy: true,
connectionParams: {
headers: {
//add your headers here
},
},
},
});
const link = split(
// split based on operation type
({ query }) => {
const definition = getMainDefinition(query);
return (
definition.kind === "OperationDefinition" &&
definition.operation === "subscription"
);
},
wsLink,
httpLink,
);
I'm new to Node and I want my website, dacio.app, working with subdomains for my college projects using vhost.
However, I need to have it secured due to the requirement for .app domains, so I'm using greenlock-express to automate it.
Don't be frontin', yo! TLS SNI 'giphy.dacio.app' does not match 'Host:
potatoes.dacio.app'
I've tried using the vhost example in the repo, but it doesn't look like server-static supports express apps.
Any tips on how to get this working? I keep hearing about reverse proxies, but I'm not sure if it's worth the effort as I don't even know if it would work - would it help?
server.js
#!/usr/bin/env node
'use strict';
// DEPENDENCIES
const express = require('express');
const vhost = require('vhost');
const path = require('path');
const glx = require('greenlock-express');
// MIDDLEWARE
const app = express();
const giphyApp = require('../giphy-search');
const potatoesApp = require('../rotten-potatoes');
const portfolioApp = require('../dacio.app');
// ROUTES
app.use(vhost('giphy.dacio.app', giphyApp));
app.use(vhost('potatoes.dacio.app', potatoesApp));
app.use(portfolioApp);
// GREENLOCK for HTTPS
glx.create({
version: 'draft-11',
server: 'https://acme-v02.api.letsencrypt.org/directory',
email: 'dacioromero#gmail.com',
agreeTos: true,
approveDomains: [ 'dacio.app', 'giphy.dacio.app', 'potatoes.dacio.app' ],
configDir: '~/.config/acme/',
app: app,
communityMember: false
}).listen(80, 443);
I've switched to using redbird which seems to accomplish everything I was hoping to do.
const path = require('path')
const proxy = require('redbird')({
port: 80,
letsencrypt: {
path: path.join(__dirname, '/certs'),
port: 9999
},
ssl: {
http2: true,
port: 443
}
});
proxy.register('dacio.app', 'http://localhost:8080', {
ssl: {
letsencrypt: {
email: 'dacioromero#gmail.com',
production: true,
}
}
});
proxy.register('giphy.dacio.app', 'http://localhost:8081', {
ssl: {
letsencrypt: {
email: 'dacioromero#gmail.com',
production: true
}
}
})
proxy.register('potatoes.dacio.app', 'http://localhost:8082', {
ssl: {
letsencrypt: {
email: 'dacioromero#gmail.com',
production: true
}
}
});