Cassandra with react, net.socket error - node.js

I am a newwbie to react and cassandra. I am just trying to connect to the cassandra db by using "npm i cassandra-driver".
I am having a main.js file where I used the cassandra driver code.
Main.js
import 'babel-polyfill';
import 'whatwg-fetch';
import React from 'react';
import ReactDOM from 'react-dom';
import FastClick from 'fastclick';
import { Provider } from 'react-redux';
import store from './core/store';
import router from './core/router';
import history from './core/history';
const cassandra = require('cassandra-driver');
const client = new cassandra.Client({ contactPoints: ['127.0.0.1'], keyspace: 'excelsior'});
console.log("client---->",client);
const query = 'SELECT * FROM playlists';
client.execute(query, [], function(err, result) {
console.log("err----->",err);
console.log("result----->",result);
assert.ifError(err);
//console.log('got user profile with email ' + result.rows[0].email);
});
let routes = require('./routes.json'); // Loaded with utils/routes-loader.js
const container = document.getElementById('container');
function renderComponent(component) {
ReactDOM.render(<Provider store={store}>{component}</Provider>, container);
}
// Find and render a web page matching the current URL path,
// if such page is not found then render an error page (see routes.json, core/router.js)
function render(location) {
router.resolve(routes, location)
.then(renderComponent)
.catch(error => router.resolve(routes, { ...location, error }).then(renderComponent));
}
continued code ......
I am getting the client console, But After that i am getting an error like
connection.js:122 Uncaught TypeError: net.Socket is not a constructor(…)
Am I missing something here. or is this piece of code should be written anywhere else. ?
Thanks

The cassandra-driver is meant to be run on a Node.js server, not in the client (i.e. browser).
So, you'll need to create a Node.js server of some kind for your client code (using React, Redux, or whatever) to talk to. For example, in a typical web application setup, your client code in the browser will:
make an HTTP call to the server
the server will process that call and use the cassandra-driver to go get the appropriate data for that call
then send the appropriate data back to the client in the HTTP response
This is a pretty gross simplification of how things could be setup, but this type of communication is common for many web applications, regardless of whether they're using Cassandra, Postgres, or whatever database on the server.

Related

Firebase getAuth() throws error getProvider of undefined but can access database

I have the following code running on a Node server.
import admin from 'firebase-admin';
import {getAuth} from 'firebase/auth';
class MyFirebase {
constructor() {
console.log("MyFirebase Constructor");
this.firebaseApp = admin.initializeApp({
credential: admin.credential.cert("PATH_TO_CERT/cert.json"),
databaseURL: "https://DATABASE_URL",
});
console.log("App name="+firebaseApp.name);
this.defaultAuth = getAuth(firebaseApp);
this.database = this.firebaseApp.database();
// database ref code here...
}
}
and it throws the following error:
return app.container.getProvider(name);
TypeError: Cannot read property 'getProvider' of undefined
If I remove "firebaseApp" from the getAuth(..) call I get this error:
No Firebase app '[DEFAULT'] has been created - call Firebase
App.initializeApp() (app/no-app)
However the "console.log("App Name...")" line produces:
App name=[DEFAULT]
So clearly a DEFAULT app has been created. Additionally if I remove the "getAuth..." call the database calls pulling data from the realtime database below it work just fine, which seem to imply the authentication worked properly because I can access data from the database.
What the heck is going on?
You are confusing Firebase Admin SDK (Node.js) with Firebase Javascript SDK. The former is for the back-end, while the latter is for the front-end. I understand your confusion because the front-end package/s are installable via NPM, although they are meant to be bundled with front-end code.
You can't do this:
import admin from 'firebase-admin' // back-end code
import { getAuth } from 'firebase/auth' // front-end code !!!
const adminApp = admin.initializeApp(...)
getAuth(adminApp) // TypeScript actually catches this error
/*
Argument of type 'App' is not assignable to parameter of type 'FirebaseApp'.
Property 'automaticDataCollectionEnabled' is missing in type 'App' but required in type 'FirebaseApp'.ts(2345)
app-public.d.ts(92, 5): 'automaticDataCollectionEnabled' is declared here.
const adminApp: admin.app.App
*/
If you are on the back-end, just use adminApp.auth() to get the Auth instance. If on the front-end, you need to call getAuth with the front-end Firebase App instance:
import { initializeApp } from 'firebase/app'
import { getAuth } from 'firebase/auth'
const app = initializeApp(...)
const auth = getAuth(app)
The new modular apis have a slightly different syntax. The following should still work if you wrap it in a class, but as long as you only do this once at the top of your express? server you shouldn't need to use a class.
Also, I'm using the require syntax but imports should work too depending on your setup.
//Import each function from the correct module.
const { initializeApp, applicationDefault } = require("firebase-admin/app");
const { getAuth } = require("firebase-admin/auth");
const { getDatabase } = require("firebase-admin/database");
const app = initializeApp({
credential: applicationDefault(), //Don't forget to export your configuration json https://firebase.google.com/docs/admin/setup
databaseURL: "https://DATABASE_URL",
});
const auth = getAuth(app)
const database = getDatabase(app)
It's not super well documented but you can find hints in the Admin SDK reference: https://firebase.google.com/docs/reference/admin/node/firebase-admin.auth
One tip: In VSCode you should see the a description of each function when you hover over them, if you have the import path formatted correctly.

How to use mysql2 library with Sapper?

I am creating an application in Svelte Sapper. I have a routes/account/login.js API route where I am trying to use mysql2. The route itself works (I checked with Postman), but as soon as I import mysql, the server crashes and an error appears:
[rollup-plugin-svelte] The following packages did not export their `package.json` file so we could not check the "svelte" field. If you had difficulties importing svelte components from a package, then please contact the author and ask them to export the package.json file.
- mysql2
import mysql from "mysql2/promise";
export async function post(req, res) {
//route test
const { login, password } = req.body;
res.end(`${login}, ${password}`);
}
What can I do to make this import work?
What can I do to make this import work?
The Sapper documentation doesn't say anything about whether you need to change something in the configuration additionally. https://sapper.svelte.dev/docs#Server_routes
I found a solution. I had to create a #lib folder in the src/node_modules folder and there file eg. db.js. In that file instead of import you need to use require()! and then you have to export the function that connects to the database. and then you can import that in the path
//src/node_modules/#lib/db.js
const mysql = require("mysql2");
export async function connectToDatabase() {
return mysql.createConnection({
host: "localhost",
....
})
}
//routes/account/login.js
import { query } from "#lib/db";
export async function post(req, res) {
...
}

keep fetching data up to date

I have just a question I want to ask if anybody have an idea about it.
I'm building a full stack application backed by nodejs and using typescript for it, in my nodejs app I'm making a fetch for an API that later on I will serve it to the user but I have one small issue, I'm using node-fetch for now but the data which are fetched are changing all the time eg. now I have 10 entries, after 5 seconds I have 30 entries, so is there a way or mechanism to make my fetching to the data with nodejs up to date by fetching them in the background?
Thanks in advance!
Easiest solution to implement and good in actual sense for making your web app realtime https://pusher.com/
This is how you can handle pusher within your NodeJS App
import Pusher from 'pusher'
//Below are the keys that you will get from pusher when you go to getting started
// within your Dashboard
const pusher = new Pusher({
appId: "<Your app id provided by pusher>",
key: "<Key id provided by pusher>",
secret: "<Secret key given by pusher>",
cluster: "<cluster given by pusher",
useTLS: true
});
Now you want to setup a changeStream for your Collection in MongoDB
const db = mongoose.collection;
db.once('open', ()=>{
const postCollection = db.collection('posts')//This will be dependent on the collection you want to watch
const changeStream = postCollection.watch()//Make sure the collection name above are acurate
changeStream.on('change', (change)=>{
const post = change.fullDocument;//Change bring back content that change in DB Collection
if (change.operationType === 'insert'){
pusher.triger('<write channel for your pusher>', '<event in this case inser>', {
newPost:post
})
}
})
})
By that setup your pusher and backend is working now is time to setup frontend
If your usin VanillaJS the Pusher getting started has code for you
If your using ReactJS here's is the code below
import Pusher from 'pusher-js'
useEffect(()=>{
Pusher.logToConsole = true;
var pusher = new Pusher('<Key received from pusher>', {
cluster: '<cluster received from pusher>'
});
var channel = pusher.subscribe('<channel name that you wrote in server');
channel.bind('<event that you wrote in sever',(data)=> {
alert(JSON.stringify(data)); // This will be the data entries coming as soon as they enter DB then you can update your state by using spread operators to maintain what you have and also add new contents
});
//Very important to have a clean-up function to render this once
return ()=>{
pusher.unbind();
pusher.unsubscribe_all();
}
})
Now like this you have everything being realtime

Is it safe to store socket.io sockets in a server object?

This might be a dumb question, but I am trying to explain it as simply as I can.
So I have been using axios to retrieve data from my server to my react app (redux)
Since I use socket.io in my app a lot, I thought I could make the authentication use that as well, so I don't have to use axios and socket.io concurrently. I have been using PassportJS for authentication, but I can't wrap my head around how the socket.io passport packages I found on npm work.
I am using this socket.io-redux middleware to fire reducers on socket.io events. With this, I can dispatch actions that then get sent to the server, and from the server I can fire the reducers I want, in order to change the state.
Here is my store.js
import { createStore, applyMiddleware, compose } from "redux";
import thunk from "redux-thunk";
import rootReducer from "./reducers";
import { composeWithDevTools } from 'redux-devtools-extension';
import createSocketIoMiddleware from 'redux-socket.io';
import io from 'socket.io-client';
const socket = io('http://localhost:5000');
const socketIoMiddleware = createSocketIoMiddleware(socket, "io/");
const initialState = {};
const middleware = [thunk, socketIoMiddleware];
const store = createStore(
rootReducer,
initialState,
composeWithDevTools(
applyMiddleware(...middleware)
)
);
// store.dispatch({type:'server/hello', data:'Hello!'});
export default store
Using this method a socket is created for every client upon connecting to the website. I created a Store class on the server side as well, with the the login/logout methods like so (these keep track of the clients and the users associated with them):
class Store
{
constructor(){
this.loggedInUsers = [];
}
loginUser(userID, socketID){
console.log('login', userID, socket.id);
const index = this.loggedInUsers.find(u => u.socketID !== socket.id && u.userID === userID);
if(index >= 0){
this.logoutUser(this.loggedInUsers[index].socketID);
}
this.loggedInUsers = [
...this.loggedInUsers,
{
userID: userID,
socketID: socket.id,
}
];
}
logoutUser(socketID){
console.log('logout', socketID);
const index = this.loggedInUsers.find(u => u.socketID === socketID);
if(index >= 0){
this.loggedInUsers = [...this.loggedInUsers.slice(0, index), ...this.loggedInUsers.slice(index+1)];
}
}
}
module.exports.store = new Store();
On login a socket.io event is emmitted to the server, then on a successful login the socket gets stored in a loggedInUsers object that holds the user IDs and the sockets associated with them, then fires a reducer to store the authenticated state in my store. On logout, the user with the socket given gets removed from the object.
Now I am wondering if this is even safe to do or not.
I have done the exact same thing in a previous project that I worked, saved in an array the mongo and socket id-s of each user.
Think of it this way, how much memory will one of these objects take, about 50 bytes maybe? 60? So technically you can have ~10 million of these in memory, assuming with a maximum usage of 500 MB of memory. The Node maximum memory limit goes from 700MB to 1.4GB between 32 and 64 bit systems, so with 10 million socket instances (which are not the same as 10 million simultaneous users since a user can have more than one tab open, and one socket is created for each tab/instance) you should be safe, since Node itself is assumed to not be used for intensive operations.
Now these are the limits, and in the end, you still can stretch beyond this limit by using some Redis or Memcache to be sort of in between memory and storage speed, and since I don't know your scenario I assume that you are pretty safe in terms of leaks if you correctly cleanup the logged out users.

Can you keep a PostgreSQL connection alive from within a Next.js API?

I'm using Next.js for my side project. I have a PostrgeSQL database hosted on ElephantSQL. Inside the Next.js project, I have a GraphQL API set up, using the apollo-server-micro package.
Inside the file where the GraphQL API is set up (/api/graphql), I import a database helper-module. Inside that, I set up a pool connection and export a function which uses a client from the pool to execute a query and return the result. This looks something like this:
// import node-postgres module
import { Pool } from 'pg'
// set up pool connection using environment variables with a maximum of three active clients at a time
const pool = new Pool({ max: 3 })
// query function which uses next available client to execute a single query and return results on success
export async function queryPool(query) {
let payload
// checkout a client
try {
// try executing queries
const res = await pool.query(query)
payload = res.rows
} catch (e) {
console.error(e)
}
return payload
}
The problem I'm running into, is that it appears as though the Next.js API doesn't (always) keep the connection alive but rather opens up a new one (either for every connected user or maybe even for every API query), which results in the database quickly running out of connections.
I believe that what I'm trying to achieve is possible for example in AWS Lambda (by setting context.callbackWaitsForEmptyEventLoop to false).
It is very possible that I don't have a proper understanding of how serverless functions work and this might not be possible at all but maybe someone can suggest me a solution.
I have found a package called serverless-postgres and I wonder if that might be able to solve it but I'd prefer to use the node-postgres package instead as it has much better documentation. Another option would probably be to move away from the integrated API functionality entirely and build a dedicated backend-server, which maintains the database connection but obviously this would be a last resort.
I haven't stress-tested this yet, but it appears that the mongodb next.js example, solves this problem by attaching the database connection to global in a helper function. The important bit in their example is here.
Since the pg connection is a bit more abstract than mongodb, it appears this approach just takes a few lines for us pg enthusiasts:
// eg, lib/db.js
const { Pool } = require("pg");
if (!global.db) {
global.db = { pool: null };
}
export function connectToDatabase() {
if (!global.db.pool) {
console.log("No pool available, creating new pool.");
global.db.pool = new Pool();
}
return global.db;
}
then in, eg, our API route, we can just:
// eg, pages/api/now
export default async (req, res) => {
const { pool } = connectToDatabase();
try {
const time = (await pool.query("SELECT NOW()")).rows[0].now;
res.end(`time: ${time}`);
} catch (e) {
console.error(e);
res.status(500).end("Error");
}
};

Resources