Error: Module name "cassandra-driver" has not been loaded yet for context: _. Use require([]) - node.js

I am using datastax cassandra-driver to make a database.
This is connect-database:
import { require } from "./requirejs.mjs";
export async function run() {
const { Client } = require("cassandra-driver");
const client1 = new Client({
cloud: {
get secureConnectBundle(){
return "secure-connect-amazonfeud.zip"}
},
credentials: {
get username(){
return "<my username>"},
get password(){
return "<my password>"}
},
});
await client1.connect();
const rs = await client1.execute("SELECT * FROM feud.users");
const results = await client1.execute("update feud.users set score=250 where id=1")
console.log(rs['rows'][0])
console.log(`Your cluster returned ${rs.rowLength} row(s)`);
await client1.shutdown();
}
This is main.js:
import { run } from "./connect-database.mjs";
run()
When I run connect-database.mjs, it works, but when I run main.js it gives me error "Uncaught Error Error: Module name "cassandra-driver" has not been loaded yet for context: _. Use require([])
https://requirejs.org/docs/errors.html#notloaded"
When I change the format to be require[], it says "Uncaught TypeError TypeError: Client is not a constructor"
Please help

If you're using a custom require in order to require cassandra-driver, you don't need to do that. Client function is exposed using module.exports in cassandra-driver so you can use a simple import.
An example that worked for me:
cassandraDriverTest.mjs
import { Client } from 'cassandra-driver';
import { inspect } from 'util';
const client = new Client({
contactPoints: ['cp'],
localDataCenter: 'dc1',
keyspace: 'ks'
});
const query = '<query>';
client.execute(query)
.then(result => console.log('User with email %s', result.rows[0].some_data,));
console.log(inspect(client));

Related

How to mock #google-cloud/kms using jest

I'm trying to write unit test cases for decrypt. I've my own implementation of decrypting an encrypted file. While trying to import the decrypt.mjs facing the following error.
Must use import to load ES Module: /node_modules/bignumber.js/bignumber.mjs
My application is a react frontend and NodeJS backend. I've used ES6 modules for NodeJS. Here is my decrypt.mjs file
import { readFile } from 'fs/promises';
import path from 'path';
import { KeyManagementServiceClient } from '#google-cloud/kms';
const decrypt = async (APP_MODE, __dirname) => {
if (APP_MODE === 'LOCALHOST') {
const keys = await readFile(
new URL(`./stagingfile.json`, import.meta.url)
).then((data) => JSON.parse(data));
return keys;
}
const { projectId, locationId, keyRingId, cryptoKeyId, fileName } =
getKMSDefaults(APP_MODE);
const ciphertext = await readFile(
path.join(__dirname, `/${fileName}`)
);
const formattedName = client.cryptoKeyPath(
projectId,
locationId,
keyRingId,
cryptoKeyId
);
const request = {
name: formattedName,
ciphertext,
};
const client = new KeyManagementServiceClient();
const [result] = await client.decrypt(request);
return JSON.parse(result.plaintext.toString('utf8'));
};
const getKMSDefaults = (APP_MODE) => {
//Based on APP_MODE the following object contains different values
return {
projectId: PROJECT_ID,
locationId: LOCATION_ID,
keyRingId: KEY_RING_ID,
cryptoKeyId: CRYPTO_KEY_ID,
fileName: FILE_NAME,
};
};
export default decrypt;
I tried to mock the #google-cloud/kms using manual mock (jest) but it didn't work. I tried multiple solutions to mock but nothing worked and it ended with the Must use import to load ES Module error.
I've had successfully used jest to mock #google-cloud/kms with TypeScript, so hopefully this will be the same process for ES modules that you can use.
Example working code:
// jest will "hoist" jest.mock to top of the file on its own anyway
jest.mock("#google-cloud/kms", () => {
return {
KeyManagementServiceClient: jest.fn().mockImplementation(() => {
return {
encrypt: kmsEncryptMock,
decrypt: kmsDecryptMock,
cryptoKeyPath: () => kmsKeyPath,
};
}),
};
});
// give names to mocked functions for easier access in tests
const kmsEncryptMock = jest.fn();
const kmsDecryptMock = jest.fn();
const kmsKeyPath = `project/location/keyring/keyname`;
// import of SUT must be after the variables used in jest.mock() are defined, not before.
import { encrypt } from "../../src/crypto/google-kms";
describe("Google KMS encryption service wrapper", () => {
const plaintext = "some text to encrypt";
const plaintextCrc32 = 1897295827;
it("sends the correct request to kms service and raise error on empty response", async () => {
// encrypt function is async that throws a "new Error(...)"
await expect(encrypt(plaintext)).rejects.toMatchObject({
message: "Encrypt: no response from KMS",
});
expect(kmsEncryptMock).toHaveBeenNthCalledWith(1, {
name: kmsKeyPath,
plaintext: Buffer.from(plaintext),
plaintextCrc32c: { value: plaintextCrc32 },
});
});
});

TypeORM and MongoDB and Repositories: Cannot read property 'prototype' of undefined

I'm trying implement TypeORM with MongoDB using repositories. However, when I try to make use of repositories to manage the database, using the same structure as in this repository, things go a bit sideways. I'm getting the following error:
UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'prototype' of undefined
I have tried the following code:
import { Request, Response } from 'express';
import { getMongoRepository } from "typeorm";
import Task from "../models/Task";
export default class TasksController {
async listAll(request: Request, response: Response): Promise<Response> {
const tasksRepository = getMongoRepository(Task);
try {
const tasks = await tasksRepository.find();
return response.status(200).json({ "items": tasks });
} catch (err) {
return response.status(400).json({
message: err.message,
});
}
}
}
I know the error refers to implementing the .find() method. I have even managed to fetch the data, using a suggestion from this post replacing:
const tasks = await tasksRepository.find();
with
const tasks = await tasksRepository.createCursor(tasksRepository.find()).toArray();
but I still get the above mentioned error.
Anyone understands what's going on?
I have also managed to save data directly to the database through the use of the following script:
server.ts
import express from 'express';
import { createConnection } from 'typeorm'
const app = express();
const port = 3333;
createConnection();
app.use(express.json());
app.post('/tasks', (async (request, response) => {
const { item } = request.body;
task.item = item;
const task = new Task();
(await connection).mongoManager.save(task);
return response.send(task);
}))
app.listen(port, () =>
console.log(`Server running on port ${port}`)
);
TypeORM is not support mongodb v4.
https://github.com/nestjs/nest/issues/7798
You can use 3.7.0 instead.
I submitted a pull requests to resolve this. https://github.com/typeorm/typeorm/pull/8412 if anyone is looking for a workaround in the meantime.

Next.js - using BigQuery client library gives an error : Module not found: Can't resolve 'child_process'

I am trying to query bigQuery dataset from a next.js project.
I have installed #google-cloud/bigquery and followed the steps from here
I have also tried next.js related solutions from this link but still getting below error.
It looks like next.config.js needs to be configured for this to allow this api call. I am not sure what needs to be changed.
Could someone please help me resolve this issue?
here is my code :
const { BigQuery } = require("#google-cloud/bigquery");
const bigquery = new BigQuery();
useEffect(() => {
async function queryBigQuery() {
const query = `
SELECT fieldname
FROM \`db.dataset.tablename\` WHERE columnname = 50
LIMIT 10`;
const options = {
query: query,
};
// Run the query
const [rows] = await bigquery.query(options);
console.log("Query Results:");
rows.forEach((row) => {
const url = row["url"];
const viewCount = row["view_count"];
console.log(`url: ${url}, ${viewCount} views`);
});
}
queryBigQuery();
}, []);
**wait - compiling...
error - ./node_modules/google-auth-library/build/src/auth/googleauth.js:17:0
Module not found: Can't resolve 'child_process'**
UPDATED:
I am able to load bigQuery library I think on client side but its giving me new error.
Here is my latest next.config.js file
module.exports = {
webpack: (config, { isServer, webpack }) => {
if (!isServer) {
config.node = {
dgram: "empty",
fs: "empty",
net: "empty",
tls: "empty",
child_process: "empty",
};
}
return config;
},
env: {
project variables.
};
New Error:
#google-cloud/bigquery is meant to run on a Node.js environment, it won't work in the browser.
You'll need to move your code to a data fetching method like getStaticProps/getServerSideProps or to an API route, as they all run server-side.
Here's an example using an API route, as it seems to fit your use-case best.
// pages/api/bigquery
const { BigQuery } = require("#google-cloud/bigquery");
const bigquery = new BigQuery();
export default function handler(req, res) {
const query = `
SELECT fieldname
FROM \`db.dataset.tablename\` WHERE columnname = 50
LIMIT 10
`;
const options = {
query: query,
};
// Run your query/logic here
res.json(data); // Return your JSON data after logic has been applied
}
Then, in your React component's useEffect:
const queryBigQuery = async () => {
const res = await fetch('api/bigquery');
const data = await res.json(); // Returns JSON data from API route
console.log(data);
}
useEffect(() => {
queryBigQuery();
}, []);

Using a variable hangs build process when accessed via import

GitHub discussion w/ solution: https://github.com/prisma/prisma/discussions/4038#discussioncomment-111664
I'm building a full-stack app and I'm using Prisma and Web3. As part of Prisma's build process I use yarn generate to compile the schema. I've run into some strange issue where my program is hanging on a return statement.
// ./modules/Wallet.ts
import Web3 from 'web3'
// create a Web3 connection
export const web3 = new Web3(
process.env.NODE_ENV === 'production'
? 'wss://mainnet.infura.io/ws/v3/redacted'
: 'wss://ropsten.infura.io/ws/v3/redacted'
)
// load an Ethereum wallet from a private key (env var)
const envWallet = () => {
if (!process.env.ETH_PRIVATE_KEY) {
require('dotenv').config()
}
const _wallet = web3.eth.accounts.wallet.add(process.env.ETH_PRIVATE_KEY!)
// successfully logs the loaded wallet on `yarn generate`
console.log(_wallet)
// Issue: `yarn generate` hangs on this line
return _wallet
}
// initialize the wallet
const wallet = envWallet()
// get the address of the wallet
export const getAddress = () => {
return wallet.address
}
// ./types/Query.ts
// If I omit this file from the schema, it will compile successfully
import { queryType } from '#nexus/schema'
import { getAddress } from '../modules/Wallet'
console.log('Test 1') // logged after the console.log(_wallet)
const Query = queryType({
definition(t) {
// simply a GraphQL query that returns the address of the loaded wallet
t.string('getAddress', {
nullable: true,
resolve: (root, args, ctx) => {
console.log('Test 2') // never hits
const address = getAddress()
console.log('Test 3') // never hits
return address
},
})
},
})
export default Query

Need to find the error with connecting subscription with schema stitching

I am using apollo-server-express for graphql back-end. I am going to process only mutations there, but I want to redirect query and subscription on hasura by means of schema stitching with introspection. Queries through apollo-server to hasura are working fine and returning the expected data.
But subscriptions are not working and I am getting this error: " Expected Iterable, but did not find one for field subscription_root.users".
And besides, server hasura is receiving events:
But apollo-server resents the answer from hasura. It is not the first day I suffer with this and I can not understand what the problem is.
In the editor hasura subscriptions work.
Link to full code
If you need any additional info, I will gladly provide it to you.
import {
introspectSchema,
makeExecutableSchema,
makeRemoteExecutableSchema,
mergeSchemas,
transformSchema,
FilterRootFields
} from 'graphql-tools';
import { HttpLink } from 'apollo-link-http';
import nodeFetch from 'node-fetch';
import { resolvers } from './resolvers';
import { hasRoleResolver } from './directives';
import { typeDefs } from './types';
import { WebSocketLink } from 'apollo-link-ws';
import { split } from 'apollo-link';
import { getMainDefinition } from 'apollo-utilities';
import { SubscriptionClient } from 'subscriptions-transport-ws';
import * as ws from 'ws';
import { OperationTypeNode } from 'graphql';
interface IDefinitionsParams {
operation?: OperationTypeNode,
kind: 'OperationDefinition' | 'FragmentDefinition'
}
const wsurl = 'ws://graphql-engine:8080/v1alpha1/graphql';
const getWsClient = function (wsurl: string) {
const client = new SubscriptionClient(wsurl, {
reconnect: true,
lazy: true
}, ws);
return client;
};
const wsLink = new WebSocketLink(getWsClient(wsurl));
const createRemoteSchema = async () => {
const httpLink = new HttpLink({
uri: 'http://graphql-engine:8080/v1alpha1/graphql',
fetch: (nodeFetch as any)
});
const link = split(
({ query }) => {
const { kind, operation }: IDefinitionsParams = getMainDefinition(query);
console.log('kind = ', kind, 'operation = ', operation);
return kind === 'OperationDefinition' && operation === 'subscription';
},
wsLink,
httpLink,
);
const remoteSchema = await introspectSchema(link);
const remoteExecutableSchema = makeRemoteExecutableSchema({
link,
schema: remoteSchema
});
const renamedSchema = transformSchema(
remoteExecutableSchema,
[
new FilterRootFields((operation, fieldName) => {
return (operation === 'Mutation') ? false : true; // && fieldName === 'password'
})
]
);
return renamedSchema;
};
export const createNewSchema = async () => {
const hasuraExecutableSchema = await createRemoteSchema();
const apolloSchema = makeExecutableSchema({
typeDefs,
resolvers,
directiveResolvers: {
hasRole: hasRoleResolver
}
});
return mergeSchemas({
schemas: [
hasuraExecutableSchema,
apolloSchema
]
});
};
Fixed by installing graphql-tools 4th version. It tutns out the editor did not even notice that I do not have this dependency and simply took the version of node_modules, which was installed by some other package. Problem was with version 3.x. Pull request is where the bug was fixed.
I had the same problem, different cause and solution.
My subscription was working well, until I introduced the 'resolve' key in
my subscription resolver:
Here is the 'Subscription' part of My resolver:
Subscription: {
mySubName: {
resolve: (payload) => {
console.log('In mySubName resolver, payload:',payload)
return payload;
},
subscribe:() => pubSub.asyncIterator(['requestsIncomplete']),
// )
},
The console.log proved the resolve() function was being called with a well structured payload (shaped the same as my Schema definiton - specifically the an object with a key named after the graphQL Subscriber, pointing to an array (array is an iterable):
In mySubName resolver, payload: { mySubName:
[ { id: 41,
...,
},
{...},
{...}
...
...
]
Even though I was returning that same unadulterated object, it caused the error expected Iterable, but did not find one for field "Subscription.mySubName"
When I commented out that resolve function all together, the subscription worked, which is further evidence that my payload was well structured, with the right key pointing to an iterable.
I must be mis-using the resolve field. From https://www.apollographql.com/docs/graphql-subscriptions/subscriptions-to-schema/
When using subscribe field, it's also possible to manipulate the event
payload before running it through the GraphQL execution engine.
Add resolve method near your subscribe and change the payload as you wish
so I am not sure how to properly use that function, specifically don't know what shape object to return from it, but using it as above breaks the subscription in the same manner you describe in your question.
I was already using graphql-tools 4.0.0, I upgraded to 4.0.8 but it made no difference.

Resources