How to extract data using async and await function in node js? - node.js

I tried redis cache implement in node js using mongodb.I set the data in cache.but i cant get the data in cache.how to solve this issue.
cache.js
async function Get_Value(){
let response = await client.get('products')
console.log("_______________")
console.log(response)
}
I got output : true
Excepted output: json data
how to get json data using cache get method

Redis does not provide a full async await adapter for node.js so usually as a workaround people are promisifying the lib.
const { promisify } = require('util');
const getAsync = promisify(client.get).bind(client);
async function getValue(){
let response = await getAsync("products");
}
An other approach to promisify the entire redis library you can use:
const redis = require('redis');
bluebird.promisifyAll(redis);
Now you will be able also to use the methods using async/await.

Related

Is it possible to reuse a nock object?

I am using nock for testing HTTP Endpoints and I don't want to define the headers and baseURL multiple times.
Is it problematic to do something like this?
const serviceNock = nock('https://my-service.local')
.matchHeader('Api-Key', 'my-api-key')
.matchHeader('Content-Type', 'application/json')
Use serviceNock in test1
const serviceNock1 = serviceNock
.patch('/resources')
.reply(200)
Use serviceNock in test2
const serviceNock2 = serviceNock
.patch('/resources')
.reply(500)
No idea about the nock (never used it and don't know if chaining creates new object or just modify the existing one), but why not to do something like next:
const getBaseNockService = () => nock('https://my-service.local')
.matchHeader('Api-Key', 'my-api-key')
.matchHeader('Content-Type', 'application/json')
const serviceNock1 = getBaseNockService()
.patch('/resources')
.reply(200)
const serviceNock2 = getBaseNockService()
.patch('/resources')
.reply(500)
Here we define factory function getBaseNockService and get full functionality you requested

Shopify API Node/Express Cannot read properties of undefined (reading 'Rest')

Just starting off with Shopify, and trying to get an order. Following the Shopify API documentation, here is my code:
const Shopify = require('#shopify/shopify-api');
const client = new Shopify.Clients.Rest('my-store.myshopify.com',
process.env.SHOPIFY_KEY);
module.exports.getShopifyOrderById = async (orderId) => {
return await client.get({
path: `orders/${orderId}`,
});
}
I get the following error when I execute this code:
TypeError: Cannot read properties of undefined (reading 'Rest')
Can't seem to figure out what the issue is.
You need to use Object destructing to get the Shopify object or use default export like below.
const { Shopify } = require('#shopify/shopify-api');
const client = new Shopify.Clients.Rest('my-store.myshopify.com',
process.env.SHOPIFY_KEY);
OR
const Shopify = require('#shopify/shopify-api').default;
const client = new Shopify.Clients.Rest('my-store.myshopify.com',
process.env.SHOPIFY_KEY);
OR
const ShopifyLib = require('#shopify/shopify-api');
const client = new ShopifyLib.Shopify.Clients.Rest('my-store.myshopify.com',
process.env.SHOPIFY_KEY);
This has to do with how ES6 modules are emulated in CommonJS and how you import the module. You can read about that here.

How to share a single promise based RabbitMQ connection across files or controllers in Node js instead of creating a new Connection each time?

amqplib library lets you create a rabbitmq connection and that object will be a segue to doing other things such as creating a channel and etc.
suppose that I'm going for a Producer/Consumer pattern, where each time a user hits a specific route, a job is produced and sent to the rabbitmq server where it's processed by certain consumers(workers).
app.post("/routethatdelegatesheavywork", async (req,res) => {
const amqpServerLink =
"link-to-cloudmq";
const connection = await amqp.connect(amqpServerLink);
const channel = await connection.createChannel();
//do other stuff with channel
})
while this "works", but i don't want to re-create that connection every time the controller is invoked since it makes the producer very slow and it's really not how it's supposed to be done.
here is where my problem comes:
how do i initialize one connection and re-use it every time i need it?
i have tried to create a connection outside controllers and use it when necessary but it's not possible since the connection is promise-based and await doesn't work on entry point and it has to be inside an async function to work.
although it is possible to run await without async using ESM (es modules) i don't want to do so since i have written all of the application using CommonJS (require("package")), changing that would require me to go through a lot of files and change every import/export according to ESM.
So, is there any other way to create one connection(that is promise based) and re-use it without having to migrate to ESM syntax?
Yes, remember that require in nodejs are singletons. Make a new amqpServerInterface module, and do
const amqpServerLink = "link-to-cloudmq"
const connection = amqp.connect(amqpServerLink)
function connect() {
return connection
}
module.exports = {
connect
}
Then in your controllers
const amqpServerInterface = require('amqpServerInterface')
app.post("/routethatdelegatesheavywork", async (req,res) => {
const connection = await amqpServerInterface.connect();
const channel = await connection.createChannel();
//do other stuff with channel
})
This will always return the same connection promise and will resolve to the save connection.

error passing empty credentials to firestore emulator

I am trying to seed some sample data into my local firestore emulator database. I adapted the example from this github issue
My code looks like this:
const {Firestore} = require('#google-cloud/firestore');
const {credentials} = require('grpc');
const db = new Firestore({
projectId: 'my-project-id',
servicePath: 'localhost',
port: 8100,
sslCreds: credentials.createInsecure(),
customHeaders: {
"Authorization": "Bearer owner"
}
});
async function load_data() {
await db.collection("mycollection").doc("myid").set({ foo: "test" })
}
load_data();
But I receive the error
this.credentials._getCallCredentials is not a function
Tested on node 10 and 12 with same error.
Library versions:
#google-cloud/firestore 3.5.1
grpc 1.24.2
Is there a better approach to writing to local emulated firestore? Or is there something wrong with my code?
The problem here is that you're trying to use two different implementations of gRPC together. Internally firestore uses #grpc/grpc-js, so that is what you should be using. You should only need to change the second line to const {credentials} = require('#grpc/grpc-js'); and switch the dependency to that library.

Do not see the reason I am getting a NOENT returned when I can see the file at the exact spot I am calling for it to be

I know this is very similar to other questions that have been asked on the same error. In the case I have seen though, the file name had been left off of the url. In my case (as far as I know) the url is specified as it should be and I can see the file on my localhost using other tools.
I have a need in a node.js app to perform I/O on json files without the benefit of using express routing. This is an API that has only one route (processor.js). It is accessed by a menu selection on the GUI by selecting 'Process'. From that point on everything happens within that route including multiple GETs/PUTs to json (for ids to data and then using the ids to get the data) and the building of SQL rows for populating SQL-Server Tables from the parsed json data. That, at least is the concept I am testing now. It is the hand I have been dealt, so I don't have other options.
I am using fs-extra rather than request or axios etc., because they all seem to expect express routes to accomplish the I/O. I appear to be able to directly read and write the json using fs-extra. I am using sequelize (or will be) for the SQL side.
That's the background.
Here is my processor.js (I am merely validating that I can in fact get idsList returned to me at this point):
'use strict';
// node_modules
const express = require('express');
const router = express.Router();
const fse = require('fs-extra')
// local modules
const idsList = require('../functions/getIds');
router.get('/', (req, res) => {
console.log(idsList);
});
module.exports = router;
Here is my getIds function:
'use strict';
// library modules
const express = require('express');
const router = express.Router();
const fse = require('fs-extra');
const uri = require('../uri');
// initialize general variables
let baseURL = `http://localhost:5000${uri}/`;
let idsID = 'ids.json';
const getIds = async () => {
let url = `${baseURL}${idsID}`;
try {
const idsList = await fse.readJson(url);
console.log('fse.readJson',idsList);
} catch (err) {
console.error(err);
}
}
module.exports = getIds();
And, here is my error, output to the console (it didn't format very well):
Listening on port 5000...
{ [Error: ENOENT: no such file or directory, open
'http://localhost:5000/Users/doug5solas/sandbox/libertyMutual/playground/api/ids.json']
errno: -2,
code: 'ENOENT',
syscall: 'open',
path:
'http://localhost:5000/Users/doug5solas/sandbox/libertyMutual/playground/api/ids.json' }
What am I missing?
You can use fs-extra to manipulate files and directories in your local file system only.
If you want to read files hosted on other machine over http, try using an http client like: axios.
I moved away from fs-extra to fs.readFileSync and solved the problem. It is not my preference. But it does work and the file is small, and only once.

Resources