I am using this npm package:
https://www.npmjs.com/package/uuid
I want to egenrate a v5 uuid.
I can generate a v4 no problem by requiring the module:
const { v4: uuidv4 } = require('uuid');
and then running:
console.log(`uuidv4: ${uuidv4()}`);
So then I try to generate a v5:
const { v5: uuidV5 } = require('uuid');
const MY_NAMESPACE = 'f709b20b-3353-4c32-8df9-66bc48e91ea9';
var v5uuid = uuidV5('hello', MY_NAMESPACE);
console.log(`userUUID: ${v5uuid}`);
However, the app gets to line var v5uuid = uuidV5('hello', MY_NAMESPACE); and then goes straight to the catch error. In the variables error says:
'uuidV5 is not a function'
running npm ls uuid:
├─┬ nodemon#1.3.3
│ └─┬ update-notifier#0.1.10
│ └─┬ configstore#0.3.2
│ └── uuid#2.0.3
├─┬ request#2.88.2
│ └── uuid#3.3.2 deduped
├─┬ sequelize#6.14.1
│ └── uuid#8.3.2
└── uuid#3.3.2
What am I doing wrong?
Below solution worked for me!
If your using package.json, add the following to package.json
{
"type": "module"
...
}
Now use can use import with node js
// index.js
import { v5 as uuidv5 } from "uuid";
const MY_NAMESPACE = "1b671a64-40d5-491e-99b0-da01ff1f3341";
uuidv5("Hello World", MY_NAMESPACE); // ⇨ 'a572fa0f-9bfa-5103-9882-16394770ad11'
Check your output using
node index.js
Related
I'm building a REST API with versioning support. Here is my directory structure.
.
├── src
│ ├── api
│ │ ├── v1
│ │ │ ├── modules ─ ...
│ │ │ ├── routers
│ │ │ │ ├─── auth.router.js
│ │ │ │ ├─── posts.router.js
│ │ ├── v2
│ │ │ ├── modules ─ ...
│ │ │ ├── routers ─ ...
├── app.js
I want the router files imported to app.js. I've looked for the solution for hours but all I found is how to import each file manually through app.use(). This is doable but as the version numbers and router files keep increasing, this can lead to redundant work. I need a way to import these files with the least manual lines of code possible.
It is not possible to directly do this with Express, generally people mange modules manually with NodeJS, as it doesn't take a lot of work to do at all. In terms of version numbers, you could specify a version setting or constant somwhere, and import depending on that number.
For instance:
// routes.js
const apiVersion = "v2";
module.exports = {
require(`./${apiVersion}/auth.route`),
}
If this is not ideal, one hacky way to manage this is by grabbing all of the route files with the fs module, and importing them automatically. This is quite a hacky way of doing it, but I came up with something like this:
// router.js
const fs = require("fs/promises");
const { Router } = require("express");
const router = Router();
const apiVersion = "v2";
const loadRoutes = async () => {
// grab all the route files from a directory using fs
// use require to grab them from the source files
}
const routes = loadRoutes();
routes.forEach(route => {
router.use(route);
})
// app.js
const router = require("./path/to/router");
// ...boilerplate
app.use(router);
Here is the structure for src directory of my project:
.
├── config.ts
├── protos
│ ├── index.proto
│ ├── index.ts
│ ├── share
│ │ ├── topic.proto
│ │ ├── topic_pb.d.ts
│ │ ├── user.proto
│ │ └── user_pb.d.ts
│ ├── topic
│ │ ├── service.proto
│ │ ├── service_grpc_pb.d.ts
│ │ ├── service_pb.d.ts
│ │ ├── topic.integration.test.ts
│ │ ├── topic.proto
│ │ ├── topicServiceImpl.ts
│ │ ├── topicServiceImplDynamic.ts
│ │ └── topic_pb.d.ts
│ └── user
│ ├── service.proto
│ ├── service_grpc_pb.d.ts
│ ├── service_pb.d.ts
│ ├── user.proto
│ ├── userServiceImpl.ts
│ └── user_pb.d.ts
└── server.ts
share/user.proto:
syntax = "proto3";
package share;
message UserBase {
string loginname = 1;
string avatar_url = 2;
}
topic/topic.proto:
syntax = "proto3";
package topic;
import "share/user.proto";
enum Tab {
share = 0;
ask = 1;
good = 2;
job = 3;
}
message Topic {
string id = 1;
string author_id = 2;
Tab tab = 3;
string title = 4;
string content = 5;
share.UserBase author = 6;
bool good = 7;
bool top = 8;
int32 reply_count = 9;
int32 visit_count = 10;
string create_at = 11;
string last_reply_at = 12;
}
As you can see, I try to import share package and use UserBase message type in Topic message type. When I try to start the server, got error:
no such Type or Enum 'share.UserBase' in Type .topic.Topic
But when I changed the package import path to a relative path import "../share/user.proto";. It works fine and got server logs: Server is listening on http://localhost:3000.
Above is the usage of dynamic codegen.
Now, I switch to using static codegen, here is the shell script for generating the codes:
protoc \
--plugin=protoc-gen-ts=./node_modules/.bin/protoc-gen-ts \
--ts_out=./src/protos \
-I ./src/protos \
./src/protos/**/*.proto
It seems protocol buffer compiler doesn't support relative path, got error:
../share/user.proto: Backslashes, consecutive slashes, ".", or ".." are not allowed in the virtual path
And, I changed the the package import path back to import "share/user.proto";. It generated code correctly, but when I try to start my server, got same error:
no such Type or Enum 'share.UserBase' in Type .topic.Topic
It's weird.
Package versions:
"grpc-tools": "^1.6.6",
"grpc_tools_node_protoc_ts": "^4.1.3",
protoc --version
libprotoc 3.10.0
UPDATE:
repo: https://github.com/mrdulin/nodejs-grpc/tree/master/src
Your dynamic codegen is failing because you are not specifying the paths to search for imported .proto files. You can do this using the includeDirs option when calling protoLoader.loadSync, which works in a very similar way to the -I option you pass to protoc. In this case, you are loading the proto files from the src/protos directory, so it should be sufficient to pass the option includeDirs: [__dirname]. Then the import paths in your .proto files should be relative to that directory, just like when you use protoc.
You are probably seeing the same error when you try to use the static code generation because it is actually the dynamic codegen error; you don't appear to be removing the dynamic codegen code when trying to use the statically generated code.
However, the main problem you will face with the statically generated code is that you are only generating the TypeScript type definition files. You also need to generate JavaScript files to actually run it. The official Node gRPC plugin for proto is distributed in the grpc-tools package. It comes with a binary called grpc_tools_node_protoc, which should be used in place of protoc and automatically includes the plugin. You will still need to pass a --js_out flag to generate that code.
Situation
I have a monorepo created with lerna with 40-50 projects. Each has a package.json like this.
{
"name": "#base-repo/add-class-methods",
"version": "1.0.0",
"main": "index.js",
"license": "MIT"
}
The folder structure is like this,
packages
├── absolute-url
│ ├── index.js
│ └── package.json
├── add-class-methods
│ ├── index.js
│ └── package.json
├── check-set-relative
│ ├── index.js
│ └── package.json
├── crypto
│ ├── index.js
│ └── package.json
If I push it to github, it will have a single github url, however I saw babel has 142 packages where each of them has a custom repository field in the package.json.
"repository": "https://github.com/babel/babel/tree/master/packages/babel-types"
I hope they are not setting this value manually for 142 packages. Same with my 40 small packages.
I understand I can manually set them in 3-4 minutes by the time I am writing this question. However this will get overwhelming when I try to do the same with a 150 package monorepo or in future.
Problem
How can I set/update the repository field without opening each package.json file manually for 40 packages?
What I tried
Manually set each as possible, but things quickly got boring and repeating considering I am a programmer. Then I googled the solution for around an hour. Finally I wrote the following script,
const glob = require('glob');
const fs = require('fs');
const path = require('path');
const gitUrl = 'https://github.com/user';
const author = `Mr. Github User <user#example.com> (${gitUrl})`;
const basePath = '/utility-scripts/tree/master';
const baseRepo = gitUrl + basePath;
glob('packages/*/package.json', (err, files) => {
for (const filePath of files) {
const [parent, pkg] = filePath.split('/');
const newData = {
author,
license: 'MIT',
repository: `${baseRepo}/${parent}/${pkg}`,
};
const data = Object.assign(
{},
JSON.parse(fs.readFileSync(path.resolve(filePath), 'utf-8')),
newData,
);
fs.writeFileSync(path.resolve(filePath), JSON.stringify(data, true, 2));
}
});
Is there an easy way to deal with this? With any kind of shell, git, yarn or npm command?
Some context here: It's not that I cannot use Webpack, it's that I do not want to use Webpack. I would like to keep everything as "vanilla" as possible.
Currently when creating modules in a project you have to require them using either a relative or absolute path, for example in the following directory..
project/
├── index.js
├── lib/
│ ├── network/
│ │ request.js
│ │ response.js
├── pages/
│ ├── foo.js
Considering we're in index.js we would import request via
var networkRequest = require('./lib/network/request.js')
and if we're in foo.js we would import request via
var networkRequest = require('../lib/network/request.js')
What I'm wondering is that if there's any way to perhaps, set a local alias in Package.json or anywhere else like so:
localPackages = [
{ name: 'network-request', path: './lib/network/request.js' }
];
In which you could just do
var networkRequest = require('network-request')
From any file and it will provide the correct path.
Yep, that's what npm link is for. Native and out of the box.
You can also set local paths in package.json
{
"name": "baz",
"dependencies": {
"bar": "file:../foo/bar"
}
}
I am trying to clear the cloudflare cache for single urls programmatically after put requests to a node.js api. I am using the https://github.com/cloudflare/node-cloudflare library, however I can't figure out how to log a callback from cloudflare. According to the test file in the same repo, the syntax should be something like this:
//client declaration:
t.context.cf = new CF({
key: 'deadbeef',
email: 'cloudflare#example.com',
h2: false
});
//invoke clearCache:
t.context.cf.deleteCache('1', {
files: [
'https://example.com/purge_url'
]
})
How can I read out the callback from this request?
I have tried the following in my own code:
client.deleteCache(process.env.CLOUDFLARE_ZONE, { "files": [url] }, function (data) {
console.log(`Cloudflare cache purged for: ${url}`);
console.log(`Callback:${data}`);
})
and:
client.deleteCache('1', {
files: [
'https://example.com/purge_url'
]
}).then(function(a,b){
console.log('helllllllooooooooo');
})
to no avail. :(
Purging Cloudflare cache by url:
var Cloudflare = require('cloudflare');
const { CF_EMAIL, CF_KEY, CF_ZONE } = process.env;
if (!CF_ZONE || !CF_EMAIL || !CF_KEY) {
throw new Error('you must provide env. variables: [CF_ZONE, CF_EMAIL, CF_KEY]');
}
const client = new Cloudflare({email: CF_EMAIL, key: CF_KEY});
const targetUrl = `https://example.com/purge_url`;
client.zones.purgeCache(CF_ZONE, { "files": [targetUrl] }).then(function (data) {
console.log(`Cloudflare cache purged for: ${targetUrl}`);
console.log(`Callback:`, data);
}, function (error) {
console.error(error);
});
You can lookup cloudflare zone this way:
client.zones.browse().then(function (zones) {
console.log(zones);
})
Don't forget to install the current client version:
npm i cloudflare#^2.4.1 --save-dev
I wrote a nodejs module to purge cache for a entire website. It scan your "public" folder, build the full url and purge it on cloudflare:
You can run it using npx:
npm install -g npx
npx purge-cloudflare-cache your#email.com your_cloudflare_key the_domain_zone https://your.website.com your/public/folder
But, you can install it and run using npm too:
npm install -g purge-cloudflare-cache
purge your#email.com your_cloudflare_key the_domain_zone https://your.website.com your/public/folder
For a public/folder tree like:
├── assets
│ ├── fonts
│ │ ├── roboto-regular.ttf
│ │ └── roboto.scss
│ ├── icon
│ │ └── favicon.ico
│ └── imgs
│ └── logo.png
├── build
│ ├── main.css
│ ├── main.js
├── index.html
It will purge cache for files:
https://your.website.com/index.html
https://your.website.com/build/main.css
https://your.website.com/build/main.js
https://your.website.com/assets/imgs/logo.png
https://your.website.com/assets/icon/favicon.ico
https://your.website.com/assets/fonts/roboto.css
https://your.website.com/assets/fonts/roboto-regular.ttf
This is probably happening because my mocha tests don't wait for the callback to return.
https://github.com/mochajs/mocha/issues/362