clearing cloudflare cache programmatically - node.js

I am trying to clear the cloudflare cache for single urls programmatically after put requests to a node.js api. I am using the https://github.com/cloudflare/node-cloudflare library, however I can't figure out how to log a callback from cloudflare. According to the test file in the same repo, the syntax should be something like this:
//client declaration:
t.context.cf = new CF({
key: 'deadbeef',
email: 'cloudflare#example.com',
h2: false
});
//invoke clearCache:
t.context.cf.deleteCache('1', {
files: [
'https://example.com/purge_url'
]
})
How can I read out the callback from this request?
I have tried the following in my own code:
client.deleteCache(process.env.CLOUDFLARE_ZONE, { "files": [url] }, function (data) {
console.log(`Cloudflare cache purged for: ${url}`);
console.log(`Callback:${data}`);
})
and:
client.deleteCache('1', {
files: [
'https://example.com/purge_url'
]
}).then(function(a,b){
console.log('helllllllooooooooo');
})
to no avail. :(

Purging Cloudflare cache by url:
var Cloudflare = require('cloudflare');
const { CF_EMAIL, CF_KEY, CF_ZONE } = process.env;
if (!CF_ZONE || !CF_EMAIL || !CF_KEY) {
throw new Error('you must provide env. variables: [CF_ZONE, CF_EMAIL, CF_KEY]');
}
const client = new Cloudflare({email: CF_EMAIL, key: CF_KEY});
const targetUrl = `https://example.com/purge_url`;
client.zones.purgeCache(CF_ZONE, { "files": [targetUrl] }).then(function (data) {
console.log(`Cloudflare cache purged for: ${targetUrl}`);
console.log(`Callback:`, data);
}, function (error) {
console.error(error);
});
You can lookup cloudflare zone this way:
client.zones.browse().then(function (zones) {
console.log(zones);
})
Don't forget to install the current client version:
npm i cloudflare#^2.4.1 --save-dev

I wrote a nodejs module to purge cache for a entire website. It scan your "public" folder, build the full url and purge it on cloudflare:
You can run it using npx:
npm install -g npx
npx purge-cloudflare-cache your#email.com your_cloudflare_key the_domain_zone https://your.website.com your/public/folder
But, you can install it and run using npm too:
npm install -g purge-cloudflare-cache
purge your#email.com your_cloudflare_key the_domain_zone https://your.website.com your/public/folder
For a public/folder tree like:
├── assets
│ ├── fonts
│ │ ├── roboto-regular.ttf
│ │ └── roboto.scss
│ ├── icon
│ │ └── favicon.ico
│ └── imgs
│ └── logo.png
├── build
│ ├── main.css
│ ├── main.js
├── index.html
It will purge cache for files:
https://your.website.com/index.html
https://your.website.com/build/main.css
https://your.website.com/build/main.js
https://your.website.com/assets/imgs/logo.png
https://your.website.com/assets/icon/favicon.ico
https://your.website.com/assets/fonts/roboto.css
https://your.website.com/assets/fonts/roboto-regular.ttf

This is probably happening because my mocha tests don't wait for the callback to return.
https://github.com/mochajs/mocha/issues/362

Related

Generating uuid v5 in node.js erros out with no information?

I am using this npm package:
https://www.npmjs.com/package/uuid
I want to egenrate a v5 uuid.
I can generate a v4 no problem by requiring the module:
const { v4: uuidv4 } = require('uuid');
and then running:
console.log(`uuidv4: ${uuidv4()}`);
So then I try to generate a v5:
const { v5: uuidV5 } = require('uuid');
const MY_NAMESPACE = 'f709b20b-3353-4c32-8df9-66bc48e91ea9';
var v5uuid = uuidV5('hello', MY_NAMESPACE);
console.log(`userUUID: ${v5uuid}`);
However, the app gets to line var v5uuid = uuidV5('hello', MY_NAMESPACE); and then goes straight to the catch error. In the variables error says:
'uuidV5 is not a function'
running npm ls uuid:
├─┬ nodemon#1.3.3
│ └─┬ update-notifier#0.1.10
│ └─┬ configstore#0.3.2
│ └── uuid#2.0.3
├─┬ request#2.88.2
│ └── uuid#3.3.2 deduped
├─┬ sequelize#6.14.1
│ └── uuid#8.3.2
└── uuid#3.3.2
What am I doing wrong?
Below solution worked for me!
If your using package.json, add the following to package.json
{
"type": "module"
...
}
Now use can use import with node js
// index.js
import { v5 as uuidv5 } from "uuid";
const MY_NAMESPACE = "1b671a64-40d5-491e-99b0-da01ff1f3341";
uuidv5("Hello World", MY_NAMESPACE); // ⇨ 'a572fa0f-9bfa-5103-9882-16394770ad11'
Check your output using
node index.js

How to import all express router files from multiple directories in nodejs?

I'm building a REST API with versioning support. Here is my directory structure.
.
├── src
│ ├── api
│ │ ├── v1
│ │ │ ├── modules ─ ...
│ │ │ ├── routers
│ │ │ │ ├─── auth.router.js
│ │ │ │ ├─── posts.router.js
│ │ ├── v2
│ │ │ ├── modules ─ ...
│ │ │ ├── routers ─ ...
├── app.js
I want the router files imported to app.js. I've looked for the solution for hours but all I found is how to import each file manually through app.use(). This is doable but as the version numbers and router files keep increasing, this can lead to redundant work. I need a way to import these files with the least manual lines of code possible.
It is not possible to directly do this with Express, generally people mange modules manually with NodeJS, as it doesn't take a lot of work to do at all. In terms of version numbers, you could specify a version setting or constant somwhere, and import depending on that number.
For instance:
// routes.js
const apiVersion = "v2";
module.exports = {
require(`./${apiVersion}/auth.route`),
}
If this is not ideal, one hacky way to manage this is by grabbing all of the route files with the fs module, and importing them automatically. This is quite a hacky way of doing it, but I came up with something like this:
// router.js
const fs = require("fs/promises");
const { Router } = require("express");
const router = Router();
const apiVersion = "v2";
const loadRoutes = async () => {
// grab all the route files from a directory using fs
// use require to grab them from the source files
}
const routes = loadRoutes();
routes.forEach(route => {
router.use(route);
})
// app.js
const router = require("./path/to/router");
// ...boilerplate
app.use(router);

Configure repository field on package.json on monorepo

Situation
I have a monorepo created with lerna with 40-50 projects. Each has a package.json like this.
{
"name": "#base-repo/add-class-methods",
"version": "1.0.0",
"main": "index.js",
"license": "MIT"
}
The folder structure is like this,
packages
├── absolute-url
│ ├── index.js
│ └── package.json
├── add-class-methods
│ ├── index.js
│ └── package.json
├── check-set-relative
│ ├── index.js
│ └── package.json
├── crypto
│ ├── index.js
│ └── package.json
If I push it to github, it will have a single github url, however I saw babel has 142 packages where each of them has a custom repository field in the package.json.
"repository": "https://github.com/babel/babel/tree/master/packages/babel-types"
I hope they are not setting this value manually for 142 packages. Same with my 40 small packages.
I understand I can manually set them in 3-4 minutes by the time I am writing this question. However this will get overwhelming when I try to do the same with a 150 package monorepo or in future.
Problem
How can I set/update the repository field without opening each package.json file manually for 40 packages?
What I tried
Manually set each as possible, but things quickly got boring and repeating considering I am a programmer. Then I googled the solution for around an hour. Finally I wrote the following script,
const glob = require('glob');
const fs = require('fs');
const path = require('path');
const gitUrl = 'https://github.com/user';
const author = `Mr. Github User <user#example.com> (${gitUrl})`;
const basePath = '/utility-scripts/tree/master';
const baseRepo = gitUrl + basePath;
glob('packages/*/package.json', (err, files) => {
for (const filePath of files) {
const [parent, pkg] = filePath.split('/');
const newData = {
author,
license: 'MIT',
repository: `${baseRepo}/${parent}/${pkg}`,
};
const data = Object.assign(
{},
JSON.parse(fs.readFileSync(path.resolve(filePath), 'utf-8')),
newData,
);
fs.writeFileSync(path.resolve(filePath), JSON.stringify(data, true, 2));
}
});
Is there an easy way to deal with this? With any kind of shell, git, yarn or npm command?

Warning: error TS18002: The 'files' list in config file is empty

I'm using TypeScript 2.1.5.0. I've configured the grunt-typescript-using-tsconfig plugin as shown below but I get the error in the subject line when I execute the task.
The problem is the tsconfig.json property "files":[]. I didn't encounter this error when using gulp-typescript. Do you recommend that I configure something differently? Either my gruntfile.js for this plugin or tsconfig.json? Or can you recommend a different grunt plugin that will successfully hook into tsconfig.json and process the typescript task as expected?
typescriptUsingTsConfig: {
basic: {
options: {
rootDir: "./tsScripts"
}
}
}
Or can you recommend a different grunt plugin that will successfully hook into tsconfig.json and process the typescript task as expected?
gulp typescript supports tsconfig : https://github.com/ivogabe/gulp-typescript/#using-tsconfigjson
var tsProject = ts.createProject('tsconfig.json');
gulp.task('scripts', function() {
var tsResult = gulp.src(tsProject.src())
.pipe(tsProject());
return tsResult.js.pipe(gulp.dest('release'));
});
Try setting your Gruntfile.js configuration as shown in the following gist :
Gruntfile.js
module.exports = function(grunt) {
grunt.initConfig({
typescriptUsingTsConfig: {
basic: {
options: {
rootDir: './'
}
}
}
});
grunt.loadNpmTasks('grunt-typescript-using-tsconfig');
grunt.registerTask('default', [
'typescriptUsingTsConfig'
]);
};
Note the value for rootDir is set to ./ (i.e. The same folder as the Gruntfile.js).
tsconfig.json
Then ensure you have your tsconfig.json configured to include a list of all .ts files to be compiled to .js. For example:
{
"compilerOptions": {
"outDir": "./dist"
},
"files": [
"./tsScripts/a.ts",
"./tsScripts/b.ts",
"./tsScripts/c.ts"
]
}
There are of course other compiler options you can set in tsconfig.json
Directory structure
The configurations above assumes a directory structured as follows, however you can just adapt the code examples as required:
foo
├── Gruntfile.js
├── tsconfig.json
├── tsScripts
│ ├── a.ts
│ ├── b.ts
│ └── c.ts
└── node_modules
└── ...
Running grunt
cd to the project folder, (in these examples the one named foo), and run:
$ grunt
Output
Running grunt will create a folder named dist and output all .js files to it. For example:
foo
├── dist
│ ├── a.js
│ ├── b.js
│ └── c.js
└── ...
If you want the resultant .js files to be output to the same folder as the source .ts file, (i.e. not to the 'dist' folder), just exclude the "outDir": "./dist" part from your ts.config.json.

Aliasing modules using NodeJS

Some context here: It's not that I cannot use Webpack, it's that I do not want to use Webpack. I would like to keep everything as "vanilla" as possible.
Currently when creating modules in a project you have to require them using either a relative or absolute path, for example in the following directory..
project/
├── index.js
├── lib/
│ ├── network/
│ │ request.js
│ │ response.js
├── pages/
│ ├── foo.js
Considering we're in index.js we would import request via
var networkRequest = require('./lib/network/request.js')
and if we're in foo.js we would import request via
var networkRequest = require('../lib/network/request.js')
What I'm wondering is that if there's any way to perhaps, set a local alias in Package.json or anywhere else like so:
localPackages = [
{ name: 'network-request', path: './lib/network/request.js' }
];
In which you could just do
var networkRequest = require('network-request')
From any file and it will provide the correct path.
Yep, that's what npm link is for. Native and out of the box.
You can also set local paths in package.json
{
"name": "baz",
"dependencies": {
"bar": "file:../foo/bar"
}
}

Resources