What is the correct way to create a cryptographically secure ID in BOTH node.js and javascript without libraries? - node.js

I'm tying to write an ES6 module that detects which environment it is running on and create a unique hex ID if it is node, or if it is browser. I do not want to use browser libraries or a bundler.
In node.js:
const crypto = require('crypto')
let id = crypto.randomBytes(48, function(err, buffer) {
return buffer.toString('hex')
})
console.log(id)
In both browser and node without a bundler:
???? unsure, from on my research on how to accomplish the same thing.

Node 15+ and modern browsers support the Web Crypto API
export async function setupHexRandom(){
const mycrypto = (typeof crypto !== 'undefined')
? crypto
: (await import('node:crypto')).webcrypto
if (!mycrypto) throw new Error('No crypto available or fallback to a Math.random implementation')
return function hexRandom(){
const bytes = mycrypto.getRandomValues(new Uint8Array(48))
return Array.from(bytes)
.map(b => b.toString(16).padStart(2, '0'))
.join('')
}
}
const hexRandom = await setupHexRandom()
console.log(hexRandom())

Related

How does one secure api keys on sveltekit 1.0

I am using ghost, i made an integration and i would like to hide the api key from the front-end. I do not believe i can set restrictions on the ghost cms (that would also work). And i do believe so +page.js files are run on the browser also, so im a little confused on how to achieve this?
The interal sveltekit module $env/static/private (docs) is how you use secure API keys. Sveltekit will not allow you to import this module into client code so it provides an extra layer of safety. Vite automatically loads your enviroment variables from .env files and process.env on build and injects your key into your server side bundle.
import { API_KEY } from '$env/static/private';
// Use your secret
Sveltekit has 4 modules for accessing enviroment variables
$env/static/private (covered)
$env/static/public accessiable by server and client and injected at build (docs)
$env/dynamic/private provided by your runtime adapter; only includes variables with that do not start with the your public prefix which defaults to PUBLIC_ and can only be imported by server files (docs)
$env/dynamic/public provided by your runtime adapter; only includes variables with that do start with the your public prefix which defaults to PUBLIC_ (docs)
You don't need to hide the key.
Ghost Content API Docs:
These keys are safe for use in browsers and other insecure environments, as they only ever provide access to public data.
One common way to hide your third-party API key(s) from public view is to set up proxy API routes.
The general idea is to have your client (browser) query a proxy API route that you provide/host, have that proxy route query the third-party API using your credentials (API key), and pass on the results from the third-party API back to the client.
Because the query to the third-party API takes place exclusively on the back-end, your credentials are never exposed to the client (browser) and thus not visible to the public.
In your use case, you would have to create 3 dynamic endpoint routes to replicate the structure of Ghost's API:
src/routes/api/[resource]/+server.js to match /posts/, /authors/, /tags/, etc.:
const API_KEY = <your_api_key>; // preferably pulled from ENV
const GHOST_URL = `https://<your_ghost_admin_domain>/ghost/api/content`;
export function GET({ params, url }) {
const { resource } = params;
const queryString = url.searchParams.toString();
return fetch(`${GHOST_URL}/${resource}/?key=${API_KEY}${queryString ? `&${queryString}` : ''}`, {
headers: {
'Accept-Version': '5.0' // Ghost API Version setting
}
});
}
src/routes/api/[resource]/[id]/+server.js to match /posts/{id}/, /authors/{id}/, etc.:
const API_KEY = <your_api_key>; // preferably pulled from ENV
const GHOST_URL = `https://<your_ghost_admin_domain>/ghost/api/content`;
export function GET({ params, url }) {
const { resource, id } = params;
const queryString = url.searchParams.toString();
return fetch(`${GHOST_URL}/${resource}/${id}/?key=${API_KEY}${queryString ? `&${queryString}` : ''}`, {
headers: {
'Accept-Version': '5.0' // Ghost API Version setting
}
});
}
src/routes/api/[resource]/slug/[slug]/+server.js to match /posts/slug/{slug}/, /authors/slug/{slug}/, etc.:
const API_KEY = <your_api_key>; // preferably pulled from ENV
const GHOST_URL = `https://<your_ghost_admin_domain>/ghost/api/content`;
export function GET({ params, url }) {
const { resource, slug } = params;
const queryString = url.searchParams.toString();
return fetch(`${GHOST_URL}/${resource}/slug/${slug}/?key=${API_KEY}${queryString ? `&${queryString}` : ''}`, {
headers: {
'Accept-Version': '5.0' // Ghost API Version setting
}
});
}
Then all you have to do is call your proxy routes in place of your original third-party API routes in your app:
// very barebones example
<script>
let uri;
let data;
async function get() {
const res = await fetch(`/api/${uri}`);
data = await res.json();
}
</script>
<input name="uri" bind:value={uri} />
<button on:click={get}>GET</button>
{data}
Note that using proxy API routes will also have the additional benefit of sidestepping potential CORS issues.

how to modify https traffic in google chrome using nodejs?

Like a question, of course I didn't do it because of illegal behavior.
For example, I have a link: https://example.com/inj.php
The result I get for example is:
<h1>Hello world</h1>
How can I fix it using only nodejs code?
<h1>Hello world</h1>
<h2>inject</h2>
I think you need to create a proxy and that device needs to install and configure your self-signed CA.
I wrote a library for personal use, it works pretty well
npm i pms-proxy
As your question above, it can be written as
const https = await PPCa.generateCACertificate();
const spki = PPCa.generateSPKIFingerprint((<PPCaFileOptions>https).cert);
const userData = path.join('C:/test-chrome');
const server = new PPServerProxy({https});
const pass = new PPPassThroughHttpHandler();
pass.injectBuffer((req, buffer) => {
return {
data: buffer.toString() + "<h2>inject</h2>"
};
})
server.addRule().url('https://example.com/inj.php').then(pass);
await server.listen(1234);
// node module
child_process.exec(
`start chrome --proxy-server="http://127.0.0.1:1234" --ignore-certificate-errors-spki-list=\"${spki}\" --user-data-dir=\"${userData}\"`
);
If you don't want to use SPKI Fingerprint you can create a self-signed CA, follow the README in the package:
https://www.npmjs.com/package/pms-proxy

Shopify API Node/Express Cannot read properties of undefined (reading 'Rest')

Just starting off with Shopify, and trying to get an order. Following the Shopify API documentation, here is my code:
const Shopify = require('#shopify/shopify-api');
const client = new Shopify.Clients.Rest('my-store.myshopify.com',
process.env.SHOPIFY_KEY);
module.exports.getShopifyOrderById = async (orderId) => {
return await client.get({
path: `orders/${orderId}`,
});
}
I get the following error when I execute this code:
TypeError: Cannot read properties of undefined (reading 'Rest')
Can't seem to figure out what the issue is.
You need to use Object destructing to get the Shopify object or use default export like below.
const { Shopify } = require('#shopify/shopify-api');
const client = new Shopify.Clients.Rest('my-store.myshopify.com',
process.env.SHOPIFY_KEY);
OR
const Shopify = require('#shopify/shopify-api').default;
const client = new Shopify.Clients.Rest('my-store.myshopify.com',
process.env.SHOPIFY_KEY);
OR
const ShopifyLib = require('#shopify/shopify-api');
const client = new ShopifyLib.Shopify.Clients.Rest('my-store.myshopify.com',
process.env.SHOPIFY_KEY);
This has to do with how ES6 modules are emulated in CommonJS and how you import the module. You can read about that here.

confused about node-localstorage

so I'm making a site with node js, and I need to use localstorage, so I'm using the node-localstorage library. So basically, in one file I add data to it, and in another file I want to retrieve it. I'm not 100% sure about how to retrieve it. I know I need to use localStorage.getItem to retrieve it, but do I need to include localStorage = new LocalStorage('./scratch');? So I was wondering what the localStorage = new LocalStorage('./scratch'); did. So here is my code for adding data:
const ls = require('node-localstorage');
const express = require("express");
const router = express.Router();
router.route("/").post((req, res, next) => {
var localStorage = new ls.LocalStorage('./scratch');
if(req.body.name != undefined){
localStorage.setItem("user", req.body.name);
res.redirect('/')
}
else{
console.log("undefind")
}
});
module.exports = router;
If my question is confusing, I just want to know what var localStorage = new ls.LocalStorage('./scratch'); does.
A drop-in substitute for the browser native localStorage API that runs on node.js.
It creates an instance of the "localStorage" class, which this library provides. The constructor expects the location of the file, the scripts stores the key, value elements in.
Opinion: This looks pointless to me - I guess it fits your use case.

How to extract data using async and await function in node js?

I tried redis cache implement in node js using mongodb.I set the data in cache.but i cant get the data in cache.how to solve this issue.
cache.js
async function Get_Value(){
let response = await client.get('products')
console.log("_______________")
console.log(response)
}
I got output : true
Excepted output: json data
how to get json data using cache get method
Redis does not provide a full async await adapter for node.js so usually as a workaround people are promisifying the lib.
const { promisify } = require('util');
const getAsync = promisify(client.get).bind(client);
async function getValue(){
let response = await getAsync("products");
}
An other approach to promisify the entire redis library you can use:
const redis = require('redis');
bluebird.promisifyAll(redis);
Now you will be able also to use the methods using async/await.

Resources