how to modify https traffic in google chrome using nodejs? - node.js

Like a question, of course I didn't do it because of illegal behavior.
For example, I have a link: https://example.com/inj.php
The result I get for example is:
<h1>Hello world</h1>
How can I fix it using only nodejs code?
<h1>Hello world</h1>
<h2>inject</h2>

I think you need to create a proxy and that device needs to install and configure your self-signed CA.
I wrote a library for personal use, it works pretty well
npm i pms-proxy
As your question above, it can be written as
const https = await PPCa.generateCACertificate();
const spki = PPCa.generateSPKIFingerprint((<PPCaFileOptions>https).cert);
const userData = path.join('C:/test-chrome');
const server = new PPServerProxy({https});
const pass = new PPPassThroughHttpHandler();
pass.injectBuffer((req, buffer) => {
return {
data: buffer.toString() + "<h2>inject</h2>"
};
})
server.addRule().url('https://example.com/inj.php').then(pass);
await server.listen(1234);
// node module
child_process.exec(
`start chrome --proxy-server="http://127.0.0.1:1234" --ignore-certificate-errors-spki-list=\"${spki}\" --user-data-dir=\"${userData}\"`
);
If you don't want to use SPKI Fingerprint you can create a self-signed CA, follow the README in the package:
https://www.npmjs.com/package/pms-proxy

Related

How does one secure api keys on sveltekit 1.0

I am using ghost, i made an integration and i would like to hide the api key from the front-end. I do not believe i can set restrictions on the ghost cms (that would also work). And i do believe so +page.js files are run on the browser also, so im a little confused on how to achieve this?
The interal sveltekit module $env/static/private (docs) is how you use secure API keys. Sveltekit will not allow you to import this module into client code so it provides an extra layer of safety. Vite automatically loads your enviroment variables from .env files and process.env on build and injects your key into your server side bundle.
import { API_KEY } from '$env/static/private';
// Use your secret
Sveltekit has 4 modules for accessing enviroment variables
$env/static/private (covered)
$env/static/public accessiable by server and client and injected at build (docs)
$env/dynamic/private provided by your runtime adapter; only includes variables with that do not start with the your public prefix which defaults to PUBLIC_ and can only be imported by server files (docs)
$env/dynamic/public provided by your runtime adapter; only includes variables with that do start with the your public prefix which defaults to PUBLIC_ (docs)
You don't need to hide the key.
Ghost Content API Docs:
These keys are safe for use in browsers and other insecure environments, as they only ever provide access to public data.
One common way to hide your third-party API key(s) from public view is to set up proxy API routes.
The general idea is to have your client (browser) query a proxy API route that you provide/host, have that proxy route query the third-party API using your credentials (API key), and pass on the results from the third-party API back to the client.
Because the query to the third-party API takes place exclusively on the back-end, your credentials are never exposed to the client (browser) and thus not visible to the public.
In your use case, you would have to create 3 dynamic endpoint routes to replicate the structure of Ghost's API:
src/routes/api/[resource]/+server.js to match /posts/, /authors/, /tags/, etc.:
const API_KEY = <your_api_key>; // preferably pulled from ENV
const GHOST_URL = `https://<your_ghost_admin_domain>/ghost/api/content`;
export function GET({ params, url }) {
const { resource } = params;
const queryString = url.searchParams.toString();
return fetch(`${GHOST_URL}/${resource}/?key=${API_KEY}${queryString ? `&${queryString}` : ''}`, {
headers: {
'Accept-Version': '5.0' // Ghost API Version setting
}
});
}
src/routes/api/[resource]/[id]/+server.js to match /posts/{id}/, /authors/{id}/, etc.:
const API_KEY = <your_api_key>; // preferably pulled from ENV
const GHOST_URL = `https://<your_ghost_admin_domain>/ghost/api/content`;
export function GET({ params, url }) {
const { resource, id } = params;
const queryString = url.searchParams.toString();
return fetch(`${GHOST_URL}/${resource}/${id}/?key=${API_KEY}${queryString ? `&${queryString}` : ''}`, {
headers: {
'Accept-Version': '5.0' // Ghost API Version setting
}
});
}
src/routes/api/[resource]/slug/[slug]/+server.js to match /posts/slug/{slug}/, /authors/slug/{slug}/, etc.:
const API_KEY = <your_api_key>; // preferably pulled from ENV
const GHOST_URL = `https://<your_ghost_admin_domain>/ghost/api/content`;
export function GET({ params, url }) {
const { resource, slug } = params;
const queryString = url.searchParams.toString();
return fetch(`${GHOST_URL}/${resource}/slug/${slug}/?key=${API_KEY}${queryString ? `&${queryString}` : ''}`, {
headers: {
'Accept-Version': '5.0' // Ghost API Version setting
}
});
}
Then all you have to do is call your proxy routes in place of your original third-party API routes in your app:
// very barebones example
<script>
let uri;
let data;
async function get() {
const res = await fetch(`/api/${uri}`);
data = await res.json();
}
</script>
<input name="uri" bind:value={uri} />
<button on:click={get}>GET</button>
{data}
Note that using proxy API routes will also have the additional benefit of sidestepping potential CORS issues.

What is the correct way to create a cryptographically secure ID in BOTH node.js and javascript without libraries?

I'm tying to write an ES6 module that detects which environment it is running on and create a unique hex ID if it is node, or if it is browser. I do not want to use browser libraries or a bundler.
In node.js:
const crypto = require('crypto')
let id = crypto.randomBytes(48, function(err, buffer) {
return buffer.toString('hex')
})
console.log(id)
In both browser and node without a bundler:
???? unsure, from on my research on how to accomplish the same thing.
Node 15+ and modern browsers support the Web Crypto API
export async function setupHexRandom(){
const mycrypto = (typeof crypto !== 'undefined')
? crypto
: (await import('node:crypto')).webcrypto
if (!mycrypto) throw new Error('No crypto available or fallback to a Math.random implementation')
return function hexRandom(){
const bytes = mycrypto.getRandomValues(new Uint8Array(48))
return Array.from(bytes)
.map(b => b.toString(16).padStart(2, '0'))
.join('')
}
}
const hexRandom = await setupHexRandom()
console.log(hexRandom())

How to make Nodejs connect to Metamask account?

I have a backend write with Nodejs that can connect to contracts and perform functions in those contracts but the problem is I want my Metamask can pass only the account address to the backend. Is there any solution for this?
there are some third party packages like node-metamask
you can use them
add this code snippet in your views (hbs template engine) / HTML file.
When you load this page, the script will execute and it will get the metamask wallet address if it's installed.
<script>
async function connect()
{
if (window.ethereum) {
await window.ethereum.request({ method: "eth_requestAccounts" });
window.web3 = new Web3(window.ethereum);
const account = web3.eth.accounts;
const walletAddress = account.givenProvider.selectedAddress;
console.log(`Wallet: ${walletAddress}`);
window.location.href = `/setup?wallet=${walletAddress}`;
}
else {
alert("MetaMask is not installed");
}
}
</script>
after that, It will send the wallet address at a specific backend route to the express.
hence, you can perform actions in node JS with the integration of web3.
app.get('/setup', (req, res) => {
const address = req.query.wallet
app.locals.address = address
console.log(app.locals.address)
})
here's complete code example: https://github.com/billypentester/web3-dapp

Nodejs Fetch API: unable to verify the first certificate

I'm using the fetch API module in my Philips Hue project and when I make a call to the local ip address (my hub) it produces that error in the title.
const fetch = require('node-fetch');
const gateway = "192.168.0.12";
const username = "username";
let getLights = function(){
fetch(`https://${gateway}/api/${username}/lights`, {
method: 'GET'
}).then((res) => {
return res.json();
}).then((json) => {
console.log(json);
});
}
module.exports = {getLights};
Any SECURE fix this will eventually go onto the public internet for me to access my lights from anywhere sooo?
To skip the SSL tests, you can use this:
process.env['NODE_TLS_REJECT_UNAUTHORIZED'] = 0;
It seems like you tried to access it using HTTPS. Most likely on your local network it is going to be HTTP
So by changing https://${gateway}/api/${username}/lights to http://${gateway}/api/${username}/lights should work.
If you're trying to keep it HTTPS then you will have to install a SSL certificate authority onto your network.
These may be useful sources if you're trying to get that done:
https://www.freecodecamp.org/news/how-to-get-https-working-on-your-local-development-environment-in-5-minutes-7af615770eec/
https://letsencrypt.org/docs/certificates-for-localhost/

How can I use the unpkg url in my nodejs project?

Is it possible to use an unpkg.com url in my nodejs project?
I have a project setup to use nodejs https to get the url and it works but I am not sure how or if its possible yet to use that in my nodejs project like we do with a regular installed npm package.
const https = require('https');
const url = 'https://unpkg.com/#tensorflow-models/speech-commands#0.3.3/dist/speech-commands.min.js';
Then I need to use it here:
async function app() {
https.get(url, (res) => {
const statusCode = res.statusCode;
if (statusCode != 200) console.error(`Error ${statusCode}: ${res.statusMessage} ${url}.`);
else console.log('Success');
});
recognizer = speechCommands.create('BROWSER_FFT');
await recognizer.ensureModelLoaded();
predictWord();
};
The response comes back with success but I ultimately need to use the package via that url. I want to be able to substitute the instance speechCommands with the res from unpkg.com. Is this possible? The reason I am trying this is because when I use the npm package I get a fetch undefined error and since this package is rather new, there isn't an issues section setup in their repo to ask.

Resources