Nodejs Fetch API: unable to verify the first certificate - node.js

I'm using the fetch API module in my Philips Hue project and when I make a call to the local ip address (my hub) it produces that error in the title.
const fetch = require('node-fetch');
const gateway = "192.168.0.12";
const username = "username";
let getLights = function(){
fetch(`https://${gateway}/api/${username}/lights`, {
method: 'GET'
}).then((res) => {
return res.json();
}).then((json) => {
console.log(json);
});
}
module.exports = {getLights};
Any SECURE fix this will eventually go onto the public internet for me to access my lights from anywhere sooo?

To skip the SSL tests, you can use this:
process.env['NODE_TLS_REJECT_UNAUTHORIZED'] = 0;

It seems like you tried to access it using HTTPS. Most likely on your local network it is going to be HTTP
So by changing https://${gateway}/api/${username}/lights to http://${gateway}/api/${username}/lights should work.
If you're trying to keep it HTTPS then you will have to install a SSL certificate authority onto your network.
These may be useful sources if you're trying to get that done:
https://www.freecodecamp.org/news/how-to-get-https-working-on-your-local-development-environment-in-5-minutes-7af615770eec/
https://letsencrypt.org/docs/certificates-for-localhost/

Related

How does one secure api keys on sveltekit 1.0

I am using ghost, i made an integration and i would like to hide the api key from the front-end. I do not believe i can set restrictions on the ghost cms (that would also work). And i do believe so +page.js files are run on the browser also, so im a little confused on how to achieve this?
The interal sveltekit module $env/static/private (docs) is how you use secure API keys. Sveltekit will not allow you to import this module into client code so it provides an extra layer of safety. Vite automatically loads your enviroment variables from .env files and process.env on build and injects your key into your server side bundle.
import { API_KEY } from '$env/static/private';
// Use your secret
Sveltekit has 4 modules for accessing enviroment variables
$env/static/private (covered)
$env/static/public accessiable by server and client and injected at build (docs)
$env/dynamic/private provided by your runtime adapter; only includes variables with that do not start with the your public prefix which defaults to PUBLIC_ and can only be imported by server files (docs)
$env/dynamic/public provided by your runtime adapter; only includes variables with that do start with the your public prefix which defaults to PUBLIC_ (docs)
You don't need to hide the key.
Ghost Content API Docs:
These keys are safe for use in browsers and other insecure environments, as they only ever provide access to public data.
One common way to hide your third-party API key(s) from public view is to set up proxy API routes.
The general idea is to have your client (browser) query a proxy API route that you provide/host, have that proxy route query the third-party API using your credentials (API key), and pass on the results from the third-party API back to the client.
Because the query to the third-party API takes place exclusively on the back-end, your credentials are never exposed to the client (browser) and thus not visible to the public.
In your use case, you would have to create 3 dynamic endpoint routes to replicate the structure of Ghost's API:
src/routes/api/[resource]/+server.js to match /posts/, /authors/, /tags/, etc.:
const API_KEY = <your_api_key>; // preferably pulled from ENV
const GHOST_URL = `https://<your_ghost_admin_domain>/ghost/api/content`;
export function GET({ params, url }) {
const { resource } = params;
const queryString = url.searchParams.toString();
return fetch(`${GHOST_URL}/${resource}/?key=${API_KEY}${queryString ? `&${queryString}` : ''}`, {
headers: {
'Accept-Version': '5.0' // Ghost API Version setting
}
});
}
src/routes/api/[resource]/[id]/+server.js to match /posts/{id}/, /authors/{id}/, etc.:
const API_KEY = <your_api_key>; // preferably pulled from ENV
const GHOST_URL = `https://<your_ghost_admin_domain>/ghost/api/content`;
export function GET({ params, url }) {
const { resource, id } = params;
const queryString = url.searchParams.toString();
return fetch(`${GHOST_URL}/${resource}/${id}/?key=${API_KEY}${queryString ? `&${queryString}` : ''}`, {
headers: {
'Accept-Version': '5.0' // Ghost API Version setting
}
});
}
src/routes/api/[resource]/slug/[slug]/+server.js to match /posts/slug/{slug}/, /authors/slug/{slug}/, etc.:
const API_KEY = <your_api_key>; // preferably pulled from ENV
const GHOST_URL = `https://<your_ghost_admin_domain>/ghost/api/content`;
export function GET({ params, url }) {
const { resource, slug } = params;
const queryString = url.searchParams.toString();
return fetch(`${GHOST_URL}/${resource}/slug/${slug}/?key=${API_KEY}${queryString ? `&${queryString}` : ''}`, {
headers: {
'Accept-Version': '5.0' // Ghost API Version setting
}
});
}
Then all you have to do is call your proxy routes in place of your original third-party API routes in your app:
// very barebones example
<script>
let uri;
let data;
async function get() {
const res = await fetch(`/api/${uri}`);
data = await res.json();
}
</script>
<input name="uri" bind:value={uri} />
<button on:click={get}>GET</button>
{data}
Note that using proxy API routes will also have the additional benefit of sidestepping potential CORS issues.

How to make Nodejs connect to Metamask account?

I have a backend write with Nodejs that can connect to contracts and perform functions in those contracts but the problem is I want my Metamask can pass only the account address to the backend. Is there any solution for this?
there are some third party packages like node-metamask
you can use them
add this code snippet in your views (hbs template engine) / HTML file.
When you load this page, the script will execute and it will get the metamask wallet address if it's installed.
<script>
async function connect()
{
if (window.ethereum) {
await window.ethereum.request({ method: "eth_requestAccounts" });
window.web3 = new Web3(window.ethereum);
const account = web3.eth.accounts;
const walletAddress = account.givenProvider.selectedAddress;
console.log(`Wallet: ${walletAddress}`);
window.location.href = `/setup?wallet=${walletAddress}`;
}
else {
alert("MetaMask is not installed");
}
}
</script>
after that, It will send the wallet address at a specific backend route to the express.
hence, you can perform actions in node JS with the integration of web3.
app.get('/setup', (req, res) => {
const address = req.query.wallet
app.locals.address = address
console.log(app.locals.address)
})
here's complete code example: https://github.com/billypentester/web3-dapp

How to determine http vs https in nodejs / nextjs api handler

In order to properly build my urls in my xml sitemaps and rss feeds I want to determine if the webpage is currently served over http or https, so it also works locally in development.
export default function handler(req, res) {
const host = req.headers.host;
const proto = req.connection.encrypted ? "https" : "http";
//construct url for xml sitemaps
}
With above code however also on Vercel it still shows as being served over http. I would expect it to run as https. Is there a better way to figure out http vs https?
As Next.js api routes run behind a proxy which is offloading to http the protocol is http.
By changing the code to the following I was able to first check at what protocol the proxy runs.
const proto = req.headers["x-forwarded-proto"];
However this will break the thing in development where you are not running behind a proxy, or a different way of deploying the solution that might also not involve a proxy. To support both use cases I eventually ended up with the following code.
const proto =
req.headers["x-forwarded-proto"] || req.connection.encrypted
? "https"
: "http";
Whenever the x-forwarded-proto header is not present (undefined) we fall back to req.connection.encrypted to determine if we should serve on http vs https.
Now it works on localhost as well a Vercel deployment.
my solution:
export const getServerSideProps: GetServerSideProps = async (context: any) => {
// Fetch data from external API
const reqUrl = context.req.headers["referer"];
const url = new URL(reqUrl);
console.log('====================================');
console.log(url.protocol); // http
console.log('====================================');
// const res = await fetch(`${origin}/api/projets`)
// const data = await res.json()
// Pass data to the page via props
return { props: { data } }
}

How can two internal Cloud Run node.js microservices successfully communicate via gRPC?

New to Google Cloud Run and trying to have two node.js microservices communicate internally via gRPC.
The client interface:
constructor(address: string, credentials: grpc.ChannelCredentials, options?: object);
The client code:
const client: MyClient = new MyClient('my-service-abcdefgh3a-ew.a.run.app:443', grpc.credentials.createSsl());
The server code:
const server = new grpc.Server();
server.addService<IMyServer>(MyService, new MyServer());
server.bind(`0.0.0.0:${process.env.PORT}`, grpc.ServerCredentials.createInsecure());
server.start();
The server is set to listen to 443.
The above seems to work when the service is open to public requests but doesn't work when you have the server set as internal. Any ideas?
You have to add the credentials in the request metadata. Here an example
...
// Create a client for the protobuf spec
const client = new protoObj.Greeter(HOST, grpc.credentials.createInsecure());
// Build gRPC request
const metadata = new grpc.Metadata();
metadata.add('authorization', `Bearer ${JWT_AUTH_TOKEN}`);
// Execute gRPC request
client.sayHello({name: GREETEE}, metadata, (err, response) => {...
Second question, how to get the JWT_AUTH_TOKEN. Here the documentation of Cloud Run to do this. But not completely, simply get the token and use it in the metadata of the request
...
request(tokenRequestOptions)
.then((token) => {
// add the token to the metadata
});
// Make the call
...

How to View NodeJS SSL Negotiation Results

I'm creating a nodeJS server using HTTPS, similar to this:
var https = require('https');
function listener(req, res) {
// for example, I wish this worked...
console.log(req.chosen_cipher)
}
var httpsd = https.createServer(SslOptions, listener);
httpsd.listen(8081, opts.ip);
How can I find the SSL negotiation results (in particular the selected cipher), for example ECDHE-RSA-AES128-GCM-SHA256, etc.
I've serialized the req & res objects, but there doesn't seem to be any likely candidates.
Thanks!
Socket for current connection is req.client.
So, to get cipher and protocol call tlsSocket.getCipher():
function listener(req, res) {
console.log(req.client.getCipher());
// Possible result:
// { name: 'ECDHE-RSA-AES128-GCM-SHA256', version: 'TLSv1/SSLv3' }
}

Resources