How to integrate OIDC Provider in Node jS - node.js

I tried to Integrate OIDC Provider to Node JS and I have a Sample Code. So, I run this Sample code it's throwing an error(unrecognized route or not allowed method (GET on /api/v1/.well-known/openid-configuration)).The problem is Issuer(https://localhost:3000) this Issuer is working fine. but i will change this Issuer((https://localhost:3000/api/v1/)) it's not working How to fix this Issue and I facing another issue also when I implement oldc-provider in node js. They Routes are override how to fix this issue
Sample.js
const { Provider } = require('oidc-provider');
const configuration = {
// ... see available options /docs
clients: [{
client_id: 'foo',
client_secret: 'bar',
redirect_uris: ['http://localhost:3000/api/v1/'],
true_provider: "pcc"
// + other client properties
}],
};
const oidc = new Provider('http://localhost:3000/api/v1/', configuration);
// express/nodejs style application callback (req, res, next) for use with express apps, see /examples/express.js
oidc.callback()
// or just expose a server standalone, see /examples/standalone.js
const server = oidc.listen(3000, () => {
console.log('oidc-provider listening on port 3000, check http://localhost:3000/api/v1/.well-known/openid-configuration');
});
Error

Defining Issuer Identifier with a path component does not affect anything route-wise.
You have two options, either mount the provider to a path (see docs), or define the actual paths you want for each endpoint to be prefixed (see docs).
I think you're looking for a way to mount, so the first one.

Related

How does one secure api keys on sveltekit 1.0

I am using ghost, i made an integration and i would like to hide the api key from the front-end. I do not believe i can set restrictions on the ghost cms (that would also work). And i do believe so +page.js files are run on the browser also, so im a little confused on how to achieve this?
The interal sveltekit module $env/static/private (docs) is how you use secure API keys. Sveltekit will not allow you to import this module into client code so it provides an extra layer of safety. Vite automatically loads your enviroment variables from .env files and process.env on build and injects your key into your server side bundle.
import { API_KEY } from '$env/static/private';
// Use your secret
Sveltekit has 4 modules for accessing enviroment variables
$env/static/private (covered)
$env/static/public accessiable by server and client and injected at build (docs)
$env/dynamic/private provided by your runtime adapter; only includes variables with that do not start with the your public prefix which defaults to PUBLIC_ and can only be imported by server files (docs)
$env/dynamic/public provided by your runtime adapter; only includes variables with that do start with the your public prefix which defaults to PUBLIC_ (docs)
You don't need to hide the key.
Ghost Content API Docs:
These keys are safe for use in browsers and other insecure environments, as they only ever provide access to public data.
One common way to hide your third-party API key(s) from public view is to set up proxy API routes.
The general idea is to have your client (browser) query a proxy API route that you provide/host, have that proxy route query the third-party API using your credentials (API key), and pass on the results from the third-party API back to the client.
Because the query to the third-party API takes place exclusively on the back-end, your credentials are never exposed to the client (browser) and thus not visible to the public.
In your use case, you would have to create 3 dynamic endpoint routes to replicate the structure of Ghost's API:
src/routes/api/[resource]/+server.js to match /posts/, /authors/, /tags/, etc.:
const API_KEY = <your_api_key>; // preferably pulled from ENV
const GHOST_URL = `https://<your_ghost_admin_domain>/ghost/api/content`;
export function GET({ params, url }) {
const { resource } = params;
const queryString = url.searchParams.toString();
return fetch(`${GHOST_URL}/${resource}/?key=${API_KEY}${queryString ? `&${queryString}` : ''}`, {
headers: {
'Accept-Version': '5.0' // Ghost API Version setting
}
});
}
src/routes/api/[resource]/[id]/+server.js to match /posts/{id}/, /authors/{id}/, etc.:
const API_KEY = <your_api_key>; // preferably pulled from ENV
const GHOST_URL = `https://<your_ghost_admin_domain>/ghost/api/content`;
export function GET({ params, url }) {
const { resource, id } = params;
const queryString = url.searchParams.toString();
return fetch(`${GHOST_URL}/${resource}/${id}/?key=${API_KEY}${queryString ? `&${queryString}` : ''}`, {
headers: {
'Accept-Version': '5.0' // Ghost API Version setting
}
});
}
src/routes/api/[resource]/slug/[slug]/+server.js to match /posts/slug/{slug}/, /authors/slug/{slug}/, etc.:
const API_KEY = <your_api_key>; // preferably pulled from ENV
const GHOST_URL = `https://<your_ghost_admin_domain>/ghost/api/content`;
export function GET({ params, url }) {
const { resource, slug } = params;
const queryString = url.searchParams.toString();
return fetch(`${GHOST_URL}/${resource}/slug/${slug}/?key=${API_KEY}${queryString ? `&${queryString}` : ''}`, {
headers: {
'Accept-Version': '5.0' // Ghost API Version setting
}
});
}
Then all you have to do is call your proxy routes in place of your original third-party API routes in your app:
// very barebones example
<script>
let uri;
let data;
async function get() {
const res = await fetch(`/api/${uri}`);
data = await res.json();
}
</script>
<input name="uri" bind:value={uri} />
<button on:click={get}>GET</button>
{data}
Note that using proxy API routes will also have the additional benefit of sidestepping potential CORS issues.

"Google Auth Library: Node.js" expects localhost:3000 as redirect uri

I am using #google-cloud/local-auth to auth an application on a node back end. I have set up a OAuth 2.0 Client IDs with the type of Web application on the google console and pointed the Authorized redirect URIs to http://localhost:3001 because that is the port I am using for dev, however I can see that the source code for #google-cloud/local-auth/build/src/index.js:56:15 clearly expects 3000
if (redirectUri.length === 0 ||
parts.port !== '3000' ||
parts.hostname !== 'localhost' ||
parts.pathname !== '/oauth2callback') {
throw new Error(invalidRedirectUri);
}
Why would this be hardcoded, what happens when you use a diferent port?
I am adding my code for context
import gsc from '#googleapis/searchconsole'
import { authenticate } from '#google-cloud/local-auth'
import { resolve } from 'path'
export default async () => {
// Obtain user credentials to use for the request
const auth = await authenticate({
keyfilePath: resolve('requests/client_secret.json'),
scopes: [
'https://www.googleapis.com/auth/webmasters',
'https://www.googleapis.com/auth/webmasters.readonly'
]
})
google.options({ auth })
const res = await gsc.sites.list({})
console.log(res.data)
}
This library is meant to demonstrate authentication for sample purposes; it should be treated as a starting point for building an application and is not a general-purpose solution. Read it here
That port number is expected since this library is for demonstration purposes, if you are developing a production-ready application, you need to use something like https://github.com/googleapis/google-auth-library-nodejs

How to make Passport.js work in Adonis Framework

I wanted to know if Passport.js can ONLY be used in an Express framework and not in any other? The docs doesn't completely answer my question. I'm in the middle of migrating my project from Express to Adonis.js and I can't seem to make passport work. Here is a sample of my code:
const passport = use('passport')
const bearer = use('./bearer')
passport.use('bearer', bearer)
module.exports = passport
and here is how I register it:
const namedMiddleware = {
auth: 'Adonis/Middleware/Auth',
guest: 'Adonis/Middleware/AllowGuestOnly',
bearer: passport.authenticate(['bearer'], { session: false }),
}
this is the usage (I provided a bearer token):
Route.post('/', ({ response }) => {
response.json('Hello world')
}).middleware(['bearer'])
It does not work. Error about res.setHeader is not a function showing. Maybe because the resoponse and http structure is different in adonis?
I know that Adonis has its own authentication library but my INITIAL goal is to get what I have now in Express to work in an Adonis environment before making any library changes to avoid any complications.
I recently migrated from knex to adonis.js as well. Integrating passport.js was initially painful but I get it to work with Macros.
For your error, Adonis' Request object has no setHeader. You will need to create a macro on Request for that function. Something like this
function setHeader (name, value) {
this.header(name, value)
}
Response.macro('setHeader', setHeader)
Add that to a provider or hooks and you should be all set.

How to remove authentication for introspection query in Graphql

so may be this is very basic question so please bear with me. Let me explain what I am doing and what I really need.
EXPLANATION
I have created a graphql server by using ApolloGraphql (apollo-server-express npm module).
Here is the code snippet to give you an idea.
api.js
import express from 'express'
import rootSchema from './root-schema'
.... // some extra code
app = express.router()
app.use(jwtaAuthenticator) // --> this code authenticates Authorization header
.... // some more middleware's added
const graphQLServer = new ApolloServer({
schema: rootSchema, // --> this is root schema object
context: context => context,
introspection: true,
})
graphQLServer.applyMiddleware({ app, path: '/graphql' })
server.js
import http from 'http'
import express from 'express'
import apiRouter from './api' // --> the above file
const app = express()
app.use([some middlewares])
app.use('/', apiRouter)
....
....
export async function init () {
try {
const httpServer = http.createServer(app)
httpServer
.listen(PORT)
.on('error', (err) => { setTimeout(() => process.exit(1), 5000) })
} catch (err) {
setTimeout(() => process.exit(1), 5000)
}
console.log('Server started --- ', PORT)
}
export default app
index.js
require('babel-core')
require('babel-polyfill')
require = require('esm')(module/* , options */)
const server = require('./server.js') // --> the above file
server.init()
PROBLEM STATEMENT
I am using node index.js to start the app. So, the app is expecting Authorization header (JWT token) to be present all the times, even for the introspection query. But this is not what I want, I want that introspection query will be resolvable even without the token. So that anyone can see the documentation.
Please shed some light and please guide what is the best approach to do so. Happy coding :)
.startsWith('query Introspection') is insecure because any query can be named Introspection.
The better approach is to check the whole query.
First import graphql and prepare introspection query string:
const { parse, print, getIntrospectionQuery } = require('graphql');
// format introspection query same way as apollo tooling do
const introspectionQuery = print(parse(getIntrospectionQuery()));
Then in Apollo Server configuration check query:
context: ({ req }) => {
// allow introspection query
if (req.body.query === introspectionQuery) {
return {};
}
// continue
}
There's a ton of different ways to handle authorization in GraphQL, as illustrated in the docs:
Adding middleware for express (or some other framework like hapi or koa)
Checking for authorization inside individual resolvers
Checking for authorization inside your data models
Utilizing custom directives
Adding express middleware is great for preventing unauthorized access to your entire schema. If you want to allow unauthenticated access to some fields but not others, it's generally recommended you move your authorization logic from the framework layer to the GraphQL or data model layer using one of the methods above.
So finally I found the solution and here is what I did.
Let me first tell you that there were 2 middle-wares added on base path. Like this:
app //--> this is express.Router()
.use(jwtMw) // ---> these are middlewares
.use(otherMw)
The jwtMw is the one that checks the authentication of the user, and since even introspection query comes under this MW, it used to authenticate that as well. So, after some research I found this solution:
jwtMw.js
function addJWTMeta (req, res, next) {
// we can check for null OR undefined and all, then check for query Introspection, with better condition like with ignore case
if (req.body.query.trim().startsWith('query Introspection')) {
req.isIntrospection = true
return next()
}
...
...
// ---> extra code to do authentication of the USER based on the Authorization header
}
export default addJWTMeta
otherMw.js
function otherMw (req, res, next) {
if (req.isIntrospection) return next()
...
...
// ---> extra code to do some other context creation
}
export default otherMw
So here in jwtMw.js we are checking that if the query is Introspection just add a variable in req object and move forward, and in next middleware after the jwtMw.js whosoever wants to check for introspection query just check for that variable (isIntrospection, in this case) and if it is present and is true, please move on. We can add this code and scale to every middleware that if req.isIntrospection is there just carry on or do the actual processing otherwise.
Happy coding :)

Feathers Authentication via POST works but socket doesn't

Here is a repo showing my latest progress, and here is my configuration. As it stands that repo now doesn't even authenticate with REST - although I think something is wrong with socket auth that needs to be looked at.
I configured feathers, was able to create a user REST-fully with Postman, and even get an auth token (I can post to /authenticate to get a token, and then verify that token - yay postman! yay REST api!).
But in the browser the story ain't so happy. I can use find to get data back, but authenticate just gives me errors.
In my googling I found this post and updated my client javascript to be this. I have also tried doing jwt auth with the token from postman, but that gives the same Missing Credentials error. Halp!
Code incoming...
app.js (only the configuration part to show order)
app.configure(configuration(path.join(__dirname, '..')))
.use(cors())
.use(helmet()) // best security practices
.use(compress())
.use(favicon(path.join(app.get('public'), 'favicon.ico')))
.use('/', feathers.static(app.get('public')))
.configure(socketio())
.configure(rest())
.configure(hooks())
.use(bodyParser.json())
.use(bodyParser.urlencoded({ extended: true }))
.configure(services) // pull in all services from services/index.js
.configure(middleware) // middleware from middleware/index.js
.hooks(appHooks)
Within services, I first add authentication, which is in its own file and that looks like this
authentication.js
const authentication = require('feathers-authentication');
const jwt = require('feathers-authentication-jwt');
const local = require('feathers-authentication-local');
const authManagement = require('feathers-authentication-management');
module.exports = function () {
const app = this;
const config = app.get('authentication');
// Set up authentication with the secret
app.configure(authentication(config));
app.configure(authManagement(config));
app.configure(jwt());
app.configure(local(config.local));
// The `authentication` service is used to create a JWT.
// The before `create` hook registers strategies that can be used
// to create a new valid JWT (e.g. local or oauth2)
app.service('authentication').hooks({
before: {
create: [
authentication.hooks.authenticate(config.strategies)
],
remove: [
authentication.hooks.authenticate('jwt')
]
}
});
};
index.html (mostly stripped, just showing relevant script)
let url = location.protocol + '//' + location.hostname +
(location.port
? ':' + location.port
: '');
const socket = io(url);
const feathersClient = feathers()
.configure(feathers.socketio(socket))
.configure(feathers.hooks())
.configure(feathers.authentication({ storage: window.localStorage }));
Here's a screen shot showing some requests in chrome debugger and postman.
When default.json is set to use 'username' as the usernameField it outputs my Windows username, 'Matt'. This is because feathers-configuration checks to see if a value is a part of the OS environment.
https://github.com/feathersjs/feathers-configuration/blob/master/src/index.js#L26
Workaround
Manually set this configuration in authentication.js, for example:
// Set up authentication with the secret
const localConfig = {
'entity': 'users',
'service': 'users',
'usernameField': 'username',
'passwordField': 'password'
};
app.configure(authentication(config));
app.configure(local(localConfig));

Resources