Github API with octokit- app authentication for create a repository dispatch event gives 404 not found - github-api

I'm trying to authenticate through a Github app to launch a repository dispatch event as per these docs. The app has the permissions metadata:read and contents:read&write as mentioned in the docs.
With the code below, authentication succeeds, but I get a 404 not found error.
const authDetails = {
appId: secrets.appId,
privateKey: privateKey,
clientId: secrets.clientId,
clientSecret: secrets.clientSecret,
}
const auth = createAppAuth(authDetails);
(async() => {
const appAuthentication = await auth({type: "app"});
const appOctokit = new Octokit(appAuthentication);
appOctokit.request('POST /repos/org_name/repo_name/dispatches', {
event_type: 'my_action'
}).then(res => {
console.log(res);
}).catch(err => {
console.log(err)
})
})()
If I authenticate using a personal access token (code below), the request succeeds, and the action my_action is fired as expected.
const patOctokit = new Octokit({ auth: `***` });
(async() => {
patOctokit.request('POST /repos/org_name/repo_name/dispatches', {
event_type: 'my_action'
}).then(res => {
console.log(res);
}).catch(err => {
console.log(err)
})
})()
Comparing the responses, I can see that the PAT method (204) includes the header 'x-oauth-scopes': 'repo' where the app method (404) does not.
The repo in question is also owned by an organisation- I'm not sure if that could alter the required permissions.

Related

How to make through Google's IAP in Cypress?

I come with a question because I have an application, it has IAP as a proxy before seeing the rest of the page. I want to test this page, however, IAP is standing in my way. I'm trying to wade through the docs, but I don't understand which way of authentication I may need to use.
I tried with Application Default Credentials as I have that JSON auth file and env pointing at that, and after going to the site, I get a 403 that I can't access the site. Maybe I'm doing something wrong in general?
cypress.config.ts
import { GoogleAuth } from 'google-auth-library';
const GOOGLE_PROJECT_ID = 1234567;
async setupNodeEvents(on, config): Promise<Cypress.PluginConfigOptions> {
// ... other functions, working with async/await
on('task', {
getIAPToken: async () => {
console.log('🔥 Initialized GoogleAuth');
const auth = new GoogleAuth({
scopes: ['https://www.googleapis.com/auth/cloud-platform'],
});
const client = await auth.getClient();
const projectId = GOOGLE_PROJECT_ID;
const iapUrl = `https://iap.googleapis.com/projects/${projectId}/iap_web/auth`;
console.log('🔥 Requesting IAP');
const response = await client.request({ url: iapUrl });
return response;
},
});
return config;
},
test.cy.ts
describe('Smoke tests', () => {
it('content is not visible', () => {
cy.visit("https://some-page-behind-iap.com");
cy.contains("Content!").should(content => expect(content).to.not.be.visible;
});
it('content is visible', () => {
cy.visit("https://some-page-behind-iap.com");
cy.task('getIAPToken').then((data) => {
console.log("loaded:", data);
});
cy.contains("Content!").should(content => expect(content).to.be.visible;
});
});

How to Authenticate a Service Account for the Gmail API in Node Js?

So I'm using the Node.js Gmail library to send an email to another user. I was thinking of using a Service account to do just that. I've followed their documentation of passing the keyFile property but when I try to run the code, I get a 401 error, Login Required.
Here's what I got so far:
const { gmail } = require("#googleapis/gmail");
function createMessage(from, to, subject, message) {
// logic that returns base64 email
const encodedMail=[...];
return encodedMail;
}
export default function handler(req, res) {
const auth = gmail({
version: "v1",
keyFile: './google_service.json',
scopes: ["https://www.googleapis.com/auth/gmail.send"],
});
const raw = createMessage(
process.env.SERVICE_EMAIL,
"someone#gmail.com",
"Subject",
"This is a test",
);
const post = auth.users.messages.send({
userId: "me",
requestBody: {
raw,
},
});
post
.then((result) => {
console.log(result.data);
})
.catch((err) => {
console.log(err);
});
}
I've already got my Service Account credential json file and placed it at the root of my project. Is there something I'm doing wrong?

Secure a GraphQL API with passport + JWT's or sessions? (with example)

To give a bit of context: I am writing an API to serve a internal CMS in React that requires Google login and a React Native app that should support SMS, email and Apple login, I am stuck on what way of authentication would be the best, I currently have an example auth flow below where a team member signs in using Google, a refresh token gets sent in a httpOnly cookie and is stored in a variable in the client, then the token can be exchanged for an accessToken, the refresh token in the cookie also has a tokenVersion which is checked before sending an accessToken which does add some extra load to the database but can be incremented if somebody got their account stolen, before any GraphQL queries / mutations are allowed, the user's token is decoded and added to the GraphQL context so I can check the roles using graphql-shield and access the user for db operations in my queries / mutations if needed
Because I am still hitting the database even if it's only one once on page / app load I wonder if this is a good approach or if I would be better off using sessions instead
// index.ts
import "./passport"
const main = () => {
const server = fastify({ logger })
const prisma = new PrismaClient()
const apolloServer = new ApolloServer({
schema: applyMiddleware(schema, permissions),
context: (request: Omit<Context, "prisma">) => ({ ...request, prisma }),
tracing: __DEV__,
})
server.register(fastifyCookie)
server.register(apolloServer.createHandler())
server.register(fastifyPassport.initialize())
server.get(
"/auth/google",
{
preValidation: fastifyPassport.authenticate("google", {
scope: ["profile", "email"],
session: false,
}),
},
// eslint-disable-next-line #typescript-eslint/no-empty-function
async () => {}
)
server.get(
"/auth/google/callback",
{
preValidation: fastifyPassport.authorize("google", { session: false }),
},
async (request, reply) => {
// Store user in database
// const user = existingOrCreatedUser
// sendRefreshToken(user, reply) < send httpOnly cookie to client
// const accessToken = createAccessToken(user)
// reply.send({ accessToken, user }) < send accessToken
}
)
server.get("/refresh_token", async (request, reply) => {
const token = request.cookies.fid
if (!token) {
return reply.send({ accessToken: "" })
}
let payload
try {
payload = verify(token, secret)
} catch {
return reply.send({ accessToken: "" })
}
const user = await prisma.user.findUnique({
where: { id: payload.userId },
})
if (!user) {
return reply.send({ accessToken: "" })
}
// Check live tokenVersion against user's one in case it was incremented
if (user.tokenVersion !== payload.tokenVersion) {
return reply.send({ accessToken: "" })
}
sendRefreshToken(user, reply)
return reply.send({ accessToken: createAccessToken(user) })
})
server.listen(port)
}
// passport.ts
import fastifyPassport from "fastify-passport"
import { OAuth2Strategy } from "passport-google-oauth"
fastifyPassport.registerUserSerializer(async (user) => user)
fastifyPassport.registerUserDeserializer(async (user) => user)
fastifyPassport.use(
new OAuth2Strategy(
{
clientID: process.env.GOOGLE_CLIENT_ID,
clientSecret: process.env.GOOGLE_CLIENT_SECRET,
callbackURL: "http://localhost:4000/auth/google/callback",
},
(_accessToken, _refreshToken, profile, done) => done(undefined, profile)
)
)
// permissions/index.ts
import { shield } from "graphql-shield"
import { rules } from "./rules"
export const permissions = shield({
Mutation: {
createOneShopLocation: rules.isAuthenticatedUser,
},
})
// permissions/rules.ts
import { rule } from "graphql-shield"
import { Context } from "../context"
export const rules = {
isAuthenticatedUser: rule()(async (_parent, _args, ctx: Context) => {
const authorization = ctx.request.headers.authorization
if (!authorization) {
return false
}
try {
const token = authorization.replace("Bearer", "")
const payload = verify(token, secret)
// mutative
ctx.payload = payload
return true
} catch {
return false
}
}),
}
To answer your question directly, you want to be using jwts for access and that's it. These jwts should be created tied to a user session, but you don't want to have to manage them. You want a user identity aggregator to do it.
You are better off removing most of the code to handle user login/refresh and use a user identity aggregator. You are running into common problems of the complexity when handling the user auth flow which is why these exist.
The most common is Auth0, but the price and complexity may not match your expectations. I would suggest going through the list and picking the one that best supports your use cases:
Auth0
Okta
Firebase
Cognito
Authress
Or you can check out this article which suggests a bunch of different alternatives as well as what they focus on

OAuth2 idToken is verified, but Login Required error persists

So cutting right to the chase...
I'm integrating an Angular front-end with an express back-end.
The login happens using angularx-social-login in the front-end(trying to avoid redirects) and the idToken is sent to the back-end for verification. The scopes are added during the login at the front-end.
After using google-auth-library to verify the token everything is good and correct.
But once a service.people.connections.list() for getting google contacts is called a Login Required error persist. I've tried using the access token I get at the front-end, the payload I get from the verification and all that... I'm sure I'm missing a single step, but I have no clue.
req.headers.authorization here is the idToken.
const client = new OAuth2Client(CLIENT_ID);
async function verify() {
const ticket = await client.verifyIdToken({
idToken: req.headers.authorization,
audience: CLIENT_ID
});
const payload = ticket.getPayload();
console.log(payload);
const userid = payload['sub'];
const service = google.people({ version: 'v1' });
service.people.connections.list({
resourceName: 'people/me',
pageSize: 10,
personFields: 'names,emailAddresses',
}, (err, res) => {
if (err) return console.error('The API returned an error: ' + err);
const connections = res.data.connections;
if (connections) {
console.log('Connections:');
connections.forEach((person) => {
if (person.names && person.names.length > 0) {
console.log(person.names[0].displayName);
} else {
console.log('No display name found for connection.');
}
});
} else {
console.log('No connections found.');
}
});
};

MSAL js, AAD B2C Multi Factor Authentication, 400 Bad Request, Request header too long

MSAL js Version: v0.2.4;
Chrome Version: 79.0.3945.88 (Official Build) (64-bit)
From the various post It is understood that due to cookies piled up, we are seeing '400 Bad Request - Request header too long', But it is not happening in all my developer environments.
I would like to know, why it is not with local environments (running from VS Code) but in deployed environments(Azure App Service)
I can update the MSAL package to latest version, but at the same time previously it was working fine in deployed environments but not now, why?
Is there any connection with scope error message (AADB2C90055) with 'Bad Request - Request header too long' ?
AADB2C90055: The scope 'openid profile' must specify resource
Any sort of information will be useful to me or other folks, and thanks in advance
Here is the Code being used in My App,
let userAgentApplication: Msal.UserAgentApplication;
const createAuthorityUrl = (tenantId: string, policy: string) => {
return `https://${tenantId}.b2clogin.com/tfp/${tenantId}.onmicrosoft.com/${policy}`;
};
export const b2cLogin = (config: B2CConfig) => {
const msalAppConfig = {
cacheLocation: 'localStorage',
redirectUri: `${location.protocol}//${location.host}`,
navigateToLoginRequestUrl: false,
storeAuthStateInCookie: true,
validateAuthority: false,
};
const { clientId, tenantId, myb2cSigninPolicy, myb2cPasswordResetPolicy } = config;
return new Promise(resolve => {
let handlingPasswordReset = false;
const app = new Msal.UserAgentApplication(
clientId,
createAuthorityUrl(tenantId, myb2cSigninPolicy),
(errorDesc: string, token: string) => {
if (errorDesc && errorDesc.indexOf('AADB2C90118') > -1) {
// user forgot password
// https://github.com/Azure-Samples/active-directory-b2c-javascript-msal-singlepageapp/issues/9#issuecomment-347556074
handlingPasswordReset = true;
new Msal.UserAgentApplication(
clientId,
createAuthorityUrl(tenantId, myb2cPasswordResetPolicy),
() => null,
msalAppConfig,
).loginRedirect();
}
return resolve(token);
},
msalAppConfig,
);
if (!handlingPasswordReset) {
userAgentApplication = app;
}
// Seems that MSAL's acquireTokenSilent() won't resolve if run within an iframe
if (window.parent !== window) {
return resolve('');
}
if (!userAgentApplication.isCallback(location.hash)) resolve(getAccessToken());
});
};
export const getAccessToken = async (): Promise<string> => {
if (!userAgentApplication) {
throw new Error('getAccessToken attempted before authentication initialized');
}
try {
return await userAgentApplication.acquireTokenSilent(['openid']);
} catch (error) {
console.log(error);
return '';
}
};
The error HTTP 400: Size of header request is too long generally happens because there's too many cookies or cookies that are too big.
reference:
Azure Portal: Bad Request - Request Too Long

Resources