Access HashiCorp Vault KV secret using node-vault - node.js

I'm trying to access HashiCorp Vault KV with "node-vault" but keep getting "statusCode: 404"
I'm following the example of node-vault
https://github.com/kr1sp1n/node-vault
1) I'm running vault_1.1.3_windows_amd64 on windows 10 with "vault server -dev" on a PowerShell.
2) Then on another PowerShell runs following;
$env:VAULT_ADDR="http://127.0.0.1:8200"
vault secrets enable -version=1 kv
vault status
Key Value
--- -----
Seal Type shamir
Initialized true
Sealed false
Total Shares 1
Threshold 1
Version 1.1.3
Cluster Name vault-cluster-28a041c6
Cluster ID 0ec85d70-8e87-dff6-347f-b1959fad8b44
HA Enabled false
3) Then runs the following code with the
const rootKey = //whatever;
const unsealKey = //whatever;
var options = {
apiVersion: 'v1',
endpoint: 'http://127.0.0.1:8200',
token: rootKey
};
var vault = require("node-vault")(options);
vault.unseal({ key: unsealKey })
.then(() => {
vault.write('secret/hello', { value: 'world' })
.then((res) => console.log(res))
.catch((err) => console.error(err));
});
vault.write('secret/hello', { value: 'world', lease: '1s' })
.then( () => vault.read('secret/hello'))
.then( () => vault.delete('secret/hello'))
.catch(console.error);
This return a status 404, what needs to be done additionally to avoid 404?
{ Error: Status 404
at handleVaultResponse (XX\TestCodes\Node-VaultTest\node_modules\node-vault\src\index.js:49:21)
at process._tickCallback (internal/process/next_tick.js:68:7)
response:
{ statusCode: 404,
body:
{ request_id: '2992e6c2-5146-6569-1f48-55f75da88993',
lease_id: '',
renewable: false,
lease_duration: 0,
data: null,
wrap_info: null,
warnings: [Array],
auth: null } } }
{ Error: Status 404
at handleVaultResponse (XX\TestCodes\Node-VaultTest\node_modules\node-vault\src\index.js:49:21)
at process._tickCallback (internal/process/next_tick.js:68:7)
response:
{ statusCode: 404,
body:
{ request_id: '2f280fa4-6596-c06f-2168-091246e0a2a1',
lease_id: '',
renewable: false,
lease_duration: 0,
data: null,
wrap_info: null,
warnings: [Array],
auth: null } } }

You mounted the kv store as a version 1. The actual path used by node-vault to read a secret from a version 2 kv store is different and not compatible with Vault's v1 kv store.
Mount your kv store with -version 2. If unspecified, it defaults to v1.

Also bear in mind that you need to add "data" to your path to get it working.
I for instance had kv/mySecret since "mySecret" was in the root "folder". Even though you need to add kv/data/mySecret. Then it worked for me!

Related

How to properly send a GraphQL update request using Axios?

I created an application using AWS Amplify, and I want to create an endpoint in an API function that will mutate a field in GraphQL. Since its a nodejs API function, I cannot use the recommended libraries from the AWS documentation which use ES6, since the functions can only use ES5. Therefore I need to use Axios.
I created a graphql query:
const query = /* GraphQL */ `mutation updatePublication($id: ID!, $keywords: String) {
updatePublication(id: $id, keywords: $keywords){
id
keywords
}
}`
Next, I created the Axios request based on this StackOverflow question.
const data = await axios.post(
process.env.API_APPNAME_GRAPHQLAPIENDPOINTOUTPUT,
{
query: query,
variables: {
id: variableWithID,
keywords: "updated keywords!"
}
},
{
headers: {
'Content-Type': 'application/json',
'x-api-key': process.env.API_APPNAME_GRAPHQLAPIKEYOUTPUT
}
}
)
When I run the query, I get a status 200 from the server with the following errors:
data: null,
errors: [
{
path: null,
locations: [Array],
message: "Validation error of type MissingFieldArgument: Missing field argument input # 'updatePublication'"
},
{
path: null,
locations: [Array],
message: "Validation error of type UnknownArgument: Unknown field argument id # 'updatePublication'"
},
{
path: null,
locations: [Array],
message: "Validation error of type UnknownArgument: Unknown field argument keywords # 'updatePublication'"
}
]
}
Can anyone advise on what this means?

Google Secret Manager INVALID_ARGUMENT Error

I'm using Google Secret Manager to access/store some secret parameters in an API. I have no problem saving secrets and accessing versions.
But when I send a request to list these secrets, I keep getting this error.
Error: 3 INVALID_ARGUMENT: Invalid resource field value in the request.
code: 3,
details: 'Invalid resource field value in the request.',
metadata: Metadata {
internalRepr: Map(3) {
'grpc-server-stats-bin' => [Array],
'google.rpc.errorinfo-bin' => [Array],
'grpc-status-details-bin' => [Array]
},
options: {}
},
statusDetails: [
ErrorInfo {
metadata: [Object],
reason: 'RESOURCE_PROJECT_INVALID',
domain: 'googleapis.com'
}
],
reason: 'RESOURCE_PROJECT_INVALID',
domain: 'googleapis.com',
errorInfoMetadata: {
method: 'google.cloud.secretmanager.v1.SecretManagerService.ListSecrets',
service: 'secretmanager.googleapis.com'
}
}
I've also checked the docs and tried different queries like in here but no dice...
This is the part of the code I'm running:
import { SecretManagerServiceClient } from "#google-cloud/secret-manager";
const secretClient = new SecretManagerServiceClient({
keyFile: "foo/bar/google_credentials.json"
});
const [secrets] = await secretClient.listSecrets({
filter: `labels.environment=development`
});
Version of the "#google-cloud/secret-manager": "^4.1.2",
Okay, I found the issue. I had to add the parent param to the request body.
So it should look like this:
const [secrets] = await SecretManager.secretClient.listSecrets({
parent: "projects/**", <=========== This is the key
filter: `
labels.environment:development AND
labels.scope:some-scope AND
labels.customer_id:*`
});

Error thrown when trigger method called in pusher, "expected string"

I'm trying to configure pusher on my nodejs app.
const Pusher = require("pusher");
const pusher = new Pusher({
appId: "id",
key: "key",
secret: "secret",
cluster: "ap2",
useTLS: true,
});
pusher.trigger('my-channel', 'my-event', {
message: "Hello, world."
}).then(console.log).catch(e=> console.log(e))
When I put above code in index and run, I get following error message.
{
name: 'PusherRequestError',
message: 'Unexpected status code 400',
url: 'a URL with all secret and stuff',
error: undefined,
status: 400,
body: 'expected string\n'
}
I double-checked the keys, secrets, etc and they are all correct. Any help is appreciated. Thanks!

"EACCESS: permission denied" problem when downloading xlsx with POST method from server in ubuntu with Angular 8 and Nodejs

My Stack:
Angular 8, Node 12, NestJs framework, Ubuntu server
Some context:
I am trying to download a generated xlsx from the server, I already have a functional excel download but only works when i Execute this with a GET method, now I need to send some parameters so I am using a POST request.
This code is perfectly working on localhost (In Windows 10)
I have the EACCESS permission denied when the server tries to read the file on the temp.
Angular:
site-list.component.ts
onGetSitesXLSX(): void {
const ids = [];
this.dataSource.getSubject().subscribe(
sites => {
sites.forEach(s => ids.push(s._id));
this.siteService.downloadListF(ids)
.subscribe((res: Blob) => {
const blob = new Blob([res], { type: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet' });
const today = moment().format('MM-DD-YYYY hh-mm-ss');
importedSaveAs(blob, `List of sites ${today}.xlsx`);
}, error => {
if (error && error.error && error.error.message) {
this.downloadListError = SiteError.getErrorMessage(error.error.message);
}
});
}
);
}
site.service.ts
generateHeader(): { 'Content-Type': string, 'Authorization'?: string } {
return !!this.token ? {
'Content-Type': 'application/json',
Authorization: this.token
} : {
'Content-Type': 'application/json'
};
}
downloadListF(sites: string[]): Observable<Blob> {
const options = {
responseType: 'blob' as 'json',
headers: this.generateHeader(),
};
return this.http.post<Blob>(
this.apiURL + 'sites/filtered/xslx',
sites,
options
).pipe(
catchError(error => {
if (!!error && !!error.error && !!error.error.message && error.error.message === SiteErrorCode.unauthorized) {
this.coreService.newError(SiteError.getErrorMessage(SiteErrorCode.unauthorized));
}
return throwError(error);
}),
);
}
Backend:
site.controller.ts
#UseGuards(AuthGuard('jwt'), RolesGuard)
#Role(UserRole.guest)
#Post('filtered/xslx')
#ApiResponse({
status: 200,
type: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
})
#HttpCode(HttpStatus.OK)
#Header('Content-Type', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet')
#Header('Content-Disposition', 'attachment; filename=message.xlsx')
async exportSitesFiltered(#Res() response, #Body()body) {
return response.sendFile(await this.exportSiteService.exportSiteListFiltered(body));
}
async exportSiteListFiltered(sitesIds): Promise<string> {
const sites = [];
let partial;
for (let id of sitesIds) {
partial = await this.siteService.getSiteById(id); // get populated data from DB
sites.push(partial);
}
const headers = [
'Name',
'Type',
'Latitude',
'Longitude',
'Region',
'Data',
];
const data = [];
for (const site of sites) {
data.push(ExportSiteService.generateSingleSheetRow(site)); // formats the obejct in an array
}
return await this.xlsxService.generateXlsx('sites', headers, data);
}
xlsx.service.ts
async generateXlsx(fileName: string, headers: string[], data): Promise<string> {
const sheetPath = join(__dirname, '..', '..', '..', 'temp', `${fileName}.xlsx`);
closeSync(openSync(sheetPath, 'w'));
await xlsxPopulate
.fromBlankAsync()
.then(workbook => {
const rangeContent = [];
rangeContent.push(headers);
for (const row of data) {
rangeContent.push(row);
}
workbook.sheet(0).name(fileName);
workbook.sheet(0).cell('A1').value(rangeContent);
workbook.sheet(0).row(1).style({
bold: true,
italic: true,
});
return workbook.toFileAsync(sheetPath);
})
.catch(error => {
Logger.error(error);
throw new InternalServerErrorException(XlsxError.writeXlsx);
});
return sheetPath;
}
}
Errors on Server:
0|nec-dev | (node:20697) [DEP0066] DeprecationWarning: OutgoingMessage.prototype._headers is deprecated
0|nec-dev | [Nest] 20697 - 04/16/2020, 2:15:58 PM [ExceptionsHandler] EACCES: permission denied, open '/var/www/my-project/server/temp/sites.xlsx' +60115ms
0|nec-dev | Error: EACCES: permission denied, open '/var/www/my-project/server/temp/sites.xlsx'
0|nec-dev | at Object.openSync (fs.js:454:3)
0|nec-dev | at XlsxService.generateXlsx (/var/www/my-project/server/src/_utils/xlsx/xlsx.service.ts:17:15)
0|nec-dev | at ExportSiteService.exportSiteListFiltered (/var/www/my-project/server/src/site/services/export-site.service.ts:91:35)
0|nec-dev | at processTicksAndRejections (internal/process/task_queues.js:89:5)
0|nec-dev | at SiteController.exportSitesFiltered (/var/www/my-project/server/src/site/site.controller.ts:93:30)
0|nec-dev | at /var/www/my-project/server/node_modules/#nestjs/core/router/router-execution-context.js:45:28
0|nec-dev | at /var/www/my-project/server/node_modules/#nestjs/core/router/router-proxy.js:8:17
ERRORS on navigator console:
POST https://xxxxx/api/v1/sites/filtered/xslx 500 (Internal Server Error)
{headers: h, status: 500, statusText: "Internal Server Error", url: "https://xxxx/api/v1/sites/filtered/xslx", ok: false, …}
headers: h {normalizedNames: Map(0), lazyUpdate: null, lazyInit: ƒ}
status: 500
statusText: "Internal Server Error"
url: "https://xxxxx/api/v1/sites/filtered/xslx"
ok: false
name: "HttpErrorResponse"
message: "Http failure response for https://xxxx/api/v1/sites/filtered/xslx: 500 Internal Server Error"
error: Blob {size: 52, type: "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"}
__proto__: O
I've tried to execute npm with sudo, and also:
npm cache clean --force
sudo chown -R $(whoami) ~/.npm
But with no luck.
How can I fix this?
if you own /var/www/ might fix this issue
sudo chown -R $(whoami) /var/www/

Serverless: Lambda starts working only after a change from console/ Lambda intermittent behavior

I am using serverless framework for lambda deployment and I am facing this really weird problem. I use Sequelize to connect to RDS Aurora MySql DB. The deployment is successful, but when I invoke the APIs I see SequelizeConnectionError: Connect ETIMEDOUT. The API works fine when I run offline. But the APIs deployed don't work. They start working as soon as I make any small change on the console and save it, like changing time out from 30 to 31. But when I redeploy I face the same problem and I just can't figure out what the problem is.
Error:
SequelizeConnectionError: connect ETIMEDOUT
Edits:
Yes. This is Aurora serverless with Data API enabled. The lambda function runs in the same VPC as the DB is in. The security group and subnets are are also same. Here is my DB config snippet:
DB_PORT: 3306
DB_POOL_MAX: 10
DB_POOL_MIN: 2
DB_POOL_ACQUIRE: 30000
DB_POOL_IDLE: 10000
This is my db.js:
const sequelize = new Sequelize(process.env.DB_NAME, process.env.DB_USER, process.env.DB_PASSWORD, {
host: process.env.DB_HOST,
port: process.env.DB_PORT,
dialect: 'mysql',
pool: {
max: process.env.DB_POOL_MAX,
min: process.env.DB_POOL_MIN,
acquire: process.env.DB_POOL_ACQUIRE,
idle: process.env.DB_POOL_IDLE
}
});
My handler file is really long with over 33 APIs. Below is one of them:
context.callbackWaitsForEmptyEventLoop = false;
if (event.source === 'serverless-plugin-warmup') {
try {
const { Datalogger } = await connectToDatabase();
} catch (err) { }
return {
statusCode: 200,
headers: { 'Access-Control-Allow-Origin': '*' },
body: 'warm-up'
}
}
try {
const { Datalogger } = await connectToDatabase();
var datalogger;
if (!event.headers.dsn && !event.headers.sftp)
throw new HTTPError(404, `Datalogger serial number and SFTP root directory is mandatory`);
datalogger = await Datalogger.findOne({
where: {
dsn: event.headers.dsn,
sftpRootDir: event.headers.sftp,
}
})
if (!datalogger)
throw new HTTPError(404, `Datalogger with this input was not found`);
console.log('datalogger', datalogger);
await datalogger.destroy();
return {
statusCode: 200,
headers: { 'Access-Control-Allow-Origin': '*' },
body: JSON.stringify(datalogger)
}
} catch (err) {
console.log('Error in destroy: ', JSON.stringify(err));
return {
statusCode: err.statusCode || 500,
headers: { 'Content-Type': 'text/plain', 'Access-Control-Allow-Origin': '*' },
body: err.message || 'Could not destroy the datalogger.'
}
}
I reached out to AWS support for this issue and they suspect the error has occurred due to node js version upgrade. I was previously using 8 and it was working. After upgrading to 10 its resulting in intermittent timeouts. Few calls are successful and then one failure. Now even if I go back to version 8, its same issue.

Resources