How to get past errors using putParameter with aws-sdk for nodejs in Lambda? - node.js

I'm trying to set a parameter using putParameter in the AWS SDK for JavaScript in Node.js. In particular, I'd like to take advantage of the "Advanced" Tier, with an Expiration policy and Tags if possible. When I execute my code, I keep getting errors like:
There were 2 validation errors:
* UnexpectedParameter: Unexpected key 'Policies' found in params
* UnexpectedParameter: Unexpected key 'Tier' found in params
I suspected the issue was around the aws-sdk version I was using, so I've tried running the code locally using SAM local, and from Lambda functions using the nodejs8.10 and nodejs10.x environments. The errors do not go away.
const AWS = require('aws-sdk');
AWS.config.update({region: 'us-east-1'});
const ssm = new AWS.SSM({apiVersion: '2014-11-06'});
exports.lambdaHandler = async () => {
const tokenExpiration = new Date();
tokenExpiration.setSeconds(tokenExpiration.getSeconds() + 60);
await ssm.putParameter({
Name: 'SECRET_TOKEN',
Type: 'SecureString',
Value: '12345',
Policies: JSON.stringify([
{
"Type":"Expiration",
"Version":"1.0",
"Attributes":{
"Timestamp": tokenExpiration.toISOString()
}
}
]),
Overwrite: true,
Tier: 'Advanced'
}).promise();
};
I would expect this code to work and set a parameter with the expiration. However, it appears that the sdk doesn't recognize the "Policies" and "Tier" parameters, which are available according to the documentation. I don't know if it's an issue of waiting for the newest AWS SDK for JavaScript, but the runtimes page suggest that nodejs10.x is running AWS SDK for JavaScript 2.437.0.
It might be helpful to know that I can get the code running correctly without the parameters in question (ie, just the "Name", "Type", and "Value" parameters).

Unfortunately both Tier and Policies weren't added until v2.442.0 (see diff)
This means that to use these features you'll have to deploy with the version of the aws-sdk you're developing against.
It should be noted that either developing/testing against the built-in version, or deploying with the aws-sdk you do use, is often cited as good practice. If you're deploying your version you can use explicit client imports (e.g. const SSM = require('aws-sdk/clients/ssm') to keep the deployment size down. This is even more effective if you develop against the preview AWS-SDK Version 3.

Related

Problems using airtable.js module in AWS lambda

I'm trying to access Airtable from an AWS lambda function.
First off, to test, I installed airtable.js (npm install -s airtable) in a local project, wrote a short test script, and executed with node test.js. All works fine.
Using exactly the same core test code, inside an appropriate node.js function wrapper, I've tried running the same test in an AWS lambda function, and I get an error in my CloudWatch logs:
Error: Cannot find module '/var/task/node_modules/abort-controller/dist/abort-controller'. Please verify that the package.json has a valid \"main\" entry
I've tried both zipping the npm packages up with the function code in a deployment package, and also creating a lambda layer from the airtable package. Both produce the same error. Note that the package is picked up - if I try the layer approach, but removing the layer itself then it can't find airtable. So this seems to be something specific with how the airtable package is trying to access abort-controller.
For what it's worth, here's the [redacted] test code that I'm using in my lambda function: (the returns etc are because it's operating behind an API gateway call - but that's not part of the issue because the same error occurs regardless of whether testing inside the lambda console or calling through the API)
var AWS = require("aws-sdk");
AWS.config.update({ region: "eu-west-1", });
const Airtable = require('airtable');
const base = new Airtable({ apiKey: "xxxxx" }).base('yyyyy');
exports.handler = async(event) => {
try {
console.info('trying create');
let records = await base('Base1').create([{
"fields": {
"Status": "Scored",
"C1": "testing",
"C2": "airtable",
"C3": "api",
"C4": "from",
"C5": "node",
}
}, ]);
console.info('completed create');
let records_list = records.map(r => r.getId()).join(',');
console.info(records_list);
return ({statusCode: 200, body: JSON.stringify(records_list)});
} catch (e) {
console.error(e);
return ({statusCode: 401, body: JSON.stringify(e)});
}
}
I've built many lambdas previously, both with layers and with embedded packages, and so I've made most of the common mistakes - and I don't think I'm repeating them here. Is there anything special about airtable.js which means there's a different approach needed here?
Turns out the problem was in the zip of the deployment package - whether in a layer or baked into the lambda, the zip file seems to have been missing something. I was doing that as part of my terraform configuration / deployment, and what's perplexing is that it seems to be exactly the same structure and setup as I've used successfully for over 20 functions and 5 layers in a different project, but here it's failing.
So - solution seems to be, for the moment at least, to manually zip the layer package, upload to s3, and then get terraform to pick it up from there.

How to create a Kubernetes deployment using the Node.js SDK

I am working on building a project using Node.js which will require me to have an application that can deploy to Kubernetes. The service I am working on will take some Kubernetes manifests, add some ENV variables into them, and then would deploy those resources.
I have some code that can create and destroy a namespace for me using the SDK and createNamespace and deleteNamespace. This part works how I want it to, ie without needing a Kubernetes YAML file. I would like to use the SDK for creating a deployment as well however I can't seem to get it to work. I found a code example of createNamespacedDeployment however using version 0.13.2 of the SDK I am unable to get that working. I get this error message when I run the example code I found.
k8sApi.createNamespacedDeployment is not a function
I have tried to check over the git repo for the SDK though it is massive and I've yet to find anything in it that would allow me to define a deployment in my Node.js code, closest I have found is a pod deployment however that won't work for me, I need a deployment.
How can I create a deployment via Node.js and have it apply to my Kubernetes cluster?
Management of deployments is handled by the AppsV1Api class:
const k8s = require('#kubernetes/client-node');
const kc = new k8s.KubeConfig();
kc.loadFromDefault();
const k8sApi = kc.makeApiClient(k8s.CoreV1Api);
const appsApi = kc.makeApiClient(k8s.AppsV1Api);
const deploymentYamlString = fs.readFileSync('./deployment.yaml', { encoding: 'utf8'});
const deployment = k8s.loadYaml(deploymentYamlString);
const res = await appsApi.createNamespacedDeployment('default', deployment);
Generally, you can find the relevant API class for managing a Kubernetes object by its apiVersion, eg: Deployment -> apiVersion: apps/v1 -> AppsV1Api, CronJob -> apiVersion: batch/v1 -> BatchV1Api.
You can use the #c6o/kubelcient kubernetes client. It's a little simpler:
import { Cluster } from '#c6o/kubeclient'
const cluster = new Cluster({}) // Assumes process.env.KUBECONFIG is set
const result = await cluster.upsert({kind: 'Deployment', apiVersion.. })
if (result.error) ...
You can also it using the fluent API if you have multiple steps:
await cluster
.begin(`Provision Apps`)
.upsertFile('../../k8s/marina.yaml', options)
.upsertFile('../../k8s/store.yaml', options)
.upsertFile('../../k8s/harbourmaster.yaml', options)
.upsertFile('../../k8s/lifeboat.yaml', options)
.upsertFile('../../k8s/navstation.yaml', options)
.upsertFile('../../k8s/apps.yaml', options)
.upsertFile('../../k8s/istio.yaml', options)
.end()
We're working on the documentation but have lots of provisioners using this client here: https://github.com/c6o/provisioners

don't include all aws-sdk for node lambdas

I have a bunch of aws lambdas written in node 12. I've discovered that require("aws-sdk") takes ages - like 3 seconds all by itself. I also discovered that if I just want to hit dynamo, I can just load a tiny little bit of it, by going:
const DynamoDB = require('aws-sdk/clients/dynamodb')
which ends up being heaps faster. However, I now need to call a lambda - ie I'm doing
const aws = require('aws-sdk');
const lambda = new aws.Lambda( );
but I couldn't find any way of requiring just lambda - eg I'd like something like this to work:
const Lambda = require("aws-sdk/Lambda");
const lambda = new Lambda();
but it doesn't. Is there any way of just including lambda functionality, without the whole aws sdk?
You should check the JS AWS SDK v3, quoting from their repo:
The AWS SDK for JavaScript v3 gamma is a rewrite of V2 with some great new features. As with version 2, it enables you to easily work with Amazon Web Services, but has been written in TypeScript and adds several frequently requested features, like modularized packages.
With it you can do things like:
const { Lambda } = require("#aws-sdk/client-lambda");
As the version suggests, it's still a pre-release so depending on your requirements and use case you might want to hold off until it's more stable.
so - turns out the solution was quite simple - Andre's answer made me look a lot closer and I came up with this:
const Lambda = require("aws-sdk/clients/lambda");

SageMaker NodeJS's SDK is not locking the API Version

I am running some code in AWS Lambda that dynamically creates SageMaker models.
I am locking Sagemaker's API version like so:
const sagemaker = new AWS.SageMaker({apiVersion: '2017-07-24'});
And here's the code to create the model:
await sagemaker.createModel({
ExecutionRoleArn: 'xxxxxx',
ModelName: sageMakerConfigId,
Containers: [{
Image: ecrUrl
}]
}).promise()
This code runs just fine locally with aws-sdk on 2.418.0.
However, when this code is deployed to Lambda, it doesn't work due to some validation errors upon creating the model:
MissingRequiredParameter: Missing required key 'PrimaryContainer' in params
UnexpectedParameter: Unexpected key 'Containers' found in params
Is anyone aware of existing bugs in the aws-sdk for NodeJS using the SDK provided by AWS in the Lambda context? I believe the SDK available inside AWS Lambda is more up-to-date than 2.418.0 but apparently there are compatibility issues.
As you've noticed the 'embedded' lambda version of the aws-sdk lags behind. It's actually on 2.290.0 (you can see the full details on the environment here: https://docs.aws.amazon.com/lambda/latest/dg/current-supported-versions.html)
You can see here: https://github.com/aws/aws-sdk-js/blame/master/clients/sagemaker.d.ts that it is not until 2.366.0 that the params for this method included Containers and did not require PrimaryContainer.
As you've noted, the workaround is to deploy your lambda with the aws-sdk version that you're using. This is sometimes noted as a best practice, as it pins the aws-sdk on the functionality you've built and tested against.

AWS Cognito ListUsers InvalidParameterException using AttributesToGet on custom attributes

Using the following AWS Lambda-based app client, I'm trying to list all users from my Cognito user pool.
let AWS = require('aws-sdk')
const COGNITO_CLIENT = new AWS.CognitoIdentityServiceProvider()
COGNITO_CLIENT.listUsers({
UserPoolId: 'MyUserPoolId',
AttributesToGet: ['default_attribute', 'custom:my_attribute']
}, callback)
Everything works fine when querying for all attributes by default (AttributesToGet: [] // or excluding this field altogether). However, when targeting custom attributes, the InvalidParameterException is raised. This is using the Amazon SDK for Node.js.
Targeting default attributes are allowed though:
AttributesToGet: ['email', 'name', /* other non-custom */]
Please remove the "AttributesToGet" and try.
Your code is correct. However, I am on the Cognito team and we don't support search on custom attributes at this point.

Resources