AWS Transcribe client does not provide an export named 'transcribeClient' - node.js

I'm trying to integrate AWS Transcribe in my Node.JS application. AWS S3 and Polly works fine, but AWS Transcribe does not. I'm using the example code of AWS.
When I want to start a transcribe job by the AWS example code I receive the following error: The requested module './libs/transcribeClient.js' does not provide an export named 'transcribeClient'
That was also the only file where I received the error that required is not defined. I wonder why it only happens with AWS transcribe but not with the other services as well? I'm also able to start a transcribe job via the AWS CLI.
That AWS Transcribe code does not work - transcribeClient.js:
const AWS_BUCKET_NAME="X"
const AWS_REGION="eu-central-1"
const AWS_ACCESS_KEY="XXX"
const AWS_SECRET_KEY="XXX"
// snippet-start:[transcribe.JavaScript.createclientv3]
const { TranscribeClient } = require('#aws-sdk/client-transcribe');
// Create anAmazon EC2 service client object.
const transcribeClient = new TranscribeClient({ AWS_REGION, AWS_ACCESS_KEY, AWS_SECRET_KEY });
module.exports = { transcribeClient };
That AWS Polly code works - pollyClient.js:
const AWS_BUCKET_NAME="X"
const AWS_REGION="eu-central-1"
const AWS_ACCESS_KEY="XXX"
const AWS_SECRET_KEY="XXX"
// snippet-start:[polly.JavaScript.createclientv3]
const { PollyClient } =require( "#aws-sdk/client-polly");
// Create an Amazon S3 service client object.
const pollyClient = new PollyClient({ AWS_REGION, AWS_ACCESS_KEY, AWS_SECRET_KEY});
module.exports = { pollyClient };
I'm looking forward to reading from you! Thanks!

I solved it. Now it's working with my Node.js 12 environment.
package.json
I changed "type": "modules" to "type": "commonjs".
transcribeClient.js needs to look like this:
Here I changed export to module.exports.
const { TranscribeClient } = require("#aws-sdk/client-transcribe");
const transcribeClient = new TranscribeClient({ AWS_REGION, AWS_ACCESS_KEY, AWS_SECRET_KEY});
module.exports = { transcribeClient };
transcribe_create_job.js needs to look like this:
Here I changed the import statement to require.
const { StartTranscriptionJobCommand } = require("#aws-sdk/client-transcribe");
const { transcribeClient } = require("./libs/transcribeClient.js")
// Set the parameters
const params = {
TranscriptionJobName: "test123",
LanguageCode: "en-GB", // For example, 'en-US'
MediaFormat: "webm", // For example, 'wav'
Media: {
MediaFileUri: "https://x.s3.eu-central-1.amazonaws.com/dlpasiddi.webm",
},
};
const run = async () => {
try {
const data = await transcribeClient.send(
new StartTranscriptionJobCommand(params)
);
console.log("Success - put", data);
return data; // For unit tests.
} catch (err) {
console.log("Error", err);
}
};
run();

Related

How to upload an image file from an s3 bucket to cloudinary (nodejs)

I have image files saved in an s3 bucket. I want to update the images using Cloudinary. What's the best way to get images out of S3 and into Cloudinary?
I can get a readable stream using the aws-sdk in nodeJS:
// Create service client module using ES6 syntax.
import { S3Client, GetObjectCommand } from "#aws-sdk/client-s3";
// Set the AWS Region.
const REGION = "eu-west-2";
// Create an Amazon S3 service client object.
const s3Client = new S3Client({ region: REGION });
// Set the parameters.
export const bucketParams = {
Bucket: "mybucketname",
};
// Get an object from S3 bucket
export async function getS3Object(inputParams: { Key: string }) {
try {
const data = await s3Client.send(
new GetObjectCommand({
...bucketParams,
Key: `public/dalle/${inputParams.Key}`,
})
);
return data; // data.Body is a readable stream
} catch (err) {
console.log("Error", err);
}
}
Uploading to cloudinary can be done by passing an image url:
import { v2 } from "cloudinary";
const cloudinary = v2;
// Return "https" URLs by setting secure: true
cloudinary.config({
secure: true,
cloud_name: myCloudName,
api_key: myApiKey,
api_secret: myApiSecret,
});
export async function uploadImage(fileLocation: string){
const newUploadUrl = await cloudinary.uploader.upload(fileLocation, {});
console.log({
newUploadUrl,
});
}
Is there a way for Cloudinary to accept a readable stream for upload? Or alternatively, is there a way to get a public image URL from S3? (Or is there a better way to do this entirely?)
Rather than downloading the image from S3, we can create a temporary URL to send to cloudinary:
// Create service client module using ES6 syntax.
import { S3Client, GetObjectCommand } from "#aws-sdk/client-s3";
import { getSignedUrl } from "#aws-sdk/s3-request-presigner";
// Set the AWS Region.
const REGION = "eu-west-2";
// Create an Amazon S3 service client object.
const s3Client = new S3Client({ region: REGION });
// Set the parameters.
export const bucketParams = {
Bucket: myBucketName,
};
export async function getTempSignedUrl(inputParams: { Key: string }) {
try {
const command = new GetObjectCommand({
...bucketParams,
Key: inputParams.Key,
});
const data = await getSignedUrl(s3Client, command, {});
console.log({
data,
});
return data; // Can be passed to cloudinary as per the upload function in the question
} catch (err) {
console.log("Error", err);
}
}
Source https://docs.aws.amazon.com/sdk-for-javascript/v3/developer-guide/s3-example-creating-buckets.html#s3-create-presigendurl
Please make sure there's either a public access to the S3 URLs or assign read permissions to our AWS user. For more information:
http://support.cloudinary.com/hc/en-us/articles/203276521-How-do-I-allow-Cloudinary-to-read-from-my-private-S3-bucket-

How to fetch aws ecs container ip address in nodejs project

I have uploaded my API project (Node.js project) to AWS ECS container and my project contains swagger documentation. In swagger I want to indicate the current host Ip address that the API is run on but I cannot find the right code to fetch it. There is a solution for that? since I have managed to implement it on .NetCore API.
How does it looks right now:
Thx in advance.
You can make use of AWS ECS metadata endpoint http://172.17.0.1:51678/v1/metadata from an ECS task to fetch details about the container instance. The details fetched can then be used to get the private/public ip address of the instance. Example:
import http from 'http';
import util from 'util';
import AWS from 'aws-sdk';
export const getIPAddresses = async () => {
try {
let options: any = {
hostname: '172.17.0.1',
port: 51678,
path: '/v1/metadata',
method: 'GET'
}
let containerInstanceDetails: any = await httpGet(options);
containerInstanceDetails = JSON.parse(containerInstanceDetails);
const cluster = containerInstanceDetails["Cluster"];
const containerInstanceArn = containerInstanceDetails["ContainerInstanceArn"];
const containerInstanceUUID = containerInstanceArn.split('/')[2];
let params: any = {
cluster: cluster,
containerInstances: [containerInstanceUUID]
}
if (!AWS.config.region) {
AWS.config.update({
region: <your_aws_region>
});
}
const ecs = new AWS.ECS({ 'region': <your_aws_region> });
const ec2 = new AWS.EC2({ 'region': <your_aws_region> });
const describeContainerInstancesAsync = util.promisify(ecs.describeContainerInstances).bind(ecs);
const describeInstancesAsync = util.promisify(ec2.describeInstances).bind(ec2);
let data = await describeContainerInstancesAsync(params);
const ec2InstanceId = data.containerInstances[0].ec2InstanceId;
params = {
InstanceIds: [
ec2InstanceId
]
}
data = await describeInstancesAsync(params);
return [data.Reservations[0].Instances[0].PrivateIpAddress, data.Reservations[0].Instances[0].PublicIpAddress];
}
catch(err) {
console.log(err);
}
}
async function httpGet(options) {
return new Promise((resolve, reject) => {
http.get(options, response => {
response.setEncoding('utf8');
response.on('data', data => {
resolve(data);
});
}).on('error', error => {
reject(error.message);
});
});
}

NodeJS Amazon AWS SDK S3 client stops working intermittently

I have NodeJS express web server that serves files from AWS S3. Most of the time this exact code works correctly and serves files for a wide verity of applications with large numbers of requests in Production. The NodeJS web server is running across multiple nodes on a docker swarm server.
After about 2-3 weeks this stops working. There is no response from S3Client GetObjectCommand, there no error returned or anything. This starts working again only after restarting the NodeJS Docker container.
I read the S3 SDK docs that indicate a that the SDK will retry automatically.
Each AWS SDK implements automatic retry logic.
Questions:
How can we make this code more resilient and not need a restart?
Is the error handling correct? I'm wondering why there is no seemingly no response or error returned at all in this situation.
Is it necessary to configure the re-try settings?
NodeJS version: node:lts-alpine
Module: #aws-sdk/client-s3
Controllers
AWS Controller
const consoleLogger = require('../logger/logger.js').console;
const { S3Client, GetObjectCommand } = require('#aws-sdk/client-s3');
const config = {
"credentials": {
"accessKeyId": "example",
"secretAccessKey": "example"
},
"endpoint": "example",
"sslEnabled": true,
"forcePathStyle": true
}
const s3client = new S3Client(config);
const awsCtrl = {};
awsCtrl.getObject = async (key) => {
// Get object from Amazon S3 bucket
let data;
try {
// Data is returned as a ReadableStream
data = await s3client.send(new GetObjectCommand({ Bucket: "example", Key: key }));
console.log("Success", data);
} catch (e) {
consoleLogger.error("AWS S3 error: ", e);
const awsS3Error = {
name: e.name || null,
status: e.$metadata.httpStatusCode || 500
};
throw awsS3Error;
}
return data;
}
module.exports = awsCtrl;
Files Controller
const queryString = require('query-string');
const consoleLogger = require('../logger/logger.js').console;
const httpCtrl = require('./http.ctrl');
const jwtCtrl = require('./jwt.ctrl');
const awsCtrl = require('./aws.ctrl');
filesCtrl.deliverFile = async (req, res) => {
/* Get object from AWS S3 */
let fileObjectStream;
try {
fileObjectStream = await awsCtrl.getObject(filePath);
} catch (e) {
consoleLogger.error(`Unable to get object from AWS S3`, e);
if (e.status && e.status === 404) {
result.error = `Not found`;
result.status = 404;
return res.status(result.status).json(result);
}
return res.status(e.status || 500).json(result);
}
const filename = lookupResponse.data.filename;
// Set response header: Content-Disposition
res.attachment(filename);
// API response object stream download to client
return fileObjectStream.Body.pipe(res);
}
API
const express = require('express');
const router = express.Router();
const filesCtrl = require('../../controllers/files.ctrl');
const filesValidation = require('../validation/files');
router.get('/:fileId', [filesValidation.getFile], (req, res, next) => {
return filesCtrl.deliverFile(req, res);
});

How to get env variable and save to parameter Store?

i'm using serverless framework(aws).
From cloudFormation when creating a database, writes to the env. variable endpoint of the database.
part of my serverless.yml file
environment:
DATABASE_HOST:
"Fn::GetAtt": [ServerlessRDS, Endpoint.Address]
This variable is available from the lambda level as it is already deployed on aws. But I want to have access to this variable from locally. I came across the idea that I would write this variable to the parameter store (aws systems manager).
So I attached the script to my serverless.yml file (using serverless-scriptable-plugin).
scriptHooks, part of my serverless.yml file
:
after:aws:deploy:finalize:cleanup:
- scripts/update-dbEndopint.js
Here's the script. Nothing special, writes an environment variable process.env.DATABASE_HOST to parameter stora.
const aws = require('aws-sdk');
const ssm = new aws.SSM();
(async () => {
try {
const params = {
Name: `${process.env.AWS_SERVICE_NAME}-DATABASE_HOST-${
process.env.AWS_STAGE
}`,
Value: `${process.env.DATABASE_HOST}`,
Type: 'String',
Overwrite: true,
};
await ssm.putParameter(params).promise();
log(`[DATABASE_HOST]: ${process.env.DATABASE_HOST} `);
log('Task done.');
} catch (e) {
throw e;
}
})();
But after taking deploy the variable is undefined.
This is because the variable value is only available later.
Do you know how to get me to get the base endpoint to parameter store?
Your servreless.yml will set the environment variable for the function but not for the process.env of the scripts run by serverless-scriptable-plugin.
You'll need to save it as an output for your stack using something similar to this:
Resources:
ServerlessRDS:
....
Outputs:
ServerlessRDSEndpointAddress:
Value:
"Fn::GetAtt": [ServerlessRDS, Endpoint.Address]
Then in your script extract that value from the stack something like this:
const fs = require('fs');
const yaml = require('js-yaml');
const aws = require('aws-sdk');
const ssm = new aws.SSM();
const getStackName = (stage) => {
const content = fs.readFileSync('serverless.yml');
return `${yaml.safeLoad(content).service}-${stage}`;
};
const getStackOutputs = async (provider, stackName, stage, region) => {
const result = await provider.request(
'CloudFormation',
'describeStacks',
{ StackName: stackName },
stage,
region,
);
const outputsArray = result.Stacks[0].Outputs;
let outputs = {};
for (let i = 0; i < outputsArray.length; i++) {
outputs[outputsArray[i].OutputKey] = outputsArray[i].OutputValue;
}
return outputs;
};
(async () => {
try {
const provider = serverless.getProvider('aws');
const { stage, region } = options;
const { ServerlessRDSEndpointAddress } = await getStackOutputs(provider, getStackName(stage), stage, region)
const params = {
Name: `${process.env.AWS_SERVICE_NAME}-DATABASE_HOST-${
process.env.AWS_STAGE
}`,
Value: `${ServerlessRDSEndpointAddress}`,
Type: 'String',
Overwrite: true,
};
await ssm.putParameter(params).promise();
log(`[DATABASE_HOST]: ${ServerlessRDSEndpointAddress} `);
log('Task done.');
} catch (e) {
throw e;
}
})();
I'm not sure how saving the value in parameter store will allow you to access it locally though.
If you want to invoke the function locally you can use:
serverless invoke local -f functionName -e DATABASE_HOST=<DATABASE_HOST>
Or use dotenv for any other JavaScript code

NodeJs Microsoft BotFramework "BotFrameworkAdapter is not a constructor" error

I am trying to create a Bot using the Microsoft Botframework to run Serverless on AWS Lambda.
But I get this error message: "BotFrameworkAdapter is not a constructor" from this Lambda code:
export async function main(event, context, callback){
var _status = null;
var _body = null;
var _respond = function (status, body) {
callback(null, {
statusCode: status || 200,
body: body || ''
});
};
var req = {
body: JSON.parse(event.body),
headers: event.headers
};
console.log(req);
var res = {
send: function (status, body) {
_respond(status, body);
},
status: function (status) {
_status = status;
},
write: function (body) {
_body = body;
},
end: function() {
_respond(_status, _body);
}
};
//res.send(200,'{"Test": "Hallo"}');
const path = require('path');
// Import required bot services.
// See https://aka.ms/bot-services to learn more about the different parts of a bot.
const { BotFrameworkAdapter, MemoryStorage, ConversationState } = require('botbuilder');
// Import required bot configuration.
const { BotConfiguration } = require('botframework-config');
// This bot's main dialog.
const { MyBot } = require('./bot');
// Read botFilePath and botFileSecret from .env file
// Note: Ensure you have a .env file and include botFilePath and botFileSecret.
//const ENV_FILE = path.join(__dirname, '.env');
//const env = require('dotenv').config({path: ENV_FILE});
// bot endpoint name as defined in .bot file
// See https://aka.ms/about-bot-file to learn more about .bot file its use and bot configuration .
const DEV_ENVIRONMENT = 'development';
// Create adapter.
// See https://aka.ms/about-bot-adapter to learn more about .bot file its use and bot configuration .
const adapter = new BotFrameworkAdapter({
appId: process.env.microsoftAppID,
appPassword: process.env.microsoftAppPassword
});
}
The first part of the code changes the Lambda reqeuest and response format to work with the BotFramework.
The other code is mostly from the Sample provided by Microsoft.
The environment variables are set correctly.
I created a new serverless project with the nodejs template and now it is working.

Resources