Write An azure function with Javascript that is connecting OracleDB and running on Docker - node.js

I have an Azure Function that is written with Visual Studio Code and it is a nodejs application with javascript codes.
Also, the application connects to Oracle DB to run an Oracle Script.
Also, the application runs on a Docker image.
I added the npm packages for oracle connection;
npm i express
npm i oracledb
Below my some key points of code;
Dockerfile
FROM mcr.microsoft.com/azure-functions/node:3.0
ENV AzureWebJobsScriptRoot=/home/site/wwwroot \
AzureFunctionsJobHost__Logging__Console__IsEnabled=true
COPY . /home/site/wwwroot
RUN cd /home/site/wwwroot && \
npm install
index.js
module.exports = async function (context, req) {
let responseMessage = "";
let connection;
try {
const oracledb = require('oracledb');
connection = await oracledb.getConnection({
user: "xx",
password: "xx",
connectString: req.body
});
let query = 'select * from xx where rownum=1';
result = await connection.execute(query);
responseMessage = result;
} catch (err) {
responseMessage = err.message;
} finally {
if (connection) {
try {
// Always close connections
await connection.close();
} catch (err) {
responseMessage = err.message;
}
}
}
context.res = {
body: responseMessage
};
}
Here is my folder structure of project;
CASE1: When I run the project with "func start" the application is working properly and gets the result.
CASE2: When I run it on my local docker with below steps it returns an error form HTTP response.
Run "docker build ."
docker run -d -p 99:80 myimage
It is listed on "docker ps" list.
I call the endpoint "http://localhost:99/api/HttpExample" I get an error.
DPI-1047: Cannot locate a 64-bit Oracle Client library: "libclntsh.so: cannot open shared object file: No such file or directory". See https://oracle.github.io/node-oracledb/INSTALL.html for help
Node-oracledb installation instructions: https://oracle.github.io/node-oracledb/INSTALL.html
You must have 64-bit Oracle client libraries in LD_LIBRARY_PATH, or configured with ldconfig.
If you do not have Oracle Database on this computer, then install the Instant Client Basic or Basic Light package from
http://www.oracle.com/technetwork/topics/linuxx86-64soft-092277.html
I search some documentations but I can't find the solution for especially Azure Function Project. Because my dockerfile have to be created based on "FROM mcr.microsoft.com/azure-functions/node:3.0".
What should be the Dockerfile for this project?

Related

Can connect to mongo db via batch/powershell but not programmatically (nodejs)

Using the external mongo.exe I can connect to the databases of our environments via
mongo.exe “mongodb://aaa.unix.abc:27018,bbb.unix.abc:27018,ccc.unix.abc:27018/mydb?replicaSet=myreplicaset” —authenticationMechanism=GSSAPI —authenticationDatabase=$external —username “user#NONPROD#ABC.COM” —password “password” -ssl —-sslCAFile C:\mymongostuff\ca.pem
So I have no problem whatsoever connecting via batch scripts and powershell scripts but my problem comes with trying to connect via application (whether Java or JavaScript) running on my local machine
Below test script I’m trying to run (node v14.16.0, npm v6.14.11, mongodb npm library v4.13.0 on a windows PC)
const { MongoClient } = require(‘mongodb’);
const path = require(‘path’);
const capem = path.join(__dirname,’.\\ca.pen’);
async function main() {
const uri = “mongodb://aaa.unix.abc:27018,bbb.unix.abc:27018,ccc.unix.abc:27018/mydb?replicaSet=myreplicaset”;
var mongoOpt = { sslValidate=true, sslCert= capem };
const client = new MongoClient(uri,mongoOpt);
try {
await client.connect();
await doSomething(client);
}
Running above will run for many seconds without doing anything before giving MongoServerSelectionError
Reason: TopologyDescription {
type: ‘ReplicaSetNoPrimary’,
servers….
My suspicion is the uri is correct but that I somehow need to specify the “authenticationMechanism=GSSAPI —authenticationDatabase=$external —username “user#NONPROD#ABC.COM” —password “password” -ssl —-sslCAFile C:\mymongostuff\ca.pem” outside the uri for it to be equivalent to my working batch/powershell scripts

Is there a redis package for "Azure cache for Redis" in dart 2.9.2 (for flutter app) which takes hostname, port and its key in the connection string?

I want to connect my Azure Cache for Redis in a flutter app. Currently I've tried two packages in dart for redis which are redis 1.3.0 and dartis 0.5.0.
Example:
import 'package:redis/redis.dart';
...
RedisConnection conn = new RedisConnection();
conn
.connect('localhost', 6379)
.then((Command command) {
print("yo2");
command.send_object(["SET", "key1", "value1"]).then((var response) {
print(response);
});
});
Instead of "localhost" I put "SampleName.redis.cache.windows.net". This is the error I get:
E/flutter ( 4861): [ERROR:flutter/shell/common/shell.cc(209)] Dart Error: Unhandled exception:
E/flutter ( 4861): RedisError(NOAUTH Authentication required.)
old package This is the package starred on Redis Website. But it's incompatible on versions >2.
Okay so I found the solution. Use your key in the following way:
For redis 1.3.0:
...
RedisConnection conn = new RedisConnection();
conn
.connect('SampleName.redis.cache.windows.net', 6379)
.then((Command command) {
print("yo2");
command.send_object([
"AUTH",
"<YourKey>"
]).then((var response) {
print(response);
});
command.send_object(["SET", "key1", "value1"]);
});
...
And for dartis 0.5.0:
...
final client = await redis.Client.connect(
'redis://SampleName.redis.cache.windows.net:6379');
// Runs some commands.
final commands = client.asCommands<String, String>();
await commands.auth("<YourKey>");
// SET key value
await commands.set('yo', 'yovalue');
...

How to setup Google Translate API for Node.js?

I want to use Googles Cloud Translation API in my Node.js application, however I'm getting a The request is missing a valid API key. error.
I have followed the Quickstart guide provided by Google.
I have created GCP project, downloaded the private key as JSON file and setup the environment variable in Powershell (img).
After that I've installed the library with
yarn add #google-cloud/translate
The code I'm running in my translate.js file comes from the Quickstart guide with additional try-catch blocks:
async function quickstart(
projectId = process.env.PROJECT_ID // Project Id from JSON file
) {
try {
// Imports the Google Cloud client library
const { Translate } = require('#google-cloud/translate');
// Instantiates a client
const translate = new Translate({ projectId });
// The text to translate
const text = 'Hello, world!';
// The target language
const target = 'ru';
// Translates some text into Russian
const [translation] = await translate.translate(text, target);
console.log('Text:', text);
console.log('Translation:', translation);
} catch (error) {
console.error(error);
}
}
quickstart();
When I then run node translate.js, I'll get an Error:
{ Error: The request is missing a valid API key.
...
code: 403,
errors:
[ { message: 'The request is missing a valid API key.',
domain: 'global',
reason: 'forbidden' } ],
response: undefined,
message: 'The request is missing a valid API key.' }
I am on Windows 10, Node v10.13.0.
Believe you would have missed this environment variable defining , before starting the node service
Replace [PATH] with the file path of the JSON file that contains your service account key, and [FILE_NAME] with the filename.
With PowerShell:
$env:GOOGLE_APPLICATION_CREDENTIALS="[PATH]"
For example:
$env:GOOGLE_APPLICATION_CREDENTIALS="C:\Users\username\Downloads\[FILE_NAME].json"

git-simple : fatal: unable to auto-detect email address

I'm trying to commit using the package "git-simple".
The idea is to authenticate using the token provided by github api, create remote repository, create the .gitignore and then setup files.
Here is my code to run the script
const run = async () => {
try {
// Retrieve & Set Authentication Token
const token = await getGithubToken();
github.githubAuth(token);
// Create remote repository
const url = await repo.createRemoteRepo();
// Create .gitignore file
await repo.createGitignore();
// Setup local repository and push to remote
const done = await repo.setupRepo(url);
if(done) {
console.log(chalk.green('All done!'));
}
} catch(err) {
if (err) {
switch (err.code) {
case 401:
console.log(chalk.red('Couldn\'t log you in. Please provide correct credentials/token.'));
break;
case 422:
console.log(chalk.red('There already exists a remote repository with the same name'));
break;
default:
console.log(err);
}
}
}
}
Everything is okey until the execution repo.setupRepo(url)
Here the code for setupRepo(url)
setupRepo : async(url) =>{
const status = new Spinner('Initializing local repository and pushing to remote...')
status.start()
try{
await git
.init()
.add('.gitignore')
.add('./*')
.commit('initial commit')
.addRemote('origin',url)
.push('origin','master')
return true
}
catch(err){
console.log(err)
}
finally{
status.stop()
}
}
I got this message :
Git#then is deprecated after version 1.72 and will be removed in version 2.x
Please switch to using Git#exec to run arbitrary functions as part of the
command chain.
/ Initializing local repository and pushing to remote...
*** Please tell me who you are.
Run
git config --global user.email "you#example.com"
git config --global user.name "Your Name"
to set your account's default identity.
Omit --global to set the identity only in this repository.
fatal: unable to auto-detect email address (got 'Anes#DESKTOP-E9U575A.
(none)')
| Initializing local repository and pushing to remote...
When I open my github account, I found that the repository is already created but it's empty.
Can anyone help me?
I have this problem too.
how to solve?
see the code below,change ssh_url to clone_url.
createRemoteRepo: async () => {
const github = gh.getInstance();
const answers = await inquirer.askRepoDetails();
const data = {
name: answers.name,
description: answers.description,
private: (answers.visibility === 'private')
};
const status = new Spinner('Creating remote repository...');
status.start();
try {
const response = await github.repos.create(data);
// return response.data.ssh_url;
return response.data.clone_url;
} catch (err) {
throw err;
} finally {
status.stop();
}
}
detail response see here create repo
Why?
Difference between: SSH and https。Which remote URL should I use?
Other solutions, but I didn't experiment.
ssh-keyscan -t rsa github.com >> ~/.ssh/known_hosts

Bluemix Nodejs FileTransferStep, documentation

I am a newbie to bluemix. I downloaded the client libraries. But I don't see API docs for Javascript. Where do I find that? How do I go about calling several of javascript functions which is neither in the nodejs client libs nor I could find it online?
about the Workload service call you have to edit your package.json file
to add a dependency on the iws-light module using an https link, as follows
"dependencies": {
"iws-light": "https://start.wa.ibmserviceengage.com/bluemix/iws-light.tgz"
}
then you have to open your shell, go to the root of your app and run:
npm install
after this you can require the Workload Scheduler service in your application:
var ws = require("iws-light");
and create a connection to Bluemix:
//retrieve service URL from Bluemix VCAP_SERVICES...
var wsConn;
if(process.env.VCAP_SERVICES) {
wsConn = ws.createConnection();
} else {
//...or set it on your own(if you're working in local)
var url = "your workload scheduler url";
wsConn = ws.createConnection(url);
}
//retrieve cloud agent
var agentName;
wsConn.getCloudAgent(function(data) {
agentName = data;
});
//set your timezone
wsConn.setTimezone({timezone: "Europe/Rome"}, function(err, data){
if(err){
console.log(err);
}
});
now you're ready to use the lib and create a process
and add to it a FileTransferStep:
//create a process
var process = new ws.WAProcess("ProcessName", "This process transfer a file every day from local to remote server");
//supported operations are ws.steps.FileTransferStep.OperationDownload or ws.steps.FileTransferStep.OperationUpload
var operation = ws.steps.FileTransferStep.OperationUpload;
//create FileTransferStep
var ftStep = new ws.steps.FileTransferStep(agentName, operation);
//supported protocols are AUTO, FTP, FTPS, SSH, WINDOWS;
ftStep.setProtocol(ws.steps.FileTransferStep.ProtocolAuto);
//set local file
var local = {
path: "local file path",
user: "local username",
password: "local password"
};
ftStep.setLocalFile(local.path, local.user, local.password);
//set remote file
var remote = {
path: "remote file path",
user: "remote username",
password: "remote password",
server: "remote server"
};
ftStep.setRemoteFile(remote.server, remote.path, remote.user, remote.password);
//the binary mode flag: true if it uses FTP binary mode
var binaryMode = true;
the passive mode flag: true if it uses FTP passive mode
var passiveMode = true;
//set timeout
var timeout = 5;
ftStep.setMode(binaryMode, passiveMode , timeout);
//add FileTransferStep to the process
process.addStep(ftStep);
//create a trigger
var trigger = new ws.TriggerFactory.everyDayAt(1, 7, 30);
//add Trigger to the process
process.addTrigger(trigger);
process.tasklibraryid = "your task library id";
//create and enable process
wsConn.createAndEnableProcess(process, function(err, data){
if(err){
console.log(error);
} else{
console.log("process created and enabled");
}
});
The code above creates a process using a file transfer step from node.js code, however I'm not sure if this is what you actually need.
If you can explain the scenario you are trying to implement, I can be more precise about which is the best way to implement this scenario using Workload Scheduler service.
Regards,
Gabriele

Resources