We have our couchbase server setup with three EC2 instances, first instance only has the database service running, second instance has the index service running & third instance has query service running.
The index & query servers are added to the data server using couchbase web console which has option to "Add Servers" under "Server Nodes" option referenced from this article.
Now, for example, If I have to connect to the bucket residing on the server using Nodejs SDK, Ottoman and create a new user then it is able to connect to the bucket however it is not able to save the document in the bucket and gives me a "segmentation fault (core dumped)" error.
Please let us know if we need make any changes to the way servers are setup or how should we go ahead with above example so that we are able to create user.
Software Versions:
Couchbase : 4.5
Couchbase Nodejs SDK : 2.2
Ottoman : 1.0.3
This function is running from AWS Lambda using Nodejs ver-4.3.
The error I am getting is "Segmentation Fault(core dumped)".
Below is the AWS Lambda function that I have tried:
var couchbase=require('couchbase');
var ottoman=require('ottoman');
var config = require("./config");
var myCluster = new couchbase.Cluster(config.couchbase.server); // here tried connecting to either data / index / query server
ottoman.bucket = myCluster.openBucket(config.couchbase.bucket);
require('./models/users');
ottoman.ensureIndices(function(err) {
if (err) {
console.log('failed to created neccessary indices', err);
return;
}
console.log('ottoman indices are ready for use!');
});
var user = require('./models/users');
exports.handler = function(event, context) {
user.computeHash(event.password, function(err, salt, hash) {
if (err) {
context.fail('Error in hash: ' + err);
} else {
user.createAndSave("userDetails details sent to the user creation function", function (error, done) {
if (error) {
context.fail(error.toString());
}
context.succeed({
success: true,
data: done
});
});
}
});
};
When you run the above function locally (using node-lambda) to test it gives the same "Segmentation fault(core dumped)" error and when uploaded on Lambda and tested it gives the following error :
{
"errorMessage": "Process exited before completing request"
}
Thanks in advance
This is a known issue related to the MDS scenario you are using (https://issues.couchbase.com/browse/JSCBC-316). This will be resolved in our next release in the beginning of August.
Related
I'm new to GCP, Cloud Functions and NodeJS ecosystem. Any pointers would be very helpful.
I want to write a GCP Cloud Function that does following:
Read contents of file (sample.txt) saved in Google Cloud Storage.
Copy it to local file system (or just console.log() it)
Run this code using functions-emulator locally for testing
Result: 500 INTERNAL error with message 'function crashed'. Function logs give following message
2019-01-21T20:24:45.647Z - info: User function triggered, starting execution
2019-01-21T20:24:46.066Z - info: Execution took 861 ms, finished with status: 'crash'
Below is my code, picked mostly from GCP NodeJS sample code and documentation.
exports.list_files = (req, res) => {
const fs = require('fs');
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('curl-tests');
bucket.setUserProject("cf-nodejs");
const file = bucket.file('sample.txt'); // file has couple of lines of text
const localFilename = '/Users/<username>/sample_copy.txt';
file.createReadStream()
.on('error', function (err) { })
.on('response', function (response) {
// Server connected and responded with the specified status and
headers.
})
.on('end', function () {
// The file is fully downloaded.
})
.pipe(fs.createWriteStream(localFilename));
}
I run like this:
functions call list_files --trigger-http
ExecutionId: 4a722196-d94d-43c8-9151-498a9bb26997
Error: { error:
{ code: 500,
status: 'INTERNAL',
message: 'function crashed',
errors: [ 'socket hang up' ] } }
Eventually, I want to have certificates and keys saved in Storage buckets and use them to authenticate with a service outside of GCP. This is the bigger problem I'm trying to solve. But for now, focusing on resolving the crash.
Start your development and debugging on your desktop using node and not an emulator. Once you have your code working without warnings and errors, then start working with the emulator and then finally with Cloud Functions.
Lets' take your code and fix parts of it.
bucket.setUserProject("cf-nodejs");
I doubt that your project is cf-nodejs. Enter the correct project ID.
const localFilename = '/Users/<username>/sample_copy.txt';
This won't work. You do not have the directory /Users/<username> in cloud functions. The only directory that you can write to is /tmp. For testing purposes change this line to:
const localFilename = '/tmp/sample_copy.txt';
You are not doing anything for errors:
.on('error', function (err) { })
Change this line to at least print something:
.on('error', function (err) { console.log(err); })
You will then be able to view the output in Google Cloud Console -> Stack Driver -> Logs. Stack Driver supports select "Cloud Functions" - "Your function name" so that you can see your debug output.
Last tip, wrap your code in a try/except block and console.log the error message in the except block. This way you will at least have a log entry when your program crashes in the cloud.
I am on web3 version 1.0.0-beta.27 and I am running a private ethereum blockchain for testing purposes. The blockchain is mining and has two users, now I would like to subscribe to events in the blockchain and perform some actions. The code is below:
var Web3 = require("web3");
var ether_port = 'http://localhost:8545'
var web3 = new Web3(new Web3.providers.HttpProvider(ether_port));
web3.eth.subscribe("pendingTransactions"
, function(err, result){
if (err){ console.log(err) }
else { console.log("result: ", result) }
});
I get something like:
Error: The current provider doesn't support subscriptions: HttpProvider
at Subscription.subscribe
In some sense not surprising since when I do web3.eth.subscribe on the node.js console I get:
{ [Function] call: undefined }
Even though the documentation for web3-1.0.0 states the function can be used: https://web3js.readthedocs.io/en/1.0/web3-eth-subscribe.html.
So is this just a matter of documentation being out of sync with actual implementation? Am I using it wrong?
If it is not implemented, what is the best way to listen to changes in the chain? For example if I want a real time update of a user's account balance? That is aside from the naive implementation of a function that pings the chain every n fraction of a second.
As the error suggests, pub/sub is not available over HTTP. However, you can use it over WS. So, the documentation you referenced is not 100% wrong, it just omits the provider portion of the code.
Try starting your nodes using web socket connections (geth --ws --wsport 8545 ..., assuming you're using geth), and change to a WebsocketProvider.
var Web3 = require("web3");
var ether_port = 'ws://localhost:8545'
var web3 = new Web3(new Web3.providers.WebsocketProvider(ether_port));
web3.eth.subscribe("pendingTransactions"
, function(err, result){
if (err){ console.log(err) }
else { console.log("result: ", result) }
});
See the 4th comment on this discussion ticket.
a better way to open it using a attached JS console
which you can attach using > geth attach 'ipc path' (i.e. in my case it is /home/dev/.ethereum/geth.ipc)
after that you will connect to running geth node and use management API's.. now you can use
admin.startWS("localhost", 'port number')
and when you want to close the connection then you can use below command
admin.stopWS()
Regards
Dev
I am creating an API where I need to connect to different database using thier credentials on the fly. I need to make functionality similar to MySql workbench - test connection. Currently, I need to deal with MySql and MSSql server. I have to check all the permutation and combination for wrong credentials. i.e. if I pass correct credentials for example, correct host, username, password, port but wrong connector instead of MySql I pass Mssql. That is throwing an exception.
var db = {
host : data.hostName,
port : data.port,
database : data.database,
username : data.userName,
password : data.password,
connector : response[0].node_js_connector
}
var dataSource = new DataSource(db.connector, db);
dataSource.on('connected', function (er) {
if(er) {
console.log("reject");
reject(er);
}
else {
console.log("resolve");
resolve('Work With Database');
}
});
dataSource.on('error', function (er) {
if(er) {
console.log("reject1");
reject(er);
}
else {
console.log("reject1");
reject('Not Connected Databse');
}
});
I have also put the code in try/catch block to handle exception. However, I am not able to catch it. Currently, I am getting following error:
throw new RangeError('Index out of range');
RangeError: Index out of range
at checkOffset (buffer.js:968:11)
at Buffer.readUInt8 (buffer.js:1006:5)
It would be a great help, if someone can assist me in solving this issue.
Thanks in advance.
i use process.on() method to headland uncaughtException. this method handle all the exception in file execution time. so, in this time uncaughtException will handle my Index out of range exception. and reject my dataSource connection.
process.on('uncaughtException', function (err) {
console.log('UNCAUGHT EXCEPTION - keeping process alive:', err); // err.message is "foobar"
reject('Connected Faile');
});
I am trying to create a search on my mongo db database. A good choice I thought was to use elasticsearch. So I started a cluster on aws elastic search. Beacuse I have this elasticsearch for development purposes I have set the access policy to have open access to the domain.
this.es_connection = new elasticsearch.Client("elastic search end point as given on aws es domain page");
this.es_connection.ping(
{
requestTimeout: 30000,
hello: 'elasticsearch'
},
function(error) {
if (error) {
console.error('elasticsearch cluster is down!' + JSON.stringify(error));
} else {
logger.info('All is well in elasticsearch');
}
}
);
to check I am trying to ping usgin elascticsearch package on npm.I am getting no living connection. The node server is running on local host.When I vising the end point url from my own browser I get the success message.
How do I use the aws es service with mongoosastic, I keep getting no living connection error. If AWS/ES is a rest api how can I use it with mongoosastic.
Make sure you specify your AWS credentials during connection. I'd recommend using this library https://www.npmjs.com/package/http-aws-es.
This worked for me:
var client = require('elasticsearch').Client({
hosts: 'Your host',
connectionClass: require('http-aws-es'),
amazonES: {
region: 'region',
accessKey: 'key',
secretKey: 'secretKey'
}
});
Then also make sure you make proper queries, otherwise it will result in 400 error.
Apologize for my English.
I have a node js script that has to send AMQP messages to device using IoT hub. I've took thiss script from github of azure iot. Here is this sample.
Here is this sample
Here is my script, based on this one:
console.log("creating the client");
var Client = require('azure-iothub').Client;
console.log("client has been created");
var Message = require('azure-iot-common').Message;
console.log("message has been created");
var connectionString = "HostName=id**.azure-devices.net;SharedAccessKeyName=iothubowner;SharedAccessKey=***;
console.log(connectionString);
var targetDevice = 'devicesergey';
var client = Client.fromConnectionString(connectionString);
client.open(function (err) {
if (err) {
console.error('Could not connect: ' + err.message);
}
else {
console.log('Client connected');
var data = JSON.stringify({ text : 'foo' });
var message = new Message(data);
console.log("json message is created")
console.log('Sending message: ' + message.getData());
client.send(targetDevice, message, printResultFor('send'));
console.log("message has been sent");
}
});
function printResultFor(op) {
return function printResult(err, res) {
if (err) {
console.log(op + ' error: ' + err.toString());
} else {
console.log(op + ' status: ' + res.constructor.name);
}
};
}
That works fine locally and I see messages on my device emulator. But when I try to put it to Azure Mobile Services API and try to run it, I see this message on logs:
An unhandled exception occurred. Error: One of your scripts caused the service to become unresponsive and the service was restarted. This is commonly caused by a script executing an infinite loop or a long, blocking operation. The service was restarted after the script continuously executed for longer than 5000 milliseconds. at process.Server._registerUncaughtExceptionListenerAndCreateHttpServer._onUncaughtException (D:\home\site\wwwroot\node_modules\azure-mobile-services\runtime\server.js:218:17) at process.EventEmitter.emit (events.js:126:20)
And sometimes I see this IIS error
I know exactly that this line occurs this function: client.open(function....
I've evem tried to leave only client.open() and send a messages out of this function. But in this case I see "client is not connected".
I asked about this stuff on github. They advised me to asked here. Maybe someone know how to solve this issue (with script or Azure). I would be very very greatfull!
Thank you!
The Mobile Service Custom API is a script that expose the functionality of the express.js library, please see the section Overview of custom APIs of the offical document "Work with a JavaScript backend mobile service"
I reproduced the issue successfully. I guess your script was not wrapped in the code below as the body block, and not sent the response to the client like browser.
exports.get = function(request, response) {
// The body block
....
response.send(200, "<response-body>");
}
For more details of Mobile Service Custom API, please see https://msdn.microsoft.com/library/azure/dn280974.aspx.
Update:
I changed your code as below.
And In order to facilitate the test, I changed the permission for the api as below, then I can access the api link https://<mobile-service-name>.azure-mobile.net/api/test with browser.
I've just tried to execute my script on new Azure MS and it was unsuccesfully.
I will write my step-by-step actions, maybe you can see anything wrong, because I'm not so good in NodeJS.
Add a new Azure MS with new SQL Database
Add a new API "dev". Access - everyone for all points. Here is source code:
exports.get = function(request, response) {
console.log("creating the client");
var Client = require('azure-iothub').Client;
console.log("client has been created");
var Message = require('azure-iot-common').Message;
console.log("message has been created");
var connectionString = "HostName=i***.azure-devices.net;SharedAccessKeyName=iothubowner;SharedAccessKey***";
console.log(connectionString);
var targetDevice = 'devicesergey';
var client = Client.fromConnectionString(connectionString);
client.open(function (err) {
if (err) {
console.error('Could not connect: ' + err.message);
}
else {
console.log('Client connected');
var data = JSON.stringify({ text : 'foo' });
var message = new Message(data);
console.log("json message is created")
console.log('Sending message: ' + message.getData());
client.send(targetDevice, message, printResultFor('send'));
console.log("message has been sent"); }
});
response(200, "Hello, world!");
};
function printResultFor(op) {
return function printResult(err, res) {
if (err) {
console.log(op + ' error: ' + err.toString());
} else {
console.log(op + ' status: ' + res.constructor.name);
}
};
}
If I try to execute this stuff it occurs "no azure-iothub" and "no azure-iot-common", so I need to use git to add these npm.
I clone this repository to my local dir using git access to Azure MS https://id.scm.azure-mobile.net/id.git
Enter the "API" folder and add the NPMs:
Then I perfom "Rescan", "Save changes", "Commit", "Push" on
After these actions I execute my script by path "http://id**.mobile-services.net/api/dev" and don't see anything o see the error "500.1013" and these messages on logs (id depends):
An unhandled exception occurred. Error: One of your scripts caused the
service to become unresponsive and the service was restarted. This is
commonly caused by a script executing an infinite loop or a long,
blocking operation. The service was restarted after the script
continuously executed for longer than 5000 milliseconds. at
process.Server._registerUncaughtExceptionListenerAndCreateHttpServer._onUncaughtException
(D:\home\site\wwwroot\node_modules\azure-mobile-services\runtime\server.js:218:17)
at process.EventEmitter.emit (events.js:126:20)
I can't realize what I'm doing wrong
UPDATE:
I've tried to use Kudu console for installing the npms and it returns many errors. If i figured out correctly, I need to update my node js and npm. But I don't know how to do this and I didn't manage to find a solution.
Here are logs:
I have lack of reputation, so I am not allowed to past log scripts.
I've tried to do these actions, but it doesn't help:
at the root of the repo, you'll find a .deployment file that has:
command = ..\ZumoDeploy.cmd Change it to
command = deploy.cmd And create a deploy.cmd next to it containing:
set
NPM_JS_PATH=%ProgramFiles(x86)%\npm\1.4.9\node_modules\npm\bin\npm-cli.js ..\ZumoDeploy.cmd Commit both files and push.
I'm confused. How is it possible? Azure Mobile services don't permit to install azure-iot-hub npm). What can I do with this issue?
UPDATE2:
Peter Pan - MSFT, you advised me to use Kudu DebucConsole to install necessary npm. But when I try to do it - I see errors.
I've messaged about this issue to "npm" command on github, they say that the version of npm which Azure is used to - is completely unsupported.
htt ps://github.com/npm/npm/issues/12210#event-615573997
UPDATE3 (04/12/2016)
I've solved this issue by different way. Created my own node JS script that is listening a port, read GET params(deviceId and message) and send D2C messages.
Unfortunately, I still can't get trow the Azure issue.
UPDATE4
Peter Pan gave me an advise how to use another version of nodejs and npm. Now I've succesfully installed necessary NPM modules. But now Azure Mobile Script APIs don't work, it shows me {"code":404,"error":"Error: Not Found"} on any script that I try to get in my browser.
Maybe I've deleted something when I tried to do these stuffs.