How to use tropo with node.js-socketstream framework? - node.js

In my application im using node.js with socketstream framework.Now i need to use tropo module to send an sms,receive an sms,make a call and answering an incoming call.I installed tropo-webapi using npm.i added this node code in server side.While running the below code i didn't get any output.i dont know how to link this file in tropo website and how to receive the sms and phone call.
exports.actions = function(req, res, ss) {
return{
sendMessage:function(){
var tropoAPI = require('tropo-webapi');
var tropo = new tropoAPI.TropoWebAPI();
tropo.call("+18197924547", null, null, null, null, null, "SMS", null, null,null);
tropo.say("HI How Are You!!");
},
makeCall:function(){
var tropoAPI = require('tropo-webapi');
var tropo = new tropoAPI.TropoWebAPI();
tropo.call("+18197924547");
tropo.say("Hi,how ary yoo!");
}
};
};

So you did not specify from which file is this snippet from but I got the feeling, it is from one under server/rpc/<yourfilename> where the Remote Procedure Calls are.
!!! NOTE THE <yourfilename>
The methods which you define are generated for the client in a way that he can trigger it trough the socket interfaces, let me write a snippet, I think you will understand:
- lets write into this file => client/code/app/app.js
ss.rpc('<yourfilename>.sendMessage','here comes params which is now a string');
To get a sense what this ss object is look into your client/code/app/entry.js you will see it.

Related

Using the response from bidirectional streaming callback in Node.js gRPC client

I have a grpc server that is written in Go and runs on a unix domain socket. I'm writing a nodeJS client for for this. Since grpc-node doesn't support unix sockets, i had to use #grpc/grpc-js.
Now the issue i'm facing is that I need to perform some operation on the response that i get back from the server. I currently have this and it works. But is there a cleaner/nicer way of accomplishing this?
let appEncryptionDef = protoLoader.loadSync(__dirname + '../../../../protos/appencryption.proto', {
keepCase: true,
defaults: true,
oneofs: true
});
let appEncryptionProto = grpc.loadPackageDefinition(appEncryptionDef);
let client = new appEncryptionProto.asherah.apps.server.AppEncryption(`unix://${socket}`, grpc.credentials.createInsecure());
let call = client.session();
call.on('data', function (sessionResponse) {
switch (sessionResponse.response) {
case 'x':
data = parse_server_response;
call.write(data);
call.end();
break;
case 'error_response':
console.log('error received: ' + sessionResponse.error_response.message);
break;
}
});
Is there a better way of doing this? I've looked at the grpc-caller library but using that with #grpc/grpc-js gives me a Channel's second argument must be a ChannelCredentials error.
Here's the client I've written so far: https://github.com/godaddy/asherah/blob/servicelayer_node/server/samples/clients/node/appencryption_client.js

postUrl runs only once

I am calling postUrl in action-javascript that is referenced from Bixby sample on https://github.com/bixbydevelopers/capsule-samples-collection/tree/master/http-api-calls
var http = require('http')
var console = require('console')
var config = require('config')
module.exports.function = function adjustVolume (volume) {
var o = { };
var options = {
passAsJson: true,
returnHeaders: true,
format: 'json'
};
var response = http.postUrl(config.get('remote.url') + '/api/gvm/control/volume/' + volume, o, options);
return "ok";
}
BTW, postUrl to my remote service runs only one time, all later postUrl does not come into my remote service. Then I need to restart Bixby developer studio again to get postUrl to my remote service.
With getUrl, there is no symptom above.
Did I miss any constraint about using postUrl ?
Thanks in advance.
It looks like the Bixby platform caches the response from your remote server, and keeps re-serving it to your capsule code. I found that the solution is to set the cacheTime in the options to 0, and this makes the Bixby platform call your remote server again every time. Substitute the following for your options above (adding cacheTime on its own line):
var options = {
passAsJson: true,
returnHeaders: true,
format: 'json',
cacheTime: 0 // <--- this is the new line to add
};
I found this out when I wrote a tutorial capsule using remote storage. I was using http.postUrl to access my remote server and had to update the options for the postUrl call at this place in the code, or it wouldn't call the remote server more than once. The solution was setting the cacheTime to 0, as I mentioned above.

Connecting to Accumulo from NodeJS

I have been trying to connect to Accumulo from NodeJS through the Thrift proxy, but have been unsuccessful.
var thrift = require("thrift");
var AccumuloClient = require("./AccumuloProxy");
var transport = thrift.TFramedTransport;
var protocol = thrift.TBinaryProtocol;
var connection = thrift.createConnection("localhost", 42424, {
transport: transport,
protocol: protocol
});
var client = thrift.createClient(AccumuloClient, connection);
client.login("root", {'password': "password"});
When I try to login I get
org.apache.thrift.protocol.TProtocolException: Expected protocol id ffffff82 but got ffffff80
Is anyone able to help me out and give me an idea of what I'm doing wrong here?
UPDATE:
I modified protocolFactory line in the proxy.properties file located in Accumulo and restarted the proxy.
protocolFactory=org.apache.thrift.protocol.TBinaryProtocol$Factory
I performed the same steps as above, but added a callback to the createClient call.
var login;
var client = thrift.createClient(AccumuloClient, connection,
function(err, success) { login = success });
This populates the login variable. I then try to use that login variable to execute other functions
client.listTables(login, function(a) { console.log(a) })
results in
{name: 'TApplicationException',
type: 6,
message: 'Internal error processing listTables'}
Trying to create a table
client.createTable(login, "testTable", 1, "", function(a) { console.log(a)})
results in
{name: 'AccumuloSecurityException',
msg: 'org.apache.accumulo.core.client.AccumuloSecurityException: Error SERIALIZATION_ERROR for user unknown - Unknown security exception'}
See answer below.
It turns out that the problem existed because of the handling of the response back from Accumulo. In the AccumuloProxy.js file when the login result is received and read in AccumuloProxy_login_result.prototype.read it will set the success as this.success = input.readString()
The readString() function will take the Buffer and call toString() using the utf8 encoding. This was resulting in characters showing up incorrectly.
I modified the AccumuloProxy_login_result.prototype.read function to set success as this.success = input.readBinary() so that a Buffer is returned. This Buffer can be passed in to the other function calls and will get a correct result back from Accumulo instead of an Exception.
This was put in as an issue with Thrift here and has apparently been fixed in the master branch.
Seems that Accumulo uses the compact protocol, not the binary protocol. It also seems, that there is currently no compact protocol support available for NodeJS.
Please have a look at this SO question as well. It deals with C#, but nevertheless it can be helpful. There are also some solutions out there utilizing RabbitMQ or other message brokers, see here.

Node.js server side connection to Socket.io

I have a Node.js application with a frontend app and a backend app, the backend will manage the list and "push" an update to the frontend app, the call to the frontend app will trigger a list update so that all clients receive the correct list data.
The problem is on the backend side, when I press the button, I perform an AJAX call, and that AJAX call will perform the following code (trimmed some operations out of it):
Lists.findOne({_id: active_settings.active_id}, function(error, lists_result) {
var song_list = new Array();
for (i=0; i < lists_result.songs.length; i++) {
song_list.push(lists_result.songs[i].ref);
}
Song.find({
'_id': {$in: song_list}
}, function(error, songs){
// DO STUFF WITH THE SONGS
// UPDATE SETTINGS (code trimmed)
active_settings.save(function(error, updated_settings) {
list = {
settings: updated_settings,
};
var io = require('socket.io-client');
var socket = io.connect(config.app_url);
socket.on('connect', function () {
socket.emit('update_list', {key: config.socket_key});
});
response.json({
status: true,
list: list
});
response.end();
}
});
});
However the response.end never seems to work, the call keeps hanging, further more, the list doesn't always get refreshed so there is an issue with the socket.emit code. And the socket connection stays open I assume because the response isn't ended?
I have never done this server side before so any help would be much appreciated. (the active_settings etc exists)
I see some issues that might or might not be causing your problems:
list isn't properly scoped, since you don't prefix it with var; essentially, you're creating a global variable which might get overwritten when there are multiple requests being handled;
response.json() calls .end() itself; it doesn't hurt to call response.end() again yourself, but not necessary;
since you're not closing the socket(.io) connection anywhere, it will probably always stay open;
it sounds more appropriate to not set up a new socket.io connection for each request, but just once at your app startup and just re-use that;

Turn on Server-side encryption and Get Object Version in Amazon S3 with knox and nodejs

So far I've been able to successfully use node.js, express, and knox to add/update/delete/retrieve objects in Amazon S3. Trying to move things to the next level I'm trying to figure out how to use knox (if it's possible) to do two things:
1) Set the object to use server-side encryption when adding/updating the object.
2) Get a particular version of an object or get a list of versions of the object.
I know this is an old question, but it is possible to upload a file with knox using server-side encryption by specifying a header:
client.putFile('test.txt', '/test.txt', {"x-amz-server-side-encryption": "AES256"}, function(err, res) {
//Do something here
});
Andy (who wrote AwsSum) here.
Using AwsSum, when you put an object, just set the 'ServerSideEncryption' to the value you want (currently S3 only supports 'AES256'). Easy! :)
e.g.
var body = ...; // a buffer, a string, a stream
var options = {
BucketName : 'chilts',
ObjectName : 'my-object.ext',
ContentLength : Buffer.byteLength(body),
Body : body,
ServerSideEncryption : 'AES256'
};
s3.PutObject(options, function(err, data) {
console.log("\nputting an object to pie-18 - expecting success");
console.log(err, 'Error');
console.log(data, 'Data');
});

Resources