I am doing a school project to teach us the basics of a bittorrent client. I decided to use node-torrent found here https://www.npmjs.com/package/node-torrent to help me. It is made for node, I am using angular 2.
I want to know if it is possible to implement this in an angular 2 web application and if not, how I would otherwise use it. I am pretty new to angular. I have attempted to just follow the tutorial which has us using this bit of code:
var Client = require('node-torrent');
var client = new Client({logLevel: 'DEBUG'});
var torrent = client.addTorrent('a.torrent');
// when the torrent completes, move it's files to another area
torrent.on('complete', function() {
console.log('complete!');
torrent.files.forEach(function(file) {
var newPath = '/new/path/' + file.path;
fs.rename(file.path, newPath);
// while still seeding need to make sure file.path points to the right place
file.path = newPath;
});
});
But the problem is I have no idea where to implement this. Also, usually there is a way to add this to my package.json in angular 2 with a npm install, but there is no indication of how to do that. I am assuming I have to download it?
Related
I have set up a simple node.js app as a proof of concept, where I want peers on a local network sync a database using gun.
I am new to gun so I am not sure if I am doing this correctly but here is my code:
var Gun = require('gun')
const address = require('network-address')
const hashToPort = require('hash-to-port')
// get username from arg eg. node index myname
const username = process.argv[2]
// create GUN server on it's own port
var server = require('http').createServer().listen(hashToPort(username))
var gun = Gun({web: server})
// listen for input from the console
process.stdin.on('data', (data) => {
gun.get('hello').put({ word: data.toString(), user: username })
});
// Output input update
gun.get('hello').on(function(data, key) {
console.log(data.user + ' said: ' + data.word.toString())
})
The idea is that peers can drop out and reconnect and sync to the latest version of the database.
I run the app on 2 different local network machines and it works well. The database is syncing.
If I close one app, then update the database on the open app, and then restart the 2nd app, the 2nd app does not sync with the already open app.
Is there a way to sync with the updated db when a new peer connects?
I hope that all makes sense. Please suggest if this is the wrong way to go about it.
#CUGreen I'm glad that the local area multicast sync is working!
If I understand your question right, it is that you want OLD data to synced?
gun.get('hello').put(data) and .on(cb) update the same object. So technically, you are syncing the whole database, you're just always getting the latest state. (Unless there is some other bug? Please let me know).
What you probably want to do is .set(data) instead of .put(, this will add a NEW record to a table on hello, which you can then query all of it (old and live inserts in future) with gun.get('hello').map().on(cb)
I do not know if this is relevant, but you may find https://gun.eco/docs/Graph-Guide a nice introduction to put/set, etc.
And of course, if you need any assistance, there is a super friendly and active community at the http://chat.gun.eco chat room!
If there is a bug, please report it at https://github.com/amark/gun/issues
I am working with a nodejs express server which uses socket.io to communicate an iOS client, and am having a little trouble trying to test how many clients can connect and exchange data at any one time.
My goal is to be able to run a script which connects to socket.io with thousands of different sessions, as well as send and receive data to understand our system's scale. Currently we are using a single dyno on Heroku but will likely be considering other options on AWS soon.
I have found code which should do what I am trying to do for earlier versions of socket.io, such as this, but have had issues since it seems v1.x has a very different handshake protocol. I tried out using the socket.io-client package, but trying to connect multiple times only simulates use of one session, I need to simulate many in independent users.
I have been picking apart the socket.io-client code, but have only gotten so far as creating a connection - I am stuck on the sending data part. If anyone has any knowledge or could point to some written resources on how data is sent between a client and a socket.io server, it would help me out a lot.
Here's what I have so far:
var needle = require('needle'),
WebSocket = require('ws'),
BASE_URL = 'url-to-socket-host:5002';
var connectionNo = 0;
needle.get('http://' + BASE_URL + '/socket.io/?EIO=3&transport=polling&t=1416506501335-0', function (err, resp) {
// parse the sid
var resp = JSON.parse(resp.body.toString().substring(5, resp.body.toString().length));
// use the sid to connect using websockets
var url = 'ws://' + BASE_URL + '/socket.io/?EIO=3&transport=websocket&sid=' + resp.sid;
console.log(connectionNo + ' with sid: ' + resp.sid);
var socket = new WebSocket(url, void(0), {
agent: false
});
socket.on('open', function () {
console.log('Websocket connected: ' + connectionNo);
// I don't understand how to send data to the server here,
// from looking at the source code it should use some kind
// of binary encoding, any ideas?
socket.on('message', function (msg) {
console.log(msg);
});
});
});
I will continue deconstructing the socket.io-client code but if anyone has any clues or recourses that may help, let me know. Thanks.
I ended up setting for using the socket.io-client npm package which has the ability to connect to a new session on every connection. I found an example benchmark in this issue.
There is not so much need for me to manually connect to socket.io using pure websockets and HTTP, but thanks to Yannik for pointing out the parser in use. The spec of the inner workings of v1.x can be found here.
Thanks!
The problem my reside in the fact that you are not using socket.io in your client code. You have imported ('ws') which is another module whose docs are here: https://www.npmjs.org/package/ws.
You probably want to ws.send('something');. When you receive a message in ws, it also comes with an object with a property indicating whether it is binary data or not. If it is, you will need to concatenate the chunks incrementally. There is a canonical way to do this which you can find via google. But it looks a little like this:
var message;
socketConnection.on('data', function(chunk){ message += chunk});
Im playing around with node.js and now.js. Everything works fine. But i would like to make a simple client that i can run from the command-line (so without the browser).
http://nowjs.com/doc/example
In the example a HTML page gets served, and that page includes the now.js file, which creates the magic 'now' object. But on a commmand-line there is no such thing.
For the server i have running (helloworld_server.js)
And the client helloworld_client.js i have:
// client.js
var nowjs = require("now");
// now i need to connect to the server (127.0.0.1:8080)
// so i i need a server object?
server = ????
var everyone = nowjs.initialize(server);
everyone.now.distributeMessage('hi!');
So how do i obtain the 'now' object?
OK, got it. Once you installed now
npm install now
it creates a node_modules folder, inside you see folders for each extension. Deeper you find:
./node_modules/now/examples
and there is the nodeclient_example folder
./node_modules/now/examples/nodeclient_example
its pretty clear from there, but those curious, this is the magic you need:
var nowjs = require('../../lib/nodeclient/now.js');
var now = nowjs.nowInitialize('http://localhost:8080');
and there it it the 'magic' now object
make sure you install :
npm install socket.io-client
otherwise it wasn't working for me!
Note: Please read the edited portion of this post before answering, it might save you time and answers one of my questions.
The problem I'm having is pretty simple but I'm pretty new to this overall and I'm having issues figuring out how to implement a mongodb database connection properly in a node/express app.
I'm using express 3.x and am basing my app layout around this project supplied by the author of Express:
https://github.com/expressjs/express/tree/d8caf209e38a214cb90b11ed59fd15b717b3f9bc/examples/blog (now removed from repo)
I have no interest in making a blog however the way the app is structured appears to be quite nice. The routes are separated and everything is organized nicely.
My problem is I might have 5-6 different route js files and each route js file might have anywhere between 1 and 15 routes; of those routes 1 or 15 might want to access the db.
So my problem is it seems like a really terrible idea to do a db.open(...) every single time I want to query the db. I should mention at this point I'm using the native mongo-db driver (npm install mongodb).
I would also need to include a file like this:
http://pastebin.com/VzFsPyax
...in all of those route files and all of my model files. Then I'm also dealing with dozens upon dozens of open connections.
Is there a way I can structure my app in such a way where I only make 1 connection and it stays open for the duration of the session (having a new one made every request would be bad too)?
If so, how can I do this? If you know the answer please post a code sample using tj's blog app (the one linked earlier in this post) structure as a base guide. Basically have a way where the routes and models can use the db freely while being in separate files than the db open code.
Thanks.
EDIT
I made some progress on solving one of my issues. If you look at tj's blog example he initializes his routes in the app.js like so:
require('./routes/site')(app);
require('./routes/post')(app);
And in the routes js file it starts like this:
module.exports = function(app){
I stumbled on a project earlier today where I saw someone pass 2 variables in the modules.exports call -> function(app, db). Then figured wow could it be that easy, do I need to just adjust my routes to be (app, db) too? Yeah, it seems so.
So now part 1 of the problem is solved. I don't have to require a mongo.js file with the connection boilerplate in every route file. At the same time it's flexible enough where I can decide to pick and choose which route files pass a db reference. This is standard and has no downside right?
Part 2 of the problem (the important one unfortunately) still exists though.
How can I bypass having to do a db.open(...) around every query I make and ideally only make a connection once per session?
Other solution is to pass database to the router via request, like this:
app.js
var db = openDatabase();
var app = express();
app.all('*', function(request, response, next)
{
request.database = db;
next();
});
app.get('/api/user/:id', Users.getByID);
users.js
var Users =
{
getByID: function(request, response)
{
request.database.collection('users').findOne(...)
response.send(user);
}
};
module.exports = Users;
I made a very simple module hub for this case that replaces the use of a global space.
In app.js you can create db connection once:
var hub = require('hub');
hub.db = new Db('foobar', new Server('10.0.2.15', 27017, {}), {native_parser: false});
And use it from any other files:
var hub = require('hub');
// hub.db - here link to db connection
This method uses a feature of 'require'. Module is only loaded for the first time and all the other calls gets a reference to an already loaded instance.
UPDATE
That's what I mean:
In main file like app.js we create Db connection, open it and store into hub:
app.js:
var hub = require('hub');
hub.mongodb = require('mongodb');
hub.mongodbClient = new hub.mongodb.Db('foobar', new hub.mongodb.Server('10.0.2.15', 27017, {}), {native_parser: false});
hub.mongodbClient.open(function(error) {
console.log('opened');
});
Now in any other file (message for example) we have access to opened connection and can simple use it:
message.js:
var hub = require('hub');
var collection = new hub.mongodb.Collection(hub.mongodbClient, 'message');
module.exports.count = function(cb) {
collection.count({}, function(err, count) {
cb(err, count);
});
};
Really silly. In the documentation it seems like db.open requires to be wrapped around whatever is using it, but in reality you can use it without a callback.
So the answer is to just do a db.open() in your database connection module, app.js file or where ever you decide to setup your db server/connection.
As long as you pass a reference to the db in the files using it, you'll have access to an "opened" db connection ready to be queried.
I have just started using node.js. My mayor problem is lack of documentation but I'm getting through and I really like it
Now I'm trying to use push-it which sits on top of socket.io. The Docs mention to serve the static client-js file, but I don't know how to do that. I already tried different paths. Socket.io works out of the box, but I can't find how to do it for push-it.
I installed push-it using npm
Thanks for any tips,
Miguel
you can use connect or express to server static files,
exactly as the dnode docs suggest.
__dirname is the directory you're running from, it's common to use __dirname + '/public' and place your files in there
var connect = require('connect');
var server = connect.createServer();
server.use(connect.staticProvider(__dirname));
var dnode = require('dnode');
dnode(function (client) {
this.cat = function (cb) {
cb('meow');
};
}).listen(server);