watch for file changes on remote server - node.js

I want to create a node.JS app to watch changes on a remote server, e.g. //10.9.8.7/files_to_watch/*.
All I need is watch if some new files/folders appear. I have found an npm package named chokidar and I managed to watch local files but I don't know how to watch remote disc (which is protected by password).
How should I do that? Should I first connect to that server (ftp node package) and then run watcher or what? And what is with optimization? What if there will be a lot of files and changes? Is it a good way I'm thinking?

Related

How can I clone a git repository on a server automatically to another folder on every commit

I have a simple express.js-powered API running on an ubuntu server with pm2. The server.js and the other files are in /var/www/node/api/.
On this server, at /srv/git-repos/api.git/ is my git repository, where I commit new changes from my local machine.
The Question is, is it possible, that every time I commit new changes to the server, it recognizes this, and clones my repository to let's say /var/www/node/api-dev/(which will then be available at dev.example.com, this could be made possible with nginx, so there is no problem) and restarts my pm2 instance with pm2 restart api.
Because then I would test, if my changes work on the server, and when they do, I just can copy the content of /var/www/node/api-dev to /var/www/node/api manually.
Or is there another, better workflow? It's just a small API on which I will make many changes because I want to develop it for the needs of my frontend.
Thanks for your answers and suggestions, I hope it's understandable what I want to archive.
git commit
git clone <repo> ../api-dev
pm2 restart api
Save it as c.sh and run with ./c.sh or add to your .bashrc

App can find view on local but not once uploaded to server

So I have this app:
https://github.com/jfny/SocialGrowth.xyz
If you were to git clone and npm install/npm start on your local, it should work fine.
However when I upload the files to my server and try running, it tells me it is unable to look up one of the views, yet the files haven't changed at all.
Why is this?

How can I use (Node) Livereload on a development server in my network

Background: My PHP projects (CakePHP, Wordpress) run on an Ubuntu server in my network, I access them through a development TLD (.dev for example) setup through a local DNS server and I edit the files through a Samba share.
I would like to utilize Livereload for my development, preferably have it running on the server itself. I have basic Node/Gulp knowledge, but haven't been able to get this running.
Livereload (or a middleware server) should proxy the 'real' URLs, making sure all websites run as they would normally and Livereload should be available over the network (so not just localhost, because that runs on the development server)
Desired result:
Livereload runs on my dev server (IP: 10.0.0.1), my project is called helloworld.dev, I browse to 10.0.0.1:3000 on my machine and see helloworld.dev proxied through Livereload. I now edit a CSS file over the Samba share and the CSS is reloaded without a refresh.
I've tried using a few NPM packages, gulp-livereload, livereload, node-livereload, with their provided examples that come with the packages, but haven't been able to get the desired result. They all expect you to run in locally, don't support access to the Livereload URL over the network, cannot proxy the 'real' URLs or require static content.
Can anyone provide an example or 'proof of concept' code of my wish, so I can see where to start?
I found the answer: http://nitoyon.github.io/livereloadx/
This does EXACTLY what I need.
I can run
livereloadx -y http://helloworld.dev -l
open
http://serverip:35729
and I'm ready to roll.
The -y option creates the proxy to the 'real' URL and the -l makes it serve files from local filesystem instead of through its proxy.

Pulling remote 'client uploaded assets' into a local website development environment

This is an issue we come up against again and again, how to get live website assets uploaded by a client into the local development environment. The options are:
Download all the things (can be a lengthy process and has to be repeated often)
Write some insane script that automates #1
Some uber clever thing which maps a local folder to a remote URL
I would really love to know how to achieve #3, have some kind of alias/folder in my local environment which is ignored by Git but means that when testing changes locally I will see client uploaded assets where they should be rather than broken images (and/or other things).
I do wonder if this might be possible using Panic Transmit and the 'Transmit disk' feature.
Update
Ok thus far I have managed to get a remote server folder mapped to my local machine as a drive (of sorts) using the 'Web Disk' option in cPanel and then the 'Connect to server' option in OS X.
However although I can then browse the folder contents in a safe read only fashion on my local machine when I alias that drive to a folder in /sites Apache just prompts me to download the alias file rather that follow it as a symlink might... :/
KISS: I'd go with #2.
I usally put a small script like update_assets.sh in the project's folder which uses rsync to download the files:
rsync --recursive --stats --progress -aze user#site.net:~/path/to/remote/files local/path
I wouldn't call that insane :) I prefer to have all the files locally so that I can work with them when I'm offline or on a slow connection.
rsync is quite fast and maybe you also want to check out the --delete flag to delete local files when they were removed from remote.

db.* files in /home from Perforce?

I see several db.* files in my /home directory, and it seems they come from perforce. For example, some files are db.archmap, db.bodtext, db.change, db.changex
Are these files useful? Can I delete them? They are making my /home directory messy
You have started a server using your home directory as the Perforce server's P4ROOT folder. Those files are files that are generated from starting the server and cannot be deleted unless you want to hose your server installation. It's not clear to me how you've started the server instance, so I'll try and cover multiple bases with my answer.
If you want to start up the server under your own account, you should set the P4ROOT environment variable and point it to where you want the server to store its files. Alternatively, when you start the server, you can specify the root folder on the command line using the -r option:
p4d -r /home/mark/p4server
which would put the server's files into the directory called 'p4server' off of my home directory.
Typically it is best to run the perforce server using a user that is dedicated to running perforce. I use a user called 'perforce'. I set P4ROOT (and other variables) in that users environment. If you cannot use a separate user, it might be easier to use the -r command line option that I mentioned above.
Those files are only server files, not client files. So it is safe to delete them, but if you start the server back up it will recreate them. So you might want to uninstall the server.
Unless you are running a beta version, they have p4sandbox coming soon(maybe in the beta, I forget) which MAY create those files. I don't have a beta version, so I can't verify what new files the client may or may not create.
You can check the documentation here to see what these files do/are for.

Resources