How can I execute git commands (pull, push) on my home computer from a remote computer without installing software on the client? - node.js

Essentially, I'm trying to build a web server on my home desktop.
However, I find that I am out and about quite often, so it would be nice to be able to use my laptop to edit the code on GitHub, push it to the cloned repository on my home computer, and restart the server with the changed code.
I'm building a node.js server, so all I have to do to run it is type git pull and then node app.js into the bash terminal.
I'm basically wondering if it is possible to use a different computer to execute those commands remotely.
One constraint is that although my desktop can install any software required, my laptop cannot due to a lack of administrator permissions (it's a school-issued laptop).
Also, I run windows 10 home, so enabling RDP is nearly impossible or laggy at best.
If possible, it would be nice to do the pushing to my desktop via the GitHub site. Is it possible?

It would be nice to do the pushing to my desktop via the GitHub site.
All you can do "via the GitHub site" is declared a webhook on push event.
That webhook can send a push payload on any IP you want.
If your laptop can exposes a public IP (or has a secure tunnel to localhost) with a listener (like that npm listener), it can ben notified and pull from the GItHub repo whenever there is a push.

Related

How does Galaxy Meteor hosting for windows work?

I have a node.js application I have adopted from a more senior developer. I want to deploy it, and I know it will work because he already deployed it several times. I am reading these instructions:
https://galaxy-guide.meteor.com/deploy-quickstart.html
I use windows, as did he.
How does deployment work?
Take these instructions:
Windows If you are using Windows, the commands to deploy are slightly
different. You need to set the environment variable first, then run
the deployment command second (the syntax is the same as everything
you’d put for meteor deploy).
In the case of US East, the commands would be:
$ SET DEPLOY_HOSTNAME=galaxy.meteor.com
$ meteor deploy [hostname]
--settings path-to-settings.json
Am I just supposed to go to the source directory on my laptop and run these commands? What then happens? Is the source uploaded to their server from my laptop and then their magic takes care of the rest?
What about when I want to make a change to the code? Do I just do the same thing, poiting to an existing container and, again, they do the magic?
Am I just supposed to go to the source directory on my laptop and run these commands? What then happens? Is the source uploaded to their server from my laptop and then their magic takes care of the rest?
It is not magic. You basically go to your dev root and enter these commands. Under the hood it builds your app for production (including minification and prod flags for optimization) and once complete opens a connection to the aws infrastructure and pushes the build bundle.
See: https://github.com/meteor/meteor/blob/devel/tools/meteor-services/deploy.js
On the server there will be some install and post install scripts that set up all the environment for you and, if there are no errrors in the process, start your app.
These scripts have if course some automation, depending on your account settings and the commands you have entered.
What about when I want to make a change to the code? Do I just do the same thing, poiting to an existing container and, again, they do the magic?
You will have to rebuild (using the given deploy command) again but Galaxy will take care of the rest.

How to set up a development environment for React when IT won't allow you to install anything on your Windows workstation

I am working for a client that does not allow setting up anything on the native Windows workstation.
I am, however, allowed to set up a virtual machine on which I can install anything I want.
So, I've set up a Linux VM and installed the React environment.
However, I would like to be able to use the native Windows tools that are allowed for development, since installing and using them on the VM is painfully slow.
I'm currently modifying the code with a native Windows IDE, then pushing the changes to a Git repository, then pulling the changes down to the Linux VM to see them work. However, for debugging, where changes are added, removed, modified, etc... this is also painfully slow.
I tried to set up a shared folder to work on the code locally and having it update on the Linux VM dynamically, but that doesn't work because "npx create-react-app" does a bunch of things, like set up symlinks, that either don't work on a shared folder or aren't allowed by IT. I'm guessing it's the shared Windows folder that's limiting this. I also tried to set up a Samba share of the Linux folder, but I think this is blocked by IT, because I just can't see it from my Windows machine, and network discovery is turned on.
So, now that you know my pain, what would be the best way to set up a React development environment in this situation? Help...
I almost understands nothing about linux and VM, but here is something you can do.
When creating a react application with create-react-app, when you run npm start, your application will be hosted in localhost:3000.
So to do what you want, you need to set up the enviroment in the VM (e.g. create-react-app) and then configure (this is the part I don't understand how to do) your VM in a way you can access the VM's localhost and the files of your project.
This way you can edit the files of the VM and also see the app changing in the windows browser.
How to share VM's folder with host
How access VM's localhost

How do I update a live Node.js Heroku App remotely?

I have a Node.js app that is live on Heroku
The Node.js folders/files that I uploaded on Heroku also reside on my computer
Whenever I update my Node.js folders/files on my computer, I want these updates to also be applied to the folders/files that are live on Heroku.
I want to be able to do that without having to stop, update and restart my Heroku app every time.
What I'm describing is basically a setup equivalent to that of the standard ftp connection that we all use whenever we make a local to remote update of static files of some standard website.
The git support that apparently Heroku offers doesn't do that. It requires for the app to be stopped (by running the appropriate commands on the terminal), then I need to make a git push (using the terminal) that updates all of the files (which takes forever) and not just the ones that need to be updated, and then the app needs to be restarted (again using the terminal). This is extremely frustrating for an app that is still in development, requires constant updates and cannot be tested locally (for a number of reasons).
Whenever a Node.js app is tested locally, the app can be started by calling supervisor app.js instead of node app.js.
What this does is it allows for the app to be updated and as soon as that happens (i.e. as soon as I hit "save") supervisor automatically restarts the app locally.
I'm looking for something similar to the above, i.e. linking my local app folder to my remote app folder and starting my remote app (on Heroku) using some supervisor mode so that as soon as my local folder is updated, my remote folder is also changed and the app automatically restarted.
It's extremely frustrating trying to test a Heroku app (that obviously needs constant updates) currently.
Testing it locally and then publishing it on Heroku (for good) will not do because some apps simply cannot be tested on localhost.
Any help would be much appreciated!
The git support that apparently Heroku offers doesn't do that. It
requires for the app to be stopped (by running the appropriate
commands on the terminal), then I need to make a git push (using the
terminal) that updates all of the files (which takes forever) and not
just the ones that need to be updated, and then the app needs to be
restarted (again using the terminal). This is extremely frustrating
for an app that is still in development, requires constant updates and
cannot be tested locally (for a number of reasons).
First, you don't need to stop your app before running git push heroku master. Just push, and the platform will build, and then restart your app with the new code, automatically. Second, git uses a diffing algorithm, so you aren't pushing all of the files - you're in fact just pushing the differences (assuming you're using git correctly). Third, you don't need to do that final, manual restart - the platform has already done this for you on push. Finally, I would advise that if your app is impossible to test locally, you might want to reconsider the architecture of that app. It sounds very un-portable. Perhaps refer to 12factor.net for a good architecture checklist.
Testing it locally and then publishing it on Heroku (for good) will not do because some apps simply cannot be tested on localhost.
What type of app are you building that would be impossible to test outside of a production environment?
In any case, the closest thing I'm aware of to what you're looking for is Dropbox Sync:
https://devcenter.heroku.com/articles/dropbox-sync

Automatically sync to a web server?

I have a web server with websites on it and I was just wondering if there is anyway of me being able to develop the websites on my Linux (Ubuntu) desktop PC and whenever I hit save it uploads it to the web server?
I hope you understand what I am trying to do.
Thanks
Yes, you can do this, using an advanced IDE.
For example a free and powerful one is NetBeans.
Essentially you need an FTP server on your host machine, then in NB you need to create a new project from external source, set up your FTP connection and that's about it, now every time you hit that ctrl+s, changes will be saved on your server as well.
What you're asking about is called continuous delivery. Jenkins is a popular tool that can automatically test and deploy anything you save (or commit to svn or git).

How to create a Mercurial repository on a remote IIS web server

I have a Windows Server 2003 running Mercurial's hgwebdir.cgi to serve repositories. Push/Pull etc is working as expected for existing repositories.
Currently I'm using remote desktop If I need a new repository on the server.
Is there a better way to do it? Command line, web interface, cgi?
Mercurial by itself only allows for the creation of repositories locally or over ssh. For http you need to either log in to the server via command line and hg init or via RDP and do essentially the same.
It is, however, very easy to create a small CGI script that will create new remote repositories over HTTP. Here's one I built that works on unix and is likely easily adapted to windows:
http://ry4an.org/unblog/UnBlog/2009-09-17
currently , running hg init where you want the repository is the way to do it, any other way would require hgwebdir to implement some kind of security better left to other/better/more os specific tools. It's not that much of a leap to imagine that the HG devs rather focus on the versioning of files than reinventing the wheel with security, at least right now.

Resources