Node tmp folder and path? - node.js

In Nodejs, when running
fs.readdirSync('/tmp');
I get result like:
['launchd-493.Je0U5v','npm-898-26dc6432']
Where is this /tmp folder? what does its path look like? (I'm using OSX)
The reason I'm asking this question is because I'm build a node app on a web host which has a read-only-files system in the node app folder, and I need to save some tmp files which is then uploaded to a backend like Parse.
Thanks!

Most servers should use the system temp folder; /tmp on OS X aliases to /private/tmp, which has the default permissions: lrwxr-xr-x# 1 root wheel.
Unless your server is abnormally locked down, you should be able to use it.
You may want to consider something like tmp as well which is tailored for temporary storage.

Related

Serving Node.js files from a LXC Turnkey container -Apache configuration needed?

I hope that I will not waste everyone's time, nor embarrass myself, but please hear/read my problem. I am new to this, so please bear with me.
Someone at work wrote a crude code in Node.js and I can see the .html files by having localhost: 8080 as the URL in the browser, while having the VisualStudio starting the npm with npm start command. Am I explaining this clear enough?
The webpages are displayed and all, but now comes the hurdle.
How can i have those pages served from a a Linux server?
If by analogy, I put some.html page inside the /var/www/ in a Apache server, pointing to the server's IP/somepage.html i can visualise it, what needs to be set up on a similar Node.js server?
Where do I have to put those files, inside what directory and what configuration is needed?
I thought to create a small LXC container and have those files and services saved as a template, but first I need to set this up correctly. Can Apache serve those files, do I have to make another configuration first?
I have those files served from a Windows machine from local host, and put the same files in a /node ,/opt/ www directory in a Linux machine, but no dice.

Folder structure to deploy app on EC2 instance

I am setting up a new React app on EC2 instance (ubuntu). I have installed nodeJS and npm and I am able to build my app successfully.
Issue is my code is in /var/www/html folder and my site example.com is pointed to this folder.
when I run
npm run build
It builds a folder under /html like /html/build now my app runs on example.com/build. Resources for these files comes from example.com/static/style.css etc but they actually reside under example.com/build/static
I can edit asset-manifest.json and change the path but thats not appropriate solution as I need to get rid of /build folder for production
I am not super familiar with deployments to EC2 but this looks like you just need to either copy the entire contents of your app inside var/www/html, or you need to tell apache or nginx to look to the right folder (in this case /build)
For example, with apache you probably have a file inside /etc/apache2/sites-enabled/ that is pointing to /var/www/html, you could change that to /var/www/html/build and restart apache.
You can check this for examples on how to write these configurations https://gist.github.com/rambabusaravanan/578df6d2486a32c3e7dc50a4201adca4

How should I place files on apache2 HTTP server

I am on Kali linux which comes with the apache2 http installed. Under /var/www/ there is a index.html file which is the default index page that will show up on localhost. I have this folder containing all my .html .css .js and some pictures that I want to put on the Apache2 server. Should I just copy/paste the folder under /var/www ?
Thats the traditional way to do it.
if you have virtualhosts or something a bit more complex then you might consider something else but typically people just drop everything in under /var/www (or the equivalent for a given disto or OS )
Yes, though you may need to adjust the accsess rights, and if you want you can use the apache config, or mount --bind, or git clone/pull. Start with the simple option then look into the other options to see if they offer you any benefit.

Pulling remote 'client uploaded assets' into a local website development environment

This is an issue we come up against again and again, how to get live website assets uploaded by a client into the local development environment. The options are:
Download all the things (can be a lengthy process and has to be repeated often)
Write some insane script that automates #1
Some uber clever thing which maps a local folder to a remote URL
I would really love to know how to achieve #3, have some kind of alias/folder in my local environment which is ignored by Git but means that when testing changes locally I will see client uploaded assets where they should be rather than broken images (and/or other things).
I do wonder if this might be possible using Panic Transmit and the 'Transmit disk' feature.
Update
Ok thus far I have managed to get a remote server folder mapped to my local machine as a drive (of sorts) using the 'Web Disk' option in cPanel and then the 'Connect to server' option in OS X.
However although I can then browse the folder contents in a safe read only fashion on my local machine when I alias that drive to a folder in /sites Apache just prompts me to download the alias file rather that follow it as a symlink might... :/
KISS: I'd go with #2.
I usally put a small script like update_assets.sh in the project's folder which uses rsync to download the files:
rsync --recursive --stats --progress -aze user#site.net:~/path/to/remote/files local/path
I wouldn't call that insane :) I prefer to have all the files locally so that I can work with them when I'm offline or on a slow connection.
rsync is quite fast and maybe you also want to check out the --delete flag to delete local files when they were removed from remote.

db.* files in /home from Perforce?

I see several db.* files in my /home directory, and it seems they come from perforce. For example, some files are db.archmap, db.bodtext, db.change, db.changex
Are these files useful? Can I delete them? They are making my /home directory messy
You have started a server using your home directory as the Perforce server's P4ROOT folder. Those files are files that are generated from starting the server and cannot be deleted unless you want to hose your server installation. It's not clear to me how you've started the server instance, so I'll try and cover multiple bases with my answer.
If you want to start up the server under your own account, you should set the P4ROOT environment variable and point it to where you want the server to store its files. Alternatively, when you start the server, you can specify the root folder on the command line using the -r option:
p4d -r /home/mark/p4server
which would put the server's files into the directory called 'p4server' off of my home directory.
Typically it is best to run the perforce server using a user that is dedicated to running perforce. I use a user called 'perforce'. I set P4ROOT (and other variables) in that users environment. If you cannot use a separate user, it might be easier to use the -r command line option that I mentioned above.
Those files are only server files, not client files. So it is safe to delete them, but if you start the server back up it will recreate them. So you might want to uninstall the server.
Unless you are running a beta version, they have p4sandbox coming soon(maybe in the beta, I forget) which MAY create those files. I don't have a beta version, so I can't verify what new files the client may or may not create.
You can check the documentation here to see what these files do/are for.

Resources