node.js createWriteStream doesn't create new file on Heroku - node.js

I have following code that works fine on my localhost running node.js 0.12.0. The code creates a new file, and copy data from readable, but it doesn't create new file on Heroku.
var output = fs.createWriteStream('public/images/test/testfile.png');
readable.pipe(output);
I thought it has something to do with the permission, but whenever I change the permission on the folder using heroku run bash and then chmod -R 777 images/ Heroku resets it back to its original permission which is drwx------.
So may be the problem is something else?
Please note that it fails silently, no exception, nothing in the log.

In Heroku a dyno's local file storage is not persistent (besides the git repo files obviously), so if you write a local file and the dyno restarts the file will be gone, and if you start another dyno it won't be able to "see" the file.
heroku run bash starts a new "one-off" dyno (can read about it here: https://devcenter.heroku.com/articles/one-off-dynos), so the file will not be accessible that way.
If you want your data to persist, better use some database or persistent storage addon.

Related

Error: ENOENT: no such file or directory -Heroku

I'm new in web and just started working on NodeJS.
I have deployed an application on Heroku. But when I was saving my images files on my local machine then it was working great, but when I deployed it on Heroku server then my images are not saving and show me error:
Error ENOENT: no such file or directory, open '../public/uploads/1623462977307.png'
Here is My Code:
my FILES_PATH is ../public/uploads Set on Heroku env variables.
I tried all like absolute path, relative path but issue is not resolving.
That error means the ../public/uploads folder doesn't exist. You can fix the error by making sure the folder exists. Create the folder, put an empty file inside it called .gitkeep, and commit the change. This will keep the empty folder in your repository when you push to Heroku.
Keep in mind that Heroku's filesystem is ephemeral/short lived. This means that any images you save on the disk will disappear after you restart or deploy your app. You can't rely on Heroku's filesystem to store images that need to persist for a longer time. Read more here: https://help.heroku.com/K1PPS2WM/why-are-my-file-uploads-missing-deleted
You're better off saving images elsewhere. Have a look at cloud storage solutions such as Cloudinary or AWS S3.

App can find view on local but not once uploaded to server

So I have this app:
https://github.com/jfny/SocialGrowth.xyz
If you were to git clone and npm install/npm start on your local, it should work fine.
However when I upload the files to my server and try running, it tells me it is unable to look up one of the views, yet the files haven't changed at all.
Why is this?

pywatchdog and pyinotify not detecting changes on files inside ftp created directories

I have an application monitoring files sent to a FTP server (proftpd 1.3.5a). I am using pywatchdog to monitor file creation on FTP server root (app running locally), but under some very specific circumstance it does not issue a notification: when I create a new dir through ftp and, after that, create a file under this directory. The file creation/modification events are not caught!
In order to reproduce it in a simple way I've used pyinotify (0.9.6) itself and it looks like the problem comes from there. So, a simple way to reproduce the problem:
Install proftpd and pyinotify (python3) on the server with default settings
In the server, run the following command to monitor ftp root (recursive and autoadd turned on - considering user "user"):
python3 -m pyinotify -v -r -a /home/user
In the client, create a sample.txt, connect in the ftp server and issue the following commands, in this order:
mkdir dir_a
cd dir_a
put sample.txt
There will be no events related to sample.txt - neither create nor modify!
I've tried to remove the ftp factor from the issue by manually creating and moving directories inside the observed target and creating files inside these directories, but the issue does not happen - it all works smoothly.
Any help will be appreciated!

How to use to make a file executable on Openshift server after pushing it via git

The original poser is found here.
I want to ensure my index.cgi is set to 755, even afer i push files to git.
This is not happening and the file permission , based on the umask i understand is getting set to 700.
I am unable to create the post-update script on the server , which is to be kept at openshift/hooks location, due to the set permissions.
So i tried using action hooks to do the job.
I created a file named stop in my action hooks local folder.
Following this i pushed my index file to the server.
My index file still shows permission as 700.
How can i resolve this ?
Try updating the permissions in git.
git update-index --chmod=<permissions> <your_file>

db.* files in /home from Perforce?

I see several db.* files in my /home directory, and it seems they come from perforce. For example, some files are db.archmap, db.bodtext, db.change, db.changex
Are these files useful? Can I delete them? They are making my /home directory messy
You have started a server using your home directory as the Perforce server's P4ROOT folder. Those files are files that are generated from starting the server and cannot be deleted unless you want to hose your server installation. It's not clear to me how you've started the server instance, so I'll try and cover multiple bases with my answer.
If you want to start up the server under your own account, you should set the P4ROOT environment variable and point it to where you want the server to store its files. Alternatively, when you start the server, you can specify the root folder on the command line using the -r option:
p4d -r /home/mark/p4server
which would put the server's files into the directory called 'p4server' off of my home directory.
Typically it is best to run the perforce server using a user that is dedicated to running perforce. I use a user called 'perforce'. I set P4ROOT (and other variables) in that users environment. If you cannot use a separate user, it might be easier to use the -r command line option that I mentioned above.
Those files are only server files, not client files. So it is safe to delete them, but if you start the server back up it will recreate them. So you might want to uninstall the server.
Unless you are running a beta version, they have p4sandbox coming soon(maybe in the beta, I forget) which MAY create those files. I don't have a beta version, so I can't verify what new files the client may or may not create.
You can check the documentation here to see what these files do/are for.

Resources