GCP Filestore error modifying shared folder contents with nodejs script under non-root user - node.js

I want to write a program that writes log files into the shared folder in a GCP compute engine. I used GCP filestore to mount the NFS folder in an ubuntu vm. After creating the folder, I noticed that I couldn't use cp to copy file to that folder unless I use sudo. When I ran the nodejs script, it also returned a permission denied error. However, I don't want to run the nodejs script with root. Is there a way to modify the set of permission so that I can write to the shared folder under the default, non-root user?
I modified the permission of the shared folder to 777 but it didn't work. I still cannot write to the folder.

Related

unable to copy file from local machine to ec2 instance in ansible playbook

So I am running scp -i ~/Downloads/ansible-benchmark.pem ~/Documents/cis-playbook/section-1.yaml ubuntu#ec2-18-170-77-90.eu-west-2.compute.amazonaws.com:~/etc/ansible/playbooks/
to transfer an ansible playbook I created with VSCODE the section-1.yaml file,
but I am coming up with an error scp: /home/ubuntu/etc/ansible/playbooks/: No such file or directory
the directory definitely exists in the ec2 instance, I did install ansible, but for some reason I don't know why it isn't recognising the directory.
For the first one, you can check if is available
/home/ubuntu/etc/ansible/playbooks/
if that part is not available on target/source, you can use create a folder on ansible first then you can go for copy on target
You can use this issue Ansible: find file and loop over paths

How to create .Deb pkg such that it should has access read write permission in /opt/myprogram

I'm developing application that is installing in /opt folder. My application fetches .zip file and tries to write in /opt/myprogram/ but it is failing due to permission issues.
If I run as root it will work.
So how can I create Deb package such that it has full read and write permission for that user under /opt/myprogram

Laravel folder permission for not-yet made cache folders

I'm having an issue with directory permissions with Laravel when it comes to caching. Whenever it tries to upload a cache file to /var/www/laravel/storage/framework/cache/data/ it tells me that file_put_contents has no permissions.
To fix this I always do something like chmod -R 755 /var/www/laravel/storage/framework/cache/ but the problem here is that when it creates a new directory inside cache it does not inherit these chmod settings, thus giving me permission denied error again.
How can this be fixed permanently?
Edit:
Been thinking about letting it run as a cronjob regularly, but I'm not so sure that's a good way to deal with it.
You need to run chmod command with -R:
sudo chmod -R 755 storage
After installing Laravel, you may need to configure some permissions. Directories within the storage and the bootstrap/cache directories should be writable by your web server or Laravel will not run. If you are using the Homestead virtual machine, these permissions should already be set.
https://laravel.com/docs/5.5#installation

"Unable to create home directory" error when changing JENKINS_HOME

Jenkins was running all fine on a RedHat Linux machine (a clean EC2 machine on AWS), until I decided to change the JENKINS_HOME. I simply moved the Jenkins directory from /var/lib/jenkins to /home/ec2-user/jenkins and then created a symlink. (I followed the first answer to this question: Change JENKINS_HOME on Red Hat Linux?).
However when I restart Jenkins I get the error:
Unable to create the home directory ‘/var/lib/jenkins’. This is most
likely a permission problem. To change the home directory, use
JENKINS_HOME environment variable or set the JENKINS_HOME system
property.
I tried changing JENKINS_HOME in /etc/sysconfig/jenkins, setting it to the new folder (which I suppose defeats the point of a symlink?) and I still get the same error
Unable to create the home directory ‘/home/ec2-user/jenkins’.
It is for backup purposes, so that I have all Jenkins data in a mounted external data storage (AWS Elastic File System).
I've figured it out. This error was persisting because the /jenkins/ folder needs to be accessible to user 'jenkins' to run processes, but it couldn't access this folder because it is belongs to the particular logged in user. I changed the mounting to /var/ where jenkins can access as global process, and it solved the problem.
I ran into the same problem, so sharing my solution here:
The user jenkins does not have access to the folder home/ec2-user/jenkins. You can modify the access rights of the folder home/ec2-user/home by changing or adding the user jenkins to owner
sudo chown jenkins /home/ec2-user/jenkins
sudo chmod u+w /home/ec2-user/jenkins
To verify the new ownership, you can do:
ls -ld /home/ec2-user/jenkins
The error seems pretty obvious: "This is most likely a permission problem."
I assume /home/jenkins does not exists, and the user jenkins does not have write permissions in /home. If you moved the Jenkins home, then you probably did it as root and just forgot to update owner permissions.
You would need to create the home, something like this:
sudo service jenkins stop
# make the changes in /etc/sysconfig/jenkins
sudo mkdir --parents /home/jenkins # or mv, in your case
sudo chown --recursive jenkins /home/jenkins
sudo service jenkins start

webdav files without all permissions go to lost+found

I'm trying to connect the liferay 6.0.6 document library (installed in linux and running on tomcat) to an external mounted folder shared through webdav.
I'm using mount.dafvs to mount the folders in linux, but whenever I create or add via sftp a file to a mounted folder, it doesn't have the 777 permission as its folder and 95% of the times it simply goes to the lost+found folder, leaving there an empty file. So I can see files uploaded from the portal, but I can't upload files from my linux machine.
But if I change the permissions of the file to 777 and then I edit again or I upload it again from sftp replacing it, the file is there and can be seen/downloaded from the portal too!
any idea of why this is happening and how can I get this sharing works with R/W options from both sides?

Resources