OS X permission denied for /usr/local/lib - linux

I'm looking for any advice/intuition/clues/answers on a permission issue that has been plaguing me ever since I switched over to a new Macbook pro. Here's the dilemma. Certain programs copy libraries under /usr/local/lib during install and upon running these programs I get a crash which I believe is related to permission restrictions to files in this folder. I've had errors (can't access files from this path) trying to install plugins for audacity and then tried doing an "ls" under this folder. I immediately get permission denied unless I prefix the cmd with sudo. I've tried owning the /usr/local/lib/audacity folder with my user account and even still I get permissions errors on these files. It's important to note that the problem is not exclusive to Audacity. I've seen the same problem with Polycom video conference software and I've also been unable to run Parallels on this machine. (I haven't traced Parallels to the same issue but I'm betting its related.) I vaguely recall some weird Linux cmd magic I used to use back in the day that would not only grant permission to a user but tweak some low level bits allowing/disabling certain things like execution and I seem to recall the permission thing ran deeper than execution but its been years. I can't recall the detils and I'm wondering if there's something similar on OS X that I'm possibly overlooking. Is there something special about that location and the files there in? Could I have somehow altered my file system in a way tht the files appear different? For what its worth, I seem to be able to use at least one of the programs if I log in as root. I haven't tried with the other programs as I've just discovered the ability. Please help.

It sounds like the folder isn't world executable. Try:
sudo chmod 755 /usr/local/lib
and then you should be able to use ls or anything else in the folder (still won't allow you to write but your user account shouldn't be able to do that anyway)

Found the answer from a coworker buddy. The folder needed to be marked executable.
sudo chmod 755 /usr/local/lib
fixes everything!

Related

Dynamically get username in Postinst script of .deb package

I wrote Postinst script for changing owner and file permission:
chown -R $(whoami) ~/Desktop/my_file.desktop
chmod 777 ~/Desktop/my_file.desktop
but after installation it does nothing.
I'm really not getting what part of script is wrong. Please tell how to get dynamically username in Postinst script?
Package installation runs as root, unconditionally. There is no concept of an invoking user; indeed, the package installation may happen e.g. before any user accounts exist on the box.
It's extremely unclear what you actually hope to achieve, but it looks like perhaps your package should simply install a script which then performs the task when the user runs it. This will also conveniently create a file which is already owned by the current user, without any chown trickery.
Even if a user exists, there is no guarantee that they have a Desktop directory in their home directory, or that they are currently, or ever, logged in using a GUI.
Finally, whatever you are attempting to do, chmod 777 is wrong and dangerous. You should absolutely not assign write access for everyone, to anything, ever.
(Okay, so there are two or three obscure scenarios related to system administration where this is actually required and useful; otherwise it should probably be technically impossible in the first place.)

Spring Tool Suite 3.8.2 - Installation on Ubuntu

I managed to install STS 3.8.2 on Ubuntu 16.04 - with a lot of hacking experiments. I have it working, but I am not happy with my solution.
Here is what I had to do:
Extracted the tar file into /opt/sts-bundle.
If you put it anywhere else, like /opt/sts, the TC server fails to start from STS.
With files in /opt/sts-bundle, TC server still fails to start from STS - permission errors. To get it to work you need to futz around with permissions of the pivotal-c-server subdirectories, essentially you need to open it up your group (the same one running STS) (security hole ?).
A local install in your own ~/sts-bundle fails on "files not found" while attempting to backup - all the conf files. It still looks in /opt/sts-bundle for all these config files (just to copy them to /backup). You can change the top directory of the server in STS server properties - but it still looks in /opt/sts-bundle. Seems hard-coded - don't know where. So you have to create all the config files in the conf directory in the tree rooted at /opt/sts-bundle ("touch" works - creating empty files). TC Server still fails to start with a "failed to clean" error - with no clue from the detailed message what files are being "cleaned".
I tried creating a non-privileged user "tcserver" per suggestion from the Pivotal TC Server docs. I installed to /opt/sts-bundle, while logged in as tcserver (with sudo privileges). That fails when I am using STS as a regular developer that is not "tcserver". Could not figure out how to tell TC server to run under a different user than the one that started STS.
The solution I have working and I am not happy with, starts by extracting the tar.gz file into /opt/sts-bundle, as it wants. Then changing owner and group of sts-bundle to my id and my group (same ones that are used in STS UI). I am not happy with that. It seems wrong to put things in /opt that are owned by a single developer.
I am new to Linux, and I still have some Windows habits that need to be unlearned.
The question is: how do I get the clean solution (installing using a "tcserver" user in the global /opt directory) to work for developers who are not "tcserver"? How should the tcserver user be related to the developers (same group?).
Am I making this problem harder than it should be? What am I missing?
I'm not sure this what you want, but I don't install the STS bundles in some kind of shared directory as a special user at all. I just install it in my user.home dir, as myself, and launch it from there.
It is very unsophisticated. I just download the tar.gz file, unpack it in my home dir and then launch it from a trivial bash script which looks something like this:
#!/bin/bash
/home/kdvolder/Applications/sts-bundle/sts-*/STS
That script is on my PATH. So I can just type 'STS' in a terminal and STS will start.
I don't have to do anything else and it works.
If you are trying to somehow install this so that several different users can run a shared installation then this isn't a good setup. But I think for your own personal laptop or desktop which only you are using, this simple setup is perfectly fine.
For a shared-user env, unfortunately, I don't know how to help you. It could be complicated to sort out all the permissions issues etc because Eclipse is a complicated beast w.r.t to installation of plugins etc.

Installing software on Ubuntu as non-root

I've been stuck on a problem for two days now where the software I'm trying to install will not proceed unless I make a separate user which is non-root.
Keep in mind I'm a big linux noob and not very experienced with the OS.
I make a user called "smrtanalysis" in a group called "smrtanalysis".
I put him in the sudoers file.
I made a folder called smrtanalysis in my home/nick/ directory
I downloaded the software from the PacBio website and put the .run files into this directory I noted above.
I used chmod 777 and chown (to user smrtanalysis) on the directory
noted above, and the .run file
I logged into smrtanalysis user by su smrtanalysis, password, and typed
./smrtanalyis-2.2.0.133377.run
the file runs, but then aborts with the following error message:
We recommend running this script as a designated SMRT Analysis user
(e.g. smrtanalysis) who will own all smrtpipe jobs and daemon
processes.
Current user is 'root' (primary group: root)
Installing as 'root' is currently not supported Switch to the desired
user and restart the install. Aborting installation...
Here is the install documentation:
https://github.com/PacificBiosciences/SMRT-Analysis/wiki/SMRT-Analysis-Software-Installation-v2.2.0
It seems pretty straightforward but I can't seem to get it working. If you guys look at the install docs, you'll probably be able to tell me what I'm doing wrong.
Thanks for any help!
Regards,
Nick
change
SMRT_ROOT=/opt/smrtanalysis
on
SMRT_ROOT=/home/nick/smrtanalysis
the rest should be easy.
Be very careful installing binaries from internet, make sure you're confident in the source.
Just don't use root for that, you ran the script as root by accident.
(update: pacbio team can help from the github page at https://github.com/PacificBiosciences/SMRT-Analysis/issues as well.)

Custom InstallAnywhere location for .com.zerog.registry.xml file on linux

I'm running into an issue where I do not have write access to the /var directory on a UNIX environment, and InstallAnywhere doesn't provide me the option of writing the .com.zerog.registry.xml to any other location for a product installation. Is there a parameter out there that allows for this file to be written to a different directory?
According to the IA docs:
If logged in as root, the global registry is located in \var.
If logged in as a user, it is located in the user’s home directory.
So, if you're running as root and can't write to /var, it sounds like a permissions problem with the /var directory, independent of IA. Check the permissions on /var.
If you're running as a non-root user, then the registry shouldn't be going to /var, but to $HOME/.com.zerog.registry.xml (FWIW, I just checked one of our test Linux boxes and found .com.zerog.registry.xml under both /var and under test-user $HOME directories. The docs appear to be correct).
I've also seen some very strange behavior if IA is low on space in $TMP. Make sure you have plenty of space there.
Also, have you considered running the installer with sudo, or the graphical equivalents kdesudo (KDE) and gksu (Gnome)? Those might get you where you want to go.

Fixing permissions after FTPing ASP.NET code to a Linux system

First off, I'm running Mono to run ASP.NET on Linux, but that's not the question.
It appears that, every time I clear out my application directory and upload, I have to go back in and fix the permissions. What I'm doing is
chmod -R -c 755 /var/www/*
...and there are two questions.
What's the deal with having to do this every time I FTP? Feels flaky.
Is there a better permissions set than 755? Do I want different permissions for the /bin directory? Or can I fix this all with one fell swoop of chown?
It could depend on your FTP server and configuration. I always used this and it worked:
chmod 777 /path/...

Resources