Azure Regain sudo/su Access - azure

While trying to install a GUI application today I found myself having to disable waagent. Turns out the NetworkManager packages are not compatible with waagent. A few, obviously outdated, posts on the Microsoft forums had me run the following process to get the packages to install:
# yum remove WALinuxAgent
# yum install NetworkManager
... do desktop installs ...
# yum remove NetworkManager
# yum install WALinuxAgent
/usr/sbin/waagent --install
So I did that.
Now I can no longer access root or any sudo commands from my default login.
Without higher level privs I cannot perform any of the "fixes" that are noted in multiple forum posts.
Any way to find a hint as to what the default root password is on my Azure CentOS 6.4 image? Or how to restore sudo access to my default login with no sudo commands?
Is this image hosed? It is running but without elevated privs it is kind of useless as I cannot maintain the system.
Suggestions?

After multiple discussions, at length, with a senior Azure engineer at Microsoft, the bottom line is "the image is hosed". Because of the way the CentOS (and other Linux) images are built, if you lose the waagent without deprovisioning it first, you will obliterate all access to an elevated privilege account.
Microsoft's Azure team has bumped the request to allow for console level access to the Linux images, but there is no ETA or even a confirmation that this feature will ever be considered.
For now the only answer is "rebuild the system on a new image".

Related

R package management on Linux

I have two accounts on a linux server, one with sudo power, one without.
When I install packages using the account with sudo power, it all works fine.
Then I logged in using another account without sudo power, it shows me the library not found.
Is there a way to solve this like changing the permission of the library? or install globally?
I have to use the account since the all the apps are running on it.
So after I checked my R packages location, I found all the new packages were installed under my personal directory. After I move it all to /usr/share/R/library it got solved.

Best Approach to installing Node.js/npm without sudo

I've been looking around for the best/most appropriate way to install node.js/npm in such a way that using commands like npm install -g bower does not require sudo, as using sudo for such a command can cause issues later on. Initially I followed this answer: Installing with nvm but this installs it into the users home directory which I read may not be a good a idea in production to have node installed in your home directory so I followed an expansion on above tutorial with this: Installing with NVM (digital ocean) however this left me still requiring sudo.
On a side note - on my macbook I installed node with homebrew, is this a good idea or is there a more standard approach.
Thanks for all your help, feel free to ask for clarifications.
I forgot to say, the machine I am planning on installing this on is running XUbuntu 14.04. (also I have my macbook running mavericks - but this is just an addition)
Sudo gives you permissions to change/add/remove files not owned by your user. Those files are as a rule everything except /home/YOU (in MacOS: /Users/YOU)
Your desire is to have Node installed as appropriate (system wide, rather than your home directory), that is good. And as you guessed you need sudo to initially install it on a system path.
But then you wish to have modules installed without sudo, meaning you want modules to be located in a directory, where your user has write access to. That would be available by default if Node was installed in your home.
To enforce your wish on a system path, you will need to give write permission to the folder where modules are located, that is change write permissions or ownership of:
/usr/local/share/npm/lib/node_modules, so that modules can be saved on your disk.
/usr/local/share/npm/bin, to allow modules executables be reachable.
You might have to alter few other folders as well.
That answers your question, but I strongly recommend you not doing so. Instead I suggest you stick to default methodologies. Everyone here without doubt will say it is absolutely safe approach to use sudo when you are installing modules globally, it is even safer to not have write permissions to global infrastructure of your install without super privileges.

Installing softwares witout sudo in centos for a single user only?

I am using centos 6 for which I am not having sudo access.I have a user account and have full access for that account. Is there a way to install packages/softwares for a particular user in Centos.
Just copy the executables into your home directory. You may also add it to your PATH variable. Many people has a ~/bin directory for this kind of stuffs.

XAMPP or any other service tool in /opt? Security

I am developing with Xampp for Linux and Tomcat (similar to Xampp on Windows). Many programs like /IDEA, Tomcat and Xampp are recommended to be installed under /opt Now I have heard that it is not recommended to run services as root, but on Ubuntu (I am using this) unpacking any directory to /opt implies that it belongs to root owner and root group. This may be specific to Xampp as per the instructions on their Linux page:
Step 2: Installation After downloading simply type in the following commands:
Go to a Linux shell and login as the system administrator root:
su
Extract the downloaded archive file to /opt:
tar xvfz xampp-linux-1.8.1.tar.gz -C /opt
Warning: Please use only this command to install XAMPP. DON'T use any Microsoft Windows tools to extract the archive, it won't work.
Warning 2: already installed XAMPP versions get overwritten by this command.
That's all. XAMPP is now installed below the /opt/lampp directory.
* Step 3: Start To start XAMPP simply call this command:
/opt/lampp/lampp start
Placing it here implies that Apache must be run as root as one is only able to run it with sudo on Ubuntu.
This may be an issue specific to Ubuntu. Is it? Because Xampp is a development tool I posted this here as I am more likely to find an appropriate answer here from developers who use it on Ubuntu (and other Linux systems). I would appreciate any information on if the same problem occurs on other systems, I notice my production environment has Tomcat installed in /opt too, but belongs to tomcat: tomcat
The question here is how to get around this for all tools under /opt, because even though Xampp may not be the tool for my needs, I still want to place Tomcat under /opt to replicate my production environment and the same thing will surely happen unless this is just a Ubuntu issue?
Ubuntu and some other distributions differ to the general Linux principle where the account that you create upon install of the OS is added to specific groups that can be viewed with the following command:
groups username
You will notice that root is not amongst these. It is also not possible to log in or su to the root account. sudo is most likey a command that has been granted permission to be used from other accounts so I imagine the 'sudo' command has a file permission of 775 for user: root:root
Thus launching services from /opt' does not run them asroot`

RCP P2 updates in multi-user environment from read-only installation

I have created an Ubuntu package to install my RCP app. The installed files are owned by root. Is it possible for a user to subsequently install updates through P2? Documentation about Eclipse multi-user installs suggests that it is possible, along with the answer to this question.
However, when I start up the application, it does not automatically check for updates as usual, and the Update Site that I had specified in p2.inf is not listed in the "Install New Software..." dialog.
Using the -configuration or -data runtime options did not help.
I can make it work with a hack by running sudo chown -R <my_username> /opt/<my_app_installation>. When I subsequently launch the application, it does properly check for updates on startup, and my update site is properly listed in the "Install New Software..." dialog. Certainly I would prefer that whatever data it is writing to that installation directory be instead written to the user's home directory.
Supplementary info:
Here is a list of files and folders that showed up in my installation directory only after the directory was given ownership by <my_username> and the program was run.
/opt/<my_app_installation>/configuration/org.eclipse.core.runtime
/opt/<my_app_installation>/configuration/org.eclipse.equinox.app
/opt/<my_app_installation>/configuration/org.eclipse.osgi
/opt/<my_app_installation>/p2/org.eclipse.equinox.p2.engine/profileRegistry/profile.profile/1339896994308.profile.gz
/opt/<my_app_installation>/p2/org.eclipse.equinox.p2.engine/profileRegistry/profile.profile/.data/.settings/org.eclipse.equinox.p2.ui.sdk.prefs
/opt/<my_app_installation>/p2/org.eclipse.equinox.p2.engine/profileRegistry/profile.profile/.data/.settings/org.eclipse.equinox.p2.ui.sdk.scheduler.prefs
/opt/<my_app_installation>/p2/org.eclipse.equinox.p2.repository
More experimental results:
Even with a writable (chown'd) installation directory, no files are placed there when the
-configuration $HOME/.my_app_files runtime option is supplied.
There are lots of limitation of p2 itself for share install. AFAIK there is no significant improvement in latest release Juno as well.
But a guy from Redhat is working on p2 install with RPM package, you can read his progress in his blog post. The work and idea could be shared with Debian package.

Resources