Issue Installing Composer on Hosted Web Server - linux

I’m having a heck of a time trying to install composer (to install Laravel) on my server. I’m accessing my web server via the built-in terminal in Coda 2. These are the commands I’ve been trying to run:
curl -sS https://getcomposer.org/installer | php
mv composer.phar /usr/local/bin/composer
The curl command executes fine, but when I try the move. I get the following error:
mv: inter-device move failed: `composer.phar' to `/usr/local/bin/composer'; unable to remove target: Read-only file system
I tried to run the move with sudo per the Composer website, but that results in these errors:
sudo: unable to stat /etc/sudoers: No such file or directory
sudo: no valid sudoers sources found, quitting
I’ve been trying to google search some to figure it out, but I haven’t had much success. I'm not too savvy with server issues like this, so that has made it hard for to figure out what is going on.
Thanks in advance.

Related

Error: Could not connect to Redis at 127.0.0.1:6379:connection refused

The detailed installation guide to install Redis on Mac
Hello Everyone,
I recently stumbled upon a YT video on Redis Crash Course by "BRAD" in Traversy Media Channel(https://www.youtube.com/channel/UC29ju8bIPH5as8OGnQzwJyA). Below are the following that I got stuck in while Installing Redis.
I was unable to download Redis through CLI i.e. wget
https://download.redis.io/releases/redis-6.2.6.tar.gz and note, I
used curl as wget was not functional.
I was unable to start the Redis-Cli and it tortured me with an Error:
Could not connect to Redis at 127.0.0.1:6379:connection refused not
connected> Below are the steps that I followed to install and run
successfully.
[Solution]Problem statement 1:
Instead of downloading through CLI, I tried downloading the "tar.gz" file directly. Downloaded the stable version 6.2.6 and then followed the below CLI commands.
$ tar xzf redis-6.2.6.tar.gz $ cd redis-6.2.6 $ make
This made the job easy to make a binary. Post which, I followed the Redis documentation to run the redis-server. And, it worked fine.
[Solution]Problem statement 2:
As I said, I was unable to run the redis-cli even though, I was able to successfully run the redis-server. I tried several websites and StackOverflow to understand the concept behind the error. That's when I realized the redis-server and redis-client are two separate executable files/process so to make the redis-client work, you should keep in mind that the redis-server should run either in background or in other terminal.
Note, if you're executing the redis-server in the same terminal, then make sure to run the server in the background using the below command.
redis-server --daemonize yes
This should solve the problem, now try using the redis-cli. It will work perfectly.
Now, you can see port 6379 with the localhost IP, make a test PING and confirm it is connected.

ubuntu backup-manager Permission denied

I've recently installed backup manager onto my ubuntu machine to have automated backup going. The problem is when I go to set up the automatization using this code -
it comes us up saying this "bash: /etc/backup-manager.sh: Permission denied"
I do not understand this error. I've tried change the user who read/writes to someone other than root and that didn't work. I tried changed the chmod number from 770 to 700 and still didn't work.
any info on this is welcome. Thank you to those who help :)
those wondering I am using this tutorial giving to me by the host. https://documentation.online.net/en/dedicated-server/tutorials/backup/configure-backup/start
I'm using the desktop version of ubuntu 16 incase that is needed
The sudo doesn't do what you want in this case. What happens is that the shell evaluates the redirection and attempts to open the /etc/backup-manager.sh for you before the sudo cat even gets started. That fails because the shell still runs as you unprivileged user. You have to say sudo -i to open a new root shell, execute the commands and exit again.
Alternatively you could try sudo nano /etc/backup-manager.sh and paste the contents there. This would work because the editor is run as root and does the file opening itself when you save.

RStudio cannot reach .Rhistory on Ubuntu

I was working with markdown file on RStudio. I have Ubuntu 14.04 on my laptop. I produce html files using knitr. I decided to clean my enviroment and added rm and gc commands at the end.
Now here is a message in my console window:
Error attempting to read history from ~/.Rhistory: permission denied (is the .Rhistory file owned by root?)
What it means? Is it bad for my code?
You are right - the first time you ran it, you were in sudo mode, and the .Rhistory file was created with root as the owner. Running RStudio as root would remove the symptom, but is not ideal. To be able to run it as a regular user, simply change the owner of the .Rhistory file:
sudo chown -c <user_name> .Rhistory
In the best traditions of stackoverflow I reply to my own question! The problem occurred because when I first started R, I did it as su:
sudo R
so I can load a lot of useful libraries in /usr/lib/R/site-library and not in my account. As result .Rhistory became su file. It is possible for RStudio to see it if it is started as
sudo rstudio
and then all is fine.

PostgreSQL "cannot access the server configuration file (...) No such file or directory" after clean install

I just installed postgresql according to the official documentation:
But for some reason it doesn't work. It did install using sudo apt-get postgres... etc. But the starting of the server doesn't seem to work.
I tried starting the server according to their documentation but mr. computer throws the following error to my head when entering this command:
Command:
user#user-noobcomputer:/usr/lib/postgresql/9.4$ bin/postgres -D /usr/local/pgsql/data/
Error:
postgres cannot access the server configuration file "/usr/local/pgsql/data/postgresql.conf": No such file or directory"
I have no clue why this file doesn't exist. Can anyone help me find out how to get past this error message and get my postgres server up and running?
First make sure that the file is not installed.
As super user update the mlocate database and then run an mlocate query for the file.
$updatedb
$locate postgresql.conf
this will return any file with that name in your system.
If the file is located in the wrong folder, then it'll show up in that query.
If not, may try going to PostgreSQL docs they have a basic example of how this file would look.
Try making a copy of the file with that setting.

Cloud9 on Raspberry Pi, Unable to save files

I'm trying to get the cloud9 local server working on my Raspberry Pi(512mb model B, running raspbian).
I followed this installation guide:
http://www.slickstreamer.info/2013/02/how-to-install-cloud-9-on-raspberry-pi.html
After this installation everything appeared to be working properly, I can start the server with the following command:
~/cloud9/bin/cloud9.sh -l 0.0.0.0 -w ~/Documents/www/workspace/
when I start the server all the files in the workspace are displayed properly and I can view, duplicate, delete, and create files remotely no problem. But when I edit an existing file and try to save it remotely a little spinning wheel pops up on the tab of the file I'm saving and it continues to spin endlessly.
When I start the server a warning pops up saying: 'path.existsSync is now called fs.existsSync.' I'm not sure if that is relevant or not.
I found another thread somewhere saying that I should go through cloud9/configs/default.js and replace any instance of localhost with 0.0.0.0. I tried that, but it didn't fix my problem.
Does anyone have any suggestions about how to get cloud9 saving files properly?
Thanks in advance for your help.
There were several complains about IDE file saving hangs on cloud9 support. At the bottom of the page there is a solution you can try.
I fully removed cloud9 and node(followed these directions to remove node: Uninstall Node.JS using Linux command line?), and then did a clean install following these directions: http://www.raspberrypi.org/phpBB3/viewtopic.php?f=63&t=30265. In addition to those commands I also had to run the following:
sudo npm install formidable
sudo npm install gnu-tools
sudo npm install xmldom
after that I was able to start the cloud9 server without issue, and now I'm able to save files.
thanks for trying to help

Resources