how to load schema file into Cassandra with cqlsh - cassandra

I have a schema file for Cassandra. I'm using a windows 7 machine (Cassandra on this machien as well - 1 node). I want to load the schema with cqssh. So far I have not been able to find how. I was hoping to be able to pass the file to cqlsh: cqlsh mySchemaFile. However since I run in windows, to start cqlsh I do the following
python "C:\Program Files (x86)\DataStax Community\apache-cassandra\bin\cqlsh" localhost 9160
Even though I have csqsh in my path, when called like this from python it needs the full path.
I tried to add in there the file name but no luck so far.
Is this even possible?

cqlsh takes a file to execute via the -f or --file option, not as a positional argument (like the host and port), so the correct form would be:
python "C:\Program Files (x86)\DataStax Community\apache-cassandra\bin\cqlsh" localhost 9160 -f mySchemaFile
Note: I'm not 100% sure about whether you'd use -f or \f in Windows.

Related

Move db2 from Windows to Linux

I'm trying to move db2 db from windows to linux server. When I move data to linux db by this command:
db2move DBNAME load -lo REPLACE -u userID -p password > load_remote.txt
I had this error:
SQLCODE: -3126 - SQLSTATE:
SQL3126N Remote client requires absolute path for files and directories.
Thanks.
Do you mean to use the 'load client' syntax (instead of just load) ?
See the details in the documentation.
The LOAD command requires that the files to be loaded are already on the Db2-target-server.
The LOAD CLIENT alternative allows the files to be on a remotely connected Db2-client (or on your Windows Db2-server if that is the source machine).
You can also just copy the IXF files to the Linux Db2-server, and open an SSH session to that Linux environment and run the LOAD command there. Your choice.
As with the LOAD command, LOAD CLIENT operates on one file at a time (in your case, one file per table) unless using lobsinsepfiles option, or other special cases.

How can we set nodetool and cqlsh to be run from anywhere and by any user on linux server

I am trying to setup environment variables so that any user on a particular server can run commands like nodetool or cqlsh from any where in linux file system . The effort to traverse to bin directory everytime should be saved .
How can we achieve this ? My DSE 4.8 is a tarball install .
Nodetool is usually available to any user that has execution privileges in your linux boxes
For cqlsh, you can set any configuration inside the cqlshrc file (usually found in $HOME/.cassandra/cqlshrc; we have used to enable client-node encryption but has more configurable options
To setup environment variable just follow some steps from root user:
# vi /etc/profile.d/cassandra.sh
Add the following lines to the cassandra.sh file-
export CASSANDRA_HOME=/opt/apache-cassandra-3.0.8
export CASSANDRA_CONF_DIR=/opt/apache-cassandra-3.0.8/conf
Here /opt/ is my directory, where I've extracted my apache-cassandra-3.0.8-bin.tar.gz tarball.
After adding those lines to cassandra.sh, save and exit. Then-
# source /etc/profile.d/cassandra.sh

Could not connect to cassandra with cqlsh

I want to connect to cassandra but got this error:
$ bin/cqlsh
Connection error: ('Unable to connect to any servers', {'192.168.1.200': error(10061, "Tried connecting to [('192.168.1.200', 9042)]. Last error: No connection could be made because the target machine actively refused it")})
Pretty simple.
The machine is actively refusing it because your system does not have cassandra running on it. Follow the following steps to completely get rid of this trouble :
Install Cassandra from DataStax (Datastax-DDC; Cassandra version 3).
Go to ~\installation\path\DataStax-DDC\apache-cassandra\bin.
Open up cmd there. (Use Alt+F+P to open it if you are on windows 8 or later).
type cassandra -f this will generate a lot of stuff on the window and you must get the last line as INFO 11:32:31 Created default superuser role 'cassandra'
Now open another cmd window in the same folder.
Type cqlsh
This should give you a prompt, without any error.
I also discovered that this error doesn't pop up if I use cassadra v2.x found here Archived version of Cassandra. I don't know why :( (If you find out please comment).
So, if the above steps do not work, you can always go back to Cassandra v2.x.
Cheers.
Check if you have started Cassandra server, then provide the host and port as the arguments.
$ bin/cqlsh 127.0.0.1 4092
I run into the same problem. This worked for me.
Go to any directory for example E:\ (doesn't have to be the same disc as the cassandra installation)
Create the following directories
E:\cassandra\storage\commitlogs
E:\cassandra\storage\data
E:\cassandra\storage\savedcaches
Then go to your cassandra installations conf path. In my case.
D:\DataStax-DDC\apache-cassandra\conf
Open cassandra.yaml. Edit the lines containing: data_file_directories, commitlog_directory, saved_caches_directory to look like the code below (change paths accordingly to where you created the folders)
data_file_directories:
- E:\cassandra\storage\data
commitlog_directory: E:\cassandra\storage\commitlog
saved_caches_directory: E:\cassandra\storage\savedcaches
Then open the cmd (I did it as administrator, but didn't check if it is necessary) to your cassandra installations bin path. In my case.
D:\DataStax-DDC\apache-cassandra\bin
run cassandra -f
Lots of stuff will be logged to your screen.
You should now be able to run cqlsh and all other stuff without problems.
Edit: The operating system was windows10 64bit
Edit2: If it stops working after a while check if the service is till running using nodetool status. If it isn't follow this instruction.
I also faced the same problem on a Win32 windows 7 machine.
Check if you have JAVA installed correctly and JAVA_HOME variable set.
Once you have checked the java installation and set JAVA_HOME, uninstall Cassandra and install it again.
Hopefully this would solve the problem. Mine was solved after applying the above two steps.
You need to mention host, user, password for cassandra cqlsh connection. Default cassandra cqlsh user is cassandra and password is cassandra.
$ bin/cqlsh <host> -u cassandra -p cassandra
I also had same problem. I applied many methods given on google and youtube but none of them worked in my case. Finally, I applied the following 3 steps and it worked in my case:-
Create a folder without any space in C or D whichever is your system drive. eg:- C:\cassandra
Install Cassandra in this folder instead of installing in"Program Files".
After installation, it will be like this- C:\cassandra\apache-cassandra-3.11.6
Copy python 2.7 installed in bin folder i.e.,C:\cassandra\apache-cassandra-3.11.6\bin
Now your program is ready for work.
There is no special method to connect cqlsh it simple as below:-
$ bin/cqlsh 127.0.0.1(host IP) 9042 or $ bin/cqlsh 127.0.0.1(host IP) 9160 (if older version of Cassandra)
Don't forget to check port connectivity if you are connecting cqlsh to remote host. Also you can use username/password if you enabled by default it is disabled.

MySQL backup with CronJOb

I am trying to export MySQL data with CronJob in cPanel and I am adding bellow code into command line:
/usr/bin/mysqldump --opt -u krystald_fred -p'tY$645=&nm' max_joomla > /home/max/db-backup.sql
After CronJob runs and when I check db-backup.sql file I am getting blank file with no data inside .sql file.
What's wrong with this command line. Can anyone guide me to fix this.
Thanks.
Not enough info there. There could be many issues. Just run from command line with -v verbose flag and see. You might also need to connect via the 3306 port and specify hostname. It all depends on your installation.

PDI hadoop file browser no list

I've hadoop single instance cluster configured to run with some IP address ( instead of localhost ) on centos linux. I was able to execute example mapreduce job correctly. That tells me that the hadoop setup appears to be fine.
I have also addded couple of data files to hadoop databse under "/data" folder and are visible through the "dfs" comand
bin/hadoop dfs -ls /data
I am trying to connect to this HDFS system from PDI/Kettle. In the HDFS File browser, if I put the HDFS connection parameters incorrectly, e.g. incorrect port, it says it can not connect to the HDFS server. Instead, If I put in all parameters correctly ( server,port,user,password ), and click 'connect' it does not give the error, meaning it is able to connect. But in the file list, it shows "/" .
Doesnt show data folder. What could be going wrong ?
I've already tried this :
tried chmod 777 to the datafiles using "bin/hadoop dfs -chmod -R 777 /data"
tried using root and also hdfs linux user in the PDI file browser
tried adding the data files in some other location
re-formatting hdfs several times and adding data files again
copying the hadoop-core jar file from hadoop installable to PDI extlib
but it does not list files in the PDI browser. I can not see anything in the PDI log either... Need quick help ... thanks !!!
-abhay
I got past this issue. On windows, PDI was not logging anything in the log file. I tried same thing on linux, when it showed me in the log that it was missing a library from Apache, the commons-configuration. I downloaded latest version of the same and put it under the extlib/pentaho folder and boom ! it worked !!

Resources