Cassandra ccm not installed as application on windows 10 - cassandra

I have installed cassandra ccm tool on my windows 10 machine, and it got installed in directory 'C:\Python27\Scripts':
I have also added path to env variable PATH, but running ccm commands is not working:
ccm create -h
'ccm' is not recognized as an internal or external
command, operable program or batch file.
Is installation of ccm wrong?I see it's not installed as application.

The same question was asked on https://community.datastax.com/questions/11860/ so I'm reposting my answer here.
The most common causes of the reported error above are:
PYTHON_HOME is not set
PYTHON_HOME is not included in the PATH environment variable
As a side note, there are too many prerequisites for ccm to run on Windows so I recommend running Cassandra on Docker or better yet, use K8ssandra.io. Cheers!

I had to add .PY to PATHEXT system variable.

Related

.Net Core Linux - Docker - Local debugging with DB2

.Net Core (3.1) Web API using Docker (Linux container).
I have a Db2 connection via nuget package IBM.Data.DB2.Core-lnx (3.1.0.300).
This Db2 connection works fine when I build and run my Dockerfile independently of VisualStudio 2019.
However, when attempting to debug via VS (fast mode), I run into this exception:
Unable to load shared library 'libdb2.so' or one of its dependencies. In order to help diagnose loading problems, consider setting the LD_DEBUG environment variable: liblibdb2.so: cannot open shared object file: No such file or directory
My guess is this failure to locate resources is because of the way fast mode debugging works with Docker containers. With the app output copied to a mount, my IBM ENV variables aren't able to find the NuGet clidriver folder. These typically look like this:
ENV DB2_CLI_DRIVER_INSTALL_PATH="/app/clidriver" \
LD_LIBRARY_PATH="/app/clidriver/lib" \
LIBPATH="/app/clidriver/lib" \
PATH=$PATH:"/app/clidriver/bin:/app/clidriver/lib:/app/clidriver/adm"
How can I update these to point to the mounted app contents? Assuming that is the problem...
*Note that I am installing package libxml2-dev in the base of my Dockerfile.
If anyone has a successful strategy for debugging Db2 connections in a Linux container, I would love to hear what you've done. Much thanks in advance.
Running a shell on the debugging container allowed me to see the mounted contents and get the clidriver path. Setting this in the Db2 environment variables fixed the issue:
ENV DB2_CLI_DRIVER_INSTALL_PATH="/app/bin/Debug/netcoreapp3.1/clidriver" \
LD_LIBRARY_PATH="/app/bin/Debug/netcoreapp3.1/clidriver/lib" \
LIBPATH="/app/bin/Debug/netcoreapp3.1/clidriver/lib" \
PATH=$PATH:"/app/bin/Debug/netcoreapp3.1/clidriver/bin:/app/bin/Debug/netcoreapp3.1/clidriver/lib:/app/bin/Debug/netcoreapp3.1/clidriver/adm"
I was facing these concerns almost for a week, glad could resolve the issue.
Solution:
LD_LIBRARY_PATH=<app release folder path>/netcoreapp3.1/clidriver/lib
when you include the LD_LIBRARY_PATH variable, the PATH value gets appended to the existing values along with "../root/usr/lib64"
Please note, If you don't include the LD_LIBRARY_PATH the pod will error out.

DataStax Bulk Loader for Apache Cassandra isn't installing on Windows

I'm trying to install DataStax Bulk Loader on my Windows machine in order to import json file to Cassandra databse. I just follow the installation instructions from the official webstie. It's just unpack the folder. Printing dsbulkfrom any catalogue into cmd prints the following result: "dsbulk" is not internal or external command, executable program, or batch file. However I added C:\DSBulk\dsbulk-1.7.0\bin into PATH variables. Anyone who faced with this problem what did you do? Thanks :D
Change into the bin/ directory where you unzipped the package. For example:
C:> cd C:\DSBulk\dsbulk-1.7.0\bin
Then run the dsbulk.cmd from there.
NOTE: Make sure you have both the classpath and Java home set in your environment. Cheers!

Can't get spark to start

I have successfully installed and run apache spark in the past on my machine. Today I returned to it and tried to run it using : bin/spark-shell in the spark directory (bin file exists in this dir) but I am getting:
bin is not recognized as an internal or external command,
operable program or batch file.
It s running on windows 10 cmd shell, in case this is helpful. What could cause this?
I belive we need more info, to be able to answr your question.
Using './' specifies a path, starting in the root of your working directory. (Bash or powershell)
Are you running this in the cmd shell/powershell/bash shell?
What directory are you working in, when trying to execute your command?
Is there a bin folder in your current directory? (LS command or dir command)
JAVA_HOME was outdated... I had updated java without updating the path! That was the problem.
Check version of java installed and location where environment variable JAVA_HOME is pointing to.
In my case JAVA_HOME = C:\Program Files\Java\jdk1.7.0_79 (this is old version)
The cause of this issue was that I installed a new version of JDK and removed the previous installation but JAVA_HOME was pointing to the old environment which was missing.

Cassandra Nodetool can't find NodeCmd from Git Bash

I'm running Cassandra 2.0.9 with Java 1.7.0 on Windows. I can run nodetool normally from the Windows command line, but I'm not able to run it from the Git Bash (directly from terminal or through a sh script) or Cygwin when running .sh files that call NodeTool (but otherwise can run it).
The exact error I get is:
Error: Could not find or load main class org.apache.cassandra.tools.NodeCmd
I haven't done any kind of extra configuration outside of the recommended changes Datastax recommends, and I haven't had any other issues with Cassandra. I don't think I have any environment screw ups (but who knows what could be wrong). Has anyone else run into this issue before? Thanks!

Running Azure node.js Tools on Ubuntu

I have followed these instructions.
And as far as I can tell I have successfully installed node.js azure tools. No error - nothing to suggest it failed.
However, I cannot, and the documentation says, simple run "azure"...
Maybe there is something I am missing with node.js?
There are a few problems you may be experiencing.
First of all, I would ensure you are running Node.js v0.6.20. You can do this by opening the command prompt and running:
node -v
You should have v0.6.20 echoed back.
If this doesn't work, you may be missing a path variable to Node.js or the NPM cache. Verify the Environment variables exist by running [in the command prompt]:
path
you should see two paths:
%appdata%\npm
[x64 Machine]
%programfiles(x86)%\nodejs\
[x86 Machine]
%programfiles%\nodejs\
If this doesn't work, I would check to ensure that the azure module was loaded into the %appdata%\npm\node_modules directory.
It could be the PATH issue. In my case, the azure program is located at ~/.npm-global/bin.
run "export PATH=$PATH:~/.npm-global/bin". Or just add to bash source file

Resources