Run screen jupyter notebook in a conda environment - linux

I am trying to run jupyter notebook from a server through a ssh tunnel in a screen window with
screen jupyter notebook --no-browser --port=8888
It works, but I am not able to let it run in a conda environment I have created.
I tried creating a screen where I activate the environment and then call the notebook:
source activate env37 -->(meaning python 3.7)
(env37) user#server:~$ jupyter notebook --no-browser --port 8889
[I 07:24:11.610 NotebookApp] [nb_conda_kernels] enabled, 4 kernels found
[I 07:24:12.242 NotebookApp] [nb_anacondacloud] enabled
[I 07:24:12.297 NotebookApp] ✓ nbpresent HTML export ENABLED
But it didn't work, since in the notebook I see this:
3.5.4 |Anaconda custom (64-bit)| (default, Nov 20 2017, 18:44:38)
--> Python is 3.5.
Other ideas?
Thanks

I suppose you should change the ipython/notebook kernel rather than changing the environment check this answer. I hope that solves your problem!

Related

Fixing jupyter notebook on AWS

I'm trying to set up jupyter notebook to run on AWS, I'm followed this guide - https://medium.com/#alexjsanchez/python-3-notebooks-on-aws-ec2-in-15-mostly-easy-steps-2ec5e662c6c6.
It worked but I now realise that every time I launch jupyter on my local machine, it asks for a password, also the password saved no longer works effectively locking me out of jupyter:
I presume its because I set up the ssh connection to port 8888 which is what jupyter uses by default. However, I've removed the ssh config file in step 10 but no change.
I'm stumped as to why jupyter no longer launches correctly, does anyone have any ideas on how I correct this?
I've also noticed that a token is given when launching jupyter from an EC2 instance
but in my case non is given
#Launching jupyter in notebook from EC2 enviornment
(base) [ec2-user#ip-172-31-59-151 ~]$ jupyter lab --no-browser
[I 14:49:49.917 LabApp] JupyterLab extension loaded from /home/ec2-user/anaconda3/lib/python3.8/site-packages/jupyterlab
[I 14:49:49.917 LabApp] JupyterLab application directory is /home/ec2-user/anaconda3/share/jupyter/lab
[I 14:49:49.919 LabApp] Serving notebooks from local directory: /home/ec2-user
[I 14:49:49.919 LabApp] Jupyter Notebook 6.1.4 is running at:
[I 14:49:49.919 LabApp] http://localhost:8888/
[I 14:49:49.920 LabApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).

jupyter notebook can't detect conda kernels only on boot

I'm trying to automatically start jupyter server on boot. (ec2, amazon linux)
I got the jupyter server working and correctly loading all the conda kernels.
and I tried to run jupyter on boot. (tried rc.local, systemd, crontab and all fails to load conda kernels and only loads basic python2 and python3 kernels)
I've traced it down to the nb_conda_kernels not finding being able to call 'conda' on boot, as below:
Oct 30 01:07:38 ip-172-31-17-102 rc.local: [E 01:07:38.816 NotebookApp] [nb_conda_kernels] couldn't call conda:
Oct 30 01:07:38 ip-172-31-17-102 rc.local: [Errno 2] No such file or directory: 'conda': 'conda'
Oct 30 01:07:38 ip-172-31-17-102 rc.local: [I 01:07:38.816 NotebookApp] [nb_conda_kernels] enabled, 0 kernels found
Oct 30 01:07:39 ip-172-31-17-102 rc.local: [I 01:07:39.645 NotebookApp] Loading IPython parallel extension
Oct 30 01:07:39 ip-172-31-17-102 rc.local: [I 01:07:39.701 NotebookApp] JupyterLab extension loaded from /home/ec2-user/anaconda3/lib/python3.7/site-packages/jupyterlab
Oct 30 01:07:39 ip-172-31-17-102 rc.local: [I 01:07:39.702 NotebookApp] JupyterLab application directory is /home/ec2-user/anaconda3/share/jupyter/lab
Oct 30 01:07:40 ip-172-31-17-102 rc.local: [I 01:07:40.061 NotebookApp] [nb_conda] enabled
Oct 30 01:07:40 ip-172-31-17-102 rc.local: [I 01:07:40.061 NotebookApp] Serving notebooks from local directory: /home/ec2-user/browse
Oct 30 01:07:40 ip-172-31-17-102 rc.local: [I 01:07:40.061 NotebookApp] The Jupyter Notebook is running at:
Oct 30 01:07:40 ip-172-31-17-102 rc.local: [I 01:07:40.061 NotebookApp] http://172.31.17.102:8888/
Oct 30 01:07:40 ip-172-31-17-102 rc.local: [I 01:07:40.061 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
The weirdest thing is that when I kill the running server with "sudo ss --tulpn | grep 8888" (to find pid for jupyter notebook which is listening on port 8888) and "sudo kill -9 {pid}" (to kill the process. checked that I killed it successfully), and I restart the server via the same command used in boot, then jupyter successfully loads the conda kernels, as below:
[I 01:12:51.369 NotebookApp] [nb_conda_kernels] enabled, 22 kernels found
[I 01:12:51.612 NotebookApp] Loading IPython parallel extension
[I 01:12:51.641 NotebookApp] JupyterLab extension loaded from /home/ec2-user/anaconda3/lib/python3.7/site-packages/jupyterlab
[I 01:12:51.641 NotebookApp] JupyterLab application directory is /home/ec2-user/anaconda3/share/jupyter/lab
[I 01:12:51.791 NotebookApp] [nb_conda] enabled
[I 01:12:51.792 NotebookApp] Serving notebooks from local directory: /home/ec2-user/browse
[I 01:12:51.792 NotebookApp] The Jupyter Notebook is running at:
[I 01:12:51.792 NotebookApp] http://172.31.17.102:8888/
[I 01:12:51.792 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
I've seen one particular question on stackoverflow about this issue. (Jupyter notebook can't find kernel when run through /etc/rc.local)
but the issue here is that when I type "jupyter kernelspec list", I only get python3 and python2, even when jupyter has already loaded conda kernels successfully.
python3 /home/ec2-user/anaconda3/share/jupyter/kernels/python3
python2 /usr/share/jupyter/kernels/python2
Also, I've tried changing environmentspecmanager in jupyter config file to see if it solves the issue but it didn't. (same error log, "couldn't call conda")
what is the problem here?
conda has not been configured yet when rc.local is executed. This is usually done through the .bashrc file. But you can try to just add it to the rc.local script (before launching jupyter):
source /home/ec2-user/anaconda3/etc/profile.d/conda.sh
This should set up the use of conda
Thank you, FlyterTeller. I used the code below in rc.local instead of the one you showed.
if [ -f "/home/ec2-user/anaconda3/etc/profile.d/conda.sh" ]; then
. "/home/ec2-user/anaconda3/etc/profile.d/conda.sh"
CONDA_CHANGEPS1=false /home/ec2-user/anaconda3/bin/jupyter notebook --config /home/ec2-user/.jupyter/jupyter_notebook_config.py --allow-root --no-browser
fi
But you were right about conda being not configured when rc.local is executed. Thanks for nudging me in the right direction, so i'm upvoting and accepting your answer!

ModuleNotFoundError when accessing python Flask within jupyter notebook in a virtual env

I have created an virtual env gcloudenv on my nividia nano running ubuntu. I was able to successfully install flask and required libraries and able to deploy my appengine into GCP from this virtual env. All my work is in python and I was using nano as my editor to get my code up and running. No issues so far.
my virtual env gcloudenv already has all the required packages for flask, jinga etc and I can see them when I run pip freeze.
Then I tried to work on Jupyter notebook as my code was getting little complicated and I didn't want to write full code and then run.
I already had jupyter notebook installed before creating the virtual env. I also installed jupyter within in virtual env as well.
So I followed the instruction to create a new kernel by running the following command:-
(gcloudenv) sunny#my-nano:~gcloudenv/MyApp/mainfolder$ pip install ipykernel
(gcloudenv) sunny#my-nano:~gcloudenv/MyApp/mainfolder$ ipython kernel install --user --
name=gcloudenv
Now, I ran the notebook as:
(gcloudenv) sunny#my-nano:~gcloudenv/MyApp/mainfolder$
/home/gcloudenv/bin/jupyter notebook
When trying to import the flask I get the following error:
ModuleNotFoundError: No module named 'flask'
Note sure what is going on as I getting blanked out.
Add
!pip install flask
in the beginning of your Jupyter notebook.
Finally I managed to solve my problem. Thanks to a wonderful post https://jakevdp.github.io/blog/2017/12/05/installing-python-packages-from-jupyter/.
In essence, I had 2 problems:-
1. I did not have a jupyter-notebook within my virtual env. Originally I thought I had it installed but that was incorrect. so whenever I tried to launch one, it was picking the first jupyter notebook it could find in the path.
The good way to find out which one is it pointing to , is to run which command
(gcloudenv) sunny#my-nano:~/gcloudenv$ which jupyter-notebook
For me, that was at:
/home/sunny/archiconda3/bin/jupyter-notebook
I had in fact 3 copies of jupyter-notebook on my system. One was probably installed using sudo pip and therefore went into the root folder. Probably not a good thing to do.
So I installed a fresh jupyter-notebook with the following command:-
(gcloudenv) $ pip install jupyter notebook
2.Next is to check list of Jupyter kernels available by running the following from the jupyter notebook ( or from command line):
!jupyter kernelspec list (OR (gcloudenv) $jupyter kernelspec list
My jupyter notebook was not able to import flask libraries because it was pointing to a wrong kernel config outside of my virtualenv gcloudenv.
Available kernels:
gcloudenv /home/sunny/.local/share/jupyter/kernels/gcloudenv ( correct one)
python3 /home/sunny/gcloudenv/share/jupyter/kernels/python3
You can determine which python version it is picking by doing a 'more' on the file:-
(gcloudenv) $
/more/home/sunny/.local/share/jupyter/kernels/gcloudenv/kernel.json
Once I changed my kernel to point to python3 from within the notebook, it picked the correct path and all the relevant libraries I needed.
In summary when you hit the problem as mentioned above, do the following:-
check the path of the python ( whereis python or which python)
check if you are running the 'right' notebook. This is determined by the path and if you have sourced your virtualenv.
Install jupyter notebook using pip from within your virtualenv( do not use sudo)
Check the Jupyter kernel. This may be particularly relevant if you have a common jupyter notebook and you want to work with multiple virtualenv.

PyCharm 2019.1 CE: No option to create/edit Jupyter Notebook (*.ipynb) files?

I updated Pycharm CE from 2018.3 to 2019.1 and lost ability to create or open an existing jupyter notebook file. The existing .ipynb file opens as a text file rather than a jupyter notebook with cells. I could open and view it fine on 2018.3.
I am new to PyCharm and indeed also jupyter notebooks, please let me know if I need to include more information.
This is what I already tried on the 2019.1 version:
Created a new project and installed jupyter package for the interpreter.
Started jupyter notebook server with the command (jupyter notebook) in the PyCharm terminal, with the working directory set to cwd.
Un-installed and re-installed PyCharm 2019.1
So as per the support staff from JetBrains:
Starting with 2019.1 the PyCharm Community Edition (CE) does not support Jupyter Notebook. This functionality has been moved to the professional version only.
You can type in Terminal:
pip install jupyterlab
pip install notebook
jupyter notebook
and Notebook will be opened in browser

Using TensorFlow through Jupyter (Python 3)

Apologies in advance, I think the issue is quite perplexing!
I would like to use TensorFlow through Jupyter, with a Python3 kernel.
However the command import tensorflow as tf returns the error: ImportError: No module named tensorflow when either Python2 or Python3 is specified as the Jupyter kernel.
I have Python 2 and Python 3 installed on my Mac and can access both
versions through Terminal.
I installed TensorFlow for Python 3, however I can only access it via Python 2 on the Terminal.
As such, this question is really two-fold:
I want to get TensorFlow working with Python3
...which should lead to TensorFlow working with Jupyter on the Python3 terminal.
I had the same problem and solved it using the tutorial Using a virtualenv in an IPython notebook. I'll walk you through the steps I took.
I am using Anaconda, and I installed a new environment tensorflow using these instructions at tensorflow.org. After that, here is how I got tensorflow to work in a Jupyter notebook:
Open Terminal
Run source activate tensorflow. You should now see (tensorflow) at the beginning of the prompt.
Now that we are in the tensorflow environment, we want to install ipython and jupyter in this environment: Run
conda install ipython
and
conda install jupyter
Now follow the instructions in the tutorial linked above. I'll repeat them here with a bit more information added. First run
ipython kernelspec install-self --user
The result for me was Installed kernelspec python3 in /Users/charliebrummitt/Library/Jupyter/kernels/python3
Run the following:
mkdir -p ~/.ipython/kernels
Then run the following with <kernel_name> replaced by a name of your choice (I chose tfkernel) and replace the first path (i.e., ~/.local/share/jupyter/kernels/pythonX) by the path generated in step 4:
mv ~/.local/share/jupyter/kernels/pythonX ~/.ipython/kernels/<kernel_name>
Now you'll see a new kernel if you open a Jupyter notebook and select Kernel -> Change kernel from the menu. But the new kernel will have the same name as your previous kernel (for me it was called Python 3). To give your new kernel a unique name, run in Terminal
cd ~/.ipython/kernels/tfkernel/
and then run vim kernel.json to edit the file kernel.json so that you replace the value of "display_name" from the default (Python 3) to a new name (I chose to call it "tfkernel"). Save and exit vim by typing :wq while in command mode.
Open a new Jupyter notebook and type import tensorflow as tf. If you didn't get ImportError then you are ready to go!

Resources