Deploy python REST service on Ubuntu - python-3.x

I have started learning django and Python.
I have developed one service using Visual Studio Code on my Windows machine.
It is working as expected on Windows machine.
Now I want to deploy the same on a Ubuntu server. (But failing as there is no 'bin' folder inside virtual environment)
How can I do it? I know I am missing something basic here.
Could you please help/ point me where I can read about it?

Let's suppose you are deploying your application at path /opt/.
Follow the below steps to deploy:
To get started we need to install some packages, run the following commands:
sudo apt-get update
sudo apt-get install python3-pip python3-dev libmysqlclient-dev ufw virtualenv
Above command would install the basics python dev packages on ubuntu server.
Create a virtualenv at some path of your choice(might be at /opt/.env), following commands:
virtualenv .env
Activate the environment: source .env/bin/activate
Install all requirement packages in virtual-env that you require to run your Django application.
Test your service manually by running: python manage.py runserver (this would shows that all dependencies are installed)
After that, You can install the web server gateway i.e Gunicorn and Supervisor as process monitoring tool for your service. Please refer: http://rahmonov.me/posts/run-a-django-app-with-nginx-gunicorn-and-supervisor/
Nginx can be run on server as web server for routing request to your application port/socket file.
Above are the high level steps to deploy the Django Application.

Related

How to publish azure function in python using docker from my machine to azure?

when I'm trying to publish my azure function in python to azure I get this following error.
pip version 10.0.1, however version 19.2.1 is available.
You should consider upgrading via the 'python -m pip install --upgrade pip' command.
ERROR: cannot install cryptography-2.7 dependency: binary dependencies without wheels are not supported. Use the --build-native-deps option to automatically build and configure the dependencies using a Docker container. More information at https://aka.ms/func-python-publish
So, I have install docker and I have tried to push my function using the following command
func azure functionapp publish timertriggerforstreaming --build-native-deps
I didn't do anything I have just installed docker,
when I tried to publish I get the following error when I was using docker on windows container(My machine is windows 10).
Error running docker pull mcr.microsoft.com/azure-functions/python:2.0.12493-python3.6-buildenv.
output: 2.0.12493-python3.6-buildenv: Pulling from azure-functions/python
image operating system "Linux" cannot be used on this platform
Again I switch from Windows to Linux and again tried the same now it's taking a long time and I'm seeing the following output for a long time.
see the image
Is the way I'm doing it right or wrong, or I need to Dockerize azure function on my own and to publish it
Follow this doc for deployment : https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-function-linux-custom-image

Running 'install' command on dockerised App Service in Azure

I have a web API dockerized and published from Visual Studio. All I had to do was to choose the system (Linux it was) and hit publish.
Now the problem is I have to run command
RUN apt-get install -y libc6-dev
on my docker container but I can't find any access to it. Does any of you know how to install 'libc5-dev' on that kind of instance?
You can either include the installation command within your docker file in order to get libc6-dev installed e. g.:
FROM YOURIMAGE
RUN apt-get install -y libc6-dev
Or you can choose a different base image that already has libc6-dev installed.

pip install from locally copied repo

I wish to install connexion for swagger on a Ubuntu server that DOES NOT have internet access.
I wish to run
pip install connexion[swagger-ui]
Can I somehow download this repro (via another machine that does have internet access) and install/copy it locally and then install it that way?
If so, how?

Install and work with Cloud Custodian in EC2 instance (Linux)?

I have executed the following command on my Linux instance:
yum install custodian
It is installed, but I don't know how to start it and use it. Can anybody helps me how can use execute yml script.
In linux We can install custodian by using commands like :
1). virtualenv custodian
2). for starting custodian
source custodian/bin/activate
3). install cli on it
pip install cli
4). install c7n
pip install c7n
5). now configure user details as:
aws configure

Step by step: Installing Python 3.3, Lighttpd & Pymongo on Ubuntu 12.04

I'm currently migrating to new computer and I need to reinstall the software I am using which are:
Python 3.3,
Lighttpd (newest version),
Pymongo (newest version),
Ubuntu 12.04 Desktop (The System I'm using)
I started to install Python 3.3 by downloading it from the its official website (in tar.bz2 file) and by following this tutorial. Afterwards I installed Lighttpd and changed the lighttpd.conf for Python by following this tutorial, too.
I tried several paths for my cgi.assign, none of them worked. Especially /opt/python3.3/bin/python3.3 should be working, but it shows 500 - internal Server error all the time with a "hello world" test script.
Now regardless to this problem I have no clue on installing Pymongo. If I try to intall pip OR easy_install python3.3 I have to manually download it and execute the setup.py with my python3.3 executable, right? Because this always fails with an error:
`Error missing zlib on a bundle called distribute-0.7.3 (is this even the right tool I need, because it seems to be a legacy wrapper !?) or unknown url type: https for pymongo2.6.2 itself.`
I'm getting crazy with this setup. Why is this so difficult to handle? Other programs are just a few clicks to install even on a system like Ubuntu, but these particular development tools seem to be really difficult to install.If anybody has an idea on how to install all three together or has information on a better solution please help me out.
The system is used to program Python scripts in Eclipse and trying them out directly on the system (lighttpd). The database used is MongoDB. Python and MongoDB are communicating over the Pymongo driver. I am planning to use the system on a Server distribution on release and it has to be nicely scalable on a high amount of excecutions.
Thanks for your time,
It's easiest to use the Ubuntu repositories:
sudo apt-get update
sudo apt-get install python3 python3-pip lightppd python-pymongo
Or if that only installs the python2.x pymongo, use pip, which you've just installed:
sudo pip-3.3 install pymongo
Or better yet, use a virtualenv with the help of virtualenvwrapper (docs)
sudo pip install virtualenvwrapper
... # follow instructions for installing virtualenvwrapper
mkvirtualenv --python=/usr/bin/python3 -i pymongo mongoppd
workon mongoppd
... which will segregate the environment I've called 'mongoppd' from the rest of your system so you can't cause any trouble. Then you don't need sudo to pip-3.3 install things, just workon mongoppd then pip-3.3 install [...]. Or after the -i flag when you create the virtualenv to get it installed straight away.
In general, on Ubuntu, you should hardly ever have to install something manually. Your first attempt should be using sudo apt-get install (use tab-complete to see what's available or just google "ubuntu 12.04 packages [...]" and you'll find the list of packages). Then for python use pip install or pip-3.3 install as appropriate. You'll only need to run python setup.py install if you need to install a development version of a package or something obscure that's not on pip. I don't think there's a good reason to ever use easy_install these days.

Resources