How to use Celery on Windows pycharm with Docker - linux

just found out that celery does not support windows anymore. Unfortunately I need to work with it.
I tried creating a docker linux repository and installed python3.8 on it(usr/bin/python3).
I also configured the python interpreter as follows:
Server : docker
Image name ubuntu:latestvim
Python interpreter path : /usr/bin/
Somehow I keep getting the same erro message:
Can't retrieve image ID from build stream
Can someone instruct me how to develop with celery on my windows platform?

Related

Why does the docker container exits immediately? on PLC

I am running docker on a WAGO-PFC 8204 device and successfully able to install an image for node-red flow based editor. However, when i am trying to run the image, it creates a container but automatically exits. I am not running this image on a linux machine itself but instead of other device. I ran docker logs [container name] but output was Error: fatal error, line 0. Please see the image attached.
Please help anyone if you can. Thanks
This is because version 3.0.0 of the Node-RED Docker container will not run on old versions of Docker that do not support 64bit time on a 32bit OS.
You can try adding --security-opt=seccomp=unconfined to the docker run command or upgrade docker/libseccom to a supported version of the latest apline base container.
Details of the minimum version are in the release notes on github
https://wiki.alpinelinux.org/wiki/Release_Notes_for_Alpine_3.13.0#time64_requirements
https://github.com/node-red/node-red-docker/issues/319
Work around:
Use the nodered/node-red:2.2.2 container rather than the new 3.0.0 (latest) tags

How do I set up PhpStorm/WebStorm to work with Node.js in a docker container?

I am trying to use PhpStorm to debug as Node application that is running in a docker container. The image is called parsoid-dev, it only exists locally on my machine. I can run it using docker run parsoid-dev.
In the PhpStorm settings, I created a remote Node.js interpreter configuration for this image using the server type "Docker", image "parsoid-dev", and path "node".
But when I try to run the application, I get this error:
Failed to prepare environment: Cannot find image: parsoid-dev
But the image is clearly there, I can use it from the command line... What am I missing?
EDIT: I'm using PhpStorm 2021.3.3 on Windows 11 and Docker for Windows 4.6.0 (Docker Desktop engine 20.10.13). I'm running Ubuntu 20.04.4 in WSL2.
Please try specifying the image name with a tag, e.g. parsoid-dev:latest - does it help?

OpenCV 4.4.0: qt.qpa.xcb: could not connect to display on a remote EC2 instance. How to solve this issue?

I am running the Opencv 4.4.0 in a Ubuntu 20.04 AWS-Ec2 instance connected with VSCODE trough the Remote Explorer module.
I am trying to open an image that I have uploaded to the project.
import cv2
img = cv2.imread("imgs/cat1.jpg")
cv2.imshow("Output", img)
But when I run the file (pressing the green arrow) I get the follow error:
(env) ubuntu#ip-xxx-xx-xx-xxx:~/vhosts/opencv-ml-images$ /bin/python3 /home/ubuntu/vhosts/opencv-ml-images/chapter1.py
qt.qpa.xcb: could not connect to display
qt.qpa.plugin: Could not load the Qt platform plugin "xcb" in "/home/ubuntu/.local/lib/python3.8/site-packages/cv2/qt/plugins" even though it was found.
This application failed to start because no Qt platform plugin could be initialized. Reinstalling the application may fix this problem.
Available platform plugins are: xcb.
Aborted (core dumped)
Does someone knows what is happening? maybe something related to the fact I am running the opencv in a remote computer? How to sove it?
I think what you are trying to do is run it on Remote Desktop Connection. I was having the same problem but I am running WSL 2 (Both are quite similar). And I solved it by setting up environment for GUI applications on WSL 2 from https://wiki.ubuntu.com/WSL . Just start your X server and run the code again. It should work. If not, do reply so that we can sort it out together.

How to distribute python3 code which contains external libraries

I wrote a small script in python3 that uses numpy, matplotlib, and other libraries used by pyCharm CE in my linux machine.
I used pyCharm to code and create the virtual env.
The script works only inside pyCharm because of the dependencies.
And a friend of mine wants to use my script in a windows machine. I'm not sure if even he has python installed.
How can I run my script outside pyCharm, or how can I activate the virtual env created by pyCharm to run the script?
And
How I can create a package or something to give the script to my friend or anyone else to freely use it?
Thanks
One way of going about to ask your friend to install python3.x and pip in his system. Meanwhile you create a requirements.txt which consists of the libraries that need to be installed and their versions in this format.
dj-database-url==0.5.0
Django==2.2.5
pytz==2019.2
sqlparse==0.3.0
psycopg2>=2.7,<3.0
Then ask your friend to run pip install -r <path to requirements.txt>. This will install all the required libraries and if there is no OS based dependencies then the project should run fine.
Another way of doing it in the case of bigger project where there are OS based dependencies is to use a containerization tool such as docker. Containerization lets you run projects, in other machines, which are dependent on various packages or environments which are available/installed in your machine.
For example: Imagine I created a python based application which is dependent on multiple packages in my Debian machine. I can build a docker image using python3.x as the base and install the required packages inside the image during the build time. It is fairly simple to do so. After doing so I can push the image to docker hub which is a registry to store docker images. Do mind that the images stored here are publicly available. If you are worried about that, you can use a private AWS ECR registry to store your images. Once I have pushed the image, anyone with access to the image can pull it and spin up a container. A container is an instance of an image which can run the applications/scripts/anything that the image is built to do. In order to be able to spin up containers they will need docker installed in their machine.
This way you can share your project and make it run in anyone's machine with as little hassle as possible. They will not need anything other than docker installed in their machine. Unlike Virtual Machine docker containers are not heavy on your machine.
In your case using docker you can build an image (much like an ISO image) with python3.x as base and install all the required packages such as numpy, matplotlib and other libraries, then copy the scripts required for the project to run into the image and push it to docker hub or a private registry of your choice. Then you can give your friend an access to the image. Your friend will need Docker for Windows installed in his machine in order to be able to spin up a container using the image you provided him with. This container will have your script running as it will have all the required dependencies installed in it by you while building the image itself.
For more info on Docker: https://www.docker.com/

linux machine using on docker for windows

I have windows server 2016 which has docker in built into it... so I am able to create windows image based containers and play around with them.
But now I want to create run linux based images, not able to do that... get below error
PS C:\Users\harishr> docker pull hello-world
Using default tag: latest
latest: Pulling from library/hello-world
image operating system "linux" cannot be used on this platform
I did installed docker-machine to create linux machine, but not getting command line options to do that

Resources