Cx_freeze issue for converting .py file to .exe - pyqt

I am trying to develop an application using the following flow:
GUI made in Qt Designer using Qt 5.6.2
Conversion of the GUI file (extension .ui) to py using pyuic5
Transfer of the code in Spyder and adding a few loops in the recently converted .py file.
Note: the code works fine in spyder 3.1.4 and there are no errors. The gui appears and fullfils the necessary functions.
Compilation using cx.freeze 5.0.2
4.1 Creation of the setup.py file as shown below (but tried also other versions found on the internet):
from cx_Freeze import setup, Executable
setup(name = "vsc.py" ,
version = "0.1" ,
description = "" ,
executables = [Executable("vsc.py")])
4.2 run in cmd: python setup.py build (using the right directory and both files vsc.py and setup.py in same directory)
4.3 build folder crated with lots of files in it (including vsc.exe) but this one is not running.
Errors I get for teh step 4.2:
Missing modules:
? IronPython.Runtime.Exceptions imported from nose.suite
? StringIO imported from numpy.lib.utils, numpy.testing.utils
? __builtin__ imported from numpy, numpy.core.numeric, numpy.core.numerictypes,
numpy.distutils.misc_util, numpy.lib._iotools, numpy.lib.function_base, numpy.ma
.core
? __main__ imported from bdb, pdb, rlcompleter
? _curses imported from curses, curses.has_key
? _dummy_threading imported from dummy_threading
? _frozen_importlib imported from importlib, importlib.abc
? _frozen_importlib_external imported from importlib, importlib._bootstrap, impo
rtlib.abc
? _posixsubprocess imported from multiprocessing.util, subprocess
? _scproxy imported from urllib.request
? _winreg imported from numpy.distutils.cpuinfo, platform
? anaconda_decrypt imported from site
? cPickle imported from numpy.core.numeric, numpy.lib.npyio, numpy.ma.core
? clr imported from nose.suite
? commands imported from numpy.distutils.cpuinfo
? compiler.consts imported from nose.pyversion
? copy_reg imported from numpy.core
? future_builtins imported from numpy.lib.npyio
? grp imported from distutils.archive_util, pathlib, shutil, tarfile
? java.lang imported from platform
? multiprocessing.AuthenticationError imported from multiprocessing.connection
? multiprocessing.BufferTooShort imported from multiprocessing.connection
? multiprocessing.Manager imported from nose.plugins.plugintest
? multiprocessing.SimpleQueue imported from concurrent.futures.process
? multiprocessing.TimeoutError imported from multiprocessing.pool
? multiprocessing.current_process imported from nose.plugins.plugintest
? multiprocessing.get_context imported from multiprocessing.managers, multiproce
ssing.pool, multiprocessing.sharedctypes
? multiprocessing.get_start_method imported from multiprocessing.spawn
? multiprocessing.set_start_method imported from multiprocessing.spawn
? new imported from nose.ext.dtcompat, nose.pyversion
? numarray imported from numpy.distutils.system_info
? numpy.core.float32 imported from numpy.testing.utils
? numpy.core.geterrobj imported from numpy.linalg.linalg
? numpy.core.integer imported from numpy.fft.helper
? numpy.core.intp imported from numpy.linalg.linalg
? numpy.core.longdouble imported from numpy.linalg.linalg
? numpy.core.object_ imported from numpy.linalg.linalg
? numpy.core.signbit imported from numpy.testing.utils
? numpy_distutils imported from numpy.f2py.diagnose
? numpy_distutils.command.build_flib imported from numpy.f2py.diagnose
? numpy_distutils.command.cpuinfo imported from numpy.f2py.diagnose
? numpy_distutils.cpuinfo imported from numpy.f2py.diagnose
? numpy_distutils.fcompiler imported from numpy.f2py.diagnose
? org.python.core imported from copy, pickle
? os.path imported from distutils.file_util, numpy.core.memmap, os, pkgutil, py_
compile, sysconfig, tracemalloc, unittest, unittest.util
? pkg_resources imported from nose.plugins.manager
? posix imported from os
? pwd imported from distutils.archive_util, distutils.util, getpass, http.server
, netrc, pathlib, posixpath, shutil, tarfile, webbrowser
? scipy imported from numpy.testing.nosetester
? sets imported from nose.util, numpy.distutils.command.build_ext, numpy.distuti
ls.fcompiler, numpy.distutils.misc_util
? setuptools imported from numpy.distutils.core
? setuptools.command imported from numpy.distutils.core
? setuptools.command.bdist_rpm imported from numpy.distutils.command.bdist_rpm
? setuptools.command.develop imported from numpy.distutils.command.develop
? setuptools.command.egg_info imported from numpy.distutils.command.egg_info
? setuptools.command.install imported from numpy.distutils.command.install
? setuptools.command.sdist imported from numpy.distutils.command.sdist
? termios imported from getpass, tty
? unittest2.case imported from nose.plugins.skip
? urllib2 imported from numpy.lib._datasource
? urlparse imported from numpy.lib._datasource
? vms_lib imported from platform
This is not necessarily a problem - the modules may not be needed on this platfo
rm.
Copying data from package PyQt5...
copying C:\Users\aa82758\Anaconda3\lib\site-packages\PyQt5\QtCore.pyd -> build\e
xe.win-amd64-3.6\PyQt5\QtCore.pyd
copying C:\Users\aa82758\Anaconda3\Library\bin\Qt5Core.dll -> build\exe.win-amd6
4-3.6\Qt5Core.dll
copying C:\Users\aa82758\Anaconda3\Library\bin\zlib.dll -> build\exe.win-amd64-3
.6\zlib.dll
copying C:\Users\aa82758\Anaconda3\Library\bin\icuin57.dll -> build\exe.win-amd6
4-3.6\icuin57.dll
copying C:\Users\aa82758\Anaconda3\Library\bin\icuuc57.dll -> build\exe.win-amd6
4-3.6\icuuc57.dll
copying C:\Users\aa82758\Anaconda3\Library\bin\icudt57.dll -> build\exe.win-amd6
4-3.6\icudt57.dll
copying C:\Users\aa82758\Anaconda3\Library\bin\api-ms-win-crt-utility-l1-1-0.dll
-> build\exe.win-amd64-3.6\api-ms-win-crt-utility-l1-1-0.dll
copying C:\Users\aa82758\Anaconda3\Library\bin\MSVCP140.dll -> build\exe.win-amd
64-3.6\MSVCP140.dll
copying C:\Users\aa82758\Anaconda3\lib\site-packages\PyQt5\QtGui.pyd -> build\ex
e.win-amd64-3.6\PyQt5\QtGui.pyd
copying C:\Users\aa82758\Anaconda3\Library\bin\Qt5Gui.dll -> build\exe.win-amd64
-3.6\Qt5Gui.dll
copying C:\Users\aa82758\Anaconda3\Library\bin\libpng16.dll -> build\exe.win-amd
64-3.6\libpng16.dll
copying C:\Users\aa82758\Anaconda3\lib\site-packages\PyQt5\QtWidgets.pyd -> buil
d\exe.win-amd64-3.6\PyQt5\QtWidgets.pyd
copying C:\Users\aa82758\Anaconda3\Library\bin\Qt5Widgets.dll -> build\exe.win-a
md64-3.6\Qt5Widgets.dll
Copying data from package collections...
Copying data from package concurrent...
Copying data from package ctypes...
Copying data from package curses...
Copying data from package distutils...
Copying data from package email...
Copying data from package encodings...
Copying data from package html...
Copying data from package http...
Copying data from package importlib...
Copying data from package json...
Copying data from package lib2to3...
Copying data from package logging...
Copying data from package multiprocessing...
Copying data from package nose...
Copying data from package numpy...
copying C:\Users\aa82758\Anaconda3\lib\site-packages\numpy\core\multiarray.cp36-
win_amd64.pyd -> build\exe.win-amd64-3.6\numpy\core\multiarray.cp36-win_amd64.py
d
copying C:\Users\aa82758\Anaconda3\lib\site-packages\numpy\core\umath.cp36-win_a
md64.pyd -> build\exe.win-amd64-3.6\numpy\core\umath.cp36-win_amd64.pyd
copying C:\Users\aa82758\Anaconda3\lib\site-packages\numpy\fft\fftpack_lite.cp36
-win_amd64.pyd -> build\exe.win-amd64-3.6\numpy\fft\fftpack_lite.cp36-win_amd64.
pyd
copying C:\Users\aa82758\Anaconda3\lib\site-packages\numpy\linalg\lapack_lite.cp
36-win_amd64.pyd -> build\exe.win-amd64-3.6\numpy\linalg\lapack_lite.cp36-win_am
d64.pyd
copying C:\Users\aa82758\Anaconda3\lib\site-packages\numpy\random\mtrand.cp36-wi
n_amd64.pyd -> build\exe.win-amd64-3.6\numpy\random\mtrand.cp36-win_amd64.pyd
Copying data from package pydoc_data...
Copying data from package scipy...
Copying data from package unittest...
Copying data from package urllib...
Copying data from package xml...
Copying data from package xmlrpc...
copying C:\Users\aa82758\Anaconda3\DLLs\_bz2.pyd -> build\exe.win-amd64-3.6\_bz2
.pyd
copying C:\Users\aa82758\Anaconda3\DLLs\_ctypes.pyd -> build\exe.win-amd64-3.6\_
ctypes.pyd
copying C:\Users\aa82758\Anaconda3\DLLs\_decimal.pyd -> build\exe.win-amd64-3.6\
_decimal.pyd
copying C:\Users\aa82758\Anaconda3\DLLs\_hashlib.pyd -> build\exe.win-amd64-3.6\
_hashlib.pyd
copying C:\Users\aa82758\Anaconda3\DLLs\_lzma.pyd -> build\exe.win-amd64-3.6\_lz
ma.pyd
copying C:\Users\aa82758\Anaconda3\DLLs\_multiprocessing.pyd -> build\exe.win-am
d64-3.6\_multiprocessing.pyd
copying C:\Users\aa82758\Anaconda3\DLLs\_socket.pyd -> build\exe.win-amd64-3.6\_
socket.pyd
copying C:\Users\aa82758\Anaconda3\DLLs\_ssl.pyd -> build\exe.win-amd64-3.6\_ssl
.pyd
copying C:\Users\aa82758\Anaconda3\lib\site-packages\win32\_win32sysloader.pyd -
> build\exe.win-amd64-3.6\_win32sysloader.pyd
copying C:\Users\aa82758\Anaconda3\DLLs\pyexpat.pyd -> build\exe.win-amd64-3.6\p
yexpat.pyd
copying C:\Users\aa82758\Anaconda3\DLLs\select.pyd -> build\exe.win-amd64-3.6\se
lect.pyd
copying C:\Users\aa82758\Anaconda3\lib\site-packages\sip.pyd -> build\exe.win-am
d64-3.6\sip.pyd
copying C:\Users\aa82758\Anaconda3\DLLs\unicodedata.pyd -> build\exe.win-amd64-3
.6\unicodedata.pyd
copying C:\Users\aa82758\Anaconda3\lib\site-packages\win32\win32api.pyd -> build
\exe.win-amd64-3.6\win32api.pyd
copying C:\Users\aa82758\Anaconda3\lib\site-packages\win32\pywintypes36.dll -> b
uild\exe.win-amd64-3.6\pywintypes36.dll
copying C:\Users\aa82758\Anaconda3\lib\site-packages\win32\win32evtlog.pyd -> bu
ild\exe.win-amd64-3.6\win32evtlog.pyd
copying C:\Users\aa82758\Anaconda3\lib\site-packages\win32\win32pdh.pyd -> build
\exe.win-amd64-3.6\win32pdh.pyd
Questions :
Did somebody had already this kind of problem? Do you know which of the missing modules are necessary for PyQt5 compilation? For instance IronPython is not available for python 3.6. Are there other methods for compiling except py2exe which is not working as well?
Thank you so much!
edit:
error message I get when I run the .exe file:
Microsoft Windows [Version 6.1.7601]
Copyright (c) 2009 Microsoft Corporation. All rights reserved.
C:\Users\aa82758>cd C:\Users\aa82758\Desktop\build\exe.win-amd64-3.6
C:\Users\aa82758\Desktop\build\exe.win-amd64-3.6>vsc.exe
Traceback (most recent call last):
File "C:\Users\aa82758\Anaconda3\lib\site-packages\cx_Freeze\initscripts\__sta
rtup__.py", line 14, in run
module.run()
File "C:\Users\aa82758\Anaconda3\lib\site-packages\cx_Freeze\initscripts\Conso
le.py", line 26, in run
exec(code, m.__dict__)
File "vsc.py", line 2, in <module>
File "C:\Users\aa82758\Anaconda3\lib\site-packages\numpy\__init__.py", line 14
2, in <module>
from . import add_newdocs
File "C:\Users\aa82758\Anaconda3\lib\site-packages\numpy\add_newdocs.py", line
13, in <module>
from numpy.lib import add_newdoc
File "C:\Users\aa82758\Anaconda3\lib\site-packages\numpy\lib\__init__.py", lin
e 8, in <module>
from .type_check import *
File "C:\Users\aa82758\Anaconda3\lib\site-packages\numpy\lib\type_check.py", l
ine 11, in <module>
import numpy.core.numeric as _nx
File "C:\Users\aa82758\Anaconda3\lib\site-packages\numpy\core\__init__.py", li
ne 36, in <module>
from . import numeric
File "C:\Users\aa82758\Anaconda3\lib\site-packages\numpy\core\numeric.py", lin
e 1842, in <module>
from .arrayprint import array2string, get_printoptions, set_printoptions
File "C:\Users\aa82758\Anaconda3\lib\site-packages\numpy\core\arrayprint.py",
line 24, in <module>
from .fromnumeric import ravel
File "C:\Users\aa82758\Anaconda3\lib\site-packages\numpy\core\fromnumeric.py",
line 15, in <module>
from . import _methods
ImportError: cannot import name '_methods'
C:\Users\aa82758\Desktop\build\exe.win-amd64-3.6>

Related

undefined symbol: __atomic_exchange_8

I'm trying to run google assistant on my raspberry pi following the steps on: https://developers.google.com/assistant/sdk/guides/service/python/embed/run-sample
all works fine until activating the Google Assistant with the command:
googlesamples-assistant-pushtotalk --project-id my-dev-project --device-model-id my-model
I'm getting the following ImportError:
Traceback (most recent call last):
File "/home/pi/env/bin/googlesamples-assistant-pushtotalk", line 5, in <module>
from googlesamples.assistant.grpc.pushtotalk import main
File "/home/pi/env/lib/python3.9/site-packages/googlesamples/assistant/grpc/pushtotalk.py", line 28, in <module>
import grpc
File "/home/pi/env/lib/python3.9/site-packages/grpc/__init__.py", line 22, in <module>
from grpc import _compression
File "/home/pi/env/lib/python3.9/site-packages/grpc/_compression.py", line 15, in <module>
from grpc._cython import cygrpc
ImportError: /home/pi/env/lib/python3.9/site-packages/grpc/_cython/cygrpc.cpython-39-arm-linux-gnueabihf.so: undefined symbol: __atomic_exchange_8
Any ideas on how to fix this?
just ended up here since I ran into the same problem (on a different project) but also involving python3.9, cygrpc on a RPi4 with a recent raspbian-lite (32bit).
While I don't have a solution here are my guesses:
formerly __atomic_exchange_8 was defined in /lib/arm-linux-gnueabihf/libgcc_s.so.1 but now it seems defined in libatomic:
$ grep __atomic_exchange_8 /lib/arm-linux-gnueabihf/libatomic.so.1
grep: /lib/arm-linux-gnueabihf/libatomic.so.1: binary file matches
EDIT:
Solved it. I was looking at the patch which tried to solve the problem two years ago:
https://github.com/grpc/grpc/pull/20514/commits/b912fc7d8d401bb65b3147ee77d03beaa3d46038
I figured their test check_linker_need_libatomic() might be broken, and patched it again to always return True, the problem got fixed.
Had tried earlier to fix it by adding CFLAGS='-latomic' CPPFLAGS='-latomic' but that didn't help.
here's my tiny workaround (not fix!) for today's grpc git HEAD:
root#mypi:/home/pi/CODE/grpc# git diff
diff --git a/setup.py b/setup.py
index 1a72c5c668..60b7705cd2 100644
--- a/setup.py
+++ b/setup.py
## -197,6 +197,7 ## ENABLE_DOCUMENTATION_BUILD = _env_bool_value(
def check_linker_need_libatomic():
"""Test if linker on system needs libatomic."""
+ return True
code_test = (b'#include <atomic>\n' +
b'int main() { return std::atomic<int64_t>{}; }')
cxx = os.environ.get('CXX', 'c++')
diff --git a/tools/distrib/python/grpcio_tools/setup.py b/tools/distrib/python/grpcio_tools/setup.py
index 6b842f56b9..8d5f581ac7 100644
--- a/tools/distrib/python/grpcio_tools/setup.py
+++ b/tools/distrib/python/grpcio_tools/setup.py
## -85,6 +85,7 ## BUILD_WITH_STATIC_LIBSTDCXX = _env_bool_value(
def check_linker_need_libatomic():
"""Test if linker on system needs libatomic."""
+ return True
code_test = (b'#include <atomic>\n' +
b'int main() { return std::atomic<int64_t>{}; }')
cxx = os.environ.get('CXX', 'c++')
root#mypi:/home/pi/CODE/grpc#
EDIT:
as a quick test, cygrpc.cpython-39-arm-linux-gnueabihf.so needs to depend on libatomic:
pi#mypi:~/CODE/grpc $ ldd /usr/local/lib/python3.9/dist-packages/grpc/_cython/cygrpc.cpython-39-arm-linux-gnueabihf.so
linux-vdso.so.1 (0xbeef7000)
/usr/lib/arm-linux-gnueabihf/libarmmem-${PLATFORM}.so => /usr/lib/arm-linux-gnueabihf/libarmmem-v7l.so (0xb698b000)
libpthread.so.0 => /lib/arm-linux-gnueabihf/libpthread.so.0 (0xb695f000)
libatomic.so.1 => /lib/arm-linux-gnueabihf/libatomic.so.1 (0xb6946000)
libstdc++.so.6 => /lib/arm-linux-gnueabihf/libstdc++.so.6 (0xb67be000)
libm.so.6 => /lib/arm-linux-gnueabihf/libm.so.6 (0xb674f000)
libc.so.6 => /lib/arm-linux-gnueabihf/libc.so.6 (0xb65fb000)
/lib/ld-linux-armhf.so.3 (0xb6fcc000)
libgcc_s.so.1 => /lib/arm-linux-gnueabihf/libgcc_s.so.1 (0xb65ce000)
This works for me on RPI0 + Bullseye + Python3.9:
pip3 uninstall -y grpcio grpcio-tools
sudo apt install -y python3-grpcio python3-grpc-tools
EDIT: Update to gRPC v1.44.0. The issue has been fixed there, see the explanation below in the old answer.
There was a problem with the order of the parameters used by the compiler to compile some test code which result is used to determine whether libatomic needs to be linked or not.
The issue will be fixed with the next release of grpc. If they maintain the same schedule of previous releases it should be v1.44.0 which should come out some time the next month.
In the mean time you can git cherry-pick the proper fix and build grpc yourself

Could not load my own modules in ghci haskell

i am trying to load a module called Point.hs into a file called Circle.hs by this way:
module Figures.Circle
( Circle(..)
, getArea
, getPerimeter
) where
import qualified Figures.Point as P
here is my Point.hs file:
module Figures.Point
( Point(..)
) where
this is my directory´s tree:
Figures/
Point.hs
Circle.hs
and this is what ghci´s error says:
Circle.hs:7:1: error:
Could not find module `Figures.Point'
Locations searched:
Figures\Point.hs
Figures\Point.lhs
Figures\Point.hsig
Figures\Point.lhsig
|
7 | import qualified Figures.Point as P
I follow this guide.
I get the same error if I call ghci from within the Figures directory, that can be avoided by calling ghci from the parent directory, so ghci Figures\Circle.hs.

Python error upon exif data extraction via Pillow module: invalid continuation byte

I am writing a piece of code to extract exif data from images using Python. I downloaded the Pillow module using pip3 and am using some code I found online:
from PIL import Image
from PIL.ExifTags import TAGS
imagename = "path to file"
image = Image.open(imagename)
exifdata = image.getexif()
for tagid in exifdata:
tagname = TAGS.get(tagid, tagid)
data = exifdata.get(tagid)
if isinstance(data, bytes):
data = data.decode()
print(f"{tagname:25}: {data}")
On some images this code works. However, for images I took on my Olympus camera I get the following error:
GPSInfo : 734
Traceback (most recent call last):
File "_pathname redacted_", line 14, in <module>
data = data.decode()
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xf0 in position 30: invalid continuation byte
When I remove the data = data.decode() part, I get the following:
GPSInfo : 734
PrintImageMatching : b"PrintIM\x000300\x00\x00%\x00\x01\x00\x14\x00\x14\x00\x02\x00\x01\x00\x00\x00\x03\x00\xf0\x00\x00\x00\x07\x00\x00\x00\x00\x00\x08\x00\x00\x00\x00\x00\t\x00\x00\x00\x00\x00\n\x00\x00\x00\x00\x00\x0b\x008\x01\x00\x00\x0c\x00\x00\x00\x00\x00\r\x00\x00\x00\x00\x00\x0e\x00P\x01\x00\x00\x10\x00`\x01\x00\x00 \x00\xb4\x01\x00\x00\x00\x01\x03\x00\x00\x00\x01\x01\xff\x00\x00\x00\x02\x01\x83\x00\x00\x00\x03\x01\x83\x00\x00\x00\x04\x01\x83\x00\x00\x00\x05\x01\x83\x00\x00\x00\x06\x01\x83\x00\x00\x00\x07\x01\x80\x80\x80\x00\x10\x01\x83\x00\x00\x00\x00\x02\x00\x00\x00\x00\x07\x02\x00\x00\x00\x00\x08\x02\x00\x00\x00\x00\t\x02\x00\x00\x00\x00\n\x02\x00\x00\x00\x00\x0b\x02\xf8\x01\x00\x00\r\x02\x00\x00\x00\x00 \x02\xd6\x01\x00\x00\x00\x03\x03\x00\x00\x00\x01\x03\xff\x00\x00\x00\x02\x03\x83\x00\x00\x00\x03\x03\x83\x00\x00\x00\x06\x03\x83\x00\x00\x00\x10\x03\x83\x00\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\t\x11\x00\x00\x10'\x00\x00\x0b\x0f\x00\x00\x10'\x00\x00\x97\x05\x00\x00\x10'\x00\x00\xb0\x08\x00\x00\x10'\x00\x00\x01\x1c\x00\x00\x10'\x00\x00^\x02\x00\x00\x10'\x00\x00\x8b\x00\x00\x00\x10'\x00\x00\xcb\x03\x00\x00\x10'\x00\x00\xe5\x1b\x00\x00\x10'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x05\x05\x05\x00\x00\x00##\x80\x80\xc0\xc0\xff\xff\x00\x00##\x80\x80\xc0\xc0\xff\xff\x00\x00##\x80\x80\xc0\xc0\xff\xff\x05\x05\x05\x00\x00\x00##\x80\x80\xc0\xc0\xff\xff\x00\x00##\x80\x80\xc0\xc0\xff\xff\x00\x00##\x80\x80\xc0\xc0\xff\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00"
ResolutionUnit : 2
ExifOffset : 230
ImageDescription : OLYMPUS DIGITAL CAMERA
Make : OLYMPUS CORPORATION
Model : E-M10MarkII
Software : Version 1.2
Orientation : 1
DateTime : 2020:02:13 15:02:57
YCbCrPositioning : 2
YResolution : 350.0
Copyright :
XResolution : 350.0
Artist :
How should I fix this problem? Should I use a different Python module?
I did some digging and figured out the answer to the problem I posted about. I originally postulated that the rest of the metadata was in the byte data:
b"PrintIM\x000300\x00\x00%\x00\x01\x00\x14\x00\x14\x00\x02\x00\x01\x00\x00\x00\x03\x00\xf0\x00\x00\x00\x07\x00\x00\x00\x00\x00\x08\x00\x00\x00\x00\x00\t\x00\x00\x00\x00\x00\n\x00\x00\x00\x00\x00\x0b\x008\x01\x00\x00\x0c\x00\x00\x00\x00\x00\r\x00\x00\x00\x00\x00\x0e\x00P\x01\x00\x00\x10\x00`\x01\x00\x00 \x00\xb4\x01\x00\x00\x00\x01\x03\x00\x00\x00\x01\x01\xff\x00\x00\x00\x02\x01\x83\x00\x00\x00\x03\x01\x83\x00\x00\x00\x04\x01\x83\x00\x00\x00\x05\x01\x83\x00\x00\x00\x06\x01\x83\x00\x00\x00\x07\x01\x80\x80\x80\x00\x10\x01\x83\x00\x00\x00\x00\x02\x00\x00\x00\x00\x07\x02\x00\x00\x00\x00\x08\x02\x00\x00\x00\x00\t\x02\x00\x00\x00\x00\n\x02\x00\x00\x00\x00\x0b\x02\xf8\x01\x00\x00\r\x02\x00\x00\x00\x00 \x02\xd6\x01\x00\x00\x00\x03\x03\x00\x00\x00\x01\x03\xff\x00\x00\x00\x02\x03\x83\x00\x00\x00\x03\x03\x83\x00\x00\x00\x06\x03\x83\x00\x00\x00\x10\x03\x83\x00\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\t\x11\x00\x00\x10'\x00\x00\x0b\x0f\x00\x00\x10'\x00\x00\x97\x05\x00\x00\x10'\x00\x00\xb0\x08\x00\x00\x10'\x00\x00\x01\x1c\x00\x00\x10'\x00\x00^\x02\x00\x00\x10'\x00\x00\x8b\x00\x00\x00\x10'\x00\x00\xcb\x03\x00\x00\x10'\x00\x00\xe5\x1b\x00\x00\x10'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x05\x05\x05\x00\x00\x00##\x80\x80\xc0\xc0\xff\xff\x00\x00##\x80\x80\xc0\xc0\xff\xff\x00\x00##\x80\x80\xc0\xc0\xff\xff\x05\x05\x05\x00\x00\x00##\x80\x80\xc0\xc0\xff\xff\x00\x00##\x80\x80\xc0\xc0\xff\xff\x00\x00##\x80\x80\xc0\xc0\xff\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00"
That assumption wasn't correct. Although the above is metadata, it simply isn't the metadata I am looking for (in my case the FocalLength attribute). Rather it appears to be Olympus specific metadata. The answer to my solution was to find all the metadata. I found a piece of code that worked very well in Stack Overflow: In Python, how do I read the exif data for an image?.
I used the following code by Nicolas Gervais:
import os,sys
from PIL import Image
from PIL.ExifTags import TAGS
for (k,v) in Image.open(sys.argv[1])._getexif().items():
print('%s = %s' % (TAGS.get(k), v))
I replaced sys.argv[1] with the path name to the image file.
Alternate Solution
As MattDMo mentioned, there are also specific libraries for reading EXIF data in Python. One that I found that look promising is ExifRead which can be download by typing the following in the terminal:
pip install ExifRead

"'JavaPackage' object is not callable" error executing explain() in Pyspark 3.0.1 via Zeppelin

I am running Pyspark 3.0.1 for Hadoop 2.7 in a Zeppelin notebook. In general all is well, however when I execute df.explain() on a DataFrame I get this error:
Fail to execute line 3: df.explain()
Traceback (most recent call last):
File "/tmp/1610595392738-0/zeppelin_python.py", line 158, in <module>
exec(code, _zcUserQueryNameSpace)
File "<stdin>", line 3, in <module>
File "/usr/local/spark/python/pyspark/sql/dataframe.py", line 356, in explain
print(self._sc._jvm.PythonSQLUtils.explainString(self._jdf.queryExecution(), explain_mode))
TypeError: 'JavaPackage' object is not callable
Has anyone come across and resolved this error before in the context of explain ?
My spark/jars folder contents:
activation-1.1.1.jar
aircompressor-0.10.jar
algebra_2.12-2.0.0-M2.jar
alluxio-2.4.1-client.jar
antlr4-runtime-4.7.1.jar
antlr-runtime-3.5.2.jar
aopalliance-1.0.jar
aopalliance-repackaged-2.6.1.jar
apacheds-i18n-2.0.0-M15.jar
apacheds-kerberos-codec-2.0.0-M15.jar
api-asn1-api-1.0.0-M20.jar
api-util-1.0.0-M20.jar
arpack_combined_all-0.1.jar
arrow-format-0.15.1.jar
arrow-memory-0.15.1.jar
arrow-vector-0.15.1.jar
audience-annotations-0.5.0.jar
automaton-1.11-8.jar
avro-1.8.2.jar
avro-ipc-1.8.2.jar
avro-mapred-1.8.2-hadoop2.jar
bonecp-0.8.0.RELEASE.jar
breeze_2.12-1.0.jar
breeze-macros_2.12-1.0.jar
cats-kernel_2.12-2.0.0-M4.jar
chill_2.12-0.9.5.jar
chill-java-0.9.5.jar
commons-beanutils-1.9.4.jar
commons-cli-1.2.jar
commons-codec-1.10.jar
commons-collections-3.2.2.jar
commons-compiler-3.0.16.jar
commons-compress-1.8.1.jar
commons-configuration-1.6.jar
commons-crypto-1.0.0.jar
commons-dbcp-1.4.jar
commons-digester-1.8.jar
commons-httpclient-3.1.jar
commons-io-2.4.jar
commons-lang-2.6.jar
commons-lang3-3.9.jar
commons-logging-1.1.3.jar
commons-math3-3.4.1.jar
commons-net-3.1.jar
commons-pool-1.5.4.jar
commons-text-1.6.jar
compress-lzf-1.0.3.jar
core-1.1.2.jar
curator-client-2.7.1.jar
curator-framework-2.7.1.jar
curator-recipes-2.7.1.jar
datanucleus-api-jdo-4.2.4.jar
datanucleus-core-4.1.17.jar
datanucleus-rdbms-4.1.19.jar
derby-10.12.1.1.jar
dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar
flatbuffers-java-1.9.0.jar
generex-1.0.2.jar
gson-2.2.4.jar
guava-14.0.1.jar
guice-3.0.jar
guice-servlet-3.0.jar
hadoop-annotations-2.7.4.jar
hadoop-auth-2.7.4.jar
hadoop-client-2.7.4.jar
hadoop-common-2.7.4.jar
hadoop-hdfs-2.7.4.jar
hadoop-mapreduce-client-app-2.7.4.jar
hadoop-mapreduce-client-common-2.7.4.jar
hadoop-mapreduce-client-core-2.7.4.jar
hadoop-mapreduce-client-jobclient-2.7.4.jar
hadoop-mapreduce-client-shuffle-2.7.4.jar
hadoop-yarn-api-2.7.4.jar
hadoop-yarn-client-2.7.4.jar
hadoop-yarn-common-2.7.4.jar
hadoop-yarn-server-common-2.7.4.jar
hadoop-yarn-server-web-proxy-2.7.4.jar
HikariCP-2.5.1.jar
hive-beeline-2.3.7.jar
hive-cli-2.3.7.jar
hive-common-2.3.7.jar
hive-exec-2.3.7-core.jar
hive-jdbc-2.3.7.jar
hive-llap-common-2.3.7.jar
hive-metastore-2.3.7.jar
hive-serde-1.2.1.spark2.jar
hive-serde-2.3.7.jar
hive-shims-0.23-2.3.7.jar
hive-shims-1.2.1.spark2.jar
hive-shims-2.3.7.jar
hive-shims-common-2.3.7.jar
hive-shims-scheduler-2.3.7.jar
hive-storage-api-2.7.1.jar
hive-vector-code-gen-2.3.7.jar
hk2-api-2.6.1.jar
hk2-locator-2.6.1.jar
hk2-utils-2.6.1.jar
htrace-core-3.1.0-incubating.jar
httpclient-4.5.6.jar
httpcore-4.4.12.jar
istack-commons-runtime-3.0.8.jar
ivy-2.4.0.jar
jackson-annotations-2.10.0.jar
jackson-core-2.10.0.jar
jackson-core-asl-1.9.13.jar
jackson-databind-2.10.0.jar
jackson-dataformat-yaml-2.10.0.jar
jackson-datatype-jsr310-2.10.3.jar
jackson-jaxrs-1.9.13.jar
jackson-mapper-asl-1.9.13.jar
jackson-module-jaxb-annotations-2.10.0.jar
jackson-module-paranamer-2.10.0.jar
jackson-module-scala_2.12-2.10.0.jar
jackson-xc-1.9.13.jar
jakarta.activation-api-1.2.1.jar
jakarta.annotation-api-1.3.5.jar
jakarta.inject-2.6.1.jar
jakarta.validation-api-2.0.2.jar
jakarta.ws.rs-api-2.1.6.jar
jakarta.xml.bind-api-2.3.2.jar
janino-3.0.16.jar
javassist-3.25.0-GA.jar
javax.inject-1.jar
javax.jdo-3.2.0-m3.jar
javax.servlet-api-3.1.0.jar
javolution-5.5.1.jar
jaxb-api-2.2.2.jar
jaxb-runtime-2.3.2.jar
jcl-over-slf4j-1.7.30.jar
jdo-api-3.0.1.jar
jersey-client-2.30.jar
jersey-common-2.30.jar
jersey-container-servlet-2.30.jar
jersey-container-servlet-core-2.30.jar
jersey-hk2-2.30.jar
jersey-media-jaxb-2.30.jar
jersey-server-2.30.jar
jetty-6.1.26.jar
jetty-sslengine-6.1.26.jar
jetty-util-6.1.26.jar
JLargeArrays-1.5.jar
jline-2.14.6.jar
joda-time-2.10.5.jar
jodd-core-3.5.2.jar
jpam-1.1.jar
json-1.8.jar
json4s-ast_2.12-3.6.6.jar
json4s-core_2.12-3.6.6.jar
json4s-jackson_2.12-3.6.6.jar
json4s-scalap_2.12-3.6.6.jar
jsp-api-2.1.jar
jsr305-3.0.0.jar
jta-1.1.jar
JTransforms-3.1.jar
jul-to-slf4j-1.7.30.jar
kryo-shaded-4.0.2.jar
kubernetes-client-4.9.2.jar
kubernetes-model-4.9.2.jar
kubernetes-model-common-4.9.2.jar
leveldbjni-all-1.8.jar
libfb303-0.9.3.jar
libthrift-0.12.0.jar
log4j-1.2.17.jar
logging-interceptor-3.12.6.jar
lz4-java-1.7.1.jar
machinist_2.12-0.6.8.jar
macro-compat_2.12-1.1.1.jar
mesos-1.4.0-shaded-protobuf.jar
metrics-core-4.1.1.jar
metrics-graphite-4.1.1.jar
metrics-jmx-4.1.1.jar
metrics-json-4.1.1.jar
metrics-jvm-4.1.1.jar
minlog-1.3.0.jar
netty-all-4.1.47.Final.jar
objenesis-2.5.1.jar
okhttp-3.12.6.jar
okio-1.15.0.jar
opencsv-2.3.jar
orc-core-1.5.10.jar
orc-mapreduce-1.5.10.jar
orc-shims-1.5.10.jar
oro-2.0.8.jar
osgi-resource-locator-1.0.3.jar
paranamer-2.8.jar
parquet-column-1.10.1.jar
parquet-common-1.10.1.jar
parquet-encoding-1.10.1.jar
parquet-format-2.4.0.jar
parquet-hadoop-1.10.1.jar
parquet-jackson-1.10.1.jar
postgresql-42.2.14.jar
protobuf-java-2.5.0.jar
py4j-0.10.9.jar
pyrolite-4.30.jar
RoaringBitmap-0.7.45.jar
scala-collection-compat_2.12-2.1.1.jar
scala-compiler-2.12.10.jar
scala-library-2.12.10.jar
scala-parser-combinators_2.12-1.1.2.jar
scala-reflect-2.12.10.jar
scala-xml_2.12-1.2.0.jar
shapeless_2.12-2.3.3.jar
shims-0.7.45.jar
slf4j-api-1.7.30.jar
slf4j-log4j12-1.7.30.jar
snakeyaml-1.24.jar
snappy-java-1.1.7.5.jar
spark-catalyst_2.12-3.0.1.jar
spark-core_2.12-3.0.1.jar
spark-graphx_2.12-3.0.1.jar
spark-hive_2.12-3.0.1.jar
spark-hive-thriftserver_2.12-3.0.1.jar
spark-kubernetes_2.12-3.0.1.jar
spark-kvstore_2.12-3.0.1.jar
spark-launcher_2.12-3.0.1.jar
spark-mesos_2.12-3.0.1.jar
spark-mllib_2.12-3.0.1.jar
spark-mllib-local_2.12-3.0.1.jar
spark-network-common_2.12-3.0.1.jar
spark-network-shuffle_2.12-3.0.1.jar
spark-repl_2.12-3.0.1.jar
spark-sketch_2.12-3.0.1.jar
spark-sql_2.12-3.0.1.jar
spark-streaming_2.12-3.0.1.jar
spark-tags_2.12-3.0.1.jar
spark-tags_2.12-3.0.1-tests.jar
spark-unsafe_2.12-3.0.1.jar
spark-yarn_2.12-3.0.1.jar
spire_2.12-0.17.0-M1.jar
spire-macros_2.12-0.17.0-M1.jar
spire-platform_2.12-0.17.0-M1.jar
spire-util_2.12-0.17.0-M1.jar
ST4-4.0.4.jar
stax-api-1.0.1.jar
stax-api-1.0-2.jar
stream-2.9.6.jar
super-csv-2.2.0.jar
threeten-extra-1.5.0.jar
transaction-api-1.1.jar
univocity-parsers-2.9.0.jar
velocity-1.5.jar
xbean-asm7-shaded-4.15.jar
xercesImpl-2.12.0.jar
xml-apis-1.4.01.jar
xmlenc-0.52.jar
xz-1.5.jar
zjsonpatch-0.3.0.jar
zookeeper-3.4.14.jar
zstd-jni-1.4.4-3.jar
I gather the error is saying something might not be in my classpath but I cant think what that might be ...
I ran into this same issue on AWS with EMR 6.2.0 (also Spark 3.0.1 coincidentally?) and jupyter notebooks. The issue appears to be related to how pyspark is initialized. Specifically, the py4j Java imports.
The following import is supposed to be executed while the notebook kernel is being initialized but seems to be skipped. You just need to run this once per session.
from py4j.java_gateway import java_import
java_import(spark._sc._jvm, "org.apache.spark.sql.api.python.*")
Now df.explain() works as expected.
For future reference - when you see 'JavaPackage' object is not callable, it often means that the target Java class was not found. Either the class doesn't exist or the expected import hasn't been called.
Mac Users / Linux Users
$nano ~/.bash_profile
or
$nano ~/.zshrc
add env
export SPARK_HOME=/usr/local/spark
export PATH=$SPARK_HOME/bin:$PATH
export PYSPARK_SUBMIT_ARGS="--master local[*]"
export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/*.zip:$PYTHONPATH
export PYSPARK_DRIVER_PYTHON=jupyter
export PYSPARK_DRIVER_PYTHON_OPTS='notebook'
Ctrl + o + Enter ##save
Ctrl + X #exit
$source ~/.bash_profile
$nano ~/.zshrc
Pyspark loads the class using spark context jvm
sc_.jvm.com.package.Class
If the jar for the corresponding Class file is not supplied to zepplin or jupyter in its config, this error will be thrown.
Running from spark-submit
If your pyspark code requires additional jars add them to spark-submit with --jars option
e.g.
spark-submit pyspark-job.py --jars abc.jar,xyz.jar
This answer explains the mechanism of java classes being called from pyspark

Bazel Error After Upgrading Nodejs Rules - ERROR: defs.bzl has been removed from build_bazel_rules_nodejs

After upgrading build_bazel_rules_nodejs from 0.42.2 to 1.0.1 I get this error:
ERROR: /home/flolu/.cache/bazel/_bazel_flolu/698f7adad10ea020bcdb85216703ce08/external/build_bazel_rules_nodejs/defs.bzl:19:5: Traceback (most recent call
last):
File "/home/flolu/Desktop/minimal-bazel-monorepo/services/server/src/BUILD", line 76
nodejs_image(name = "server", <2 more arguments>)
File "/home/flolu/.cache/bazel/_bazel_flolu/698f7adad10ea020bcdb85216703ce08/external/io_bazel_rules_docker/nodejs/image.bzl", line 112, in nodejs_image
nodejs_binary(name = binary, <2 more arguments>)
File "/home/flolu/.cache/bazel/_bazel_flolu/698f7adad10ea020bcdb85216703ce08/external/build_bazel_rules_nodejs/defs.bzl", line 19, in nodejs_binary
fail(<1 more arguments>)
ERROR: defs.bzl has been removed from build_bazel_rules_nodejs
Please update your load statements to use index.bzl instead.
See https://github.com/bazelbuild/rules_nodejs/wiki#migrating-off-build_bazel_rules_nodejsdefsbzl for help.
ERROR: error loading package 'services/server/src': Package 'services/server/src' contains errors
INFO: Elapsed time: 0.119s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (1 packages loaded)
FAILED: Build did NOT complete successfully (1 packages loaded)
Line 76 in the error refers to this part of the BUILD file:
load("#io_bazel_rules_docker//nodejs:image.bzl", "nodejs_image")
nodejs_image(
name = "server",
data = [":lib"],
entry_point = ":index.ts",
)
But there is no defs.bzl. So I am confused by the error.
So in detail I have upgraded from
http_archive(
name = "build_bazel_rules_nodejs",
sha256 = "16fc00ab0d1e538e88f084272316c0693a2e9007d64f45529b82f6230aedb073",
urls = ["https://github.com/bazelbuild/rules_nodejs/releases/download/0.42.2/rules_nodejs-0.42.2.tar.gz"],
)
to
http_archive(
name = "build_bazel_rules_nodejs",
sha256 = "e1a0d6eb40ec89f61a13a028e7113aa3630247253bcb1406281b627e44395145",
urls = ["https://github.com/bazelbuild/rules_nodejs/releases/download/1.0.1/rules_nodejs-1.0.1.tar.gz"],
)
You can recreate the error by cloning this repo: https://github.com/flolude/minimal-bazel-monorepo/tree/48add7ddcad4d25e361e1c7f7f257cf916a797b2 and running
bazel test //services/server/src:test
There are some breaking changes between those versions of build_bazel_rules_nodejs. Namely the import path this:
load("#build_bazel_rules_nodejs//:defs..bzl", <whatever>)
needs to become this
load("#build_bazel_rules_nodejs//:index.bzl", <whatever>)
You also need to update your io_bazel_rules_docker to at least v0.13.0. From looking at the release notes its the version compatible with 1.0.1 in node. https://github.com/bazelbuild/rules_docker/releases/

Resources