turn off logging from phantomjs - python-3.x

I am trying to use PhantomJS on an Openshift Server (Linux) using Python and Selenium.
The Error that I get using no exceptions is:
PermissionError: [Errno 13] Permission denied: 'ghostdriver.log'
This is the code:
try:
browser = webdriver.PhantomJS()
except AttributeError:
error = "Attribute Error"
except PermissionError:
error= "Permission Error"
The solution to this is probalby turning of logging
I tried this by changing this line:
browser = webdriver.PhantomJS()
To:
browser = webdriver.PhantomJS(service_log_path=os.path.devnull)
But now I get the AttributeError
What is a possible solution for this problem?

Related

urllib [Errno 11001] getaddrinfo failed with windows proxy

I'm running Django 3.2 with django-tenants on a Windows local dev environment.
In my windows hosts file I have:
127.0.0.1 *.localhost
...so that I am able to use subdomains with django-tenants. E.g. http://mysub.localhost:8000.
When running ./manage.py runserver the dev server runs perfectly. However, when trying to execute urlopen in my code I get an error:
>>> html = urlopen('http://mysub.localhost:8000')
Traceback (most recent call last):
[...]
urllib.error.URLError: <urlopen error [Errno 11001] getaddrinfo failed>
As far as I can tell the error is due to the proxy settings on my windows machine (this does not fail in production), but I am unsure how to resolve it?

Fetch and Read secrets to vault using python hvac

Hello I have just tried using this code but still getting error. Any idea how to solve this?
I have referred similar post also but it is not working.
import hvac
client = hvac.Client(url='https://vault.xyz.com:1100/ui')
response = client.secrets.kv.v2.read_secret_version(path='/secrets/secret/show/ab/ss/dd')
Error-
raise_for_error raise exceptions.Forbidden(message, errors=errors, method=method, url=url) hvac.exceptions.Forbidden: 1 error occurred: * permission denied

Error while accessing .hdf5 file, shows error “OSError: Unable to open file ”

When I tried to open an .hdf5 file with h5py:
import h5py
file=h5py.open(".../f.hdf5",'r'),
The following error was raised:
h5py/h5f.pyx in h5py.h5f.open()
OSError: Unable to open file (unable to lock file, errno = 11, error message = 'Resource temporarily unavailable')
Solutions: The error could be solved for opening .hdf5 files as below:
file= h5py.File(file_path,'r')
close the files using file.close() or
import os
os.environ["HDF5_USE_FILE_LOCKING"] = "FALSE"
Then the error while opening the file will be gone and you can keep working on that file.

ValueError: root_directory must be an absolute path: Error when access directory in ADLS from Synapse Workspace

When trying to access ADLS directory with the following PySpark code in Apache Spark I get the error:
ValueError: root_directory must be an absolute path. Got abfss://root#adlspretbiukadlsdev.dfs.core.windows.net/RAW/LANDING/ instead.
Traceback (most recent call last):
File "/home/trusted-service-user/cluster-env/env/lib/python3.6/site-packages/great_expectations/core/usage_statistics/usage_statistics.py", line 262, in usage_statistics_wrapped_method
result = func(*args, **kwargs)
The code that gives the above error when I'm trying to access the directory is as follows:
data_context_config = DataContextConfig(
datasources={"my_spark_datasource": my_spark_datasource_config},
store_backend_defaults=FilesystemStoreBackendDefaults(root_directory='abfss://root#adlspretbiukadlsdev.dfs.core.windows.net/RAW/LANDING/'),
)
context = BaseDataContext(project_config=data_context_config)
When I change the code to
data_context_config = DataContextConfig(
datasources={"my_spark_datasource": my_spark_datasource_config},
store_backend_defaults=FilesystemStoreBackendDefaults(root_directory='/abfss://root#adlspretbiukadlsdev.dfs.core.windows.net/RAW/LANDING/'),
)
I get the following error message:
PermissionError: [Errno 13] Permission denied: '/abfss:'
Traceback (most recent call last):
When I enter the following code
data_context_config = DataContextConfig(
datasources={"my_spark_datasource": my_spark_datasource_config},
store_backend_defaults=FilesystemStoreBackendDefaults(root_directory='/'),
)
context = BaseDataContext(project_config=data_context_config)
I get the error message:
PermissionError: [Errno 13] Permission denied: '/expectations'
Traceback (most recent call last):
However, I don't have a directory called '/expectations
As a side note I'm trying to execute Great_Expectations.
The developers of Great_Expectations have informed me that this error will be fixed a new release of the Great_Expectations

PermissionError: [Errno 13] Permission denied on windows (pyinstaller)

I am creating an application in python (3.4) with tkinter and I am compiling it with pyinstaller. The code fragment that brings the error is this:
client = paramiko.SSHClient()
known_hosts = open(self.resource_path("known_hosts")) # Linea 73
client.load_host_keys(known_hosts)
The error is thrown when I click on a button that executes that part of the code, that is, the application runs rather well. The error is this:
Exception in Tkinter callback
Traceback (most recent call last):
File "tkinter\__init__.py", line 1538, in __call__
File "prueba.py", line 73, in aceptar
PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Hernan\\AppData\\Local\\Temp\\_MEI124282\\known_hosts'
I clarify that I am compiling it and running in windows 10.
I tried to execute the exe as administrator but still giving the same error. I verified the path of the file and it exists, so I discard that the file does not exist. I also tried to compile the exe in a cmd with administrator permissions, but it did not give me a solution either.
Any ideas ?
PD: add code...
def resource_path(self, relative_path):
""" Get absolute path to resource, works for dev and for PyInstaller """
base_path = getattr(sys, '_MEIPASS', os.path.dirname(os.path.abspath(__file__)))
return os.path.join(base_path, relative_path)

Resources