pytest- creating a plugin - python-3.x

I would like to create pytest plugin, that includes hook function and fixtures.
I have created it locally on a conftest.py file located at my root folder, and everything works fine. Now I would like to use this code in other projects, by converting the conftest.py code into a pytest- plugin.
In theory I know I simply need to create a package, however I have some really hard time understanding the entry points that I will use for this plugin.
I have an approximation of what I need, where my actual code goes inside the metadata of the test and reports it to a file.
The example code:
conftest.py
#pytest.fixture(scope = 'function')
def upload_manager():
def wrapper(val):
# Do something
return val
return wrapper
def pytest_report_teststatus(report):
if report.when == 'call':
# Do something
if report.when == 'setup':
# Do something
test_foo.py
def test_bar(upload_manager):
my_val = upload_manager(1000)
assert my_val == 1000
How would I go about using this code (fixture and hooks) as a package?

Okay, found an answer- This requires in the setup.py file to add the following key:
entry_points={
"pytest11": [
"name_of_plugin = package.file_name",
],
},
I've used it for a python plugin I wrote to extract data and metadata out of the tests themselves, it can be found here for future reference pytest-data-extractor.
If for some reason, someone needs it as package it can by installed directly from PyPi:
pip install pytest-data-extractor

Related

Referencing top level modules python

After reading some existing answers about this topic, I still can't get my head around how this works.
I have a program that manages imports the following way
from ...src.folder.sub_folder import file_name
For then executing methods with
file_name.method_name()
This works, but the interpreter im using indicates Attempted relative import beyond top-level package so I wanted to instance this files the right way (For learning and also removing the annoying red indicator).
Lets say I have the following tree (This is a snippet of the actual big tree, but the idea stays the same)
PackageTest/
main.py
src/
tasks/
__init__.py
tasks.py
security/
recon/
__init__.py
some_scan.py
With the following contents
tasks.py
def function_1():
print("Function1")
return
def function_2():
print("Function2")
return
def function_3():
print("Function3")
return
How can I reference the methods present in tasks.py from some_scan.py?
Im doing the following with no success.
some_scan.py
import src.tasks as tasks
def call_task():
tasks.function1()
call_task()

Python: Dynamicaly imported module fails when first created, then succeeds

I have a template engine named Contemplate which has implementations for php, node/js and python.
All work fine except lately the python implementation gives me some issues. Specificaly the problem appears when first parsing a template and creating the template python code which is then dynamically imported as a module. When template is already created everything works fine but when template needs to be parsed and saved to disk and THEN imported it raises an error eg
ModuleNotFoundError: No module named 'blah blah'
(note this error appears to be random, it is not always sure that it will be raised, many times it works even if template is created just before importing, other times it fails and then if ran again with template already created it succeeds)
Is there any way I can bypass this issue, maybe add a delay between saving a parsed template and then importing as module or somethig else?
The code to import the module (the parsed template which is now a python class) is below:
def import_tpl( filename, classname, cacheDir, doReload=False ):
# http://www.php2python.com/wiki/function.import_tpl/
# http://docs.python.org/dev/3.0/whatsnew/3.0.html
# http://stackoverflow.com/questions/4821104/python-dynamic-instantiation-from-string-name-of-a-class-in-dynamically-imported
#_locals_ = {'Contemplate': Contemplate}
#_globals_ = {'Contemplate': Contemplate}
#if 'execfile' in globals():
# # Python 2.x
# execfile(filename, _globals_, _locals_)
# return _locals_[classname]
#else:
# # Python 3.x
# exec(read_file(filename), _globals_, _locals_)
# return _locals_[classname]
# http://docs.python.org/2/library/imp.html
# http://docs.python.org/2/library/functions.html#__import__
# http://docs.python.org/3/library/functions.html#__import__
# http://stackoverflow.com/questions/301134/dynamic-module-import-in-python
# http://stackoverflow.com/questions/11108628/python-dynamic-from-import
# also: http://code.activestate.com/recipes/473888-lazy-module-imports/
# using import instead of execfile, usually takes advantage of Python cached compiled code
global _G
getTplClass = None
# add the dynamic import path to sys
basename = os.path.basename(filename)
directory = os.path.dirname(filename)
os.sys.path.append(cacheDir)
os.sys.path.append(directory)
currentcwd = os.getcwd()
os.chdir(directory) # change working directory so we know import will work
if os.path.exists(filename):
modname = basename[:-3] # remove .py extension
mod = __import__(modname)
if doReload: reload(mod) # Might be out of date
# a trick in-order to pass the Contemplate super-class in a cross-module way
getTplClass = getattr( mod, '__getTplClass__' )
# restore current dir
os.chdir(currentcwd)
# remove the dynamic import path from sys
del os.sys.path[-1]
del os.sys.path[-1]
# return the tplClass if found
if getTplClass: return getTplClass(Contemplate)
return None
Note the engine creates a __init__.py file in cacheDir if it is not there already.
If needed I can change the import_tpl function to sth else I dont mind.
Python tested is python 3.6 on windows but I dont think this is a platform-specific issue.
To test the issue you can download the github repository (linked above) and run the /tests/test.py test after clearing all cached templates from /tests/_tplcache/ folder
UPDATE:
I am thinking of adding a while loop with some counter in import_tpl that catches the error raised if any and retries a specified amount of times until it succeeds to import the module. But I am also wondering if this is a good solution or there is something else I am missing here..
UPDATE (20/02/2019):
Added a loop to retry a specified amount of times plus a small delay of 1 sec if initially failed to import template module (see online repository code), but still it raises same error sometimes when templates are first created before being imported. Any solutions?
Right, if you use a "while" loop with to handle exceptions would be one way.
while True:
try:
#The module importing
break
except ModuleNotFoundError:
print("NOPE! Module not found")
If it works for some other, an not other "module" files, the likely suspect is the template files the template files themselves.

script runs fine, but can't get pytest to work

Trying to learn pytest, the following script runs fine, but with pytest it fails as it can't the find csv file.
import csv
def load_data(file):
mast_list = []
with open(file) as csvfile:
data = csvfile.read()
phone_masts = data.split('\n')
columns = csv.reader(phone_masts, delimiter=',')
for row in columns:
if len(row) > 0:
mast_list.append(row)
return mast_list
I am trying just to get something working, so trying to test that the function is returning a list type, but it says no csv file found. I'm sure there are plenty of other issues but I'm trying to do one bit at a time. Here is the test:
import pytest
import mobile_phone_data
def test_column_count():
file = 'Mobile Phone Masts.csv'
assert load_list() == type(list)
Why does the script work on it's own but the test fail because it can't find the csv file?
It is actually a bit surprising you get a file not found error: you create a function under one name and try test a different function, that is an error to start with.
I call your first listing foo.py and modified your test script as following:
test_foo.py
from foo import load_data
def test_column_count():
file = 'spam.csv'
assert isinstance(load_data(file), list)
There is also a a file called spam.csv, all three files are in the same folder. pytest runs this test and it passes.
Other issues in your code:
your csv import does unnecessary things - you do not have to split a newline the hard way, use reader parameter instaed
isinstance should be used for type checking
you might be creating a temp file for unit testing and destroying it
the initial function load_data() can be split in two: one that reads a file and the other that parses its content, which shoud eventually make it easier to test.

restart python (or reload modules) in py.test tests

I have a (python3) package that has completely different behaviour depending on how it's init()ed (perhaps not the best design, but rewriting is not an option). The module can only be init()ed once, a second time gives an error. I want to test this package (both behaviours) using py.test.
Note: the nature of the package makes the two behaviours mutually exclusive, there is no possible reason to ever want both in a singular program.
I have serveral test_xxx.py modules in my test directory. Each module will init the package in the way in needs (using fixtures). Since py.test starts the python interpreter once, running all test-modules in one py.test run fails.
Monkey-patching the package to allow a second init() is not something I want to do, since there is internal caching etc that might result in unexplained behaviour.
Is it possible to tell py.test to run each test module in a separate python process (thereby not being influenced by inits in another test-module)
Is there a way to reliably reload a package (including all sub-dependencies, etc)?
Is there another solution (I'm thinking of importing and then unimporting the package in a fixture, but this seems excessive)?
To reload a module, try using the reload() from library importlib
Example:
from importlib import reload
import some_lib
#do something
reload(some_lib)
Also, launching each test in a new process is viable, but multiprocessed code is kind of painful to debug.
Example
import some_test
from multiprocessing import Manager, Process
#create new return value holder, in this case a list
manager = Manager()
return_value = manager.list()
#create new process
process = Process(target=some_test.some_function, args=(arg, return_value))
#execute process
process.start()
#finish and return process
process.join()
#you can now use your return value as if it were a normal list,
#as long as it was assigned in your subprocess
Delete all your module imports and also your tests import that also import your modules:
import sys
for key in list(sys.modules.keys()):
if key.startswith("your_package_name") or key.startswith("test"):
del sys.modules[key]
you can use this as a fixture by configuring on your conftest.py file a fixture using the #pytest.fixture decorator.
Once I had similar problem, quite bad design though..
#pytest.fixture()
def module_type1():
mod = importlib.import_module('example')
mod._init(10)
yield mod
del sys.modules['example']
#pytest.fixture()
def module_type2():
mod = importlib.import_module('example')
mod._init(20)
yield mod
del sys.modules['example']
def test1(module_type1)
pass
def test2(module_type2)
pass
The example/init.py had something like this
def _init(val):
if 'sample' in globals():
logger.info(f'example already imported, val{sample}' )
else:
globals()['sample'] = val
logger.info(f'importing example with val : {val}')
output:
importing example with val : 10
importing example with val : 20
No clue as to how complex your package is, but if its just global variables, then this probably helps.
I have the same problem, and found three solutions:
reload(some_lib)
patch SUT, as the imported method is a key and value in SUT, you can patch the
SUT. Example, if you use f2 of m2 in m1, you can patch m1.f2 instead of m2.f2
import module, and use module.function.

Importing one module from different other modules only executes it once. Why?

I am confused about some behavior of Python. I always thought importing a module basically meant executing it. (Like they say here: Does python execute imports on importation) So I created three simple scripts to test something:
main.py
import config
print(config.a)
config.a += 1
print(config.a)
import test
print(config.a)
config.py
def get_a():
print("get_a is called")
return 1
a = get_a()
test.py
import config
print(config.a)
config.a += 1
The output when running main.py is:
get_a is called
1
2
2
3
Now I am confused because I expected get_a() to be called twice, once from main.py and once from test.py. Can someone please explain why it is not? What if I really wanted to import config a second time, like it was in the beginning with a=1?
(Fortunately, for my project this behavior is exactly what I wanted, because get_a() corresponds to a function, which reads lots of data from a database and of course I only want to read it once, but it should be accessible from multiple modules.)
Because the config module is already loaded so there's no need to 'run' it anymore, just return the loaded instance.
Some standard library modules make use of this, from example random. It creates an object of class Random on first import and reuses it when it gets imported again. A comment on the module reads:
# Create one instance, seeded from current time, and export its methods
# as module-level functions. The functions share state across all uses
#(both in the user's code and in the Python libraries), but that's fine
# for most programs and is easier for the casual user than making them
# instantiate their own Random() instance.

Resources