How to monkey patch class instance variable in pytest - python-3.x

I have a class that have a variable in __init__ and a method. The variable stores a string value that represents the s3 bucket name, which will be used by the method.
class Manage:
def __init__(self):
self.bucket = 'doc'
def read_file(self):
bucket = self.bucket
...
return file
And I am writing a test case to test the method but the problem is I want to use a different bucket, so I want to patch the value of that variable.
Code I tried
import Manage
def test_manage(monkeypath):
monkeypath.setattr(Manage, 'bucket', 'doc2')
...
The above code is not working, since the bucket is not a class-level variable (i guess). I don't know how should I change that

As #gold_cy suggested, you can set it after instantiating the object:
import Manage
def test_manage():
manage = Manage()
manage.bucket = 'doc2'
...

Related

Python mocking using MOTO for SSM

Taken from this answer:
Python mock AWS SSM
I now have this code:
test_2.py
from unittest import TestCase
import boto3
import pytest
from moto import mock_ssm
#pytest.yield_fixture
def s3ssm():
with mock_ssm():
ssm = boto3.client("ssm")
yield ssm
#mock_ssm
class MyTest(TestCase):
def setUp(self):
ssm = boto3.client("ssm")
ssm.put_parameter(
Name="/mypath/password",
Description="A test parameter",
Value="this is it!",
Type="SecureString",
)
def test_param_getting(self):
import real_code
resp = real_code.get_variable("/mypath/password")
assert resp["Parameter"]["Value"] == "this is it!"
and this is my code to test (or a cut down example):
real_code.py
import boto3
class ParamTest:
def __init__(self) -> None:
self.client = boto3.client("ssm")
pass
def get_parameters(self, param_name):
print(self.client.describe_parameters())
return self.client.get_parameters_by_path(Path=param_name)
def get_variable(param_name):
p = ParamTest()
param_details = p.get_parameters(param_name)
return param_details
I have tried a number of solutions, and switched between pytest and unittest quite a few times!
Each time I run the code, it doesn't reach out to AWS so it seems something is affecting the boto3 client, but it doesn't return the parameter. If I edit real_code.py to not have a class inside it the test passes.
Is it not possible to patch the client inside the class in the real_code.py file? I'm trying to do this without editing the real_code.py file at all if possible.
Thanks,
The get_parameters_by_path returns all parameters that are prefixed with the supplied path.
When providing /mypath, it would return /mypath/password.
But when providing /mypath/password, as in your example, it will only return parameters that look like this: /mypath/password/..
If you are only looking to retrieve a single parameter, the get_parameter call would be more suitable:
class ParamTest:
def __init__(self) -> None:
self.client = boto3.client("ssm")
pass
def get_parameters(self, param_name):
# Decrypt the value, as it is stored as a SecureString
return self.client.get_parameter(Name=param_name, WithDecryption=True)
Edit: Note that Moto behaves the same as AWS in this.
From https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ssm.html#SSM.Client.get_parameters_by_path:
[The Path-parameter is t]he hierarchy for the parameter. [...] The hierachy is the parameter name except the last part of the parameter. For the API call to succeeed, the last part of the parameter name can't be in the path.

How to share variables across Python modules when getter and setter methods are required

How can I share variables across different modules of my Python project if I need these variables to have setter and getter methods.
The reason I need setter\getter methods is because when getting and setting the variables I need to have backwards compatibility with code that stored these variable as environment variables. So I need to write and read using os.environ too.
Usually all I need to do is create a class with class-level variables, import the class in each Module and access the module as follows:
datastore.py/
class DataStore:
target_server_ip: str = '10.10.10.100'
consumer.py/
from project.datastore import DataStore
def print_target_server_ip():
print(DataStore.target_server_ip)
This doesn't work (at least not in Python 3.6.5) if the variables require property getter and setter methods.
The reason is that I cannot define a class level method as a property. The following code just isn't possible:
datastore.py/
class DataStore:
target_server_ip: str = '10.10.10.100'
#classmethod
#property
def target_server_ip(cls):
return cls.target_server_ip
#classmethod
#target_server_ip.setter
def target_server_ip(cls, value):
cls.target_server_ip = value
To solve this issue I propose the following code section. It is split into two classes.
The first class is working at the class level and maintains a 2 level nested dictionary that contains the name of the datastore and the variable name.
The second class is the datastore itself. It has the minimum required code to keep it visually simple.
This specific implementation has one known error prone limitation. If you declare two or more variables with the same name in different datastore classes, i.d. you define class FrameworkDatastore and another class SecondDatastore with the same variable in both, the environment will have only one of them.
import inspect
import logging
import os
from typing import Any, Dict, Type
logger = logging.getLogger(__name__)
class _BaseDataStoreWithEnvironSupport:
"""
The class support global storing of variables in a class level dictionary, allowing all instances of the
datastore to access the same values.
This class is backward compatible to store the global variables as os.environ, but also
"""
_members: Dict[str, Dict[str, Any]] = {} # holds all the members of the datastore
#classmethod
def get_value(cls) -> Any:
datastore_name: str = cls.__name__
member_name: str = inspect.stack()[1][3]
env_value: str = os.environ.get(member_name)
ds_value: Any = cls._members[datastore_name][member_name]
if env_value:
type_ds_value: Type = type(ds_value)
if type_ds_value is bool:
value: bool = (env_value == True.__str__())
else:
value: Any = type(ds_value)(env_value)
if value != ds_value:
logger.warning('Environment stored value is different from Datastore value. Check your implementation')
else:
value: Any = ds_value
return value
#classmethod
def set_value(cls, value: Any) -> None:
datastore_name: str = cls.__name__
name: str = inspect.stack()[1][3]
if datastore_name not in cls._members.keys():
cls._members[datastore_name] = {}
cls._members[datastore_name][name] = value
os.environ[name] = str(value)
def validate_datastore(self):
members = set([attr for attr in dir(self) if not callable(getattr(self, attr)) and not attr.startswith("_")])
if members.__len__() == 0:
raise RuntimeError(f'There are no members in the datastore or the validation runs at the start of __init__')
datastore_name: str = self.__class__.__name__
dict_keys: set = set(self._members[datastore_name].keys())
if members != dict_keys:
missing_members: set = members - dict_keys
raise NotImplementedError(f'Datastore is missing get and set methods for members: {missing_members}')
class FrameworkDatastore(_BaseDataStoreWithEnvironSupport):
"""
This class is storing all variables that are currently saved as global or os.environ variables
If the data stored here becomes irrelevant after the code change or is seldom used, remove it and merge its
functionality into other sections
"""
def __init__(self):
"""
predefine all the members of the datastore.
Members which dont implement get/set methods will be flagged by the validate_datastore check
"""
self.run_traffic_validations: bool = True # Should Ixia traffic validations run in the current suite
# The validation of the datastore must come at the end of the __init__ method
self.validate_datastore()
#property
def run_traffic_validations(self):
return self.get_value()
#run_traffic_validations.setter
def run_traffic_validations(self, value: Any):
self.set_value(value)
if __name__ == '__main__':
# This tests the datastore code
fd1 = FrameworkDatastore()
fd2 = FrameworkDatastore()
print(fd1.run_traffic_validations)
print(fd2.run_traffic_validations)
fd1.run_traffic_validations = False
print(fd1.run_traffic_validations)
print(fd2.run_traffic_validations)
fd2.run_traffic_validations = True
print(fd1.run_traffic_validations)
print(fd2.run_traffic_validations)

How to create constent object for aws dynamodb in python?

I want to know how can check if the boto3 client object already instantiated or not. i have below code for creating the object for aws dynamodb
import boto3
def aws_dynamodb():
return boto3.resource("dynamodb")
def get_db_con():
dynamo_conn=aws_dynamodb()
return dynamo_conn
Now the above 'get_db_con()' return the connection to the dynamodb. but i want to make sure that 'get_db_con' not creating the client object from the 'aws_dynamodb()' everytime when its been called.
for eg:
def aws_db_table(table):
con=get_db_con()
return con.table(table)
account_table=aws_db_table("my_ac_table")
audit_table=aws_db_table("audit_table")
so here whenever i call the 'aws_db_table', it should not create the client for 'aws_dynamodb()' for everytime.
so how i can check it if the aws_dynamodb() is already instantiated or not creating new client object. Because creating client object everytime is costly..
Note: I want to run the code in Lambda function
Please help us on this..
Thanks
I usually go with a low-tech solution with a global variable that looks something like this:
import boto3
# Underscore-prefix to indicate this is something private
_TABLE_RESOURCE = None
def get_table_resource():
global _TABLE_RESOURCE
if _TABLE_RESOURCE is None:
_TABLE_RESOURCE = boto3.resource("dynamodb").Table("my_table")
return _TABLE_RESOURCE
def handler(event, context):
table = get_table_resource()
# ...
Global variables are persisted across Lambda executions, that's why this works.
Another option would be to use the lru_cache from functools, which uses memoization.
from functools import lru_cache
import boto3
#lru_cache(maxsize=128)
def get_table_resource():
return boto3.resource("dynamodb").Table("my_table")
def handler(event, context):
table = get_table_resource()
# ...
For those not familiar with memoization the first solution is probably easier to read + understand.
(Note: I wrote the code from memory, there may be bugs)

python 3: mock a method of the boto3 S3 client

I want to unit test some code that calls a method of the boto3 s3 client.
I can't use moto because this particular method (put_bucket_lifecycle_configuration) is not yet implemented in moto.
I want to mock the S3 client and assure that this method was called with specific parameters.
The code I want to test looks something like this:
# sut.py
import boto3
class S3Bucket(object):
def __init__(self, name, lifecycle_config):
self.name = name
self.lifecycle_config = lifecycle_config
def create(self):
client = boto3.client("s3")
client.create_bucket(Bucket=self.name)
rules = # some code that computes rules from self.lifecycle_config
# I want to test that `rules` is correct in the following call:
client.put_bucket_lifecycle_configuration(Bucket=self.name, \
LifecycleConfiguration={"Rules": rules})
def create_a_bucket(name):
lifecycle_policy = # a dict with a bunch of key/value pairs
bucket = S3Bucket(name, lifecycle_policy)
bucket.create()
return bucket
In my test, I'd like to call create_a_bucket() (though instantiating an S3Bucket directly is also an option) and make sure that the call to put_bucket_lifecycle_configuration was made with the correct parameters.
I have messed around with unittest.mock and botocore.stub.Stubber but have not managed to crack this. Unless otherwise urged, I am not posting my attempts since they have not been successful so far.
I am open to suggestions on restructuring the code I'm trying to test in order to make it easier to test.
Got the test to work with the following, where ... is the remainder of the arguments that are expected to be passed to s3.put_bucket_lifecycle_configuration().
# test.py
from unittest.mock import patch
import unittest
import sut
class MyTestCase(unittest.TestCase):
#patch("sut.boto3")
def test_lifecycle_config(self, cli):
s3 = cli.client.return_value
sut.create_a_bucket("foo")
 s3.put_bucket_lifecycle_configuration.assert_called_once_with(Bucket="foo", ...)
if __name__ == '__main__':
unittest.main()

How to store in variable function returning value (kivy properties)

class Data(object):
def get_key_nicks(self):
'''
It returns key and nicks object
'''
file = open(self.key_address, 'rb')
key = pickle.load(file)
file.close()
file = open(self.nicks_address, 'rb')
nicks = pickle.load(file)
file.close()
return (key, nicks)
Above is the data api and function which i want to use in kivy
class MainScreen(FloatLayout):
data = ObjectProperty(Data())
key, nicks = ListProperty(data.get_key_nicks())
it gives error like: AttributeError: 'kivy.properties.ObjectProperty' object has no attribute 'get_key_nicks'
Properties are descriptors, which basically means they look like normal attributes when accessed from instances of the class, but at class level they are objects on their own. That's the nature of the problem here - at class level data is an ObjectProperty, even though if you access it from an instance of the class you'll get your Data() object that you passed in as the default value.
That said, I don't know what your code is actually trying to do, do you want key and nicks to be separate ListProperties?
Could you expand a bit more on what you're trying to do?
I think all you actually need to do is:
class MainScreen(FloatLayout):
data = ObjectProperty(Data())
def get_key_nicks(self):
return data.get_key_nicks()

Resources