How to access public attributes in RPyC on nested data? - python-3.x

I am trying access public attributes on RPyC call by following this document but don't see it to be working as mentioned in document.
Documentation says if you don't specify protocol_config={'allow_public_attrs': True,} , public attributes , even of builtin data types won't be accessible. However even if we specify this, public attributes of nested data structure is not accessible ?
RPyC Server code.
import pickle
import rpyc
class MyService(rpyc.Service):
def on_connect(self, conn):
# code that runs when a connection is created
# (to init the service, if needed)
pass
def on_disconnect(self, conn):
# code that runs after the connection has already closed
# (to finalize the service, if needed)
pass
def exposed_get_answer(self): # this is an exposed method
return 42
exposed_the_real_answer_though = 43 # an exposed attribute
def get_question(self): # while this method is not exposed
return "what is the airspeed velocity of an unladen swallow?"
def exposed_hello(self, collection):
print ("Collection is ", collection)
print ("Collection type is ", type(collection).__name__)
for item in collection:
print ("Item type is ", type(item).__name__)
print(item)
def exposed_hello2(self, collection):
for item in collection:
for key, val in item.items():
print (key, val)
def exposed_hello_json(self, collection):
for item in collection:
item = json.loads(item)
for key, val in item.items():
print (key, val)
if __name__ == "__main__":
from rpyc.utils.server import ThreadedServer
t = ThreadedServer(
MyService(),
port=3655,
protocol_config={'allow_public_attrs': True,}
)
t.start()
Client Side Calls
>>> import rpyc
>>> rpyc.__version__
(4, 0, 2)
>>> c = rpyc.connect('a.b.c.d', 3655) ; client=c.root
# Case 1
If data is in nested structure (using builtin data types) , it doesn't work.
>>> data
[{'a': [1, 2], 'b': 'asa'}]
>>> client.hello2(data)
...
AttributeError: cannot access 'items'
========= Remote Traceback (2) =========
Traceback (most recent call last):
File "/root/lydian.egg/rpyc/core/protocol.py", line 329, in _dispatch_request
res = self._HANDLERS[handler](self, *args)
File "/root/lydian.egg/rpyc/core/protocol.py", line 590, in _handle_call
return obj(*args, **dict(kwargs))
File "sample.py", line 33, in exposed_hello2
for key, val in item.items():
File "/root/lydian.egg/rpyc/core/netref.py", line 159, in __getattr__
return syncreq(self, consts.HANDLE_GETATTR, name)
File "/root/lydian.egg/rpyc/core/netref.py", line 75, in syncreq
return conn.sync_request(handler, proxy, *args)
File "/root/lydian.egg/rpyc/core/protocol.py", line 471, in sync_request
return self.async_request(handler, *args, timeout=timeout).value
File "/root/lydian.egg/rpyc/core/async_.py", line 97, in value
raise self._obj
_get_exception_class.<locals>.Derived: cannot access 'items'
Case 2 : Workaround, Pass nested data as string using json (poor man's pickle) and decode at server end.
>>> jdata = [json.dumps({'a': [1,2], 'b':"asa"})].
>>> client.hello_json(jdata) # Prints following at remote endpoint.
a [1, 2]
b asa
Case 3 :
Interestingly, at first level builtin items are accessible as in case of
hello method. But calling that on nested data is giving error.
>>> client.hello([1,2,3,4]) # Prints following at remote endpoint.
Collection is [1, 2, 3, 4]
Collection type is list
Item type is int
1
Item type is int
2
Item type is int
3
Item type is int
4
I have workaround / solution to the problem (case 2 above) but looking for explanation on why is this not allowed or if it is a bug. Thanks for inputs.

The issue is not related to nested data.
Your problem is that you are not allowing public attributes in the client side.
The solution is simple:
c = rpyc.connect('a.b.c.d', 3655, config={'allow_public_attrs': True})
Keep in mind that rpyc is a symmetric protocol (see https://rpyc.readthedocs.io/en/latest/docs/services.html#decoupled-services).
In your case, the server tries to access the client's object, so allow_public_attrs must be set in the client side.
Actually for your specific example, there is no need to set allow_public_attrs in the server side.
Regarding case 3:
In the line for item in collection:, the server tries to access two fields: collection.__iter__ and collection.__next__.
Both of these fields are considered by default as "safe attributes", and this is why you didn't get error there.
To inspect the default configuration dictionary in rpyc:
>>> import rpyc
>>> rpyc.core.protocol.DEFAULT_CONFIG

Related

Decorators or assertions in setters to check property type?

In a python project, my class has several properties that I need to be of specific type. Users of the class must have the ability to set the property.
What is the best way to do this? Two solutions come to my mind:
1. Have test routines in each setter function.
2. Use decorators for attributes
My current solution is 1 but I am not happy with it due to the code duplication. It looks like this:
class MyClass(object):
#property
def x(self):
return self._x
#x.setter
def x(self, val):
if not isinstance(self, int):
raise Exception("Value must be of type int")
self._x = val
#property
def y(self):
return self._y
#x.setter
def y(self, val):
if not isinstance(self, (tuple, set, list)):
raise Exception("Value must be of type tuple or set or list")
self._y = val
From what I know of decorators, it should be possible to have a decorator before def x(self) handle this job. Alas I fail miserably at this, as all examples I found (like this or this) are not targeted at what I want.
The first question is thus: Is it better to use a decorator to check property types? If yes, the next question is: What is wrong with below decorator (I want to be able write #accepts(int)?
def accepts(types):
"""Decorator to check types of property."""
def outer_wrapper(func):
def check_accepts(prop):
getter = prop.fget
if not isinstance(self[0], types):
msg = "Wrong type."
raise ValueError(msg)
return self
return check_accepts
return outer_wrapper
Appetizer
Callables
This is likely beyond your needs, since it sounds like you're dealing with end-user input, but I figured it may be helpful for others.
Callables include functions defined with def, built-in functions/methods such as open(), lambda expressions, callable classes, and many more. Obviously, if you only want to allow a certain type(s) of callables, you can still use isinstance() with types.FunctionType, types.BuiltinFunctionType, types.LambdaType, etc. But if this is not the case, the best solution to this that I am aware of is demonstrated by the MyDecoratedClass.z property using isinstance() with collections.abc.Callable. It's not perfect, and will return false positives in extraordinary cases (for example, if a class defines a __call__ function that doesn't actually make the class callable). The callable(obj) built-in is the only foolproof check function to my knowledge. The MyClass.z the use property demonstrates this function, but you'd have to write another/modify the existing decorator function in MyDecoratedClass in order to support the use of check functions other than isinstance().
Iterables (and Sequences and Sets)
The y property in the code you provided is supposed to be restricted to tuples, sets, and lists, so the following may be of some use to you.
Instead of checking if arguments are of individual types, you might want to consider using Iterable, Sequence, and Set from the collections.abc module. Please use caution though, as these types are far less restrictive than simply passing (tuple, set, list) as you have. abc.Iterable (as well as the others) work near-perfectly with isinstance(), although it does sometimes return false positives as well (e.g. a class defines an __iter__ function but doesn't actually return an iterator -- who hurt you?). The only foolproof method of determining whether or not an argument is iterable is by calling the iter(obj) built-in and letting it raise a TypeError if it's not iterable, which could work in your case. I don't know of any built-in alternatives to abc.Sequence and abc.Set, but almost every sequence/set object is also iterable as of Python 3, if that helps. The MyClass.y2 property implements iter() as a demonstration, however the decorator function in MyDecoratedClass does not (currently) support functions other than isinstance(); as such, MyDecoratedClass.y2 uses abc.Iterable instead.
For the completeness' sake, here is a quick comparison of their differences:
>>> from collections.abc import Iterable, Sequence, Set
>>> def test(x):
... print((isinstance(x, Iterable),
... isinstance(x, Sequence),
... isinstance(x, Set)))
...
>>> test(123) # int
False, False, False
>>> test("1, 2, 3") # str
True, True, False
>>> test([1, 2, 3]) # list
(True, True, False)
>>> test(range(3)) # range
(True, True, False)
>>> test((1, 2, 3)) # tuple
(True, True, False)
>>> test({1, 2, 3}) # set
(True, False, True)
>>> import numpy as np
>>> test(numpy.arange(3)) # numpy.ndarray
(True, False, False)
>>> test(zip([1, 2, 3],[4, 5, 6])) # zip
(True, False, False)
>>> test({1: 4, 2: 5, 3: 6}) # dict
(True, False, False)
>>> test({1: 4, 2: 5, 3: 6}.keys()) # dict_keys
(True, False, True)
>>> test({1: 4, 2: 5, 3: 6}.values()) # dict_values
(True, False, False)
>>> test({1: 4, 2: 5, 3: 6}.items()) # dict_items
(True, False, True)
Other Restrictions
Virtually all other argument type restrictions that I can think of must use hasattr(), which I'm not going to get into here.
Main Course
This is the part that actually answers your question. assert is definitely the simplest solution, but it has its limits.
class MyClass:
#property
def x(self):
return self._x
#x.setter
def x(self, val):
assert isinstance(val, int) # raises AssertionError if val is not of type 'int'
self._x = val
#property
def y(self):
return self._y
#y.setter
def y(self, val):
assert isinstance(val, (list, set, tuple)) # raises AssertionError if val is not of type 'list', 'set', or 'tuple'
self._y = val
#property
def y2(self):
return self._y2
#y2.setter
def y2(self, val):
iter(val) # raises TypeError if val is not iterable
self._y2 = val
#property
def z(self):
return self._z
#z.setter
def z(self, val):
assert callable(val) # raises AssertionError if val is not callable
self._z = val
def multi_arg_example_fn(self, a, b, c, d, e, f, g):
assert isinstance(a, int)
assert isinstance(b, int)
# let's say 'c' is unrestricted
assert isinstance(d, int)
assert isinstance(e, int)
assert isinstance(f, int)
assert isinstance(g, int)
this._a = a
this._b = b
this._c = c
this._d = d
this._e = e
this._f = f
this._g = g
return a + b * d - e // f + g
Pretty clean overall, besides the multi-argument function I threw in there at the end, demonstrating that asserts can get tedious. However, I'd argue that the biggest drawback here is the lack of Exception messages/variables. If the end-user sees an AssertionError, it has no message and is therefore mostly useless. If you write intermediate code that could except these errors, that code will have no variables/data to be able to explain to the user what went wrong. Enter the decorator function...
from collections.abc import Callable, Iterable
class MyDecoratedClass:
def isinstance_decorator(*classinfo_args, **classinfo_kwargs):
'''
Usage:
Always remember that each classinfo can be a type OR tuple of types.
If the decorated function takes, for example, two positional arguments...
* You only need to provide positional arguments up to the last positional argument that you want to restrict the type of. Take a look:
1. Restrict the type of only the first argument with '#isinstance_decorator(<classinfo_of_arg_1>)'
* Notice that a second positional argument is not required
* Although if you'd like to be explicit for clarity (in exchange for a small amount of efficiency), use '#isinstance_decorator(<classinfo_of_arg_1>, object)'
* Every object in Python must be of type 'object', so restricting the argument to type 'object' is equivalent to no restriction whatsoever
2. Restrict the types of both arguments with '#isinstance_decorator(<classinfo_of_arg_1>, <classinfo_of_arg_2>)'
3. Restrict the type of only the second argument with '#isinstance_decorator(object, <classinfo_of_arg_2>)'
* Every object in Python must be of type 'object', so restricting the argument to type 'object' is equivalent to no restriction whatsoever
Keyword arguments are simpler: #isinstance_decorator(<a_keyword> = <classinfo_of_the_kwarg>, <another_keyword> = <classinfo_of_the_other_kwarg>, ...etc)
* Remember that you only need to include the kwargs that you actually want to restrict the type of (no using 'object' as a keyword argument!)
* Using kwargs is probably more efficient than using example 3 above; I would avoid having to use 'object' as a positional argument as much as possible
Programming-Related Errors:
Raises IndexError if given more positional arguments than decorated function
Raises KeyError if given keyword argument that decorated function isn't expecting
Raises TypeError if given argument that is not of type 'type'
* Raised by 'isinstance()' when fed improper 2nd argument, like 'isinstance(foo, 123)'
* Virtually all UN-instantiated objects are of type 'type'
Examples:
example_instance = ExampleClass(*args)
# Neither 'example_instance' nor 'ExampleClass(*args)' is of type 'type', but 'ExampleClass' itself is
example_int = 100
# Neither 'example_int' nor '100' are of type 'type', but 'int' itself is
def example_fn: pass
# 'example_fn' is not of type 'type'.
print(type(example_fn).__name__) # function
print(type(isinstance).__name__) # builtin_function_or_method
# As you can see, there are also several types of callable objects
# If needed, you can retrieve most function/method/etc. types from the built-in 'types' module
Functional/Intended Errors:
Raises TypeError if a decorated function argument is not an instance of the type(s) specified by the corresponding decorator argument
'''
def isinstance_decorator_wrapper(old_fn):
def new_fn(self, *args, **kwargs):
for i in range(len(classinfo_args)):
classinfo = classinfo_args[i]
arg = args[i]
if not isinstance(arg, classinfo):
raise TypeError("%s() argument %s takes argument of type%s' but argument of type '%s' was given" %
(old_fn.__name__, i,
"s '" + "', '".join([x.__name__ for x in classinfo]) if isinstance(classinfo, tuple) else " '" + classinfo.__name__,
type(arg).__name__))
for k, classinfo in classinfo_kwargs.items():
kwarg = kwargs[k]
if not isinstance(kwarg, classinfo):
raise TypeError("%s() keyword argument '%s' takes argument of type%s' but argument of type '%s' was given" %
(old_fn.__name__, k,
"s '" + "', '".join([x.__name__ for x in classinfo]) if isinstance(classinfo, tuple) else " '" + classinfo.__name__,
type(kwarg).__name__))
return old_fn(self, *args, **kwargs)
return new_fn
return isinstance_decorator_wrapper
#property
def x(self):
return self._x
#x.setter
#isinstance_decorator(int)
def x(self, val):
self._x = val
#property
def y(self):
return self._y
#y.setter
#isinstance_decorator((list, set, tuple))
def y(self, val):
self._y = val
#property
def y2(self):
return self._y2
#y2.setter
#isinstance_decorator(Iterable)
def y2(self, val):
self._y2 = val
#property
def z(self):
return self._z
#z.setter
#isinstance_decorator(Callable)
def z(self, val):
self._z = val
#isinstance_decorator(int, int, e = int, f = int, g = int, d = (int, float, str))
def multi_arg_example_fn(self, a, b, c, d, e, f, g):
# Identical to assertions in MyClass.multi_arg_example_fn
self._a = a
self._b = b
self._c = c
self._d = d
return a + b * e - f // g
Clearly, multi_example_fn is one place where this decorator really shines. The clutter made by assertions has been reduced to a single line. Let's take a look at some example error messages:
>>> test = MyClass()
>>> dtest = MyDecoratedClass()
>>> test.x = 10
>>> dtest.x = 10
>>> print(test.x == dtest.x)
True
>>> test.x = 'Hello'
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 7, in x
AssertionError
>>> dtest.x = 'Hello'
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 100, in new_fn
TypeError: x() argument 0 takes argument of type 'int' but argument of type 'str' was given
>>> test.y = 1
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 15, in y
AssertionError
>>> test.y2 = 1
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 23, in y2
TypeError: 'int' object is not iterable
>>> dtest.y = 1
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 100, in new_fn
TypeError: y() argument 0 takes argument of types 'list', 'set', 'tuple' but argument of type 'int' was given
>>> dtest.y2 = 1
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 100, in new_fn
TypeError: y2() argument 0 takes argument of type 'Iterable' but argument of type 'int' was given
>>> test.z = open
>>> dtest.z = open
>>> test.z = None
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 31, in z
AssertionError
>>> dtest.z = None
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 100, in new_fn
TypeError: z() argument 0 takes argument of type 'Callable' but argument of type 'NoneType' was given
Far superior in my opinion. Everything looks good except...
>>> test.multi_arg_example_fn(9,4,[1,2],'hi', g=2,e=1,f=4)
11
>>> dtest.multi_arg_example_fn(9,4,[1,2],'hi', g=2,e=1,f=4)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 102, in new_fn
KeyError: 'd'
>>> print('I forgot that you have to merge args and kwargs in order for the decorator to work properly with both but I dont have time to fix it right now. Absolutely safe for properties for the time being though!')
I forgot that you have to merge args and kwargs in order for the decorator to work properly with both but I dont have time to fix it right now. Absolutely safe for properties for the time being though!
Edit Notice: My previous answer was completely incorrect. I was suggesting the use of type hints, forgetting that they aren't actually ensured in any way. They are strictly a development/IDE tool. They still are insanely helpful though; I recommend looking into using them.

I am getting the error : ValueError: not enough values to unpack (expected 2, got 1). What could be the problem?

Im running a mapreduce job, but it keeps failing saying missing input. Unfortunately its not showing where its missing either
from mrjob.job import MRJob
from mrjob.step import MRStep
import re
class flight_combination(MRJob):
def steps(self):
return [MRStep(mapper=self.mapper_1,reducer=self.reducer_1)]
def mapper_1(self,_,value):
group1 = {}
group2 = {}
parts = value.split(",")
destination = parts[0]
origin = parts[1]
count = parts[2]
group1[destination] = {'Origin': origin, 'count': count}
group2[origin] = {'Destination':destination,'count':count}
yield group1
yield group2
def reducer_1(self,key,value):
g1,g2 = data
for key1 in g1:
for key2 in g2:
if g1[key1]['Origin'] == g2[key2]['Destination']:
total = int(g1[key1]['count'])*int(g2[key2]['count'])
yield (key1,key2,total)
if __name__ == '__main__':
flight_combination.run()
Following is the error:
`File "wd.py", line 35, in <module>
flight_combination.run()
…...
File "/usr/lib/python3.6/site-packages/mrjob/job.py", line 536, in run_mapper
for out_key, out_value in mapper(key, value) or ():
ValueError: not enough values to unpack (expected 2, got 1)`
The run method of the object type flight_combination is expecting 2 arguments, but provided with 1 argument. ( Python by default takes self as the first argument for methods invoked on object)
To fix this -
as the method run is defined in the parent class, go through its definition and pass the other argument.
Override the run method by redefining it flight_combination class and provide your logic.

Pool.apply_async().get() causes _thread.lock pickle error

I've recently made a python program that would benefit a lot from a consumer/producer parallel computing strategy.
I've tried to develop a module (Class) to ease the implementation of such processing strategy, but I quickly ran into a problem.
My ProducerConsumer class:
class ProducerConsumer(object):
def __init__(self, workers_qt, producer, consumer, min_producer_qt=1):
self.producer_functor = producer # Pointer to the producer function
self.consumer_functor = consumer # Pointer to the consumer function
self.buffer = deque([]) # Thread-safe double-ended queue item for intermediate result buffer
self.workers_qt = workers_qt
self.min_producer_qt = min_producer_qt # Minimum quantity of active producers (if enough remaining input data)
self.producers = [] # List of producers async results
self.consumers = [] # List of consumers async results
def produce(self, params, callback=None):
result = self.producer_functor(*params) # Execute the producer function
if callback is not None:
callback() # Call the callback (if there is one)
return result
def consume(self, params, callback=None):
result = self.consumer_functor(params) # Execute the producer function
if callback is not None:
callback() # Call the callback (if there is one)
return result
# Map a list of producer's input data to a list of consumer's output data
def map_result(self, producers_param):
result = [] # Result container
producers_param = deque(producers_param) # Convert input to double-ended queue (for popleft() member)
with Pool(self.workers_qt) as p: # Create a worker pool
while self.buffer or producers_param or self.consumers or self.producers: # Work remaining
# Create consumers
if self.buffer and (len(self.producers) >= self.min_producer_qt or not producers_param):
consumer_param = self.buffer.popleft() # Pop one set from the consumer param queue
if not isinstance(consumer_param, tuple):
consumer_param = (consumer_param,) # Force tuple type
self.consumers.append(p.apply_async(func=self.consume, args=consumer_param)) # Start new consumer
# Create producers
elif producers_param:
producer_param = producers_param.popleft() # Pop one set from the consumer param queue
if not isinstance(producer_param, tuple):
producer_param = (producer_param,) # Force tuple type
self.producers.append(p.apply_async(func=self.produce, args=producer_param)) # Start new producer
# Filter finished async_tasks
finished_producers = [r for r in self.producers if r.ready()] if self.producers else []
finished_consumers = [r for r in self.consumers if r.ready()] if self.consumers else []
# Remove finished async_tasks from the running tasks list
self.producers = [r for r in self.producers if r not in finished_producers]
self.consumers = [r for r in self.consumers if r not in finished_consumers]
# Extract result from finished async_tasks
for r in finished_producers:
assert r.ready()
self.buffer.append(r.get()) # Get the producer result and put it in the buffer
for r in finished_consumers:
assert r.ready()
result.append(r.get()) # Get the consumer tesult and put in in the function local result var
return result
In the member map_result(), when I try to "get()" the result of the apply_async(...) function, i get the following error (note that I'm running python3):
Traceback (most recent call last):
File "ProducerConsumer.py", line 91, in <module>
test()
File "ProducerConsumer.py", line 85, in test
result = pc.map_result(input)
File "ProducerConsumer.py", line 64, in map_result
self.buffer.append(r.get()) # Get the producer result and put it in the buffer
File "/usr/lib/python3.5/multiprocessing/pool.py", line 608, in get
raise self._value
File "/usr/lib/python3.5/multiprocessing/pool.py", line 385, in _handle_tasks
put(task)
File "/usr/lib/python3.5/multiprocessing/connection.py", line 206, in send
self._send_bytes(ForkingPickler.dumps(obj))
File "/usr/lib/python3.5/multiprocessing/reduction.py", line 50, in dumps
cls(buf, protocol).dump(obj)
TypeError: can't pickle _thread.lock objects
And here is some code to reproduce my error (dependent on the class obviously) :
def test_producer(val):
return val*12
def test_consumer(val):
return val/4
def test():
pc = ProducerConsumer(4, test_producer, test_consumer)
input = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9] # Input for the test of the ProducerConsumer class
expected = [0, 3, 6, 9, 15, 18, 21, 23, 27] # Expected output for the test of the ProducerConsumer class
result = pc.map_result(input)
print('got : {}'.format(result))
print('expected : {}'.format(expected))
if __name__ == '__main__':
test()
Note that in the map_result() member of my class I only "get()" results that are "ready()".
From what I know about pickling (which I admit is not that much), I'd say that the fact that I Pool.apply_async(...) on a member function could play a role but I'd really like to keep the class structure if I can.
Thank you for the help!
So, the problem have been corrected when I also corrected some conception errors:
My 3 buffer variables (buffer, producers, consumers) had nothing to do as member of the class since they were semantically bound to the "map_result()" member itself.
So the patch was deleting these members and creating them as local variables of the member "map_result()".
Problem is, even if the conception was faulty, I still have a hard time understanding why the worker couldn't pickle the lock (of the param I now suppose) so...
If anyone have a clear explanation on what was going on (or a link to some) that would be really appreciated.

Returning dictionary with list values in python 3

I have been tasked with updating some code from python 2.7 to python 3.6
Currently the code breaks with:
TypeError: 'map' object is not subscriptable
Original code:
def test_create_page(self):
"""Ensure we can make a page"""
response = DispatchTestHelpers.create_page(self.client)
id = response.data['id']
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
try:
page = Page.objects.get(pk=response.data['id'])
except Page.DoesNotExist:
self.fail('The page should exist in the database')
print()
print()
print(response.data)
print()
print()
# Check Data
self.assertEqual(response.data['title'], 'Test Page')
self.assertEqual(response.data['slug'], 'test-page')
self.assertEqual(response.data['snippet'], 'This is a test snippet')
self.assertEqual(response.data['content'][0]['type'], 'paragraph')
self.assertEqual(response.data['content'][0]['data'], 'This is some paragraph text')
Original Output:
======================================================================
ERROR: test_create_page (dispatch.tests.test_api_pages.PagesTest)
Ensure we can make a page
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/icenine/code/ubyssey-
dev3/dispatch/dispatch/tests/test_api_pages.py", line 67, in
test_create_page
self.assertEqual(response.data['content'][0]['type'], 'paragraph')
TypeError: 'map' object is not subscriptable
Thus far I have solved this by casting the map object to a list, however this returns an empty list. As far as I can tell the map object has not been touched in any way prior to being cast as a list, as discussed here https://stackoverflow.com/a/45018536/6448060.
The dictionary is returned from the following function:
#classmethod
def create_page(cls, client, title='Test Page', slug='test-page'):
"""Create dummy page"""
url = reverse('api-pages-list')
data = {
'title': title,
'slug': slug,
'snippet': 'This is a test snippet',
'content': [
{
'type': 'paragraph',
'data': 'This is some paragraph text'
}
]
}
return client.post(url, data, format='json')
Attempted Solution:
def test_create_page(self):
"""Ensure we can make a page"""
response = DispatchTestHelpers.create_page(self.client)
id = response.data['id']
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
try:
page = Page.objects.get(pk=response.data['id'])
except Page.DoesNotExist:
self.fail('The page should exist in the database')
data_list = list(response.data['content'])
print()
print(response.data)
print()
print(data_list)
print()
# Check Data
self.assertEqual(response.data['title'], 'Test Page')
self.assertEqual(response.data['slug'], 'test-page')
self.assertEqual(response.data['snippet'], 'This is a test snippet')
self.assertEqual(data_list[0]['type'], 'paragraph')
self.assertEqual(data_list[0]['data'], 'This is some paragraph text')
Attempted Solution Output:
{'id': 1, 'slug': 'test-page', 'title': 'Test Page', 'featured_image': None, 'snippet': 'This is a test snippet', 'content': <map object at 0x7ff15f04e048>, 'published_at': None, 'is_published': False, 'published_version': None, 'current_version': 1, 'latest_version': 1, 'preview_id': '8720814f-a5e8-4892-b592-8cbb4d0d019f', 'template': OrderedDict([('id', 'default'), ('name', 'Default')]), 'template_data': {}, 'seo_keyword': None, 'seo_description': None}
[]
======================================================================
ERROR: test_create_page (dispatch.tests.test_api_pages.PagesTest)
Ensure we can make a page
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/icenine/code/ubyssey-
dev3/dispatch/dispatch/tests/test_api_pages.py", line 77, in
test_create_page
self.assertEqual(data_list[0]['type'], 'paragraph')
IndexError: list index out of range
As you can see in the first line, the 'content' key's value is a map object.
How can I access the list values from the returned dictionary?
response.data['content'][0]['type']
I suppose that response.data['content'] is not a list, so an attempt to access its 0th element fails.
'content': <map object at 0x7ff15f04e048>,
Look up the documentation for that map object; it's likely in the docs for the library you use to actually hit your URLs.
So the issue was in several module files associated with django. There were several functions that returned map types, one was for converting json data that returned with map. I wrapped the returned maps with a list like so
return list(map(<whatever is being mapped>))
And the problem for now appears to be solved.

python3: getting defined functions from a code object?

In python3, I have the following code:
path = '/path/to/file/containing/python/code'
source = open(path, 'r').read()
codeobject = compile(source, path, 'exec')
I have examined codeobject, but I don't see any way to get a list of all the functions that are defined within that object.
I know I can search the source string for lines that begin with def, but I want to get this info from the code object, if at all possible.
What am I missing?
A code object is a nested structure; functions are created when the code object is executed, with their bodies embedded as separate code objects that are part of the constants:
>>> example = '''\
... def foobar():
... print('Hello world!')
... '''
>>> codeobject = compile(example, '', 'exec')
>>> codeobject
<code object <module> at 0x11049ff60, file "", line 1>
>>> codeobject.co_consts
(<code object foobar at 0x11049fe40, file "", line 1>, 'foobar', None)
>>> codeobject.co_consts[0]
<code object foobar at 0x11049fe40, file "", line 1>
>>> codeobject.co_consts[0].co_name
'foobar'
When you disassemble the top-level code object you can see that the function objects are created from such code objects:
>>> import dis
>>> dis.dis(codeobject)
1 0 LOAD_CONST 0 (<code object foobar at 0x11049fe40, file "", line 1>)
2 LOAD_CONST 1 ('foobar')
4 MAKE_FUNCTION 0
6 STORE_NAME 0 (foobar)
8 LOAD_CONST 2 (None)
10 RETURN_VALUE
The MAKE_FUNCTION opcode takes the code object from the stack, as well as the function name and any default argument values from the stack; you can see the LOAD_CONST opcodes preceding it that put the code object and name there.
Not all code objects are functions however:
>>> compile('[i for i in range(10)]', '', 'exec').co_consts
(<code object <listcomp> at 0x1105cb030, file "", line 1>, '<listcomp>', 10, None)
>>> compile('class Foo: pass', '', 'exec').co_consts
(<code object Foo at 0x1105cb0c0, file "", line 1>, 'Foo', None)
If you wanted to list what functions are loaded in the bytecode, your best bet is to use the disassembly, not look for code objects:
import dis
from itertools import islice
# old itertools example to create a sliding window over a generator
def window(seq, n=2):
"""Returns a sliding window (of width n) over data from the iterable
s -> (s0,s1,...s[n-1]), (s1,s2,...,sn), ...
"""
it = iter(seq)
result = tuple(islice(it, n))
if len(result) == n:
yield result
for elem in it:
result = result[1:] + (elem,)
yield result
def extract_functions(codeobject):
codetype = type(codeobject)
signature = ('LOAD_CONST', 'LOAD_CONST', 'MAKE_FUNCTION', 'STORE_NAME')
for op1, op2, op3, op4 in window(dis.get_instructions(codeobject), 4):
if (op1.opname, op2.opname, op3.opname, op4.opname) == signature:
# Function loaded
fname = op2.argval
assert isinstance(op1.argval, codetype)
yield fname, op1.argval
This generates (name, codeobject) tuples for all functions that are loaded in a given code object.

Resources