What are the advantage and disadvantages of using a list comprehension in Python 2.54-6? - list-comprehension

I've heard that list comprehensions can be slow sometimes, but I'm not sure why? I'm new to Python (coming from a C# background), and I'd like to know more about when to use a list comprehension versus a for loop. Any ideas, suggestions, advice, or examples? Thanks for all the help.

Use a list comprehension (LC) when it's appropriate.
For example, if you are passing any ol' iterable to a function, a generator expression (genexpr) is often more appropriate, and a LC is wasteful:
"".join([str(n) for n in xrange(10)])
# becomes
"".join(str(n) for n in xrange(10))
Or, if you don't need a full list, a for-loop with a break statement would be your choice. The itertools module also has tools, such as takewhile.

Related

Can we write this code in different way by using list comprehension

Any other optimize way to write the below code
t1=[]
for i in range(0,10):
x=int(input())
t1.append(x)
You can go for
t1 = [int(input()) for _ in range(10)]
which does the same thing but does not optimize the code in any meaningful way. While list comprehensions might be faster than for-loops, this doesn't matter when your code has to wait for user input. Furthermore: You probably have to do some input checking/parsing (What happens if the input cannot be cast to an integer?) and this really shouldn't be done inside a list comprehension.

What are the upsides of generators in python 3?

I know that you can use generators/list comprehensions as filters. You can do a lot with lists but what can you do with generators? Python would only make such thing as a generator if it is useful.
The biggest benefit of a generator is that it doesn't need to reserve memory for every element of a sequence, it generates each item as needed.
Because of this, a generator doesn't need to have a defined size. It can generate an infinite sequence if needed.

Comprehension with Lists of Dictionaries

I know this isn't exactly the first question about list comprehensions, but I've been looking around and experimenting for a while and can't figure this one out. I'll apologize in advance, I'm a self-taught novice learning from failures.
This is working code, but it screams list comprehension. I understand list comprehensions and use them, but the combination of stacking for and working with the dictionaries within the lists is breaking my brain. How would you simplify this:
results = []
for system in systems: # list of dicts
for result in telnet_results: # list of dicts
if system['master_ip'] == result['master_ip']:
combined = {**system, **result} # merge dicts, right takes precedence
results.append(combined)
Thanks in advance for any help on this.
results = [{**system, **result} for system in systems for result in telnet_results if
system['master_ip'] == result['master_ip']]
Can be also splitted more logically:
results = [{**system, **result}
for system in systems
for result in telnet_results
if system['master_ip'] == result['master_ip']]
Is this "simplified"? I'm not sure. List comprehensions are not magic, and they do not always simplify the code or make it more readable.

Time Complexity for Python built-ins?

Is there any good reference resource to know the time complexity of Python's built-in functions like dict.fromkeys(), .lower()? I found links like this UCI resource which lists time-complexity for basic list & set operations but of course, not for all built-ins. I also found Python Reference - The Right Way but most of references have #TODO for time complexity.
I also tried reading the source code of python built-ins to figure out how the functions like dict.fromkeys() are implemented but felt lost.
This is a great place to start:
https://wiki.python.org/moin/TimeComplexity
It says that Get Item is O(1) and Iteration is O(n) (Average Case).
So then, if with .fromkeys() you iterate over just the keys of the dict, then make those the keys of a new dict, while also setting values, I'd think that you'd have between O(n) and O(2n), where n is the number of keys in the first dict.
Sorry that I can't offer more than conjecture, but hopefully that link is helpful.

k-combinations in Python 3.x

What is the most efficient way to generate k-tuples with all k-combinations from a given python set? Is there an appropriate built-in function? Something tells me it should be possible with a two-line for loop.
P.S. I did conduct a search and found various entries to the topic "combinations from lists, etc in Python", but all proposed solutions seem rather 'un-python'. I am hoping for a mind-blowing, idiomatic python expression.
itertools has all of those types of functions:
import itertools
for combination in itertools.combinations(iterable, k):
print(combination)

Resources