How could you set a variable equal to 'O' or '-' and then put that in an if statement like the one below:
if variable == 'O':
print 'hi'
how could you do that for:
import threading
from array import array
from Queue import Queue, Full
import pyaudio
CHUNK_SIZE = 1024
MIN_VOLUME = 500
BUF_MAX_SIZE = CHUNK_SIZE * 10
def main():
stopped = threading.Event()
q = Queue(maxsize=int(round(BUF_MAX_SIZE / CHUNK_SIZE)))
listen_t = threading.Thread(target=listen, args=(stopped, q))
listen_t.start()
record_t = threading.Thread(target=record, args=(stopped, q))
record_t.start()
try:
while True:
listen_t.join(0.1)
record_t.join(0.1)
except KeyboardInterrupt:
stopped.set()
listen_t.join()
record_t.join()
def record(stopped, q):
while True:
if stopped.wait(timeout=0):
break
chunk = q.get()
vol = max(chunk)
if vol >= MIN_VOLUME:
# TODO: write to file
print "O",
else:
print "-",
def listen(stopped, q):
stream = pyaudio.PyAudio().open(
format=pyaudio.paInt16,
channels=2,
rate=44100,
input=True,
frames_per_buffer=1024,
)
while True:
if stopped.wait(timeout=0):
break
try:
q.put(array('h', stream.read(CHUNK_SIZE)))
except Full:
pass # discard
if __name__ == '__main__':
main()
Could you use that so if the output is 'O' then print hi? Will somebody write the code for me because I have been trying for a little bit to write this code and I have still not been able to make the code work for me. Thank You.
In order to use if then statement in Python, first we need to declare variable value with proper syntax according to requirement and type.
s = "O"
if s == 'O':
print 'hi'
Related
For example, this code is not working:
from pynput import keyboard
def on_press(key):
global play
if str(key) == "'x'":
play = 'Play'
play = ''
with keyboard.Listener(on_press = on_press, suppress=True) as listener:
while listener.is_alive(): # infinite loop which runs all time
def pump():
num = 0
while True:
yield num # generator
num += 1
if play == 'Play':
next(gen)
listener.join()
No matter how I put the generator function or using a global variable, I always have a StopIteration: error or others.
I want to control the generator, so I can "play" and "pause" with keyboard keys.
While this code works:
def pump():
num = 0
while True:
yield num
num += 1
gen = pump()
next(gen) #play
next(gen)
I'm so bewildered...
I don't use Jupyter notebooks but I tried the VS Code extension and it seems to work in them as well.
Here is some sample code to prove the generator is working.
from pynput import keyboard
import sys
def on_press(key):
global play
if str(key) == "'x'":
play = 'Play'
# added to provide a way of escaping the while loop
elif str(key) == "'q'":
sys.exit(1)
def pump():
num = 0
while True:
# added as proof your generator is working
print(num)
yield num
num += 1
gen = pump()
play = ''
with keyboard.Listener(on_press=on_press, suppress=True) as listener:
while listener.is_alive():
if play == 'Play':
next(gen)
listener.join()
This will infinitely call next(gen) since there is no way to unset play from "Play" in the code. Once it's set, it will continue infinitely looping and if play == 'Play': will always be True.
Now, onto your example.
This line:
next(gen)
will error.
gen is never defined in your code.
I assume this was just a typo and you have gen = pump() somewhere.
Additionally, defining pump in your loop will almost certainly have unintended consequences.
Suppose you do have gen = pump() immediately after the definition.
play will be set to "Play" when you press the x key. You will continuously call next(gen).
However, this time after each loop pump will be redefined and num will be set to 0. It will yield the 0 and continue to do so infinitely.
So what did you mean to do?
Probably something like this:
from pynput import keyboard
import sys
def on_press(key):
global play
if str(key) == "'x'":
play = 'Play'
elif str(key) == "'q'":
sys.exit(1)
def pump():
global play
num = 0
while True:
play = ''
print(num)
yield num
num += 1
gen = pump()
play = ''
with keyboard.Listener(on_press = on_press, suppress=True) as listener:
while listener.is_alive():
if play == 'Play':
next(gen)
listener.join()
Personally, I would avoid the use of globals.
However this, calls the next(gen) once with the x key press. The generator sets play back to "" avoiding subsequent calls of next(gen) until x is pressed again.
i have the following:
from multiprocessing import Pool
def process_elements(index_of_data_inputs):
<process>
if <condition>:
# i would like to change the size of data_inputs
if __name__ == '__main__':
pool = Pool() # Create a multiprocessing Pool
pool.map(process_elements, range(0, len(data_inputs)) # process data_inputs iterable with pool
how i can change the size of data_inputs and so change the number of times process_elements
is called?
the work behind that i would like to parallelize is:
i = 0
while i < len(elements):
new_elems = process_some_elements(x,y)
if len(new_elems) > 0:
elements = elements + new_elems
i += 1
Consider simple example of communication between processes with multiprocessing module in Python:
import multiprocessing
import queue
import random
def process_elements(num, comq):
val = random.random()
if val > 0.5:
comq.put(1)
return num, int(1000 * val)
if __name__ == '__main__':
# initial data
numbers = list(range(10))
# data structure fot communication between multiple processes
m = multiprocessing.Manager()
q = m.Queue()
with multiprocessing.Pool(processes=4) as pool:
# get answer for original data
ans = pool.starmap(process_elements, [(num, q) for num in numbers])
print(numbers)
print(ans)
# create additional data based on the answer for initial data
new_numbers = numbers[-1:]
try:
while True:
new_numbers.append(new_numbers[-1] + q.get_nowait())
except queue.Empty:
pass
# get answer for additional data
new_ans = pool.starmap(process_elements, [(num, q) for num in new_numbers[1:]])
print(new_numbers)
print(new_ans)
I spent nearly the whole day with this and came to the end of my knowledge:
I want to change a shared multiprocessing.Value string in the subprocess, but python hangs as soon as the subprocess is trying to change the shared value.
Below an example code:
from multiprocessing import Process, Value, freeze_support
from ctypes import c_wchar_p
def test(x):
with x.get_lock():
x.value = 'THE TEST WORKED'
return
if __name__ == "__main__":
freeze_support()
value = Value(c_wchar_p, '')
p = Process(target=test, args = (value,))
p.start()
print(p.pid)
# this try block is to also allow p.run()
try:
p.join()
p.terminate()
except:
pass
print(value.value)
What I tried and does not work:
I tried ctypes c_wchar_p and c_char_p, but both result in the same freezing.
I tried also without x.get_lock()
I tried also without freeze_support()
What works (but does not help):
Using a float as the shared value (value = Value('d',0) and x.value = 1).
Running the Process without starting a subprocess (replace p.start() with p.run() )
I am using Windows 10 64 bit and Python 3.6.4 (Spyder, but also tried outside of Spyder).
Any help welcome!
A shared pointer won't work in another process because the pointer is only valid in the process in which it was created. Instead, use an array:
import multiprocessing as mp
def test(x):
x.value = b'Test worked!'
if __name__ == "__main__":
x = mp.Array('c',15)
p = mp.Process(target=test, args = (x,))
p.start()
p.join()
print(x.value)
Output:
b'Test worked!'
Note that array type 'c' is specialized and returns a SynchronizedString vs. other types that return SynchronizedArray. Here's how to use type 'u' for example:
import multiprocessing as mp
from ctypes import *
def test(x):
x.get_obj().value = 'Test worked!'
if __name__ == "__main__":
x = mp.Array('u',15)
p = mp.Process(target=test, args = (x,))
p.start()
p.join()
print(x.get_obj().value)
Output:
Test worked!
Note that operations on the wrapped value that are non-atomic such as += that do read/modify/write should be protected with a with x.get_lock(): context manager.
I use feedparser to get rss feeds from some sites, my core code is like this:
def parseworker(procnum, result_queue, return_dict, source_link):
try:
data = feedparser.parse(source_link)
return_dict[procnum] = data
except Exception as e:
print(str(e))
result_queue.put(source_link + 'grabbed')
def infoworker(procnum, timeout, result_queue, source_name, source_link):
text = 'recheck ' + source_name + ': ' + '...'
progress = ''
for x in range(timeout):
progress += '.'
sys.stdout.write('\r' + text + progress)
sys.stdout.flush()
time.sleep(1)
result_queue.put('time out')
def parsecaller(link, timeout, timestocheck):
return_dict = multiprocessing.Manager().dict()
result_queue = multiprocessing.Queue()
counter = 1
jobs = []
result = []
while not (counter > timestocheck):
p1 = multiprocessing.Process(target=infoworker, args=(11, timeout, result_queue, source_name, link))
p2 = multiprocessing.Process(target=parseworker, args=(22, result_queue, return_dict, link))
jobs.append(p1)
jobs.append(p2)
p1.start()
p2.start()
result_queue.get()
p1.terminate()
p2.terminate()
p1.join()
p2.join()
result = return_dict.values()
if not result or result[0].bozo:
print(' bad - no data', flush=True)
result = -1
else:
print(' ok ', flush=True)
result = result[0]
break
counter += 1
if result == -1:
raise bot_exceptions.ParserExceptionData()
elif result == -2:
raise bot_exceptions.ParserExceptionConnection()
else:
return result
if __name__ == '__main__':
multiprocessing.freeze_support()
multiprocessing.set_start_method('spawn')
try:
data = parsecaller(source_link, timeout=wait_time, timestocheck=check_times)
except Exception as e:
print(str(e))
continue
It works good, but after some random time goes into suspended state and does nothing - like infinite bootloop. It may suspend after 4 hours or 3 days, that's random.
I try to solve that problem by multiprocessing: use main process with timer like infoworker. When infoworker stops, it will put "result" to queue and by that will call result_queue.get() in parsecaller which after continues it and terminates both processes. But it does not work. Today, after 11 hours I got my code in suspended state in multiprocessing managers.py:
def serve_forever(self):
'''
Run the server forever
'''
self.stop_event = threading.Event()
process.current_process()._manager_server = self
try:
accepter = threading.Thread(target=self.accepter)
accepter.daemon = True
accepter.start()
try:
while not self.stop_event.is_set():
self.stop_event.wait(1)
except (KeyboardInterrupt, SystemExit):
pass
finally:
if sys.stdout != sys.__stdout__: # what about stderr?
util.debug('resetting stdout, stderr')
sys.stdout = sys.__stdout__
sys.stderr = sys.__stderr__
sys.exit(0)
for all time it was in:
while not self.stop_event.is_set():
self.stop_event.wait(1)
I thing that somewhere or GIL does not allow any other threads to work in processes or feedparser goes into loop. And of course it gets suspended with any random RSS sources.
My 'environment':
Mac OS 10.12.6 (also was that situation on win7 and win 10)
Python 3.7.0 (also wat that situation on 3.6.2, 3.6.5)
Pycharm 2017.2.2
My questions:
How to understand why it gets suspended (what to do, any recipe)?
How to bypass that state (what to do, any recipe)?
I am making a console app and I would like a loader animation, for example 3 dots that appear one at a time until it reaches the third dot then restarts the loop and does it all over again. Could someone show me how to do this please?
You could run the loop in a background thread:
import threading
import time
import sys
should_quit = False
num_dots = 3
def print_dots():
count = 0
while not should_quit:
time.sleep(.25)
if 0 == count % num_dots:
print(f"\r{' ' * num_dots}\r", end='')
print('.', end='')
sys.stdout.flush()
count += 1
t = None
try:
t = threading.Thread(target=print_dots)
t.daemon = True
t.start()
except:
print("Error: unable to start thread")
try:
input()
except KeyboardInterrupt:
pass
should_quit = True
t.join()