Blitting an image in pygame immediately disappears after I release my mouse - python-3.x

def options():
options = True
while options:
for event in pygame.event.get():
win.fill(WHITE)
win.blit(background, (0, 0))
... # Blitting text and switch buttons
options_x, options_y = pygame.mouse.get_pos()
if event.type == pygame.QUIT:
# Exit button
pygame.QUIT()
quit()
elif event.type == pygame.MOUSEBUTTONDOWN:
# Sound effect tuning
if 470 > options_x > 390 and 220 > options_y > 185:
# Checking if mouse click is on the ON SWITCH
mouse_click.play()
screen.blit(off_switch, (off_switch_x, off_switch_y))
pygame.mixer.stop()
# But doesn't stop sound from playing when I quit options section
# Music effect tuning
elif 470 > options_x > 390 and 300 > options_y > 260:
# Checking if mouse click is on the ON SWITCH
mouse_click.play()
screen.blit(off_switch, (off_switch_x, off_switch_y))
pygame.mixer.music.stop()
# Interactive BACK button
elif 90 > options_x > 42 and 75 > options_y > 25:
mouse_click.play()
options = False
pygame.display.update()
So this is a part of my HANGMAN game where I try to set up the OPTIONS section which allows you to configure the volume.
The problem is in the "Music" and "Sound" effect tuning.
Pressing on the ON SWITCH button for both "Music" and "Sound" will display the OFF SWITCH, but as soon as I release my mouse, they go back to their original state.
The music stops, but not the sound effects (mouse clicks, plop sounds, etc...).
I would like the image blitting to be saved, and the sounds effects to be stopped.
How can I fix this?

You undo the blit in every loop. At the top I see code which clears\resets everything.
win.fill(WHITE)
win.blit(background, (0, 0))
....
You blit the change inside the event handler:
screen.blit(off_switch, (off_switch_x, off_switch_y))
The event switch blit will be cleared at the next event loop (probably mouse move).
Think of the game as a series of states. The code at the top of the loop should draw the current state of the game.
for event in pygame.event.get():
win.fill(WHITE)
win.blit(background, (0, 0))
if SwitchIsOn:
screen.blit(on_switch, (on_switch_x, on_switch_y))
else:
screen.blit(off_switch, (off_switch_x, off_switch_y))
....
The event handler should be used to change the game state.
if 470 > options_x > 390 and 220 > options_y > 185:
# Checking if mouse click is on the ON SWITCH
mouse_click.play()
SwitchIsOn = not SwitchIsOn # reverse switch position
This will prevent your event changes from being cleared.

Related

question about combining def function() and PWM duty_ns() in micropython

as a Micro python beginner, I combined few codes found on different forums in order to achieve a higher resolution for ESC signal control. The code will generate from a MIN 1000000 nanoseconds to a MAX 2000000 nanoseconds' pulse but I could only do 100 in increments. My code is kind of messy. Sorry if that hurts your eyes. My question is, does it represent an actual 100ns of resolution? and what's the trick to make it in increments of 1.(Not sure whether is even necessary, but I still hope someone can share some wisdom.)
from machine import Pin, PWM, ADC
from time import sleep
MIN=10000
MAX=20000
class setPin(PWM):
def __init__(self, pin: Pin):
super().__init__(pin)
def duty(self,d):
super().duty_ns(d*100)
print(d*100)
pot = ADC(0)
esc = setPin(Pin(7))
esc.freq(500)
esc.duty(MIN) # arming ESC at 1000 us.
sleep(1)
def map(x, in_min, in_max, out_min, out_max):
return int((x - in_min)*(out_max - out_min)/(in_max - in_min) + out_min)
while True:
pot_val = pot.read_u16()
pulse_ns = map(pot_val, 256, 65535, 10000, 20000)
if pot_val<300: # makes ESC more stable at startup.
esc.duty(MIN)
sleep(0.1)
if pot_val>65300: # gives less tolerance when reaching MAX.
esc.duty(MAX)
sleep(0.1)
else:
esc.duty(pulse_ns) # generates 1000000ns to 2000000ns of pulse.
sleep(0.1)
try to change
esc.freq(500) => esc.freq(250)
x=3600
print (map(3600,256,65535,10000,20000)*100)

Console Screen Buffer Info shows incorrect X position

I recently found a great short code Why the irrelevant code made a difference? for obtaining console screen buffer info (which I include below) that replaces the huge code accompanying the standard 'CONSOLE_SCREEN_BUFFER_INFO()' method (which I won't include here!)
import ctypes
import struct
print("xxx",end="") # I added this to show what the problem is
hstd = ctypes.windll.kernel32.GetStdHandle(-11) # STD_OUTPUT_HANDLE = -11
csbi = ctypes.create_string_buffer(22)
res = ctypes.windll.kernel32.GetConsoleScreenBufferInfo(hstd, csbi)
width, height, curx, cury, wattr, left, top, right, bottom, maxx, maxy = struct.unpack("hhhhHhhhhhh", csbi.raw)
# The following two lines are also added
print() # To bring the cursor to next line for displaying infp
print(width, height, curx, cury, wattr, left, top, right, bottom, maxx, maxy) # Display what we got
Output:
80 250 0 7 7 0 0 79 24 80 43
This output is for Windows 10 MSDOS, with clearing the screen before running the code. However. 'curx' = 0 although it should be 3 (after printing "xxx"). The same phenomenon happens also with the 'CONSOLE_SCREEN_BUFFER_INFO()' method. Any idea what is the problem?
Also, any suggestion for a method of obtaining current cursor position -- besides 'curses' library -- will be welcome!
You need to flush the print buffer if you don't output a linefeed:
print("xxx",end="",flush=True)
Then I get the correct curx=3 with your code:
xxx
130 9999 3 0 14 0 0 129 75 130 76
BTW the original answer in the posted question is the "great" code. The "bitness" of HANDLE can break your code, and not defining .argtypes as a "shortcut" is usually the cause of most ctypes problems.

cannot set position of wx.frame on openbox

i am playing with wxPython and try to set position of frame:
import wx
app = wx.App()
p = wx.Point(200, 200)
frame = wx.Frame(None, title = 'test position', pos = p)
frame.Show(True)
print('frame position: ', frame.GetPosition())
app.MainLoop()
even though print('frame position: ', frame.GetPosition()) shows the correct postion, the frame is shown in top left corner of screen.
Alternatively i tried
frame.SetPosition(p)
frame.Move(p)
without success.
my environment: ArchLinux 5.3.13, python 3.8.0, wxpython 4.0.7, openbox 3.6.1
On cinnamom the code works as expected. How to solve this on openbox?
edit 07,12,2019:
i could set postion of a dialog in openbox config ~/.config/openbox/rc.xml:
<application name="fahrplan.py"
class="Fahrplan.py"
groupname="fahrplan.py"
groupclass="Fahrplan.py"
title="Fahrplan *"
type="dialog">
<position force="no">
<x>760</x>
<y>415</y>
</position>
</application>
i got name, class etc. from obxprop. x and y are calculated to center a dialog of 400 x 250 px on screen of 1920 x 1080 px.
This static solution is not suitable for me. I want to place dynamically generated popups.
I had the same problem under Windows and played around with the style flags. With wxICONIZE sytle set active the window finally used the positioning information
The position is provided to the window manager as a "hint". It is totally up to the window manager whether it will actually honor the hint or not. Check the openbox settings or preferences and see if there is anything relevant that can be changed.

kivy: badly aligned widgets

Example code fragment from a base class:
def build_extra_content(self):
grp = 'choice_dialog'
extra_content = GridLayout(cols=2)
lb_width = self.width - 2 * self.choice_height
for choice in self.choices:
cb = CheckBox(group=grp,
size_hint=(None, None), size=(self.choice_height, self.choice_height))
lb = Label(markup=True, text=choice, halign='left', valign='middle',
size_hint=(None, None), size=(lb_width, self.choice_height))
lb.texture_size = (lb_width, self.choice_height)
extra_content.add_widget(cb)
extra_content.add_widget(lb)
# TODO: check the checkbox when the label is touched.
def _lb_press(*args):
print(args)
cb.bind(on_touch_down=_lb_press)
return extra_content
The content is displayed in this dialog:
I have two questions. First: why the text is aligned to the center? I have already set absolute sizes for both the label and its texture size, and set halign='left'. But the text is still aligned to the center. Why?
Second: I wanted the labels to be clickable/touchable. E.g. the checkboxes should be selected by touching their corresponding labels. Whenever I click on a single label or checkbox, this is printed:
(<kivy.uix.checkbox.CheckBox object at 0x0B2C01B8>, <MouseMotionEvent button="left" device="mouse" double_tap_time="0" dpos="(0.0, 0.0)" dsx="0.0" dsy="0.0" dsz="0.0" dx="0.0" dy="0.0" dz="0.0" grab_current="None" grab_exclusive_class="None" grab_list="[]" grab_state="False" id="mouse3" is_double_tap="False" is_mouse_scrolling="False" is_touch="True" is_triple_tap="False" opos="(644.0, 379.0)" osx="0.503125" osy="0.47375" osz="0.0" ox="644.0" oy="379.0" oz="0.0" pos="(644.0, 379.0)" ppos="(644.0, 379.0)" profile="['pos', 'button']" psx="0.503125" psy="0.47375" psz="0.0" push_attrs="('x', 'y', 'z', 'dx', 'dy', 'dz', 'ox', 'oy', 'oz', 'px', 'py', 'pz', 'pos')" push_attrs_stack="[]" px="644.0" py="379.0" pz="0.0" shape="None" spos="(0.503125, 0.47375)" sx="0.503125" sy="0.47375" sz="0.0" time_end="-1" time_start="1507722998.229789" time_update="1507722998.229789" triple_tap_time="0" ud="{}" uid="3" x="644.0" y="379.0" z="0.0">)
(<kivy.uix.checkbox.CheckBox object at 0x0B5FC2D0>, <MouseMotionEvent button="left" device="mouse" double_tap_time="0" dpos="(0.0, 0.0)" dsx="0.0" dsy="0.0" dsz="0.0" dx="0.0" dy="0.0" dz="0.0" grab_current="None" grab_exclusive_class="None" grab_list="[]" grab_state="False" id="mouse3" is_double_tap="False" is_mouse_scrolling="False" is_touch="True" is_triple_tap="False" opos="(644.0, 379.0)" osx="0.503125" osy="0.47375" osz="0.0" ox="644.0" oy="379.0" oz="0.0" pos="(644.0, 379.0)" ppos="(644.0, 379.0)" profile="['pos', 'button']" psx="0.503125" psy="0.47375" psz="0.0" push_attrs="('x', 'y', 'z', 'dx', 'dy', 'dz', 'ox', 'oy', 'oz', 'px', 'py', 'pz', 'pos')" push_attrs_stack="[]" px="644.0" py="379.0" pz="0.0" shape="None" spos="(0.503125, 0.47375)" sx="0.503125" sy="0.47375" sz="0.0" time_end="-1" time_start="1507722998.229789" time_update="1507722998.229789" triple_tap_time="0" ud="{}" uid="3" x="644.0" y="379.0" z="0.0">)
(<kivy.uix.checkbox.CheckBox object at 0x0B5F3768>, <MouseMotionEvent button="left" device="mouse" double_tap_time="0" dpos="(0.0, 0.0)" dsx="0.0" dsy="0.0" dsz="0.0" dx="0.0" dy="0.0" dz="0.0" grab_current="None" grab_exclusive_class="None" grab_list="[]" grab_state="False" id="mouse3" is_double_tap="False" is_mouse_scrolling="False" is_touch="True" is_triple_tap="False" opos="(644.0, 379.0)" osx="0.503125" osy="0.47375" osz="0.0" ox="644.0" oy="379.0" oz="0.0" pos="(644.0, 379.0)" ppos="(644.0, 379.0)" profile="['pos', 'button']" psx="0.503125" psy="0.47375" psz="0.0" push_attrs="('x', 'y', 'z', 'dx', 'dy', 'dz', 'ox', 'oy', 'oz', 'px', 'py', 'pz', 'pos')" push_attrs_stack="[]" px="644.0" py="379.0" pz="0.0" shape="None" spos="(0.503125, 0.47375)" sx="0.503125" sy="0.47375" sz="0.0" time_end="-1" time_start="1507722998.229789" time_update="1507722998.229789" triple_tap_time="0" ud="{}" uid="3" x="644.0" y="379.0" z="0.0">)
Actually it doesn't matter where I click. Even if I click outside the GridLayout, always all labels will trigger the touch event. But why? I only want the one under my finger.
Thanks
First: why the text is aligned to the center? I
have already set absolute sizes for both the label and its texture
size, and set halign='left'. But the text is still aligned to the
center. Why?
You're setting texture_size, but you should set text_size instead.
Second: I wanted the labels to be clickable/touchable. E.g. the
checkboxes should be selected by touching their corresponding labels.
Whenever I click on a single label or checkbox, this is printed:
In Kivy every widget receive touch event. You should check if touch happened inside your label manually:
def on_touch_down(self, touch):
if self.collide_point(*touch.pos):
# The touch has occurred inside the widgets area. Do stuff!
pass
Here is the answer to your second question.
Note
By default, touch events are dispatched to all currently displayed
widgets. This means widgets receive the touch event whether it occurs
within their physical area or not.
In order to provide the maximum flexibility, Kivy dispatches the
events to all the widgets and lets them decide how to react to them.
If you only want to respond to touch events inside the widget, you
simply check:
Example
class ProjectSelectButton(Button):
def click_on_button(self, instance, touch, *args):
print(instance)
if self.collide_point(*touch.pos):
if touch.button == 'right':
print(self.id, "right mouse clicked")
elif touch.buttom == 'left':
print(self.id, "left mouse clicked")
return True
return super(ProjectSelectButton, self).on_touch_down(touch)

Qt 5.4 Linux Touchscreen Input with Tslib on Raspberry Pi failing with LinuxFB QPA Platform Plugin

I bought a Tontec 2.4 Inch Touchscreen ( http://elinux.org/MZTX-PI-EXT ) for my Raspberry Pi. The touchscreen controller requires the "tsc2007.ko" and "tsp_raspi.ko" kernel modules as described in the elinux post. The tsc2007.ko module is in the Raspbian Kernel tree but the tsp_raspi.ko can be found here: https://github.com/osandov/raspi/tree/master/tsc2007.
I've cross compiled a new Kernel for the Pi with those modules and they load fine and create a /dev/input/event0 device in Raspbian. If I 'evtest' that device and touch the screen, I get output so I know the events are being delivered in Linux:
pi#raspberry /dev/input $ evtest
Available devices:
/dev/input/event0: TSC2007 Touchscreen
Select the device event number [0-0]: 0
Input driver version is 1.0.1
Input device ID: bus 0x18 vendor 0x0 product 0x0 version 0x0
Input device name: "TSC2007 Touchscreen"
Supported events:
Event type 0 (EV_SYN)
Event type 1 (EV_KEY)
Event code 330 (BTN_TOUCH)
Event type 3 (EV_ABS)
Event code 0 (ABS_X)
Value 1922
Min 0
Max 4095
Fuzz 64
Event code 1 (ABS_Y)
Value 2221
Min 0
Max 4095
Fuzz 64
Event code 24 (ABS_PRESSURE)
Value 0
Min 0
Max 4095
Fuzz 64
Properties:
Testing ... (interrupt to exit)
Event: time 1425521704.199489, type 1 (EV_KEY), code 330 (BTN_TOUCH), value 1
Event: time 1425521704.199489, type 3 (EV_ABS), code 1 (ABS_Y), value 2085
Event: time 1425521704.199489, type 3 (EV_ABS), code 24 (ABS_PRESSURE), value 538
Event: time 1425521704.199489, -------------- SYN_REPORT ------------
Event: time 1425521704.209174, type 3 (EV_ABS), code 0 (ABS_X), value 1455
...
I installed tslib and ran a quick ts_calibrate. I also made sure that ts_test spit out data when I touched the screen.
I added the following environment variables to /etc/profile for tslib support in Qt5:
## For Qt5 Touchscreen Support
export QT_DEBUG_PLUGINS=1
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/lib/arm-linux-gnueabihf/
export QT_PLUGIN_PATH=/usr/lib/plugins
export QT_QPA_FONTDIR=/usr/lib/fonts
export QT_QPA_PLATFORM_PLUGIN_PATH=/usr/lib/plugins/platforms
export QT_QPA_PLATFORM=linuxfb
export QT_QPA_GENERIC_PLUGINS=tslib:/dev/input/event0
export QT_QPA_EVDEV_TOUCHSCREEN_PARAMETERS=/dev/input/event0
export TSLIB_TSEVENTTYPE='INPUT'
export TSLIB_CALIBFILE='/etc/pointercal'
export TSLIB_CONFFILE='/etc/ts.conf'
export TSLIB_CONSOLEDEVICE='none'
export TSLIB_FBDEVICE='/dev/fb0'
export TSLIB_PLUGINDIR='/usr/lib/ts'
export TSLIB_TSDEVICE='/dev/input/event0'
I read up on the Qt5 docs and how to get touch events in my app. I have a main Widget and set the appropriate flags in the constructor:
MainWidget::MainWidget(QLabel *parent)
: QLabel(parent)
{
qDebug() << "Setting WA_AcceptTouchEvents on MainWidget...";
// Accept touch events
setAttribute(Qt::WA_AcceptTouchEvents);
setAttribute(Qt::WA_StaticContents);
}
I setup an event filter to try to catch the touch events:
bool MainWidget::eventFilter( QObject* target, QEvent* e )
{
qDebug() << "Event Type: " << e->type();
return false;
return QLabel::eventFilter(target, e);
}
I launch my app like this:
myapp -platform linuxfb:fb=/dev/fb0 -plugin tslib:/dev/input/event0
I also uncommented a single printf in Qt's source code for the qtslib.cpp:
void QTsLibMouseHandler::readMouseData()
{
ts_sample sample;
while (get_sample(m_dev, &sample, m_rawMode)) {
bool pressed = sample.pressure;
int x = sample.x;
int y = sample.y;
// work around missing coordinates on mouse release
if (sample.pressure == 0 && sample.x == 0 && sample.y == 0) {
x = m_x;
y = m_y;
}
if (!m_rawMode) {
//filtering: ignore movements of 2 pixels or less
int dx = x - m_x;
int dy = y - m_y;
if (dx*dx <= 4 && dy*dy <= 4 && pressed == m_pressed)
continue;
}
QPoint pos(x, y);
//printf("handleMouseEvent %d %d %d %ld\n", m_x, m_y, pressed, sample.tv.tv_usec);
QWindowSystemInterface::handleMouseEvent(0, pos, pos, pressed ? Qt::LeftButton : Qt::NoButton);
m_x = x;
m_y = y;
m_pressed = pressed;
}
}
When I launch my Qt app I see the plugins are loading OK ( even shows the correct event0 file ). I also see that the qt tslib plugin is receiving touch events when I touch the screen. The problem is that the event filter is NEVER called!
Here is the app being launched:
Got keys from plugin meta data ("tslib", "tslibraw")
QFactoryLoader::QFactoryLoader() checking directory path "/home/pi/generic" ...
loaded library "/usr/lib/qt/plugins/generic/libqtslibplugin.so"
QTsLibMouseHandler "tslib" ""
QTsLibMouseHandler "tslib" "/dev/input/event0"
QFactoryLoader::QFactoryLoader() checking directory path "/usr/lib/qt/plugins/styles" ...
QFactoryLoader::QFactoryLoader() checking directory path "/home/pi/styles" ...
Setting WA_AcceptTouchEvents on MainWidget...
-----------------------------------------
Waiting for data now...
-----------------------------------------
handleMouseEvent 0 0 1 751196
handleMouseEvent 0 0 1 751196
handleMouseEvent 1696 1615 1 771075
handleMouseEvent 1696 1615 1 771075
handleMouseEvent 1679 1622 1 781368
handleMouseEvent 1671 1638 1 781368
handleMouseEvent 1679 1622 1 781368
handleMouseEvent 1671 1638 1 781368
...
I found a few forum posts where people are having problems with touch input with the linuxfb platform plugin:
http://comments.gmane.org/gmane.comp.lib.qt.user/5686
http://qt-project.org/forums/viewthread/35757
http://qt-project.org/forums/viewthread/36120/
I've tried all their suggestions and still have the problem - no touch events are received by my app even though the Qt tslib plugin says it is receiving them.
It seems that the tslib plugin is having problems injecting the event it receives into my app's event loop with this:
QWindowSystemInterface::handleMouseEvent(0, pos, pos, pressed ? Qt::LeftButton : Qt::NoButton);
I also tried the Qt5.4 touch fingerpaint example and see the same behavior - no touch events are received.
I'm not sure where to go from here. I would greatly appreciate any help solving this issue. Thanks!
UPDATE:
I changed my event filter so it looks like this:
bool MainWidget::eventFilter(QObject *obj, QEvent *event)
{
qDebug() << "Event received" << obj->metaObject()->className() << event->type();
switch (event->type()) {
case QEvent::TouchBegin:
qDebug() << "TouchBegin";
case QEvent::TouchUpdate:
qDebug() << "TouchUpdate";
case QEvent::TouchEnd:
qDebug() << "TouchEnd";
{
// QTouchEvent *touch = static_cast<QTouchEvent *>(event);
// QList<QTouchEvent::TouchPoint> touchPoints = static_cast<QTouchEvent *>(event)->touchPoints();
// foreach (const QTouchEvent::TouchPoint &touchPoint, touchPoints) {
// switch (touchPoint.state()) {
// case Qt::TouchPointStationary:
// // don't do anything if this touch point hasn't moved
// continue;
// default:
// {
// }
// break;
// }
// }
// break;
}
//default:
//return QLabel::event(event);
}
//return true;
}
Now I can see 'socket notifier' events intermingled with Qt Tslib Plugin's prints whenever I touch the screen. Any ideas as to why Event Type 50 but no Touch Events?
Event received QSocketNotifier 50
handleMouseEvent 2702 2618 0 557715
Event received QSocketNotifier 50
handleMouseEvent 2698 2612 1 547758
Event received QSocketNotifier 50
handleMouseEvent 2706 2802 1 759928
Event received QSocketNotifier 50
Event received QSocketNotifier 50
UPDATE #2:
I installed the event filter only to try to catch any events. I'm not sure in Qt5 what translates an event type 50 ( QSocketNotifier ) to a QTouch* or QMouse* event.
Here is some more information:
When I run evtest, I see that the screen resolution is huge ( ~2500 x
~2500 ) and the actual screen is 320x240. I tried changed the
/dev/fb0 framebuffer size in /boot/config.txt to 320x240 and
rebooted. But the evtest and ts_calibrate steps still show the huge
resolution.
Because of the large resolution, I tried making my main widget
10000x10000 to see if I would get a touch or mouse event - but I
still only get the QSocketNotifier
I then tried to force the tslib plugin to always inject events at
screen position X=50 Y=50, but I still only get the event type 50
QSocketNotifier.
The problem was solved by making sure the tslib plugins were installed on the RasbperryPi.
TSLIB_PLUGINDIR=/usr/lib/ts
The directory /usr/lib/ts was not present on the Pi.

Resources