Why does the dimensions of Kivy app changes after deployment? - python-3.x

As mentioned in the question, I build a kivy app and deploy it to my android phone. The app works perfectly on my laptop but after deploying it the font size changes all of a sudden and become very small.
I can't debug this since everything works fine. The only problem is this design or rather the UI.
Does anyone had this issue before? Do you have a suggestion how to deal with it?
PS: I can't provide a reproducible code here since everything works fine. I assume it is a limitation of the framework but I'm not sure.

It sounds like you coded everything in terms of pixel sizes (the default units for most things). The difference on the phone is probably just that the pixels are smaller.
Use the kivy.metrics.dp helper function to apply a rough scaling according to pixel density. You'll probably find that if you currently have e.g. width: 50, on the desktop then width: dp(50) will look the same while on the phone it will be twice as big as before.
PS: I can't provide a reproducible code here since everything works fine.
Providing a minimal runnable example would, in fact, have let the reader verify whether you were attempting to compensate for pixel density.

Related

Vector3.ToString "rounds" decimals after build

I'm trying to get very precise Input.mousePosition in Unity3D with Input.mousePosition.x.ToString("F4"); and in the editor everything works fine. I get 4 decimal signs exacly how I need it. However, after I build the app, I get x.0000 all the time.
What important is, that, in the editor, I don't get zeroes only when I choose precise resolution and not the aspect ratio or Free Aspect. I was trying to play around with build resolution settings, for example disabling all aspect ratios, setting exact resolution, windowed/fullscreen mode. Nothing helps so far.
It seems like the built app sort of disables inbetween pixel position, and I have no idea why.
I know that there are plenty of questions and info on vector3 decimals, but my problem is mostly about display resolution (inbetween pixel coordinates) in Unity. Thanks in advance for any advice.

Getting a UI to be adaptive for iPhone and iPad

I'm working on a xamarin app and the goal is to have the UI be adaptive for both phone and tablet and so far it seems Android has a far easier way to achieve this. I'm reading this article on the matter and it honestly don't make a lick of sense.
All I want is to have everything grow in portion to the view as it gets bigger because of screen size (like android does), every time I try to do it it just anchors the control to the right and it just drags it to the right.
I mainly just need some a simple explanation on how I can have everything grow with the view.
I finally figured it out, After some deep breathing I saw what I was missing when I was reading the documentation.
for me, I wanted to use the autosize options, which let me anchor it how I wanted it to and it correctly scales for me now.

Nvidia High Performance Processor Setting leads to graphical bug (Seizure Warning) with current lighting system, drawing completely in the shader code

I followed the Lighting tutorial on learnopenGL, modifying some of the code to work in a 2D game engine. Everything was looking great and my team got our game done and the lights were quite simple for our designers to use. However we ran into a rare bug. as shown here: https://www.youtube.com/watch?v=to0mMP5I0cs one team member was able to recreate the bug by switching his Nvidia settings to use the "High Performance Processor" as opposed to "Integrated Graphics". Otherwise everything renders properly. The bug doesn't appear when there are no lights and everything is rendered in its full color. We have gone through alot of Ideas already but they haven't worked and now I am at a loss. Does anyone have any ideas about what is going on?
Always make sure you initialize your variables. Apparently some cards and drivers automatically initialize vec3 to (0,0,0), but others don't. That was what was going on here. Garbage values causing different colors at each fragment. By Initializing my resulting color vec3 to (0,0,0) at the beginning the problem is fixed.

DXGI: trying to get correct display mode from output (monitor)

I'm currently stuck with a pesky little issue. I developed an application that zeroes out the DXGI mode desc. structure and calls FindClosestMatchingMode() to, as advertised, "gravitate towards the desktop resolution".
This works fine if the laptop(s) run fully on their own display -- as soon as I plug in another monitor it goes berserk. In the case I extend my desktop it will still correctly get the laptop monitor resolution, yet the attached one (running 1080p) will yield a preference for 800*480 :) (sure, poor man's 16:10, but...)
Doing the same thing with the monitors cloned/combined (results in 1 output device), even if their resolution is equal, gives the same 800*480 crap.
What gives? And has anyone perhaps found a way to properly get a display's current mode through DXGI or a pointer for a wholly different yet functional approach to this here problem?
Life was easier back in the D3D9 days =)
-- Update
As it turns out any FindClosestMatchingMode() call made on the IDXGIOutput instance belonging to the external monitor behaves differently (and in most cases plain wrong) compared to the internal display, even though their native resolution is identical. To top it all off, other systems don't have this issue yet I can't get around supporting this particular laptop including it's drivers.
Time for a good old setup dialog.
Not the best solution but as I was constrained to these exact machines I settled for getting the monitor's current resolution through GetSystemMetrics() (SM_CXSCREEN/SM_CYSCREEN), which admittedly only works for the primary monitor but there's other ways, and feeding this resolution to the ModeToMatch structure fed to FindClosestMatchingMode().
It then settles for the correct (desktop) resolution.
Better answers are very welcome of course ;)

Making Software ready for Retina Display - Why is this necessary?

Now that the new Macbook Pro is coming out with a Retina Display, there are a lot of resources out there on how to make Mac apps and now even websites "Retina Display Friendly". Even Google is updating Chrome for Retina Display...
Why is this necessary at all? From what I understand, Retina Display is just a higher resolution screen. Right?
I thought when you develop gui's for desktop software and develop websites, you are developing something that is supposed to work and scale properly with virtually any resolution... When you resize an app's window, or display it on a higher or lower resolution display, it is supposed to scale and display properly.
So why are these people coming out with guides on how to make something look good on a Retina Display? Shouldn't it already look fine by default? Is there something about Retina Display that I'm not understanding?
And for the record, I'm not talking about iPhone 4 Retina Display. Most iOS dev's make their apps with fixed position elements since they know the screen's won't change size/shape. So I understand the importance of developing an app to look good on the iPhone 4/s vs 3g/s.
With the Retina display apps don't actually scale like they're being resized, all the controls are resized to be twice as big. If an app would be scaled normally, not by scaling all the controls, etc. you wouldn't see anything, because everything would be too small. It's the same difference between a Retina and a lower-resolution display as on the iPhone 3GS / iPhone 4.
An example:
These images are actually the same size, just the pixel densities differ.
And here's how it looks not properly scaled (using some app to disable proper scaling):
http://cloudmancer.com/images/trueretina.jpg
I thought when you develop gui's for desktop software and develop websites, you are developing something that is supposed to work and scale properly with virtually any resolution... When you resize an app's window, or display it on a higher or lower resolution display, it is supposed to scale and display properly (StackOverflow, for example, uses a 960px-wide container).
From a web developer standpoint, you are often asked to develop fixed-width websites (ranging from normally 940 to 1000 pixels wide), and they don't get to scale at all. There are a lot of websites like this and many apps just aren't designed to increase in size.
Also, apps that do grow in size usually expect that a bigger resolution also means a bigger screen, so they simply stretch the main application panels and are done with it.
Now, consider static elements, like a 150x50 button that says 'Click me'. This button is not intended to become bigger and is perfectly acceptable on a regular 1440x900 display. Now the retina screen comes in with its 2580x1800 resolution. The app sees the resolution change but it thinks "Hey, that user must have a huge screen" so it keeps the button the same size.
The problem that now occurs is that the button, because both resolutions apply to the same 13" screen, is now appearing to be a fraction of the size of the original button. Depending on your user vision, he might not be able to read the text on it, and might have a hard time clicking it, depending on the mouse settings.
To fix that problem, Apple and Microsoft used two different solutions:
Microsoft decided to tell the app the display had a 2580x1800 resolutions, but that the user wanted to have everything scaled to 200 dpi. This means that, if an app does not follow the guidelines, it will look smaller. Many apps simply ignore the DPI settings (though this might change with Windows 8);
Apple decided to report to apps that the resolution of the monitor was 1440x900, but that it could display higher-resolution elements if asked to; This means that apps existing before the new retina settings will appear to be the same size as before for the end-user (with added benefits like crisper text if they use the default Apple APIs), but that they can decide to provide high-DPI images that will look much better on the display.
Both solutions requires apps to be aware that the display is high-DPI ('retina'), but the way Apple handled it means the static websites and apps mentioned earlier will keep looking just fine, except they wont have super-crisp, high-resolution images to use. And, to opt-in to the retina features, they have to provide 200x200 images for a 100x100 canvas, for example, and Apple will take care of the rest.

Resources