I was wondering if there's a way to update the default window color when an image is buffering. Currently it's white.
For example, if I have a workspace open with just vscode, when I switch between workspaces there's a brief moment when the screen is white and I'd prefer the monitor be painted black while the image is buffering.
Sometime this issue also happens when moving windows around but it covers a smaller portion of the screen.
Just the color
There is a setting that may be precisely the answer to your question: change the buffer color to black (Check the section in the manual) Particularly, for i3 version 4.21, the config would be this:
client.background #000000
Getting a Composer
If that does not solve the issue, perhaps you could address it from a different perspective. i3wm is a window manager with no "composing" capability. This, among other things, sometimes produces a loss of performance in some graphical tasks, such as problems with smooth scrolling on the browser, and/or flashing colors before showing the rendering of the desktop. (In your case is white.)
So you could obtain a compositor for i3wm. Compton is an option that seems popular among the community. I haven't tried myself, but the lack of compositor may be the root of your problem. Here there is (an interesting related question) in i3wm FAQs. This is part of the answer:
i3 depends on an external application for compositing and Compton is an excellent choice if you want to improve rendering quality or apply hardware-accelerated translucency effects.
There are two issues I am aware of that affect stock compositing. One is screen tearing, which you may notice with animated effects such as Firefox's smooth scrolling. Another is a flash of partially rendered content when switching workspaces, or opening and closing windows. Using Compton should resolve those issues if it is configured correctly.
What is the recommended way of supporting multiple screen resolutions/aspect ratios across devices like iPad, iPhone, Windows Phones, and Android phones/tablets? Should I simply #if/#else specific code for each device? I don't know how well this would work. Especially for Android phones/tablets which come in all different sizes. Any pointers would be greatly appreciated.
Here is what we are doing for our game:
All menu or ui elements are positioned based on the screen size (we implement Horizontal and Vertical alignment)
All levels scroll, so on some devices you just see less of level of the level at a time
Our levels also zoom in on smaller devices where needed
Design fixed levels (ones that don't scroll) so that a bit of unused space is on the edges of the screen. This way it can get cropped on some devices no problem.
Make 3 sizes of images: small (3GS), medium (iPhone 4, Android, WP7, iPad), large iPad3
Position sprites/ui elements based on an images size
Take advantage of the #2x naming scheme for images
We made an iPhone-only and iPad-only version of the app, this helps in only having to put 2 sets of images in each app
Using the screen size for positioning is your best bet. Being able to center or dock to the bottom or right of the screen is also very helpful in general.
I could tell more, but I can't reveal specifics about our game yet.
I'm currently designing my first ever GUI for Windows. I'm using MFC and Visual Studio 2008. The monitor I have been designing my program on has 1680x1050 native resolution. If I compile and send my program to one of my coworkers to run on their computer (generally a laptop running at 1024x768), my program will not fit on their screen.
I have been trying to read up on how to design an MFC application so that it will run on all resolutions, but I keep finding misleading information. Everywhere I look it seems that DLUs are supposed to resize your application for you, and that the only time you should run into problems is when you have an actual bitmap whose resolution you need to worry about. But if this is the case, why will my program no longer fit on my screen when I set my monitor to a lower resolution? Instead of my program "shrinking" to take up the same amount of screen real estate that it uses at 1680x1050, it gets huge and grainy.
The "obvious" solution here is to set my resolution to 1024x768 and redesign my program to fit on the screen. Except that I've already squished everything on my dialogs as much as possible to try and get my program to fit on screen running at 1024x768. My dialog fonts are set to Microsoft Sans Serif 8 but still appear huge (much larger than 8 points) when running at 1024x768.
I know there HAS to be a way to make my program keep the same scaling... right? Or is this the wrong way to approach the problem? What is the correct/standard way to go about designing an MFC program so that it can run on many resolutions, say 800x600 and up?
I assume your application GUI is dialog based (the main window is a dialog)?
In that case you have a problem, because, as you discovered, MFC has no support for resizing a dialog correctly. Your options are:
Redesign your GUI to use a SDI or MDI GUI.
Use a dialog resize extension. There are many available, for some very good suggestions see this question. Another options are this one and this one.
Don't use MFC. wxWidgets has much better support for dialog resizing.
MFC is only a thin wrapper over the Windows API. They both make an assumption which is hardly ever true: if you have a higher resolution screen, you'll adjust the DPI or font size in Windows to get larger characters. Most of the time, a larger screen size means a larger physical monitor, or a laptop where you want to squeeze as much information into a small screen as possible; people value more information over greater detail. Thus the assumption fails.
If you can't squeeze your entire UI into the smallest size screen you need to support, you'll have to find another way to make it smaller. Without knowing anything about your UI, I might suggest using tabs to group the controls into pages.
I've had good luck making my windows resizable, so that people with larger screens can see more information at once. You need to do this the hard way, responding to the WM_SIZE message to the window and deciding which controls should be made larger and which ones should just move.
There is no automatic way to resize the content of your dialogs when resolution changes. So, you need to set some boundaries.
Option 1.
If you are developing your app for customers, pick one minimum resolution (like 1024x7678), redesign you dialogs so that everything fits. Maybe break up some into several, or use tab strip control.
Option 2.
Create separate dialog forms for each resolution you'd like to support, but use the same class to handle it. At runtime detect resolution and use the appropriate form.
Option 3.
Write your own resizing functionality, so that user could adjust the size of your dialogs to his liking.
I love making fluid layouts, but as we all know, one of its problems is that on larger screens, it can cause text to horizontally extend to uncomfortable lengths.
I myself only have a moderate screen size, so I'm wondering this. Do people with gargantuan screens typically restore their window and set it to a more moderate width (multitask), or do they maximize it?
As far as design goes, it's no big deal to set a max width, but still, I just have to know.
Thanks.
You can answer this either way.
Right now I have my gargatuan screen maximised, later I might not.
Cater for both is your best option, but limit the fluidness to a certain max size.
I think it depends on your focus group. "Power" users tend to not run their window maximized while others seem to prefer to have the window maximized.
I especially hear it from those advocates of "business" notebooks manufactured by IBM/Lenovo, HP, Dell (maybe) that "business users do not need quality screens". They stick in the worst possible LCDs out there (even if with a high resolution) and dare to sell that crap. You can't even distinguish hue variations like light-yellow vs. light-grey.
I really miss it - do all of you agree color reproduction of a developer display is irrelevant, be it even a grayscale display it will do?
I understand most of developers work with text but... at times there is some design work to be done which is not doable on cheap LCDs.
And besides - wouldn't you enjoy fresh saturated colors even in a development environment? Bright cheerful icons on menus? Isn't it better to sit in a sunny office with green trees and flowers out of the window than in a garage with dark colors and weak artificial lighting?
P.S. Inspired by the topic about keyboards: Keyboard for programmers
The question about displays and developers really interests me since a very long time.
Even though I don't need a high quality screen, I appreciate the difference, and like esnoeijs said, an occasion will arise where I'll need to critique some graphic design work where the quality monitor will make a difference.
I think, "developer" is too broad to give a precise answer.
If you are a code-crafter of programs reading text emitting text, without the need to make some colors look nice, then yes, then you really can go with a monochrome screen. you need black as a background, white as the foreground and some reversing to highlight matching braces. In this particular case, I would value high resolutions far far more important than colors, since usually it is about seeing more code (and especially, more things about and around the current piece of code, like documentation, tests, a quick interpreter loop, some research paper, you name it).
If you are a developer just learning a language and if you have an editor with syntax highlighting, then color is a massive, massive usability leap. I would not want to miss the ability to display keywords in a bright pink, strings in a brigt cyan and similar things (all on a black background)
If you are a frontend-designer, then it is a completely different story. If you are a frontend designer, you will need a high quality display with good color display abilities. You do not need the best one possible, but your display should at least be able to display the colors your regular user will use, so you will not put in green, because you wanted blue and your users see yellow (or other nonsenses).
if you use tools that require the use of colors in order to encode information, color is crucial, because you might be unable to see the additional information.
...
So, I think, most programmers do not need some ridiculous color displaying abilities, even though, most of the time, a good solid color display is helpful, because they need to work on some frontend or because they want to learn some language.
HTH,
Tetha
Better quality color monitors can come in handy in a lot of ways. The first way that comes to mind is if you are using a code development tool that has the capability of highlighting keywords such as Zend does.
I once spent half a day trying to add zebra striping to a table in my company's webapp that already had it because both my screen and QA's screen were unable to display the different colors of the zebra stripes (they rendered as the same color). Likewise, I once had my boss ask me to change the color of part of an icon, and to me it made the icon look like a uniform blue, but on his much better monitor, you could clearly see both shades of blue and it looked really nice... it was hard to make that edit without being able to see what I was doing.
I guess the developers in my company end up doing some design work in addition to real dev. I do spend most of my time in the shell though, so aside from the constant flickering that gives me headaches (yes, it's an LCD), a low-qual monitor is OK.
I'm a developer but being in webdev land i've picked up enough design stuff to be critical about it, so i mostly try to get samsung screens with a good colour range.
With a good monitor, you can adjust it to your likings.
Personally, I have a $700-Fujitsu Siemens monitor bought in (afaik) 2000, and a $340-BenQ bought in 2005, and I prefer coding on the first monitor, as I don't have to crank up the brightness (reducing headaches) and can still see everything I want to see (subpixelhinted 6 point fonts, subtle variations in syntax highlightings etc.).
At least one author would disagree. He ranked color accuracy on four notebooks:
Lenovo ThinkPad W700
IBM/Lenovo ThinkPad T60
Dell Inspiron Mini 9
Apple late-2008 MacBook Pro 15 inch
I'm less picky about the actual monitor I have and more picky that I have two monitors that are exactly the same model and use the same video connector.
As a web developer, it can be frustrating to have colors that don't match because one of your monitors is VGA and the other is DVI.
Possibly the sort of "business user" who works on invoices all day does not need a very good display, but anyone who works on anything whose appearance counts, from software developers to business users who need to make Powerpoint presentations, does.
If you are a hardcore terminal+vim user like me, they color quality and fidelity are almost irrelevant, except for the quality of blue (which I use in some situations, like directory names) which tends to be too faint to be seen on my black background. Nothing that cannot be fixed with some tinkering though, but I am used to blue.
That said, I actually have a couple of things to say about the new screen on the macbook unibody. The glossy finish is a real pain. So annoying. And the color fidelity is very low. I spent an evening trying to understand why on a gradient from light green to white I had a pinkish stripe. Turns out that the pink is an artifact of the macbook screen. Another screen does not show the issue. On the plus side, the LED backlight is very powerful and nice, making the colors very vibrant.
This to say that color fidelity is fundamental if you use color-intensive stuff like eclipse (which communicates a lot also through different shades of colors), and of course for web frontend development. If you just need a terminal and a vim running, I don't think color fidelity makes a real difference, once you have a comfortable setup with low reflections, and a good contrast.
(note: it's been a few years since I've shopped for a monitor. this may be out of date)
I find it interesting that nobody has really defined "quality" yet, other than to say more vibrant colors. Generally, LCD panels fall into one of two tracks:
Good color/image reproduction (S-IPS panels and similar)
Good response time ( TN panels )
I consider SIPS and similar panels a must for development for one crucial reason: look angle. The image doesn't change colors or do other weird things as you angle to the screen changes. Very important for collaboration.
At the high end of this scale are monitors that are designed to perform will with color calibration. Most developers won't need anything this fancy.
TN panels are decent for gaming, movies, and other things featuring fast motion. They are optimized for pixel response time, and it's usually the main feature touted for these panels. Many cheaper panels are going to be of this variety.
In a monitor, I look for four things:
panel type (S-IPS or similar)
brightness (no more than 300cd/m2)
dot pitch (for good text, go with a small dot pitch: 0.27 is too big)
good contrast/ light leakage/ etc. (how black is black, and how uniform)
Although I love S-IPS panels, I must admit that any LCD monitor that can meet criteria 2-4 above would be a good choice, even if it's a cheaper TN panel.
It depends what you're doing.
If you're dealing processing images, yes, a good "quality" monitor is important.. but equally (or more) important to have it set-up correctly and calibrated.
If you're doing web-design, having a decent monitor is important, but again only if it's setup correctly (contrast/brightness/colour-balance).
If you're just "writing code", having a monitor your eyes like is important, the colour replication isn't important. A monochrome monitor might be stretching it, syntax-highlighting is nice, but even vim and it's 16 colours is "enough"
The term quality is also a bit "it depends" also.. CRT's have far better colour replication than TFT's, but I wouldn't recommend them (I always found reading text on them difficult, and they are hard to find, bulky and generally deprecated now).
For web-design, pretty much any monitor will be fine as long as it's not a 10 year old CRT with a broken red cathode-tube.. Again, as long as it's set-up correctly, most monitors are capable of displaying colour "good enough"
For "writing code", I think size/resolution/number-of-screens is more important than colour replication, as shown by most answers to any of these questions