Semitransparency / Alphalevels in J2ME? - java-me

My first question on stackoverflow, so my apologies if I'm missing something.
I read in a reply in another J2ME transparency question from 2009 here at stackoverflow that "You should note that alpha is sometimes ignored on some implementations and sometimes quantized to ugly levels (Some motorola phones snap alpha values to the nearest 2-bit value)."
I am currently experimenting with transparency in J2ME, testing on rather new devices (at least in the J2ME world).
They all return 256 when I call Display.numAlphaLevels();
Yet, I only count max 8 actual levels (and mostly only 5 I can tell apart) when I set an image to slowly fade in.
And it seems it is, as the above quote states, the nearest 2-bit value:
1% visible at level 2-3
3% visible at level 4-7
6% visible at level 8-16
12% visible at level 16-31
25% visible at level 32-63
50% visible at level 64-127
100% visible at level 128-255
Tested on Sony Ericsson Aino, Vivaz, and a Nokia N8, with both an 8bit PNG and a 24bit PNG.
I consider those devices to be some of the latest models that can run J2ME.
The above quote makes it sound like it's not all J2ME enabled phones that behaves like this though. And that's why I ask my question:
Can anyone confirm that this is just how alphalevels are with J2ME enabled phones? If not, then how is it set up on other phones, and which phones are they?
Do you know of any J2ME enabled phone that actually have all the 256 levels? Or one that can give e.g. 75% visibility?
Thanks

Well, not quite sure what I did wrong, but apparently it was something in my test-code that gave me these wrong results, that just so happened to match a quote from another JavaME developer, which made the results seem probable.
Since I posted the question, the alphalevels suddenly appear as I'd like them to appear: Nice and smooth. Not sure what I've changed in the code, and too busy/lazy to test at the moment.
So this can be closed.

Related

Samsung and micromax tablet both picking assets from mdpi

I have 2 android tablets - 1 samsung and 1 micromax tablet, samsung has 800 x 1280 pixels, 10.1 inches (~149 ppi pixel density) and micromax has 600 x 1024 pixels, 10.1 inches (~118 ppi pixel density).
I want their assets to be seprate but both are picking from mdpi, How can I do it?
Is it possible to use something like drawable-w600-mdpi and drawable-w800-mdpi?
No. Have a look at the Providing Resources page over at Android Developers. In Table 1 you can find all admissible qualifiers. Personally, I would suggest you let the OS handle the layout and choosing the right resource. However, with the new format of specifying resources 'wdp' respectively 'hdp', you might be able to get what you want as the tablets feature different sizes.
However, unless you have a very good reason to do so, I wouldn't do it. Often when you think you need to do something that is discouraged, there is a flaw in your design or the requirements can be met in an easier, more standard, way.
Update:
As by the comments, your problem is probably related to the devices not correctly communicating their DPI and, therefore, Android not choosing the correct resource. An Android Developer blog I recently read details the problem and shows a few examples on how to use the new numeric selectors tied to screen width and height to resolve some of those problems.

Android Qualifiers Not Working for the Notion Ink Adam

I'm making an app and I'm nearing completion, now I'm trying to optimize it for different screen sizes and pixel densities. One of the devices (using an emulator) is really frustrating me. I can't seem to find a qualifier that edits the Notion Ink Adam (1024x800 or something, 10.1 inches). According to this: http://developer.android.com/guide/practices/screens_support.html , Notion Ink Adam at 10.1 inches should be considered "xlarge" in a qualifier. However, when I use this in my qualifier like "layout-xlarge" the Notion Ink Adam emulator doesn't follow it.
I also tried using "layout-xlarge-hdpi" because I have another folder that's "layout-hdpi" that the Notion Ink Adam follows, but I'm using THAT qualifier for other devices. Also I've tried "layout-hdpi-long" but it also edits my other "long" hdpi devices. Notion Ink Adam is a tablet, and I'm just trying to seperate: 1) tablets like the Notion Ink Adam, 2)MDPI screens, the smaller screens, and 3) Long hdpi screens like the Nexus One and Motorola Droid.
My main problem is trying to find a qualifier that seperates 1 and 3, the tablet always follows my qualifier for the long hdpi screens.
Support for xlarge devices was introduced only in Android 2.3 (Gingerbread) and later. If your Adam is still running Froyo, it will report itself as "large" and will not find xlarge resources.
I developed an app, "ScreenInfo", which will cause an Android device to report its screen size and density classification. You can find it in the Market, or grab the source.
To help you sort out the various categories:
small-screened phones (like the original G1): normal-mdpi
most high-end smartphones w/3.7-4.5 inch screens: normal-hdpi
small-screened tablets (7-inch): large-mdpi, or in the case of the Galaxy Tab 7, large-hdpi
large-screened tablets (10-inch): xlarge-mdpi
As far as I'm aware, you're already doing everything correctly - using -xlarge for tablets, -hdpi, -mdpi, and so on for the appropriate screen densities, and so on. If the Adam's emulator (or the actual device) don't pull from the -xlarge layout already, it's probably in your best interest to simply ignore it. It's not a particularly popular tablet now that the Android 3.x devices are out (probably wasn't even before that, but I don't know), and if they're ignoring standards, all the more reason to ignore them in favor of what works for the majority of devices.
In terms of common qualifiers, I'm not sure what you mean, but if you go by the information in the documentation you linked, that's what's "common."
Adam reports itself as a large device. So, xlarge resources wont work on Adam.

What's the standard "minimum" resolution I should support with a website? [duplicate]

This question already has answers here:
Closed 13 years ago.
Duplicate:
Recommended website resolution (width and height)?
I tend to think of 1024x768 as the minimum Screen Resolution that a modern web browser will run in, but I worry when using this resolution for a business website because I feel that I might be hurting the folks out there who are stuck with something smaller/older. So I ask, realistically, what is the minimum screen resolution I should expect my website to function perfectly in with the browser "maximized"?
Look to the Netbooks for a new minimum. I'd say 1024x600 is reasonable.
Edit: You can always look to any number of sites that give you statistics on browser usage. Here's one that Google turned up for me:
http://www.w3schools.com/browsers/browsers_display.asp
From an article in Jacob's Nielsen Alertbox called Screen Resolution and Page Layout:
Optimize Web pages for 1024x768, but
use a liquid layout that stretches
well for any resolution, from 800x600
to 1280x1024.
Depends on your audience. If it's mainly American consumers at home, then I think you're safe with 1024x768. For schools, corporate and global international audiences you'll want 800x600 because schools and businesses are less likely to have upgraded computers, and international audiences in various countries may not have larger screens available for whatever reason.
Can I also suggest you test a maximum resolution as well. Many sites are unusable (without zooming) at 1920x1200 due to people using fixed font sizes and the like.
Dear God, have we forgotten?
The WHOLE POINT of HTML - a LOGICAL page description language - is that you NEVER have to think or worry about the display device.
What happens if the display device is a text-reader for the blind?
Or a text-only browser on a console?
But that's not the main point; the main point is that HTML LOGICALLY describes the page. If you in your logical description of the page are making PURELY PHYSICAL descriptions then you GOT IT WRONG. You're writing web-pages like you're Word emitting HTML!
You need to write your web-site so it works LOGICALLY - which is to say, you leave the problem of rendering PURELY in the hands of the rendering agent. If you're not doing that, you've got it WRONG.
Consider using a fluid layout that adapts to the user's screen. Most sites with a fixed layout force the majority of users to view the site targeted to the least common denominator even though 90% of the visitors have a much higher resolution available. This results in layouts that are overly populated with navigational chrome and little content.
If you must use a fixed layout, consider taking a cue from MSN where you split the screen into 760 and 224 pixel columns. If the visitor has a resolution of 800 (which you can detect in JavaScript) then hide the 224 pixel column.
UPDATE from comments: As for determining a safe min though I'd set your screen to 800x600 then browse some of the popular general public sites - MSN, Yahoo, etc. and see what they do. It's a good bet they've invested a lot of research in this area and adopting what they've done is usually a safe bet
1024x768 is fine. Most people have that resolution setting and the ones who don't won't have a heavily compromised user experience. Also, to ensure your page fits nicely into the browser, taking into account scroll-bars and such, make the with of your pages 960px.
1) My browser is not maximized. The size of my screen doesn't matter. The size of my browser window does.
2) The iPhone's screen resolution is 480x320. NewEgg currently lists at least one 1920x1080 monitor for under $200. Designing to either of those resolutions will make your site completely unusable on the other. Even if you split the difference and design to 1024x768, you'll get a stripe covering half the screen width on the $200 monitor (which, IMO, looks like crap) and it will still be completely unusable on the iPhone.
Screens aren't just getting bigger. They're also getting smaller. The trend is moving to fluid layout instead of fixed-width and it's for a damn good reason.
I usually design websites 800 wide.
Height isn't a problem, as the user can scroll.
As Mark said, there are a number of netbooks around now.
Most of them now have the 1024x600 size, but there are also some of the "older" netbooks that have lower resolution then that still. Mine for example has *wince* 800x480.
If you want to be really compatible, go for 800, but otherwise, I'd say your good with 1024, and as for the height, the user can always scroll.
Don't forget that scrollbars, toolbars, and sidebars can constrict the space a little. Even if you assume the resolution is at least 1024*768, don't make your page 1024 wide.
Definitely 1024 wide (as in 980px or so usable), but please don't design for a fixed height.
I'd take a look at those statistics: http://www.w3schools.com/browsers/browsers_display.asp.
As of January 2009, only 4% of people visiting W3Schools are using 800x600 as resolution. The remaining, are using at least 1024x768.
Beware of how much of that 4% could be part of you users, though.

Are web-safe colors still relevant?

Since the vast majority of monitors are 16-bit color or more, including mobile devices, does it make sense to even consider web-safe colors when choosing color schemes? Or is it something that ought to be relegated to history as a piece of trivia?
For those of you that don't know what web-safe colors are:
Another set of 216 color values is commonly considered to be the "web-safe" color palette, developed at a time when many computer
displays were only capable of
displaying 256 colors. A set of colors
was needed that could be shown without
dithering on 256-color displays; the
number 216 was chosen partly because
computer operating systems customarily
reserved sixteen to twenty colors for
their own use; it was also selected
because it allows exactly six shades
each of red, green, and blue (6 × 6 ×
6 = 216).
The list of colors is often presented
as if it has special properties that
render them immune to dithering. In
fact, on 256-color displays
applications can set a palette of any
selection of colors that they choose,
dithering the rest. These colors were
chosen specifically because they
matched the palettes selected by the
then leading browser applications. [Wikipedia]
For me web safe color palette is no longer primary concern. Optimize for the largest target audience.
According to w3schools site visitors:
In January 2009 1% of site visitors had 256 color displays, 95% of users had
24 or 32 bit.
[Update] In January 2015 0.5% had 256 colours, 0.5% had 24 bit and 99% had 32 bit
I found similar numbers from a business app site that I look after:
32-bit 79.01%
24-bit 15.64%
16-bit 5.27%
8-bit 0.08%
I don't think web safe colors are relevant any more. To me, a much bigger problem for smartphones are all the fixed-width 960-pixel wide web pages.
I think the most important thing when choosing a colour palette is keeping in mind colour-blindness. There are a few different types that I know of, but the main thing is making sure that you have enough contrast between colours.
For example green text on a red background might be easier for some to read, but very difficult or maybe impossible for others (5-10% of males!), especially if the values of the colours are close.
For those of us (like me) that didn't know exactly what web safe colors are, they were
developed at a time when many computer
displays were only capable of
displaying 256 colors. A set of colors
was needed that could be shown without
dithering on 256-color displays; the
number 216 was chosen partly because
computer operating systems customarily
reserved sixteen to twenty colors for
their own use; it was also selected
because it allows exactly six shades
each of red, green, and blue (6 × 6 ×
6 = 216).
The list of colors is often presented
as if it has special properties that
render them immune to dithering. In
fact, on 256-color displays
applications can set a palette of any
selection of colors that they choose,
dithering the rest. These colors were
chosen specifically because they
matched the palettes selected by the
then leading browser applications.
It's hard to imagine any of this applying to today's modern displays, since almost nobody runs their display in 256 colors anymore (unless perhaps they are playing an old version of Leisure Suit Larry).
It depends what you mean by web safe colours.
In terms of 16bit colour it's probably not worth worrying about. However Colours do not appear the same across devices. This can lead to all sorts of problems particularly if a designers gamma settings are different to your particular monitor set up.
So you still need to test your design across multiple set ups.
In my opinion, it's just history.
Yes, it's definitely a thing of the past. Place its importance right next to your marquee tags.
IMHO the point is really moot. Colors that aren't web safe are dithered anyway. It may not look the best in 256-color modes but as long as functional elements of the page/applications are not dependent on those colors it will not disturb the user experience that much.
Also most users surfing in 256-color modes will be aware of the fact colors will be dithered as I don't think that a lot of sites adhere to the web-safe colorschemes anymore.
According to research, even the web safe colors were not web safe. It was an interesting idea while it was relevant, thankfully that's over now.
Web safe colors are pretty much not a problem anymore unless you are dealing with consumers that will have legacy (think > 10 year old) video display equipment.
It's still important if your target very poor to developing nations, such as countries in here south asia. I personally have a full blown IPS monitor with Windows 10, so it's not a problem to me, but we are minorities, and majority of them have old hardware/computers/operating systems except mobiles phones, as it's cheap to buy a latest mobile phone, but computer hardware are expensive due to the taxes/ import cost etc.. compared to the salary of an average person.I personally witnessed many people still use old Windows XP, 98 PC with 256 colors on Pentium 4 processors. So if you target such audience, it's better to use web safe colors, but if you are doing a business it's not worth, as they are less likely to be your customers, but if you are doing an information site, a blog, an activist site that people can read and get informed without having to pay for something then always use fallback theme or something with web safe colors. since such people are mostly on windows xp/98, try to detect the OS, and if the user is from such operating systems, then use the fallback theme.
So remember that, most of people in this world are poor, and most of people still use old hardware and technologies. If you want to cover them all without working a lot hard on your current theme, then always use fallback themes. One for old mobiles, one for old desktop displays, one for modern displays and modern mobiles (responsive)
Some colors do not display on some mobile devices. (trying to make a list)
The title bars are supposed to be a blue fade from CSS:
background: linear-gradient(to bottom, #0099CC1, #0033CC) repeat-x scroll 0 0 #006DCC;
On many devices the background is not visible, and the header looks like white-on-white.
So, I'm just trying to explain the tip to determine browser-safe colors just by looking at its hex code
For a hex color #xxyyzz, the color is browser safe if
For pairs:
position 12 i.e xx values are same
position 34 i.e yy values are same
position 56 i.e zz values are same
Allowed values are
0
3
6
9
C
F

Do all developers consider monitor quality (colors, not resolution) to be irrelevant?

I especially hear it from those advocates of "business" notebooks manufactured by IBM/Lenovo, HP, Dell (maybe) that "business users do not need quality screens". They stick in the worst possible LCDs out there (even if with a high resolution) and dare to sell that crap. You can't even distinguish hue variations like light-yellow vs. light-grey.
I really miss it - do all of you agree color reproduction of a developer display is irrelevant, be it even a grayscale display it will do?
I understand most of developers work with text but... at times there is some design work to be done which is not doable on cheap LCDs.
And besides - wouldn't you enjoy fresh saturated colors even in a development environment? Bright cheerful icons on menus? Isn't it better to sit in a sunny office with green trees and flowers out of the window than in a garage with dark colors and weak artificial lighting?
P.S. Inspired by the topic about keyboards: Keyboard for programmers
The question about displays and developers really interests me since a very long time.
Even though I don't need a high quality screen, I appreciate the difference, and like esnoeijs said, an occasion will arise where I'll need to critique some graphic design work where the quality monitor will make a difference.
I think, "developer" is too broad to give a precise answer.
If you are a code-crafter of programs reading text emitting text, without the need to make some colors look nice, then yes, then you really can go with a monochrome screen. you need black as a background, white as the foreground and some reversing to highlight matching braces. In this particular case, I would value high resolutions far far more important than colors, since usually it is about seeing more code (and especially, more things about and around the current piece of code, like documentation, tests, a quick interpreter loop, some research paper, you name it).
If you are a developer just learning a language and if you have an editor with syntax highlighting, then color is a massive, massive usability leap. I would not want to miss the ability to display keywords in a bright pink, strings in a brigt cyan and similar things (all on a black background)
If you are a frontend-designer, then it is a completely different story. If you are a frontend designer, you will need a high quality display with good color display abilities. You do not need the best one possible, but your display should at least be able to display the colors your regular user will use, so you will not put in green, because you wanted blue and your users see yellow (or other nonsenses).
if you use tools that require the use of colors in order to encode information, color is crucial, because you might be unable to see the additional information.
...
So, I think, most programmers do not need some ridiculous color displaying abilities, even though, most of the time, a good solid color display is helpful, because they need to work on some frontend or because they want to learn some language.
HTH,
Tetha
Better quality color monitors can come in handy in a lot of ways. The first way that comes to mind is if you are using a code development tool that has the capability of highlighting keywords such as Zend does.
I once spent half a day trying to add zebra striping to a table in my company's webapp that already had it because both my screen and QA's screen were unable to display the different colors of the zebra stripes (they rendered as the same color). Likewise, I once had my boss ask me to change the color of part of an icon, and to me it made the icon look like a uniform blue, but on his much better monitor, you could clearly see both shades of blue and it looked really nice... it was hard to make that edit without being able to see what I was doing.
I guess the developers in my company end up doing some design work in addition to real dev. I do spend most of my time in the shell though, so aside from the constant flickering that gives me headaches (yes, it's an LCD), a low-qual monitor is OK.
I'm a developer but being in webdev land i've picked up enough design stuff to be critical about it, so i mostly try to get samsung screens with a good colour range.
With a good monitor, you can adjust it to your likings.
Personally, I have a $700-Fujitsu Siemens monitor bought in (afaik) 2000, and a $340-BenQ bought in 2005, and I prefer coding on the first monitor, as I don't have to crank up the brightness (reducing headaches) and can still see everything I want to see (subpixelhinted 6 point fonts, subtle variations in syntax highlightings etc.).
At least one author would disagree. He ranked color accuracy on four notebooks:
Lenovo ThinkPad W700
IBM/Lenovo ThinkPad T60
Dell Inspiron Mini 9
Apple late-2008 MacBook Pro 15 inch
I'm less picky about the actual monitor I have and more picky that I have two monitors that are exactly the same model and use the same video connector.
As a web developer, it can be frustrating to have colors that don't match because one of your monitors is VGA and the other is DVI.
Possibly the sort of "business user" who works on invoices all day does not need a very good display, but anyone who works on anything whose appearance counts, from software developers to business users who need to make Powerpoint presentations, does.
If you are a hardcore terminal+vim user like me, they color quality and fidelity are almost irrelevant, except for the quality of blue (which I use in some situations, like directory names) which tends to be too faint to be seen on my black background. Nothing that cannot be fixed with some tinkering though, but I am used to blue.
That said, I actually have a couple of things to say about the new screen on the macbook unibody. The glossy finish is a real pain. So annoying. And the color fidelity is very low. I spent an evening trying to understand why on a gradient from light green to white I had a pinkish stripe. Turns out that the pink is an artifact of the macbook screen. Another screen does not show the issue. On the plus side, the LED backlight is very powerful and nice, making the colors very vibrant.
This to say that color fidelity is fundamental if you use color-intensive stuff like eclipse (which communicates a lot also through different shades of colors), and of course for web frontend development. If you just need a terminal and a vim running, I don't think color fidelity makes a real difference, once you have a comfortable setup with low reflections, and a good contrast.
(note: it's been a few years since I've shopped for a monitor. this may be out of date)
I find it interesting that nobody has really defined "quality" yet, other than to say more vibrant colors. Generally, LCD panels fall into one of two tracks:
Good color/image reproduction (S-IPS panels and similar)
Good response time ( TN panels )
I consider SIPS and similar panels a must for development for one crucial reason: look angle. The image doesn't change colors or do other weird things as you angle to the screen changes. Very important for collaboration.
At the high end of this scale are monitors that are designed to perform will with color calibration. Most developers won't need anything this fancy.
TN panels are decent for gaming, movies, and other things featuring fast motion. They are optimized for pixel response time, and it's usually the main feature touted for these panels. Many cheaper panels are going to be of this variety.
In a monitor, I look for four things:
panel type (S-IPS or similar)
brightness (no more than 300cd/m2)
dot pitch (for good text, go with a small dot pitch: 0.27 is too big)
good contrast/ light leakage/ etc. (how black is black, and how uniform)
Although I love S-IPS panels, I must admit that any LCD monitor that can meet criteria 2-4 above would be a good choice, even if it's a cheaper TN panel.
It depends what you're doing.
If you're dealing processing images, yes, a good "quality" monitor is important.. but equally (or more) important to have it set-up correctly and calibrated.
If you're doing web-design, having a decent monitor is important, but again only if it's setup correctly (contrast/brightness/colour-balance).
If you're just "writing code", having a monitor your eyes like is important, the colour replication isn't important. A monochrome monitor might be stretching it, syntax-highlighting is nice, but even vim and it's 16 colours is "enough"
The term quality is also a bit "it depends" also.. CRT's have far better colour replication than TFT's, but I wouldn't recommend them (I always found reading text on them difficult, and they are hard to find, bulky and generally deprecated now).
For web-design, pretty much any monitor will be fine as long as it's not a 10 year old CRT with a broken red cathode-tube.. Again, as long as it's set-up correctly, most monitors are capable of displaying colour "good enough"
For "writing code", I think size/resolution/number-of-screens is more important than colour replication, as shown by most answers to any of these questions

Resources