Are there standard 8-bit color palettes? - colors

I'm attempting to reverse engineer a file format from a logging program. It seems to use a 1-2 byte field which may be used for label colors, as it seems to be the only thing that's unexplained. Short of iterating through other label colors and generating new output files (which is a pain in the ass, hence me trying to reverse-engineer the format in the first place), are there any standard 8-bit color palettes they may line up with? I've found some (e.g. the only 216 colors that are allowed on the web), but they're not enumerated in any particular way.
I believe 0x32 is light green (e.g. 0,255,0), 0x0A is teal cyan (0, 255, 255), and 0x28 is red. Each field may also include a trailing 0x00.
Or I could be totally wrong.

Well, it could be one of the old standard VGA palettes. This is the one for mode 13h:
However, the numbers you posted seem to be off-by-one from the index into the usual ordering.
EDIT: Here are the 24-bit RGB values for the color components in a python formatted array. It is a bit long so be warned!
[0x000000, 0x0000a8, 0x00a800, 0x00a8a8, 0xa80000, 0xa800a8, 0xa85400, 0xa8a8a8, 0x545454, 0x5454fc, 0x54fc54, 0x54fcfc, 0xfc5454, 0xfc54fc, 0xfcfc54, 0xfcfcfc, 0x000000, 0x141414, 0x202020, 0x2c2c2c, 0x383838, 0x444444, 0x505050, 0x606060, 0x707070, 0x808080, 0x909090, 0xa0a0a0, 0xb4b4b4, 0xc8c8c8, 0xe0e0e0, 0xfcfcfc, 0x0000fc, 0x4000fc, 0x7c00fc, 0xbc00fc, 0xfc00fc, 0xfc00bc, 0xfc007c, 0xfc0040, 0xfc0000, 0xfc4000, 0xfc7c00, 0xfcbc00, 0xfcfc00, 0xbcfc00, 0x7cfc00, 0x40fc00, 0x00fc00, 0x00fc40, 0x00fc7c, 0x00fcbc, 0x00fcfc, 0x00bcfc, 0x007cfc, 0x0040fc, 0x7c7cfc, 0x9c7cfc, 0xbc7cfc, 0xdc7cfc, 0xfc7cfc, 0xfc7cdc, 0xfc7cbc, 0xfc7c9c, 0xfc7c7c, 0xfc9c7c, 0xfcbc7c, 0xfcdc7c, 0xfcfc7c, 0xdcfc7c, 0xbcfc7c, 0x9cfc7c, 0x7cfc7c, 0x7cfc9c, 0x7cfcbc, 0x7cfcdc, 0x7cfcfc, 0x7cdcfc, 0x7cbcfc, 0x7c9cfc, 0xb4b4fc, 0xc4b4fc, 0xd8b4fc, 0xe8b4fc, 0xfcb4fc, 0xfcb4e8, 0xfcb4d8, 0xfcb4c4, 0xfcb4b4, 0xfcc4b4, 0xfcd8b4, 0xfce8b4, 0xfcfcb4, 0xe8fcb4, 0xd8fcb4, 0xc4fcb4, 0xb4fcb4, 0xb4fcc4, 0xb4fcd8, 0xb4fce8, 0xb4fcfc, 0xb4e8fc, 0xb4d8fc, 0xb4c4fc, 0x000070, 0x1c0070, 0x380070, 0x540070, 0x700070, 0x700054, 0x700038, 0x70001c, 0x700000, 0x701c00, 0x703800, 0x705400, 0x707000, 0x547000, 0x387000, 0x1c7000, 0x007000, 0x00701c, 0x007038, 0x007054, 0x007070, 0x005470, 0x003870, 0x001c70, 0x383870, 0x443870, 0x543870, 0x603870, 0x703870, 0x703860, 0x703854, 0x703844, 0x703838, 0x704438, 0x705438, 0x706038, 0x707038, 0x607038, 0x547038, 0x447038, 0x387038, 0x387044, 0x387054, 0x387060, 0x387070, 0x386070, 0x385470, 0x384470, 0x505070, 0x585070, 0x605070, 0x685070, 0x705070, 0x705068, 0x705060, 0x705058, 0x705050, 0x705850, 0x706050, 0x706850, 0x707050, 0x687050, 0x607050, 0x587050, 0x507050, 0x507058, 0x507060, 0x507068, 0x507070, 0x506870, 0x506070, 0x505870, 0x000040, 0x100040, 0x200040, 0x300040, 0x400040, 0x400030, 0x400020, 0x400010, 0x400000, 0x401000, 0x402000, 0x403000, 0x404000, 0x304000, 0x204000, 0x104000, 0x004000, 0x004010, 0x004020, 0x004030, 0x004040, 0x003040, 0x002040, 0x001040, 0x202040, 0x282040, 0x302040, 0x382040, 0x402040, 0x402038, 0x402030, 0x402028, 0x402020, 0x402820, 0x403020, 0x403820, 0x404020, 0x384020, 0x304020, 0x284020, 0x204020, 0x204028, 0x204030, 0x204038, 0x204040, 0x203840, 0x203040, 0x202840, 0x2c2c40, 0x302c40, 0x342c40, 0x3c2c40, 0x402c40, 0x402c3c, 0x402c34, 0x402c30, 0x402c2c, 0x40302c, 0x40342c, 0x403c2c, 0x40402c, 0x3c402c, 0x34402c, 0x30402c, 0x2c402c, 0x2c4030, 0x2c4034, 0x2c403c, 0x2c4040, 0x2c3c40, 0x2c3440, 0x2c3040, 0x000000, 0x000000, 0x000000, 0x000000, 0x000000, 0x000000, 0x000000, 0x000000]

There is no standard palette for 8-bit but Teletext uses true 8-bit color palette
It's based on hardwares
http://en.wikipedia.org/wiki/List_of_8-bit_computer_hardware_palettes
Edit : Some devices able to show different shades of color by adjusting brightness but true 8-bit is this palette ( Teletext )

Related

Openpyxl: Decode 'auto' color to rgb value

I am trying to determine the color of borders set in an Excel document. I use the 'openpyxl' library (latest Version 3.0.9) and encountered a problem to extract the rgb color code when the user sets the border via the default 'auto' color.
When I extract the cell color property, I see this (in my debugger, see screenshot). None of the cell fields are set with an RGB code, the fields indicate some problem with extracting the color information. I assume this is due to the 'auto' color.
The index color probably refers to this in documentation:
Default Color Index as per 18.8.27 of ECMA Part 4
COLOR_INDEX = (
'00000000', '00FFFFFF', '00FF0000', '0000FF00', '000000FF', #0-4
'00FFFF00', '00FF00FF', '0000FFFF', '00000000', '00FFFFFF', #5-9
'00FF0000', '0000FF00', '000000FF', '00FFFF00', '00FF00FF', #10-14
'0000FFFF', '00800000', '00008000', '00000080', '00808000', #15-19
'00800080', '00008080', '00C0C0C0', '00808080', '009999FF', #20-24
'00993366', '00FFFFCC', '00CCFFFF', '00660066', '00FF8080', #25-29
'000066CC', '00CCCCFF', '00000080', '00FF00FF', '00FFFF00', #30-34
'0000FFFF', '00800080', '00800000', '00008080', '000000FF', #35-39
'0000CCFF', '00CCFFFF', '00CCFFCC', '00FFFF99', '0099CCFF', #40-44
'00FF99CC', '00CC99FF', '00FFCC99', '003366FF', '0033CCCC', #45-49
'0099CC00', '00FFCC00', '00FF9900', '00FF6600', '00666699', #50-54
'00969696', '00003366', '00339966', '00003300', '00333300', #55-59
'00993300', '00993366', '00333399', '00333333', #60-63
)
# indices 64 and 65 are reserved for the system foreground and background colours respectively
The documentation says that Index 64 is reserved so there is no point in trying to access the color array (max index is 63).
Source: https://openpyxl.readthedocs.io/en/stable/_modules/openpyxl/styles/colors.html
I can be brave and deal with index 64 as black, which seems to be the eternal auto-color anyway but I am wondering if there is a proper way to decode the 'auto' color from Excel?
System colours are set by the application. Windows defaults will be white background and black foreground. You can check the colors part of the styles.xml to see if these have been overriden, but really the client application gets to decide.

What is the formula to generate two colour shade combos?

Attaching a sample image.
In which you can see a vivid pattern using two colours and the ranges in between.
I earlier thought of using the difference between the two colours and adding random values in that range to the smaller(hex) colour. But it gave me a very unpleasant palette.
First is what I need, Second is What I get.
When you generate colors, you should remember that they are built up from (R)ed (G)reen (B)lue, and sometimes Alpha-transparency.
Think of it: if you pick a random number between #000000 and #ffffff, you could get something like: #840523. Note that it's not gray, as you'd expect.
If you want it "random", you should pick random values for each channel.
So, in your example, do this:
Color1: #297EA6 split: #29 #7E #A6
Color2: #00101C split: #00 #10 #1C
Red : get a random value between #29 and #00 --> #20?
Green: get a random value between #7E and #10 --> #61?
Blue : get a random value between #A6 and #1C --> #8D?
New Color: #20 + #61 +#8D --> #20618D
If you want to keep the same tone, then you should consider calculating Hue/Saturation/Luminance values, and play with those. You might get more pleasing results.
In this answer I've explained how to do what you were trying to do, but make sure to read this too, because it goes deeper into interpolating colors: Interpolate from one color to another

Process image with Imagemagick to "default palette" (for example 16 or 256 colours)

Currently I'm processing image to extract main colors from it with such:
-resize '50x50' -colors '8' -colorspace 'RGB' -quantize 'RGB' '/tmp/downsampled20190502-27373-iqgqom.png'
However, it returns me a lot of colours, so I have to limit them to the main palette (like white/black/red/etc), so I guess 8 or 16 colours would be enough for me.
I thought that -colors '8' should process it, however, it only returns primary 8 colours from the image.
Do you have any ideas about how I could extract colours and convert them to 3-bit (8-color palette)
I though convert it to GIF, however GIF contains 256 colour palette.
I think you want to map all colours to one of 8 "primaries". So, let's make a palette of acceptable colours:
convert xc:red xc:lime xc:blue xc:cyan xc:magenta xc:yellow xc:white xc:black +append palette.gif
And enlarge it and look at it (because at the moment it is only 8x1 pixels):
Now take this colorwheel:
and remap all the colours to your "acceptable" palette without dithering:
convert colorwheel.png +dither -remap palette.gif result.png
and now remap with dithering:
convert colorwheel.png -remap palette.gif result.png
You can make your own palette - you don't have to use my colours, and you can make any RGB/HSL, hex colour you like, e.g.:
convert xc:"rgb(10,20,200)" xc:"#ff7832" xc:"hsl(10,40,90)" +append palette.gif
If you want the names and hex values of the colours in the resulting images:
convert result.png -unique-colors txt:
Sample Output
# ImageMagick pixel enumeration: 7,1,65535,srgb
0,0: (65535,0,0) #FF0000 red
1,0: (0,65535,0) #00FF00 lime
2,0: (65535,65535,0) #FFFF00 yellow
3,0: (0,0,65535) #0000FF blue
4,0: (65535,0,65535) #FF00FF magenta
5,0: (0,65535,65535) #00FFFF cyan
6,0: (65535,65535,65535) #FFFFFF white

Pygame: Fill transparent areas of text with a color

I have several fonts that I would like to use that are basically outlines of letters, but the insides are transparent. How would I go about filling only the inside areas of these fonts with with a color? I suspect it would be using the special blitting RGBA_BLEND modes, but I am not familiar with their functionality.
Here is an example of the font I am using:
https://www.dafont.com/fipps.font?back=bitmap
Right now, I am simply rendering the font onto a surface, and I've written a helper function for that. Ideally I would be able to integrate this into my function.
def renderText(surface, text, font, color, position):
x, y = position[0], position[1]
width, height = font.size(text)
position = x-width//2, y-height//2
render = font.render(text, 1, color)
surface.blit(render, position)
Thank you so much for any help you can give me!
An option is to define a surface the size of the text, fill that with the color you want, and blit the text on that. For example you could do this:
text = font.render('Hello World!', True, (255, 255, 255)
temp_surface = pygame.Surface(text.get_size())
temp_surface.fill((192, 192, 192))
temp_surface.blit(text, (0, 0))
screen.blit(temp_surface, (0, 0))
This will create a temporary surface that should fill in the transparent pixels of a text surface. There is another option of using set_at() but its too expensive in processing power for what you are doing and is best used for pre-processing surfaces.
Im certain that a better option using BLEND_RGBA_MULT will come from a more experienced user. Im not too good at blending modes either.

About skia antialias

Recently I'm learning skia library(google open source 2d engine,be used on Android and chromium,etc.),Now I want to use it on windows instead of GDI+ dont support clip area with antialias,during it, I find a problem about pixel.
up is set antialias,down is not set antialias
the main code is:
paint.setStrokeWidth(1);
paint.setStyle(SkPaint::kStroke_Style);
paint.setAntiAlias(true);
canvas.drawRect(skrect,paint); //draw up rect
skrect.fTop += 110;
skrect.fBottom += 110;
paint.setAntiAlias(false);
canvas.drawRect(skrect, paint); //draw down rect
As you see,the same rect,if I not set Antialias,the boundary pixel is 1(I set strock width is 1),but if I set Antialias, the boundary pixel is 2,and it become a bit light,although I set color is black.
I dont konw why,anyone can tell me?
thk,
now,maybe I konw.
the skia library canvas should be like Html5`s canvas ,Canvas every line have an infinite thin "line", the width of the line from the center line to stretch,so if we draw a 1px line,in fact,it will fill two of 0.5 pixel point,but the display device dont allow it do that,so it will fill 2 pixel point,and set it color more light to differentiate real 2 pixels .
I will search more material to prove it.
This happens because OpenGL assumes that each pixel is centered at [0.5; 0.5]. The rasterizer have to draw indeed two pixels and interpolate the alpha. Read more here
If you would like to draw sharp 1px lines with antialiasing turned on just offset coordinates of a line or other figure with 0.5f.
For example a point at [10; 10] should have coordinates [10.5; 10.5] and so on

Resources