I'm trying to "compress" an array of 16 numbers ranging from 0 to 15 into a single number.
Since each element of the array is at most 15, I can represent it with just 4 bits, so I expect to have 4 bits * 16 = 64 bits which fits a number.
To do this "compression" I use the fallowing function:
function compressArray(array) {
return array.reduce(
(pre, curr, index) => {
return (pre | (curr * Math.pow(2, index * 4)));
},
0
);
}
But after index 8 it will keep giving me the same output without computing the correct |.
What am I missing here?
This question already has answers here:
How to convert 24 hr format time in to 12 hr Format?
(16 answers)
Closed 3 years ago.
I'm new to groovy and i want to convert 24 hour format to 12 hour format. What is the code to be used for it? Is there any built-in methods?
I just want groovy code not java one
Kevin's answer is correct, and should get the tick... I only post this as it's slightly shorter
import java.time.*
import static java.time.format.DateTimeFormatter.ofPattern
String time = '13:23:45'
String result = LocalTime.parse(time).format(ofPattern('h:mm:ss a'))
println result
I thought this question is somewhat similar to the How to convert 24 hr format time in to 12 hr Format?. It just that Java and Groovy share a lot of similarities. To point that out, I took Cloud's answer from the mentioned question and converted that to Groovy.
import java.time.LocalTime
import java.time.format.DateTimeFormatter
final String time = "21:49"
String result = LocalTime.parse(time, DateTimeFormatter.ofPattern("HH:mm")).format(DateTimeFormatter.ofPattern("hh:mm a"));
println(result)
If you want to build your own time function you can try to customize the code below.
final String time = "21:49"
final String _24to12( final String input ){
if ( input.indexOf(":") == -1 )
throw ("")
final String []temp = input.split(":")
if ( temp.size() != 2 )
throw ("") // Add your throw code
// This does not support time string with seconds
int h = temp[0] as int // if h or m is not a number then exception
int m = temp[1] as int // java.lang.NumberFormatException will be raised
// that can be cached or just terminate the program
String dn
if ( h < 0 || h > 23 )
throw("") // add your own throw code
// hour can't be less than 0 or larger than 24
if ( m < 0 || m > 59 )
throw("") // add your own throw code
// minutes can't be less than 0 or larger than 60
if ( h == 0 ){
h = 12
dn = "AM"
} else if ( h == 12 ) {
dn = "PM"
} else if ( h > 12 ) {
h = h - 12
dn = "PM"
} else {
dn = "AM"
}
return h.toString() + ":" + m.toString() + " " + dn.toString()
}
println(_24to12(time))
How can I parse RGB color in web color format (3 or 6 hex digits) to Color from image/color? Does go have any built-in parser for that?
I want to be able to parse both #XXXXXX and #XXX colors formats.
color docs says nothing about it: https://golang.org/pkg/image/color/
but this task is very common, so I believe that go has some functions for that (which I just didn't find).
Update: I created small Go library based on accepted answer: github.com/g4s8/hexcolor
Foreword: I released this utility (the 2. Fast solution) in github.com/icza/gox, see colorx.ParseHexColor().
1. Elegant solution
Here's another solution using fmt.Sscanf(). It certainly not the fastest solution, but it is elegant. It scans right into the fields of a color.RGBA struct:
func ParseHexColor(s string) (c color.RGBA, err error) {
c.A = 0xff
switch len(s) {
case 7:
_, err = fmt.Sscanf(s, "#%02x%02x%02x", &c.R, &c.G, &c.B)
case 4:
_, err = fmt.Sscanf(s, "#%1x%1x%1x", &c.R, &c.G, &c.B)
// Double the hex digits:
c.R *= 17
c.G *= 17
c.B *= 17
default:
err = fmt.Errorf("invalid length, must be 7 or 4")
}
return
}
Testing it:
hexCols := []string{
"#112233",
"#123",
"#000233",
"#023",
"invalid",
"#abcd",
"#-12",
}
for _, hc := range hexCols {
c, err := ParseHexColor(hc)
fmt.Printf("%-7s = %3v, %v\n", hc, c, err)
}
Output (try it on the Go Playground):
#112233 = { 17 34 51 255}, <nil>
#123 = { 17 34 51 255}, <nil>
#000233 = { 0 2 51 255}, <nil>
#023 = { 0 34 51 255}, <nil>
invalid = { 0 0 0 255}, input does not match format
#abcd = { 0 0 0 255}, invalid length, must be 7 or 4
#-12 = { 0 0 0 255}, expected integer
2. Fast solution
If performance does matter, fmt.Sscanf() is a really bad choice. It requires a format string which the implementation has to parse, and according to it parse the input, and use reflection to store the result to the pointed values.
Since the task is basically just "parsing" a hexadecimal value, we can do better than this. We don't even have to call into a general hex parsing library (such as encoding/hex), we can do that on our own. We don't even have to treat the input as a string, or even as a series of runes, we may lower to the level of treating it as a series of bytes. Yes, Go stores string values as UTF-8 byte sequences in memory, but if the input is a valid color string, all its bytes must be in the range of 0..127 which map to bytes 1-to-1. If that would not be the case, the input would already be invalid, which we will detect, but what color we return in that case should not matter (does not matter).
Now let's see a fast implementation:
var errInvalidFormat = errors.New("invalid format")
func ParseHexColorFast(s string) (c color.RGBA, err error) {
c.A = 0xff
if s[0] != '#' {
return c, errInvalidFormat
}
hexToByte := func(b byte) byte {
switch {
case b >= '0' && b <= '9':
return b - '0'
case b >= 'a' && b <= 'f':
return b - 'a' + 10
case b >= 'A' && b <= 'F':
return b - 'A' + 10
}
err = errInvalidFormat
return 0
}
switch len(s) {
case 7:
c.R = hexToByte(s[1])<<4 + hexToByte(s[2])
c.G = hexToByte(s[3])<<4 + hexToByte(s[4])
c.B = hexToByte(s[5])<<4 + hexToByte(s[6])
case 4:
c.R = hexToByte(s[1]) * 17
c.G = hexToByte(s[2]) * 17
c.B = hexToByte(s[3]) * 17
default:
err = errInvalidFormat
}
return
}
Testing it with the same inputs as in the first example, the output is (try it on the Go Playground):
#112233 = { 17 34 51 255}, <nil>
#123 = { 17 34 51 255}, <nil>
#000233 = { 0 2 51 255}, <nil>
#023 = { 0 34 51 255}, <nil>
invalid = { 0 0 0 255}, invalid format
#abcd = { 0 0 0 255}, invalid format
#-12 = { 0 17 34 255}, invalid format
3. Benchmarks
Let's benchmark these 2 solutions. The benchmarking code will include calling them with long and short formats. Error case is excluded.
func BenchmarkParseHexColor(b *testing.B) {
for i := 0; i < b.N; i++ {
ParseHexColor("#112233")
ParseHexColor("#123")
}
}
func BenchmarkParseHexColorFast(b *testing.B) {
for i := 0; i < b.N; i++ {
ParseHexColorFast("#112233")
ParseHexColorFast("#123")
}
}
And here are the benchmark results:
go test -bench . -benchmem
BenchmarkParseHexColor-4 500000 2557 ns/op 144 B/op 9 allocs/op
BenchmarkParseHexColorFast-4 100000000 10.3 ns/op 0 B/op 0 allocs/op
As we can see, the "fast" solution is roughly 250 times faster and uses no allocation (unlike the "elegant" solution).
An RGBA color is just 4 bytes, one each for the red, green, blue, and alpha channels. For three or six hex digits the alpha byte is usually implied to be 0xFF (AABBCC is considered the same as AABBCCFF, as is ABC).
So parsing the color string is as simple as normalizing it, such that it is of the form "RRGGBBAA" (4 hex encoded bytes), and then decoding it:
package main
import (
"encoding/hex"
"fmt"
"image/color"
"log"
)
func main() {
colorStr := "102030FF"
colorStr, err := normalize(colorStr)
if err != nil {
log.Fatal(err)
}
b, err := hex.DecodeString(colorStr)
if err != nil {
log.Fatal(err)
}
color := color.RGBA{b[0], b[1], b[2], b[3]}
fmt.Println(color) // Output: {16 32 48 255}
}
func normalize(colorStr string) (string, error) {
// left as an exercise for the reader
return colorStr, nil
}
Try it on the playground: https://play.golang.org/p/aCX-vyfMG4G
You can convert any 2 hex digits into an integer using strconv.ParseUint
strconv.ParseUint(str, 16, 8)
The 16 indicates base 16 (hex) and the 8 indicates the bit count, in this case, one byte.
You can use this to parse each 2 characters into their components
https://play.golang.org/p/B56B8_NvnVR
func ParseHexColor(v string) (out color.RGBA, err error) {
if len(v) != 7 {
return out, errors.New("hex color must be 7 characters")
}
if v[0] != '#' {
return out, errors.New("hex color must start with '#'")
}
var red, redError = strconv.ParseUint(v[1:3], 16, 8)
if redError != nil {
return out, errors.New("red component invalid")
}
out.R = uint8(red)
var green, greenError = strconv.ParseUint(v[3:5], 16, 8)
if greenError != nil {
return out, errors.New("green component invalid")
}
out.G = uint8(green)
var blue, blueError = strconv.ParseUint(v[5:7], 16, 8)
if blueError != nil {
return out, errors.New("blue component invalid")
}
out.B = uint8(blue)
return
}
Edit: Thanks to Peter for the correction
I have a function that generates a random filename in the format "greetingXX.gif" where "XX" is a number between 1 and 20. See code below:
1 function getGIF(callback) {
2 let randomGIF;
3 let gifName = "greeting";
4 const GIF_EXTENSION = ".gif";
5 const MAX_GREETING = 20;
6 randomGIF = Math.floor(Math.random() * MAX_GREETING) + 1;
7 if (randomGIF < 10) {
8 gifName += "0" + randomGIF;
9 } else {
10 gifName += randomGIF;
11 }
12 gifName += GIF_EXTENSION;
13 callback(gifName);
14 }
The function works, BUT in WebStorm I get the following warnings:
Unused Variable randomGIF (Line 2)
Unused constant MAX_GREETING (Line 5)
Element MAX_GREETING is not imported (Line 6)
Variable gifName might not have been initialised (Line 8 and Line 10)
Like I say, the function does exactly what it is supposed to do. But why am I getting these warnings? And more specifically, how do I change my code so I don't get them?
I was able to fix this by invalidating caches (File | Invalidate caches, Invalidate and restart). Thanks to lena for this!
I´m trying to read a dht22 sensor in ulp deep sleep, and it´s not working, can perhaps someone tell me what I´m doing wrong?
I´m reimplementing the arduino-DHT library because it´s working in non ulp mode with my sensor, it looks like this:
digitalWrite(pin, LOW); // Send start signal
pinMode(pin, OUTPUT);
delayMicroseconds(800);
pinMode(pin, INPUT);
digitalWrite(pin, HIGH); // Switch bus to receive data
// We're going to read 83 edges:
// - First a FALLING, RISING, and FALLING edge for the start bit
// - Then 40 bits: RISING and then a FALLING edge per bit
// To keep our code simple, we accept any HIGH or LOW reading if it's max 85 usecs long
uint16_t rawHumidity = 0;
uint16_t rawTemperature = 0;
uint16_t data = 0;
for ( int8_t i = -3 ; i < 2 * 40; i++ ) {
byte age;
startTime = micros();
do {
age = (unsigned long)(micros() - startTime);
if ( age > 90 ) {
error = ERROR_TIMEOUT;
return;
}
}
while ( digitalRead(pin) == (i & 1) ? HIGH : LOW );
if ( i >= 0 && (i & 1) ) {
// Now we are being fed our 40 bits
data <<= 1;
// A zero max 30 usecs, a one at least 68 usecs.
if ( age > 30 ) {
data |= 1; // we got a one
}
}
switch ( i ) {
case 31:
rawHumidity = data;
break;
case 63:
rawTemperature = data;
data = 0;
break;
}
}
Looks simple enough :D, I tried the first part with this code, but it doesn´t work:
rtc_gpio_init(15);
rtc_gpio_set_direction(15, RTC_GPIO_MODE_INPUT_OUTPUT);
And in my ulp script file:
.set temp_humidity_sensor_pin, 13 // gpio 15 (adc 13)
send_start_signal:
/* disable hold on gpio 15 (data pin) */
WRITE_RTC_REG(RTC_IO_TOUCH_PAD3_REG, RTC_IO_TOUCH_PAD3_HOLD_S, 1, 0)
/* switch to output mode */
WRITE_RTC_REG(RTC_GPIO_OUT_W1TS_REG, RTC_GPIO_OUT_DATA_W1TS_S + temp_humidity_sensor_pin, 1, 1)
/* send start signal (LOW) */
WRITE_RTC_REG(RTC_GPIO_OUT_REG, RTC_GPIO_OUT_DATA_S + temp_humidity_sensor_pin, 1, 0)
/* pull low for 800 microseconds (8Mhz) */
wait 6400
/* switch to input mode */
WRITE_RTC_REG(RTC_GPIO_OUT_W1TC_REG, RTC_GPIO_OUT_DATA_W1TC_S + temp_humidity_sensor_pin, 1, 1)
/* switch bus to receive data (HIGH) */
WRITE_RTC_REG(RTC_GPIO_OUT_REG, RTC_GPIO_OUT_DATA_S + temp_humidity_sensor_pin, 1, 1)
wait_for_sensor_preparation_low:
READ_RTC_REG(RTC_GPIO_IN_REG, RTC_GPIO_IN_NEXT_S + temp_humidity_sensor_pin, 1)
and r0, r0, 1
jump wait_for_sensor_preparation_low, eq
wait_for_sensor_preparation_high:
READ_RTC_REG(RTC_GPIO_IN_REG, RTC_GPIO_IN_NEXT_S + temp_humidity_sensor_pin, 1)
and r0, r0, 0
jump wait_for_sensor_preparation_high, eq
jump wake_up // <-- never called :(
Any ideas?
Your "and r0, r0, 0" instruction (near the end) always sets r0 to zero (x & 0 == 0 by definition), which means the following jump instruction will loop forever. Remember that the "eq" flag doesn't really mean "equal". It means "zero". I think you want:
wait_for_sensor_preparation_low:
READ_RTC_REG(RTC_GPIO_IN_REG, RTC_GPIO_IN_NEXT_S + temp_humidity_sensor_pin, 1)
and r0, r0, 1
jump wait_for_sensor_preparation_high, eq
jump wait_for_sensor_preparation_low
wait_for_sensor_preparation_high:
READ_RTC_REG(RTC_GPIO_IN_REG, RTC_GPIO_IN_NEXT_S + temp_humidity_sensor_pin, 1)
and r0, r0, 1
jump wait_for_sensor_preparation_high, eq
By the way, I wrote a C compiler for the ULP, which should make your life easier. It's on https://github.com/jasonful/lcc