How to convert hexadecimal string to signed integer? [duplicate] - string

I'm pulling in data that is in long hexadecimal string form which I need to convert into decimal notation, truncate 18 decimal places, and then serve up in JSON.
For example I may have the hex string:
"0x00000000000000000000000000000000000000000000d3c21bcecceda1000000"
At first I was attempting to use ParseUint(), however since the highest it supports is int64, my number ends up being way too big.
This example after conversion and truncation results in 10^6.
However there are instances where this number can be up to 10^12 (meaning pre truncation 10^30!).
What is the best strategy to attack this?

Use math/big for working with numbers larger than 64 bits.
From the Int.SetString example:
s := "d3c21bcecceda1000000"
i := new(big.Int)
i.SetString(s, 16)
fmt.Println(i)
https://play.golang.org/p/vf31ce93vA
The math/big types also support the encoding.TextMarshaler and fmt.Scanner interfaces.
For example
i := new(big.Int)
fmt.Sscan("0x000000d3c21bcecceda1000000", i)
Or
i := new(big.Int)
fmt.Sscanf("0x000000d3c21bcecceda1000000", "0x%x", i)

Related

Determine number length in a file in Twincat 3

I am reading a file on my computer that contains the following information:
cellcount=011 (INT)
currentdensity=1.112 (REAL)
REAL2=2.1145 (REAL)
INT1=41823 (INT)
REAL3=4.2023 (REAL)
INT=11 (INT)
Currently I am storing the ReadBuffer in a string(1000) because I thought that was the easiest way to manipulate the content. I want to be able to extract the numbers as you see and store them in variables. I want it to be dynamic so folks can enter any number (not reals in to ints, but otherwise).
so far I have looked at the string functions of twincat 3 and using MID() and FIND() I can make something work, but then I need to know the length of the numbers. Like this:
test.CellCount := STRING_TO_INT(MID(sTest,number_of_chars,FIND(sTest,'cellcount:')+10));
Any idea how to make this dynamic?
Square brackets after a string variable will allow you to extract the ASCII code of a particular character. Knowing that digits 0-9 are ASCII codes 48-57, you can iterate through the characters following your search string until no more digits are found. For example:
loc1 := FIND(sTest,'cellcount=') + 9;
FOR i:=loc1 TO (loc1+10) DO // 10 = maximum length of number
IF (sTest[i]>=48 AND sTest[i]<=57) OR sTest[i]=46 THEN
loc2 := i;
ELSE
EXIT;
END_IF
END_FOR
number_of_chars := loc2 - loc1 + 1;
ASCII code 46 is the decimal point, to allow parsing of floating point values.

Convert large decimal number to hexadecimal notation

When creating a String object in Swift you can use a String Format Specifier to convert an integer to hexadecimal notation.
print(String(format:"%x", 1234))
// output: 4d2
// expected output: 4d2
But when numbers become bigger, the output is not as expected.
print(String(format:"%x", 12345678901234))
// output: 73ce2ff2
// expected output: b3a73ce2ff2
It seems that the output of String(format:"%x", n) is truncated at 8 characters. I don't think in hexadecimal natively, this makes debugging hard. I have seen answers for other programming languages where it is explained that you need to brake-up the large integer into parts, but that seems wrong to me.
What am I doing wrong here?
What is the right way to convert decimal numbers to hexadecimal numbers in Swift?
You need to use %lx or %llx
print(String(format:"%lx", 12345678901234))
b3a73ce2ff2
Table 2 on the site you linked specifies them
l -
Length modifier specifying that a following d, o, u, x, or X conversion specifier applies to a long or unsigned long argument.
x is for unsigned 32 bit integers which only go up to 4.294.967.296

Golang Random Sha256

I am having trouble getting a random sha256 hash using a timestamp seed:
https://play.golang.org/p/2-_VPe3oFr (dont use playground - time always same)
Does anyone understand why it always returns the same result? (non-playground runs)
Because you do this:
timestamp := time.Now().Unix()
log.Print(fmt.Sprintf("%x", sha256.Sum256([]byte(string(timestamp))))[:45])
You print the hex form of the SHA-256 digest of the data:
[]byte(string(timestamp))
What is it exactly?
timestamp is of type int64, converting it to string is:
Converting a signed or unsigned integer value to a string type yields a string containing the UTF-8 representation of the integer. Values outside the range of valid Unicode code points are converted to "\uFFFD".
But its value is not a valid unicode code point so it will always be "\uFFFD" which is efbfbd (UTF-8 encoded), and your code always prints the SHA-256 of the data []byte{0xef, 0xbf, 0xbd} which is (or rather its first 45 hex digits because you slice the result):
83d544ccc223c057d2bf80d3f2a32982c32c3c0db8e26
I guess you wanted to generate some random bytes and calculate the SHA-256 of that, something like this:
data := make([]byte, 10)
for i := range data {
data[i] = byte(rand.Intn(256))
}
fmt.Printf("%x", sha256.Sum256(data))
Note that if you'd use the crypto/rand package instead of math/rand, you could fill a slice of bytes with random values using the rand.Read() function, and you don't even have to set seed (and so you don't even need the time package):
data := make([]byte, 10)
if _, err := rand.Read(data); err == nil {
fmt.Printf("%x", sha256.Sum256(data))
}
Yes. This:
string(timestamp)
does not do what you think it does, see the spec. Long story short, the timestamp is not a valid unicode code point, so the result is always "\uFFFD".

How to convert strings to array of byte and back

4I must write strings to a binary MIDI file. The standard requires one to know the length of the string in bytes. As I want to write for mobile as well I cannot use AnsiString, which was a good way to ensure that the string was a one-byte string. That simplified things. I tested the following code:
TByte = array of Byte;
function TForm3.convertSB (arg: string): TByte;
var
i: Int32;
begin
Label1.Text := (SizeOf (Char));
for i := Low (arg) to High (arg) do
begin
label1.Text := label1.Text + ' ' + IntToStr (Ord (arg [i]));
end;
end; // convert SB //
convertSB ('MThd');
It returns 2 77 84 104 100 (as label text) in Windows as well as Android. Does this mean that Delphi treats strings by default as UTF-8? This would greatly simplify things but I couldn't find it in the help. And what is the best way to convert this to an array of bytes? Read each character and test whether it is 1, 2 or 4 bytes and allocate this space in the array? For converting back to a character: just read the array of bytes until a byte is encountered < 128?
Delphi strings are encoded internally as UTF-16. There was a big clue in the fact that SizeOf(Char) is 2.
The reason that all your characters had ordinal in the ASCII range is that UTF-16 extends ASCII in the sense that characters 0 to 127, in the ASCII range, have the same ordinal value in UTF-16. And all your characters are ASCII characters.
That said, you do not need to worry about the internal storage. You simply convert between string and byte array using the TEncoding class. For instance, to convert to UTF-8 you write:
bytes := TEncoding.UTF8.GetBytes(str);
And in the opposite direction:
str := TEncoding.UTF8.GetString(bytes);
The class supports many other encodings, as described in the documentation. It's not clear from the question which encoding you are need to use. Hopefully you can work the rest out from here.

How do int-to-string casts work in Go?

I only started Go today, so this may be obvious but I couldn't find anything on it.
What does var x uint64 = 0x12345678; y := string(x) give y?
I know var x uint8 = 65; y := string(x) would give y the byte 65, character A, and common sense would suggest (since types larger than uint8 are allowed to be cast to strings) that they would simply be packed in to native byte order (i.e little endian) and assigned to the variable.
This does not seem to be the case:
hex.EncodeToString([]byte(y)) ==> "efbfbd"
First thought says this is an address with the last byte being left off because of some weird null terminator thingy, but if I allocate two x and y variables with two different values and print them out I get the same result.
var x, x2 uint64 = 0x10000000, 0x20000000
y, y2 := string(x), string(x2)
fmt.Println(hex.EncodeToString([]byte(y))) // "efbfbd"
fmt.Println(hex.EncodeToString([]byte(y2))) // "efbfbd"
Maddeningly I can't find the implementation for the string type anywhere although I probably haven't looked hard enough.
This is covered in the Spec: Conversions: Conversions to and from a string type:
Converting a signed or unsigned integer value to a string type yields a string containing the UTF-8 representation of the integer. Values outside the range of valid Unicode code points are converted to "\uFFFD".
So effectively when you convert a numeric value to string, it can only yield a string having one rune (character). And since Go stores strings as the UTF-8 encoded byte sequences in memory, that is what you will see if you convert your string to []byte:
Converting a value of a string type to a slice of bytes type yields a slice whose successive elements are the bytes of the string.
When you try to conver the 0x12345678, 0x10000000 and 0x20000000 values to string, since they are outside of the range of valid Unicode code points, as per spec they are converted to "\uFFFD" which in UTF-8 encoding is []byte{239, 191, 189}; when encoded to hex string:
fmt.Println(hex.EncodeToString([]byte("\uFFFD"))) // Output: efbfbd
Or simply:
fmt.Printf("%x", "\uFFFD") // Output: efbfbd
Read the blog post Strings, bytes, runes and characters in Go for more details about string internals.
And btw since Go 1.5 the Go runtime is implemented (mostly) in Go, so these conversions are now implemented in Go and can be found in the runtime package: runtime/string.go, look for the intstring() function.

Resources