There is big hex value:
var Hex = "ad6eb61316ff805e9c94667ab04aa45aa3203eef71ba8c12afb353a5c7f11657e43f5ce4483d4e6eca46af6b3bde4981499014730d3b233420bf3ecd3287a2768da8bd401f0abd7a5a137d700f0c9d0574ef7ba91328e9a6b055820d03c98d56943139075d";
How can I convert it to big integer in node.js? I tried to search, but what I found is
var integer = parseInt(Hex, 16);
But It doesn't work if I put big hex value. I think.
the result is,
1.1564501846672726e+243
How can I return normal big integer? I want to use this value for modulus in RSA encryption. Actually I don't know I have to convert it or not.
You need precise integers to do modular arithmetic for RSA, but the largest integer in JavaScript is 9007199254740991 without losing precision. You cannot represent a larger integer as a Number. You would need to devise a way to do modular arithmetic with many chunks of the large integer or simply use one of the available like the big number arithmetic in JSBN which also provides a full implementation of RSA including PKCS#1 v1.5 padding.
Related
This question already has an answer here:
What is the standard solution in JavaScript for handling big numbers (BigNum)?
(1 answer)
Closed 3 years ago.
I'm using node.js + express for my project. I end up with big string like "365958975201738764" which needs to be converted into an int.
So I need to take "365958975201738764" which is a string and make it 365958975201738764 which is an int.
Problem is, whenever I do the conversions, js messes up.
I did some research and found out that 9007199254740991 is the biggest number that js can work with, so is there a workaround to this?
Numbers in JavaScript are represented as double-precision floats. This means they have limited precision. The Number.MAX_SAFE_INTEGER constant gives the greatest possible integer that can safely be incremented. 2^53=9,007,199,254,740,992
BigInts are a new numeric primitive in JavaScript that can represent integers with arbitrary precision. With BigInts, you can safely store and operate on large integers even beyond the safe integer limit for Numbers.
To create a BigInt, add the 'n' suffix to any integer literal. For example, 789 is Int and 789n is BigInt. The global BigInt(number) function can be used to convert a Number into a BigInt.
> BigInt(Number.MAX_SAFE_INTEGER)
// returns 9007199254740991n
For example, you can calculate MAX_SAFE_INTEGER to the power of two
> 9007199254740991n ** 2n
// returns 81129638414606663681390495662081n
For more detailed information and examples arbitrary-precision integers in JavaScript and BigInt - JavaScript | MDN
In my RSA algorithm implementation I should evaluate the following expression:
b = ((m-da)*H)mod(p-1)
While all the elements in the expression are integers, m is a String. That is a problem for me. How can I use a String in the case? Particularly I need that to sign the message m.
How can I use a String in the case?
You can't. m needs to be an integer. You need to devise some kind of encoding like convert to a byte array using some kind of character encoding like UTF-8. Then you can use some kind of encoding to unserialize the byte array into an integer. Have a look at how big-endian or little-endian type of integer encodings work.
I am very much a beginner in encryption/hashing. And I want to know how to hash a variable length string, maybe 10 or 100 letters to a fixed length code, e.g. 128-bit binary, regardless of the underlying programming language, while achieving relatively equal collisions among the bins.
Specifically, how to deal with inputs of different inputs, and make the hashcode evenly distributed?
There are many different ways to do this.
For non-cryptographic applications, it's common to hash strings by iterating over the characters in sequence and applying some operation to mix in the bits of the new character with the accumulated hash bits. There are many variations on how exactly you'd carry this out. One common approach is shown here:
unsigned int kSmallPrime = /* some small prime */;
unsigned int kLargePrime = /* some large prime */;
unsigned int result = 0;
for (char ch: string) {
result = (result * kSmallPrime + ch) % kLargePrime;
}
More complex combination steps are possible to get better distributions. These approaches generally don't require the string to have any specific length and work for any length of string. The number of bits you get back depends on what internal storage you use for mixing up the bits, though there's not necessarily a strong theoretical reason (other than empirical evidence) to believe that you have a good distribution.
For cryptographic applications, string hash functions are often derived from block ciphers. Constructions like Merkle-Damgard let you start with a secure block cipher and produce a secure hash function. They work by padding the string up to some multiple of the block size using a secure padding scheme (one that ensures that different strings end up different after padding), breaking the string apart into blocks, and hashing them in a chain. The final output is then derived from the underlying block cipher, which naturally outputs a large number of bits, and the nice distribution comes from the strength of the underlying block cipher, which (in principle) should be indistinguishable from random.
Is there any way to code a long string to a unique number (integer) and then decode this number to original string? (I mean to reduce size of long string)
The simple answer is no.
The complex answer is maybe.
What you are looking for is compression, compression can reduce the size of the String but there is no guarantee as to how small it can make it. In particular you can never guarantee being able to fit it into a certain sized integer.
There are concepts like "hashing" which may help you do what you want depending on exactly what you are trying to do with this number.
Alternatively if you use the same string in a lot of different places then you can store it once and pass references/pointers to that single instance of the String around.
First you need to hash it to string eg md5. Then you convert the characters of the hash string into numbers according to the alphabetical number
We have an alpha numeric string (up to 32 characters) and we want to transform it to an integer (bigint). Now we're looking for an algorithm to do that. Collision isn't bad (therefor we use an bigint to prevent this a little bit), important thing is, that the calculated integers are constantly distributed over bigint range and the calculated integer is always the same for a given string.
This page has a few. You'll need to port to 64bit, but that should be trivial. A C# port of SBDM hash is here. Another page of hash functions here
Most programming languages come with a built-in construct or a standard library call to do this. Without knowing the language, I don't think anyone can help you.
Yes, a "hash" should be the right description for my problem. I know, that there is CRC32, but it only provides an 32-bit int (in PHP) and this 32-bit integers are at least 10 characters long, so a huge range of integer number is unused!?
Mostly, we have a short string like "PX38IEK" or an 36 character UUID like "24868d36-a150-11df-8882-d8d385ffc39c", so the strings are arbitrary, yes.
It doesn't has to be reversible (so collisions aren't bad). It also doesn't matter what int a string is converted to, my only wish is, that the full bigint range is used as best as possible.