Using express, is it possible to return 0.0 (as a number) in response JSON?
What we are observing is, if the value is 0.0, then Javascript is keeping only 0 and response JSON just has 0.
We have a consuming system which is implying the data type of the json property based on data and it is wrongly implying the field to be Long instead of Decimal.
you can use .toFixed() to set the number of decimals. So for example
var number = 0.000;
console.log(number); // <--- Prints only 0
var preciseNumber = parseFloat(0).toFixed(4); // 4 inside the .toFixed(4) is number of places after decimal point
console.log(preciseNumber); // <--- Prints '0.0000'
Do note that the preciseNumber is a string now.
you can use .toFixed() to set the number of decimals. So for example
var number = 0.000;
console.log(number); // <--- Prints only 0
var preciseNumber = parseFloat(0).toFixed(4); // 4 inside the .toFixed(4) is number of places after decimal point
console.log(preciseNumber); // <--- Prints '0.0000'
Do note that the preciseNumber is a string now.
Related
I'm working on my Web Server using Node JS and Express.js. I need to insert hex value in a Buffer, starting from a binary string. My code is something like this:
var buffer = fs.readFileSync("myBINfile.dat");
setValue(buffer, 4);
function setValue(buffer, i) {
var value1 = parseInt(buffer[i].toString(16), 16).toString(2).toString()
value1 = "0" + value1.substring(1, value1.length);
var hex = parseInt(value1, 2).toString(16);
console.log(hex); // print a8 (correct)
buffer[i] = hex;
console.log(buffer[i]); // print 0 (why?)
}
The buffer contains a hex file. value1 is read correctly. How can fix this this problem?
Thanks
You're writing a string value into your Buffer object rather than a numerical value that it expects. Replace the line:
var hex = parseInt(value1, 2).toString(16);
with:
var hex = parseInt(value1, 2);
"a8" is really just the integer value 168 (you'll find if you console.log(value1) before your var hex = parseInt(value1, 2).toString(16); line, you'll get 10101000 (168 in binary)). When you write this value to the buffer, you really just want to write the integer value, not the string "a8".
The "hex value" as you put it, is actually just a number, hex is simply a presentation format. You don't store "hex numbers" or "binary numbers", you just store the numbers themselves.
As a result doing this you'll find console.log(hex); outputs 168 instead and think "That's wrong though!", but it's not, because 168 is a8.
The reason why you'll find it works with some values but not others is because any values which result in a purely numerical hex value (e.g. "22" or "67") will be automatically converted the their numerical equivalent (22 or 67). In your case however "a8" the value cannot be converted to the number type required by the buffer and so is discarded and 0 is written.
Node.js is mutating the number value passed into the test function. Any idea why and how to fix? Thank you!
const val1 = 61368443939717130;
const val2 = 861368443939717130;
const val3 = 161368443939717130;
const test = (v) => {
console.log(v);
}
test(val1);
test(val2);
test(val3);
// Output:
// 61368443939717130
// 861368443939717100
// 161368443939717120
It's because the number 161368443939717130 is bigger than the maximum safe integer that Javascript can represent, which is provided by the constant Number.MAX_SAFE_INTEGER. When working with big numbers you should always check that the values are below the limit, or otherwise you will get unexpected results. See more info here.
Your numbers are out of range for javascript.
The max number javascript can handle is 9007199254740991 (9,007,199,254,740,991 or ~9 quadrillion).
It is represented by the MAX_SAFE_INTEGER variable. The reasoning behind that number is that JavaScript uses double-precision floating-point format numbers as specified in IEEE 754 and can only safely represent integers between -(2^53 - 1) and 2^53 - 1.
Use BigInt, which has no practical upper limit. But BigInt can’t handle decimals.
This means that if you convert from a Number to a BigInt and backward again, you can lose precision.
console.log(Number.MAX_SAFE_INTEGER)
// 9007199254740991
console.log(Number.MAX_SAFE_INTEGER + 10)
// 9007199254741000
console.log(BigInt(Number.MAX_SAFE_INTEGER) + 10n)
// 9007199254741001n
I am trying to do a relatively simple task of writing 1000000000000000000000 (as a number without quotes) to a json file using NodeJs. I have this number defined as a constant:
const NUM = 1000000000000000000000
If I simply write it, it comes out as 1e+21. Everything else I tried (big number...) ends up converting it to a string.
Can someone please help?
Update
To clarify what i am after, please look at the code below:
// 22 digits
const bigNum = 1000000000000000000000;
console.log(JSON.stringify(bigNum)); // Output: 1e+21
// 19 digits
const bigNum2 = 1000000000000000000;
console.log(JSON.stringify(bigNum2)); // Output: 1000000000000000000
What i would like is to be able to output 1000000000000000000000 in the first example instead of 1e+21
If I simply write it, it comes out as 1e+21. Everything else I tried (big number...) ends up converting it to a string.
JSON will always be a string in its serialized representation. It also does not impose any limits on number sizes, although you might of course hit limits imposed by the runtime you use to serialize your data into JSON:
> let num = JSON.stringify(Number.MAX_SAFE_INTEGER + 1)
undefined
> JSON.parse(num)
9007199254740992
> Number.MAX_SAFE_INTEGER + 1
9007199254740992
The exponential notation does not change the value but just saves bytes and is easier to read:
> 1e+21 === 1000000000000000000000
true
I am trying to form a number having a decimal separator from a string in order to be able to apply .toLocaleString() method , which cannot be applied on strings.
const initialString = 122; //number
const decimalStr = initialString.toFixed(initialString); //string
const formattedStr = decimalStr.toLocaleString('de-DE');//error,decimalStr is a string
If any conversion is to be applied to the decimalStr, the decimal digits would be lost ( i.e. decimalStr = +decimalStr or decimalStr = Number(decimalStr). => problem, need to keep decimal digits.
How can i keep the decimal points and make the .toLocaleString()
see calling value as a number?
The Number.toLocaleString method accepts a second argument called options.
You can use this argument to control the number of fraction digits. Therefore, your call would be:
const initialNumber = 122;
const formattedStr = initialNumber.toLocaleString('de-DE', {
minimumFractionDigits: 2,
maximumFractionDigits: 2
});
For more details check here.
Also, notice that both toFixed and the maximumFractionDigits have limits: 100 and 20, respectively. So the example you provide fails on the second line.
I use a random string generator, based on this:
http://stackoverflow.com/questions/1344221/how-can-i-generate-random-8-character-alphanumeric-strings-in-c
Every now and then, it will generate a 20 chars len string like this
aaaaaaaaaaaaaaaaaaaa when it needs to generate a 20 chars len string full of random chars (eg 63TSRVvbVDJiMNwneB5l), like if the C# Random object would return a value of 0 every time over the 20 iterations.
public static string GetRandomAlphaNumericString(int charCount)
{
var result = new string(
Enumerable.Repeat(CHARS, charCount)
.Select(s =>
{
var nxt = SHARED_RANDOM.Next(s.Length);
var rval = s[nxt];
return rval;
})
.ToArray());
return result;
}
const string CHARS = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789";
public static Random SHARED_RANDOM = new Random(Guid.NewGuid().GetHashCode());
It doesn't make sense to me, what could I be overseeing in this generator ? The generator works OK, but every now and then, very rarely, it would perform like this generating a strike aaaaaaa strings for a short period of time.
I don't see a bug in the code, so to me it's 1 of 2 things, or the C# Random object is acting funny in some situations, returning 0 for a short period of time, or the CHARS const string changes from "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789" to "a" in the assembly for some reason, for a short period of time.
I can't believe it could be either of the above, I prefer to believe that I'm overseeing something. Do you see or imagine how could this possibly be happening ?
UPDATE OCT 23 / 2017
Coded a test routine so if the GetRandomAlphaNumericString produces the evil aaaaa string, it will throw an error, but logs a quick report before.
The report generates 3 random numbers in a row, and adds it to the log:
SHARED_RANDOM.Next(20);
SHARED_RANDOM.Next(20);
SHARED_RANDOM.Next(20);
The 3 numbers produced are 0, nice uh ?
This tells me that the issue is not the GetRandomAlphaNumericString, but the Random object itself.
There is some sort of corruption that causes the object to return those results, so basically it could be anything ?!