Is there a way to get the occupied memory size in bytes of BigInt numbers?
let a = BigInt(99999n)
console.log(a.length) // yield undefined
Thanks
V8 developer here. There is generally no way to determine the occupied memory size of an object, and BigInts are no exception. Why do you want to access it?
As far as the internal implementation in V8 is concerned, a BigInt has a small object header (currently two pointer sizes; this might change over time), and then a bit for every bit of the BigInt, rounded up to multiples of a pointer size. 99999 is a 17-bit number, so in your example let a = 99999n ("BigInt(99999n)" is superfluous!), the allocated BigInt will consume (2 + Math.ceil(17/64)) * 64 bits === 24 bytes on a 64-bit system.
It may or may not make sense to add length-related properties or methods (.bitLength?) to BigInts in the future. If you have a use case, I suggest you file an issue at https://github.com/tc39/proposal-bigint/issues so that it can be discussed.
Related
In 2018 a Tech Lead at Google said they were working to "support buffers way beyond 4GiB" in V8 on 64 bit systems. Did that happen?
Trying to load a large file into a buffer like:
const fileBuffer = fs.readFileSync(csvPath);
in Node v12.16.1 and getting the error:
RangeError [ERR_FS_FILE_TOO_LARGE]: File size (3461193224) is greater than possible Buffer: 2147483647 bytes.
and in Node v14.12.0 (latest) and getting the error:
RangeError [ERR_FS_FILE_TOO_LARGE]: File size (3461193224) is greater than 2 GB
Which looks to me to be a limit set due to 32 bit integers for addressing of the buffers. But I don't understand why this would be a limitation on 64 bit systems... Yes I realize I can use streams or read from the file at a specific address, but I have massive amounts of memory laying around, and I'm limited to 2147483647 bytes because Node is limited at 32 bit addressing?
Surely having a buffer of a high frequency random access data-set fully loaded into a buffer rather than streamed has performance benefits. The code involved in directing the request to pull from the multiple buffer alternative structure is going to cost something, regardless of how small...
I can use the --max-old-space-size=16000 flag to increase the maximum memory used by Node, but I suspect this is a hard-limit based on the architecture of V8. However I still have to ask since the tech lead at Google did claim they were increasing the maximum buffer size past 4GiB: Is there any way in 2020 to have a buffer beyond 2147483647 bytes in Node.js?
Edit, relevant tracker on the topic by Google, where apparently they were working on fixing this since at least last year: https://bugs.chromium.org/p/v8/issues/detail?id=4153
Did that happen?
Yes, V8 supports very large (many gigabytes) ArrayBuffers nowadays.
Is there any way to have a buffer beyond 2147483647 bytes in Node.js?
Yes:
$ node
Welcome to Node.js v14.12.0.
Type ".help" for more information.
> let b = Buffer.alloc(3461193224)
undefined
> b.length
3461193224
That said, it appears that fs.readFileAsync has its own limit: https://github.com/nodejs/node/blob/master/lib/internal/fs/promises.js#L5
I have no idea what it would take to lift that. I suggest you file an issue on Node's bug tracker.
FWIW, Buffer has yet another limit:
> let buffer = require("buffer")
undefined
> buffer.kMaxLength
4294967295
And again, that's Node's decision, not V8's.
I wonder what is the unit for reporting a memory usage returned by vtkPVMemoryUseInformation.GetProcMemoryUse (reference)? Is it a bit, byte, kilobyte? Where can I find this in the documentation?
Update 1
I'm calling the mentioned function from a Python-script with servermanager.vtkPVMemoryUseInformation().GetProcMemoryUse(<index>). We don't have size_t in Python, right? The main question is how can I convert the value into a human-readable value like MB or GB returned by a function call?
This method internally uses vtksys::SystemInformation, which returns system RAM used in units of KiB.
https://github.com/Kitware/VTK/blob/master/Utilities/KWSys/vtksys/SystemInformation.hxx.in
The doc should be improved here.
I learned that a reference takes 64 bits :
That is, a referential structure will typically use 64-bits for the memory address stored in the array, on top of whatever number of bits are used to represent the object that is considered the element.
How could I see it in action?
In [75]: patients = ["trump", "Trump", "trumP"]
In [76]: id(patients[1])
Out[76]: 4529777048
In [77]: math.log2(4529777048)
Out[77]: 32.076792897710234
It's 2**32 rather than 2**64.
With math.log2(id(obj)) you ask "2 raised to what power gives us the address of obj in the memory?".
This is not how id() works. id() gives you a constant and unique value for every object. In CPython this is the address of the object in the memory.
On 64 bit systems it makes sense to store this address in a 64-bit variable since you would not be able to cover the full address-space with a 32 bit variable.
However a 64 bit reference does not mean, that every object has the address of 2**64. As of 2018 this would not even be possible since our x86_64 pcs have just a 48-bit address space. That the id of your first patient was near 2**32 is (mostly) coincidence.
id will return the address in memory. So, this is not what you are looking for.
Typically a way to get the size in memory of something in Python is using sys.getsizeof(). However, this will return the size of the object. You are interested in the size of the reference to that object.
You can however still calculate this more or less as follows: 8 * struct.calcsize("P"). This will basically reveal if you are on a 32-bit or 64-bit system, and therefore you know what the size of a reference is. But really calculating it by inspecting a reference, I don't know if that is possible.
Is it possible to write 64-bit BigInts into a Buffer in Node.js (10.7+) yet?
Or do I still have to do it in two operations?
let buf = Buffer.allocUnsafe(16);
buf.writeUInt32BE(Number(time>>32n),0,true);
buf.writeUInt32BE(Number(time&4294967295n),4,true);
I can't find anything promising in the docs, but there's other barely documented methods such as BigInt.asUintN, so i thought I'd ask.
I was just faced with a similar problem (needing to build and write 64-bit IDs consisting of a 41-bit timestamp, 13-bit node ID, and a 10-bit counter). The largest single value I was able to write to a buffer was 48-bit using buf.writeIntLE(). So I ended up building up / writing the high 48 bits, and low 16 bits independently. If there's a better way to do it, I'm not aware of it.
Did you already try this package?
https://github.com/substack/node-bigint#tobufferopts
I got a fatal error reading a file that was too big to fit in a buffer.
FATAL ERROR: v8::Object::SetIndexedPropertiesToExternalArrayData() length exceeds max acceptable value
Or,
RangeError: "size" argument must not be larger than 2147483647
at Function.Buffer.allocUnsafe (buffer.js:209:3)
If I try to allocate a 1GB Buffer I get the same fatal Error,
var oneGigInBytes = 1073741824;
var my1GBuffer = new Buffer(oneGigInBytes); //Crash
What is the maximum size of a Node.js Buffer class instance?
Maximum length of a typed array in V8 is currently set to kSmiMaxValue which depending on the platform is either:
1Gb - 1byte on 32-bit
2Gb - 1byte on 64-bit
Relevant constant in the code is v8::internal::JSTypedArray::kMaxLength (source).
V8 team is working on increasing this even further on 64-bit platforms, where currently ArrayBuffer objects can be up to Number.MAX_SAFE_INTEGER large (2**53 - 1). See bug 4153.
This is now documented as part of Node's buffer api, the maximum size is buffer.constants.MAX_LENGTH.
buffer.constants.MAX_LENGTH <integer> The largest size allowed for a single Buffer instance.
On 32-bit architectures, this value is (2^30)-1 (~1GB).
On 64-bit architectures, this value is (2^31)-1 (~2GB).
This value is also available as buffer.kMaxLength.
So you can figure out how big it is by doing
> (require('buffer').constants.MAX_LENGTH + 1) / 2**30
2
Seems like the current max buffer size is 2147483647 bytes aka 2.147GB
Source: https://stackoverflow.com/a/44994896/3973137 (and my own code)
The actual maximum size of a buffer changes across platforms and
versions of Node.js. You can find out what's the limit
in bytes in a given platform, run this:
import buffer from "buffer";
console.log(buffer.constants.MAX_LENGTH);