IT dictionary
Bignum
Big'nuhm/ (Originally from MIT MacLISP) A multiple-precision computer representation for very large integers.
Most computer languages provide a type of data called "integer", but such computer integers are usually limited in size; usually they must be smaller than 2^31 (2,147,483,648) or (on a bitty box) 2^15 (32,768). If you want to work with numbers larger than that, you have to use floating-point numbers, which are usually accurate to only six or seven decimal places. Computer languages that provide bignums can perform exact calculations on very large numbers, such as 1000! (the factorial of 1000, which is 1000 times 999 times 998 times ... times 2 times 1). For example, this value for 1000! was computed by the MacLISP system using bignums:
Content
License.
All information of this service is derived from the free sources and is provided solely in the form of quotations.
This service provides information and interfaces solely for the familiarization (not ownership) and under the "as is" condition.
Copyright 2016 © ELTASK.COM. All rights reserved.
Site is optimized for mobile devices.
Downloads: 420 / 158764084. Delta: 0.00132 с