We compared multi-digit multiplication between python and C (gnu gmp). The calculation time on a personal computer (4Ghz) is shown below (unit: seconds). Decimal 300 million digits (result) Calculated in order from 1 time to half the number of digits and multiple times. The value at the right end of both is the time calculated 128 times in decimal 2.36 million digits. python : 1627, 1084, 723, 482, 322, 215, 143, 96 (s) gnu(gmp) : 4.5, 3.9, 3.7, 3,4, 3.2, 2.8, 2.6, 2.2 (s) You can see that python uses the Karatsuba algorithm and gmp uses FMT (integer FFT) with more than 1000 digits. Python has twice the number of digits and requires three times as much time (1.5 times in reverse order in the example). Since gmp is an FMT calculation, the log (number of digits) ratio is calculated by multiplying the number of digits by the number of times. See the multi-digit multiplication of the python program at https://ecc-256.com for the python source and detailed results for both. Interestingly, python is only about twice as slow in decimal 20 to 200 digits (both are huge). This is probably because it takes a lot of time to secure a place to store the results and the pointer. Python is slow to convert a to decimal with d10 = format (a). The conversion takes four times as long as the multiplication takes twice as many digits and three times as long. The multiplication and conversion times of 1.18 million digits, 2.36 million digits, and 4.72 million digits are shown in order. Multiplication: 0.24, 0.72, 2,19 (s), Conversion: 19, 77, 308 (s)
Recommended Posts