diff options
author | Dmytro Lytovchenko <[email protected]> | 2016-01-21 17:20:47 +0100 |
---|---|---|
committer | Dmytro Lytovchenko <[email protected]> | 2016-02-02 11:32:58 +0100 |
commit | c96b6c2f58642b457d806c0a8a5bed03d16e35f1 (patch) | |
tree | 6b70c74b6d5e96559811589c4fb5ccef19ac3d6d /erts/emulator/beam/big.h | |
parent | 0236a875929729eca1933cbb854267f584734b26 (diff) | |
download | otp-c96b6c2f58642b457d806c0a8a5bed03d16e35f1.tar.gz otp-c96b6c2f58642b457d806c0a8a5bed03d16e35f1.tar.bz2 otp-c96b6c2f58642b457d806c0a8a5bed03d16e35f1.zip |
Better list_to_integer
Now tries to use whole width of signed long (Sint) and this halves amount of
multiplications needed to parse long integers. New code is 2-3 times faster
than the old code for large inputs (tens and hundreds of digits), behavior
should not change for small inputs.
Test ran 10k times with GC forced between attempts.
Was (R17):
720 el base 10: 0.14682 sec; base 16: 0.192722 sec; base 36: 0.337118 sec.
2800 el base 10: 1.794133 sec; base 16: 2.735106 sec; base 36: 4.761108 sec.
6500 el base 10: 9.316434 sec; base 16: 14.109469 sec; base 36: 25.319263 sec.
Now (R19 Dev)
720 el base 10: 0.10265 sec; base 16: 0.10851 sec; base 36: 0.160478 sec.
2800 el base 10: 1.002793 sec; base 16: 1.360649 sec; base 36: 2.174309 sec.
6500 el base 10: 4.722197 sec; base 16: 6.60522 sec; base 36: 10.552795 sec.
Added test for corner cases and sign bit corruption. Replaced macros with
inline and hid it inside C file to not pollute global namespace
Old bug in #define LG2_LOOKUP: Replaced with inline function and table
recalculated for all bases 2 to 36 (was 2 to 64)
Diffstat (limited to 'erts/emulator/beam/big.h')
-rw-r--r-- | erts/emulator/beam/big.h | 22 |
1 files changed, 11 insertions, 11 deletions
diff --git a/erts/emulator/beam/big.h b/erts/emulator/beam/big.h index 85807d6eea..9c92de6b55 100644 --- a/erts/emulator/beam/big.h +++ b/erts/emulator/beam/big.h @@ -54,9 +54,6 @@ typedef Uint32 ErtsHalfDigit; #error "can not determine machine size" #endif -#define D_DECIMAL_EXP 9 -#define D_DECIMAL_BASE 1000000000 - typedef Uint dsize_t; /* Vector size type */ #define D_EXP (ERTS_SIZEOF_ETERM*8) @@ -173,12 +170,15 @@ Eterm erts_sint64_to_big(Sint64, Eterm **); Eterm erts_chars_to_integer(Process *, char*, Uint, const int); -#define LTI_BAD_STRUCTURE 0 -#define LTI_NO_INTEGER 1 -#define LTI_SOME_INTEGER 2 -#define LTI_ALL_INTEGER 3 - -int do_list_to_integer(Process *p, Eterm orig_list, - Eterm *integer, Eterm *rest); - +/* How list_to_integer classifies the input, was it even a string? */ +typedef enum { + LTI_BAD_STRUCTURE = 0, + LTI_NO_INTEGER = 1, + LTI_SOME_INTEGER = 2, + LTI_ALL_INTEGER = 3 +} LTI_result_t; + +LTI_result_t erts_list_to_integer(Process *BIF_P, Eterm orig_list, + const Uint base, + Eterm *integer_out, Eterm *tail_out); #endif |