diff options
| author | 2018-05-23 14:44:39 +0100 | |
|---|---|---|
| committer | 2018-05-28 08:45:37 +0100 | |
| commit | ffaf87a429766ed80e6afee5bebea93db539620b (patch) | |
| tree | d79637f4b6a564facf4b837c3ff125bb3755594e /runtime/quick_exception_handler.cc | |
| parent | 5513c2b68a08109a5bfd811c7b2c8bbc66244e8e (diff) | |
Optimize register mask and stack mask in stack maps.
Use BitTable to store the masks as well and move the
deduplication responsibility to the BitTable builders.
Don't generate entries for masks which are all zeros.
This saves 0.2% of .oat file size on both Arm64 and Arm.
Encode registers as (value+shift) due to tailing zeros.
This saves 1.0% of .oat file size on Arm64 and 0.2% on Arm.
Test: test-art-host-gtest
Change-Id: I636b7edd49e10e8afc9f2aa385b5980f7ee0e1f1
Diffstat (limited to 'runtime/quick_exception_handler.cc')
| -rw-r--r-- | runtime/quick_exception_handler.cc | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/runtime/quick_exception_handler.cc b/runtime/quick_exception_handler.cc index de613d3b20..26489209b8 100644 --- a/runtime/quick_exception_handler.cc +++ b/runtime/quick_exception_handler.cc @@ -439,7 +439,7 @@ class DeoptimizeStackVisitor FINAL : public StackVisitor { const uint8_t* addr = reinterpret_cast<const uint8_t*>(GetCurrentQuickFrame()) + offset; value = *reinterpret_cast<const uint32_t*>(addr); uint32_t bit = (offset >> 2); - if (bit < code_info.GetNumberOfStackMaskBits() && stack_mask.LoadBit(bit)) { + if (bit < stack_mask.size_in_bits() && stack_mask.LoadBit(bit)) { is_reference = true; } break; |