WebKit Bugzilla
Attachment 369919 Details for
Bug 197732
: [JSC] Add CodeOffset, 32bit offset in JIT memory pool
Home
|
New
|
Browse
|
Search
|
[?]
|
Reports
|
Requests
|
Help
|
New Account
|
Log In
Remember
[x]
|
Forgot Password
Login:
[x]
[patch]
Patch
bug-197732-20190514192221.patch (text/plain), 58.68 KB, created by
Yusuke Suzuki
on 2019-05-14 19:22:21 PDT
(
hide
)
Description:
Patch
Filename:
MIME Type:
Creator:
Yusuke Suzuki
Created:
2019-05-14 19:22:21 PDT
Size:
58.68 KB
patch
obsolete
>Subversion Revision: 245315 >diff --git a/Source/JavaScriptCore/ChangeLog b/Source/JavaScriptCore/ChangeLog >index 312863185dc8be5ea8d89570fd5919333e6c814d..443d5828aa25d0da2f13cb7b5d5f3c42f9df0bbc 100644 >--- a/Source/JavaScriptCore/ChangeLog >+++ b/Source/JavaScriptCore/ChangeLog >@@ -1,3 +1,100 @@ >+2019-05-14 Yusuke Suzuki <ysuzuki@apple.com> >+ >+ [JSC] Add CodeOffset, 32bit offset in JIT memory pool >+ https://bugs.webkit.org/show_bug.cgi?id=197732 >+ >+ Reviewed by NOBODY (OOPS!). >+ >+ * CMakeLists.txt: >+ * JavaScriptCore.xcodeproj/project.pbxproj: >+ * bytecode/ByValInfo.h: >+ (JSC::ByValInfo::ByValInfo): >+ * bytecode/CallLinkInfo.cpp: >+ (JSC::CallLinkInfo::callReturnLocation): >+ (JSC::CallLinkInfo::patchableJump): >+ (JSC::CallLinkInfo::hotPathBegin): >+ (JSC::CallLinkInfo::slowPathStart): >+ * bytecode/CallLinkInfo.h: >+ (JSC::CallLinkInfo::hotPathOther): >+ * bytecode/InlineAccess.cpp: >+ (JSC::linkCodeInline): >+ (JSC::InlineAccess::rewireStubAsJump): >+ * bytecode/StructureStubInfo.cpp: >+ (JSC::StructureStubInfo::StructureStubInfo): >+ * bytecode/StructureStubInfo.h: >+ (JSC::StructureStubInfo::baseGPR const): >+ (JSC::StructureStubInfo::slowPathCallLocation): >+ (JSC::StructureStubInfo::doneLocation): >+ (JSC::StructureStubInfo::slowPathStartLocation): >+ (JSC::StructureStubInfo::patchableJump): >+ (JSC::StructureStubInfo::valueRegs const): >+ * dfg/DFGJumpReplacement.cpp: >+ (JSC::DFG::JumpReplacement::fire): >+ (JSC::DFG::JumpReplacement::installVMTrapBreakpoint): >+ * dfg/DFGJumpReplacement.h: >+ * dfg/DFGOSREntry.h: >+ * dfg/DFGOSRExit.cpp: >+ (JSC::DFG::OSRExit::codeLocationForRepatch const): >+ * dfg/DFGOSRExit.h: >+ * ftl/FTLLazySlowPath.cpp: >+ (JSC::FTL::LazySlowPath::generate): >+ * ftl/FTLLazySlowPath.h: >+ (JSC::FTL::LazySlowPath::patchableJump const): >+ (JSC::FTL::LazySlowPath::done const): >+ * ftl/FTLOSRExit.cpp: >+ (JSC::FTL::OSRExit::codeLocationForRepatch const): >+ * ftl/FTLOSRExit.h: >+ * jit/CodeOffset.h: Added. >+ (JSC::CodeOffsetCommon:: const): >+ (JSC::CodeOffsetCommon::differenceBetweenCodeOffset): >+ (JSC::CodeOffsetCommon::isValid const): >+ (JSC::CodeOffsetCommon::CodeOffsetCommon): >+ (JSC::CodeOffsetCommon::retagged const): >+ (JSC::CodeOffsetInstruction::CodeOffsetInstruction): >+ (JSC::CodeOffsetLabel::CodeOffsetLabel): >+ (JSC::CodeOffsetLabel::location const): >+ (JSC::CodeOffsetLabel::retagged): >+ (JSC::CodeOffsetLabel:: const): >+ (JSC::CodeOffsetJump::CodeOffsetJump): >+ (JSC::CodeOffsetJump::location const): >+ (JSC::CodeOffsetJump::retagged): >+ (JSC::CodeOffsetCall::CodeOffsetCall): >+ (JSC::CodeOffsetCall::location const): >+ (JSC::CodeOffsetCall::retagged): >+ (JSC::CodeOffsetNearCall::CodeOffsetNearCall): >+ (JSC::CodeOffsetNearCall::location const): >+ (JSC::CodeOffsetNearCall::callMode): >+ (JSC::CodeOffsetDataLabel32::CodeOffsetDataLabel32): >+ (JSC::CodeOffsetDataLabel32::location const): >+ (JSC::CodeOffsetDataLabelCompact::CodeOffsetDataLabelCompact): >+ (JSC::CodeOffsetDataLabelCompact::location const): >+ (JSC::CodeOffsetDataLabelPtr::CodeOffsetDataLabelPtr): >+ (JSC::CodeOffsetDataLabelPtr::location const): >+ (JSC::CodeOffsetConvertibleLoad::CodeOffsetConvertibleLoad): >+ (JSC::CodeOffsetConvertibleLoad::location const): >+ (JSC::CodeOffsetCommon<tag>::instructionAtOffset): >+ (JSC::CodeOffsetCommon<tag>::labelAtOffset): >+ (JSC::CodeOffsetCommon<tag>::jumpAtOffset): >+ (JSC::CodeOffsetCommon<tag>::callAtOffset): >+ (JSC::CodeOffsetCommon<tag>::nearCallAtOffset): >+ (JSC::CodeOffsetCommon<tag>::dataLabelPtrAtOffset): >+ (JSC::CodeOffsetCommon<tag>::dataLabel32AtOffset): >+ (JSC::CodeOffsetCommon<tag>::dataLabelCompactAtOffset): >+ (JSC::CodeOffsetCommon<tag>::convertibleLoadAtOffset): >+ * jit/JITCodeMap.h: >+ (JSC::JITCodeMap::Entry::codeLocation): >+ * jit/JITMathIC.h: >+ (JSC::isProfileEmpty): >+ * jit/JITOpcodes.cpp: >+ (JSC::JIT::privateCompileHasIndexedProperty): >+ * jit/JITOpcodes32_64.cpp: >+ (JSC::JIT::privateCompileHasIndexedProperty): >+ * jit/JITPropertyAccess.cpp: >+ (JSC::JIT::privateCompileGetByVal): >+ (JSC::JIT::privateCompileGetByValWithCachedId): >+ (JSC::JIT::privateCompilePutByVal): >+ (JSC::JIT::privateCompilePutByValWithCachedId): >+ > 2019-05-14 Keith Miller <keith_miller@apple.com> > > Fix issue with byteOffset on ARM64E >diff --git a/Source/JavaScriptCore/CMakeLists.txt b/Source/JavaScriptCore/CMakeLists.txt >index 5d6d1701b34fd7f2d03f58b672fe0297d2c47d57..2c9792a5d88d2bd42af6956366222736e2d3e487 100644 >--- a/Source/JavaScriptCore/CMakeLists.txt >+++ b/Source/JavaScriptCore/CMakeLists.txt >@@ -681,6 +681,7 @@ set(JavaScriptCore_PRIVATE_FRAMEWORK_HEADERS > > jit/AssemblyHelpers.h > jit/CCallHelpers.h >+ jit/CodeOffset.h > jit/ExecutableAllocator.h > jit/FPRInfo.h > jit/GCAwareJITStubRoutine.h >diff --git a/Source/JavaScriptCore/JavaScriptCore.xcodeproj/project.pbxproj b/Source/JavaScriptCore/JavaScriptCore.xcodeproj/project.pbxproj >index 7e21e0b07c532cc7a81b7510041ab2224c9d2e71..aeafbdf0846cd15b28211665c3fe66fd44ac0a6f 100644 >--- a/Source/JavaScriptCore/JavaScriptCore.xcodeproj/project.pbxproj >+++ b/Source/JavaScriptCore/JavaScriptCore.xcodeproj/project.pbxproj >@@ -1784,6 +1784,7 @@ > E35CA1541DBC3A5C00F83516 /* DOMJITHeapRange.h in Headers */ = {isa = PBXBuildFile; fileRef = E35CA1521DBC3A5600F83516 /* DOMJITHeapRange.h */; settings = {ATTRIBUTES = (Private, ); }; }; > E35CA1561DBC3A5F00F83516 /* DOMJITAbstractHeap.h in Headers */ = {isa = PBXBuildFile; fileRef = E35CA1501DBC3A5600F83516 /* DOMJITAbstractHeap.h */; settings = {ATTRIBUTES = (Private, ); }; }; > E35E03601B7AB43E0073AD2A /* InspectorInstrumentationObject.h in Headers */ = {isa = PBXBuildFile; fileRef = E35E035E1B7AB43E0073AD2A /* InspectorInstrumentationObject.h */; settings = {ATTRIBUTES = (Private, ); }; }; >+ E367FD86228A8D130061DBF3 /* CodeOffset.h in Headers */ = {isa = PBXBuildFile; fileRef = E367FD85228A8D0E0061DBF3 /* CodeOffset.h */; settings = {ATTRIBUTES = (Private, ); }; }; > E36CC9472086314F0051FFD6 /* WasmCreationMode.h in Headers */ = {isa = PBXBuildFile; fileRef = E36CC9462086314F0051FFD6 /* WasmCreationMode.h */; settings = {ATTRIBUTES = (Private, ); }; }; > E3794E761B77EB97005543AE /* ModuleAnalyzer.h in Headers */ = {isa = PBXBuildFile; fileRef = E3794E741B77EB97005543AE /* ModuleAnalyzer.h */; settings = {ATTRIBUTES = (Private, ); }; }; > E3850B15226ED641009ABF9C /* DFGMinifiedIDInlines.h in Headers */ = {isa = PBXBuildFile; fileRef = E3850B14226ED63E009ABF9C /* DFGMinifiedIDInlines.h */; }; >@@ -4784,6 +4785,7 @@ > E35E035D1B7AB43E0073AD2A /* InspectorInstrumentationObject.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = InspectorInstrumentationObject.cpp; sourceTree = "<group>"; }; > E35E035E1B7AB43E0073AD2A /* InspectorInstrumentationObject.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = InspectorInstrumentationObject.h; sourceTree = "<group>"; }; > E35E03611B7AB4850073AD2A /* InspectorInstrumentationObject.js */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.javascript; path = InspectorInstrumentationObject.js; sourceTree = "<group>"; }; >+ E367FD85228A8D0E0061DBF3 /* CodeOffset.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = CodeOffset.h; sourceTree = "<group>"; }; > E36CC9462086314F0051FFD6 /* WasmCreationMode.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = WasmCreationMode.h; sourceTree = "<group>"; }; > E3794E731B77EB97005543AE /* ModuleAnalyzer.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = ModuleAnalyzer.cpp; sourceTree = "<group>"; }; > E3794E741B77EB97005543AE /* ModuleAnalyzer.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = ModuleAnalyzer.h; sourceTree = "<group>"; }; >@@ -5778,6 +5780,7 @@ > 62D755D31B84FB39001801FA /* CallFrameShuffler64.cpp */, > DC17E8161C9C802B008A6AB3 /* CCallHelpers.cpp */, > 0F24E53D17EA9F5900ABB217 /* CCallHelpers.h */, >+ E367FD85228A8D0E0061DBF3 /* CodeOffset.h */, > 0FF054F71AC35B4400E5BE57 /* ExecutableAllocationFuzz.cpp */, > 0FF054F81AC35B4400E5BE57 /* ExecutableAllocationFuzz.h */, > A7B48DB60EE74CFC00DCBDB6 /* ExecutableAllocator.cpp */, >@@ -8822,6 +8825,7 @@ > 0F96EBB316676EF6008BADE3 /* CodeBlockWithJITType.h in Headers */, > A77F1822164088B200640A47 /* CodeCache.h in Headers */, > 86E116B10FE75AC800B512BC /* CodeLocation.h in Headers */, >+ E367FD86228A8D130061DBF3 /* CodeOffset.h in Headers */, > 0FBD7E691447999600481315 /* CodeOrigin.h in Headers */, > 0F21C27D14BE727A00ADC64B /* CodeSpecializationKind.h in Headers */, > 0F0B83A714BCF50700885B4F /* CodeType.h in Headers */, >diff --git a/Source/JavaScriptCore/bytecode/ByValInfo.h b/Source/JavaScriptCore/bytecode/ByValInfo.h >index 3399d581e1f2c8fc7cf172e666e7c72878f4b545..cfa1bd503182b057a2df726df6ea25526180fa95 100644 >--- a/Source/JavaScriptCore/bytecode/ByValInfo.h >+++ b/Source/JavaScriptCore/bytecode/ByValInfo.h >@@ -26,7 +26,7 @@ > #pragma once > > #include "ClassInfo.h" >-#include "CodeLocation.h" >+#include "CodeOffset.h" > #include "IndexingType.h" > #include "JITStubRoutine.h" > #include "Structure.h" >@@ -228,10 +228,10 @@ struct ByValInfo { > ByValInfo(unsigned bytecodeIndex, CodeLocationJump<JSInternalPtrTag> notIndexJump, CodeLocationJump<JSInternalPtrTag> badTypeJump, CodeLocationLabel<ExceptionHandlerPtrTag> exceptionHandler, JITArrayMode arrayMode, ArrayProfile* arrayProfile, CodeLocationLabel<JSInternalPtrTag> badTypeDoneTarget, CodeLocationLabel<JSInternalPtrTag> badTypeNextHotPathTarget, CodeLocationLabel<JSInternalPtrTag> slowPathTarget) > : notIndexJump(notIndexJump) > , badTypeJump(badTypeJump) >- , exceptionHandler(exceptionHandler) > , badTypeDoneTarget(badTypeDoneTarget) > , badTypeNextHotPathTarget(badTypeNextHotPathTarget) > , slowPathTarget(slowPathTarget) >+ , exceptionHandler(exceptionHandler) > , arrayProfile(arrayProfile) > , bytecodeIndex(bytecodeIndex) > , slowPathCount(0) >@@ -242,12 +242,12 @@ struct ByValInfo { > { > } > >- CodeLocationJump<JSInternalPtrTag> notIndexJump; >- CodeLocationJump<JSInternalPtrTag> badTypeJump; >+ CodeOffsetJump<JSInternalPtrTag> notIndexJump; >+ CodeOffsetJump<JSInternalPtrTag> badTypeJump; >+ CodeOffsetLabel<JSInternalPtrTag> badTypeDoneTarget; >+ CodeOffsetLabel<JSInternalPtrTag> badTypeNextHotPathTarget; >+ CodeOffsetLabel<JSInternalPtrTag> slowPathTarget; > CodeLocationLabel<ExceptionHandlerPtrTag> exceptionHandler; >- CodeLocationLabel<JSInternalPtrTag> badTypeDoneTarget; >- CodeLocationLabel<JSInternalPtrTag> badTypeNextHotPathTarget; >- CodeLocationLabel<JSInternalPtrTag> slowPathTarget; > ArrayProfile* arrayProfile; > unsigned bytecodeIndex; > unsigned slowPathCount; >diff --git a/Source/JavaScriptCore/bytecode/CallLinkInfo.cpp b/Source/JavaScriptCore/bytecode/CallLinkInfo.cpp >index ced9d4d67baddbd47fe0d6d41f46f775d09ca82d..adf6bce2f39d659f87b0df54b0f5533070fff827 100644 >--- a/Source/JavaScriptCore/bytecode/CallLinkInfo.cpp >+++ b/Source/JavaScriptCore/bytecode/CallLinkInfo.cpp >@@ -98,25 +98,25 @@ void CallLinkInfo::unlink(VM& vm) > CodeLocationNearCall<JSInternalPtrTag> CallLinkInfo::callReturnLocation() > { > RELEASE_ASSERT(!isDirect()); >- return CodeLocationNearCall<JSInternalPtrTag>(m_callReturnLocationOrPatchableJump, NearCallMode::Regular); >+ return CodeLocationNearCall<JSInternalPtrTag>(m_callReturnLocationOrPatchableJump.location(), NearCallMode::Regular); > } > > CodeLocationJump<JSInternalPtrTag> CallLinkInfo::patchableJump() > { > RELEASE_ASSERT(callType() == DirectTailCall); >- return CodeLocationJump<JSInternalPtrTag>(m_callReturnLocationOrPatchableJump); >+ return CodeLocationJump<JSInternalPtrTag>(m_callReturnLocationOrPatchableJump.location()); > } > > CodeLocationDataLabelPtr<JSInternalPtrTag> CallLinkInfo::hotPathBegin() > { > RELEASE_ASSERT(!isDirect()); >- return CodeLocationDataLabelPtr<JSInternalPtrTag>(m_hotPathBeginOrSlowPathStart); >+ return CodeLocationDataLabelPtr<JSInternalPtrTag>(m_hotPathBeginOrSlowPathStart.location()); > } > > CodeLocationLabel<JSInternalPtrTag> CallLinkInfo::slowPathStart() > { > RELEASE_ASSERT(isDirect()); >- return m_hotPathBeginOrSlowPathStart; >+ return m_hotPathBeginOrSlowPathStart.location(); > } > > void CallLinkInfo::setCallee(VM& vm, JSCell* owner, JSObject* callee) >diff --git a/Source/JavaScriptCore/bytecode/CallLinkInfo.h b/Source/JavaScriptCore/bytecode/CallLinkInfo.h >index fa0a115b5890e5978cc4eb5815cbc274dfffb007..b532546f160b2803ed32a4ba1d9156df09083091 100644 >--- a/Source/JavaScriptCore/bytecode/CallLinkInfo.h >+++ b/Source/JavaScriptCore/bytecode/CallLinkInfo.h >@@ -26,7 +26,7 @@ > #pragma once > > #include "CallMode.h" >-#include "CodeLocation.h" >+#include "CodeOffset.h" > #include "CodeSpecializationKind.h" > #include "PolymorphicCallStubRoutine.h" > #include "WriteBarrier.h" >@@ -188,7 +188,7 @@ class CallLinkInfo : public PackedRawSentinelNode<CallLinkInfo> { > > CodeLocationNearCall<JSInternalPtrTag> hotPathOther() > { >- return m_hotPathOther; >+ return m_hotPathOther.location(); > } > > void setCallee(VM&, JSCell*, JSObject* callee); >@@ -348,9 +348,9 @@ class CallLinkInfo : public PackedRawSentinelNode<CallLinkInfo> { > > private: > uint32_t m_maxNumArguments { 0 }; // For varargs: the profiled maximum number of arguments. For direct: the number of stack slots allocated for arguments. >- CodeLocationLabel<JSInternalPtrTag> m_callReturnLocationOrPatchableJump; >- CodeLocationLabel<JSInternalPtrTag> m_hotPathBeginOrSlowPathStart; >- CodeLocationNearCall<JSInternalPtrTag> m_hotPathOther; >+ CodeOffsetLabel<JSInternalPtrTag> m_callReturnLocationOrPatchableJump; >+ CodeOffsetLabel<JSInternalPtrTag> m_hotPathBeginOrSlowPathStart; >+ CodeOffsetNearCall<JSInternalPtrTag> m_hotPathOther; > WriteBarrier<JSCell> m_calleeOrCodeBlock; > WriteBarrier<JSCell> m_lastSeenCalleeOrExecutable; > RefPtr<PolymorphicCallStubRoutine> m_stub; >diff --git a/Source/JavaScriptCore/bytecode/InlineAccess.cpp b/Source/JavaScriptCore/bytecode/InlineAccess.cpp >index d7e6c35cf4ebc88d3d77e2b4773f121602df77a5..b77603f93965e9cce1aa4cdf2d2d1c701979a990 100644 >--- a/Source/JavaScriptCore/bytecode/InlineAccess.cpp >+++ b/Source/JavaScriptCore/bytecode/InlineAccess.cpp >@@ -154,7 +154,7 @@ ALWAYS_INLINE static bool linkCodeInline(const char* name, CCallHelpers& jit, St > { > if (jit.m_assembler.buffer().codeSize() <= stubInfo.patch.inlineSize()) { > bool needsBranchCompaction = true; >- LinkBuffer linkBuffer(jit, stubInfo.patch.start, stubInfo.patch.inlineSize(), JITCompilationMustSucceed, needsBranchCompaction); >+ LinkBuffer linkBuffer(jit, stubInfo.patch.start.location(), stubInfo.patch.inlineSize(), JITCompilationMustSucceed, needsBranchCompaction); > ASSERT(linkBuffer.isValid()); > function(linkBuffer); > FINALIZE_CODE(linkBuffer, NoPtrTag, "InlineAccessType: '%s'", name); >@@ -363,7 +363,7 @@ void InlineAccess::rewireStubAsJump(StructureStubInfo& stubInfo, CodeLocationLab > > // We don't need a nop sled here because nobody should be jumping into the middle of an IC. > bool needsBranchCompaction = false; >- LinkBuffer linkBuffer(jit, stubInfo.patch.start, jit.m_assembler.buffer().codeSize(), JITCompilationMustSucceed, needsBranchCompaction); >+ LinkBuffer linkBuffer(jit, stubInfo.patch.start.location(), jit.m_assembler.buffer().codeSize(), JITCompilationMustSucceed, needsBranchCompaction); > RELEASE_ASSERT(linkBuffer.isValid()); > linkBuffer.link(jump, target); > >diff --git a/Source/JavaScriptCore/bytecode/StructureStubInfo.cpp b/Source/JavaScriptCore/bytecode/StructureStubInfo.cpp >index 77f2c2a5e03b7fc0d41b4bb6fac61143d3abb4c7..0489abb62f6ed78ddf8fb2fafebbc0a730495339 100644 >--- a/Source/JavaScriptCore/bytecode/StructureStubInfo.cpp >+++ b/Source/JavaScriptCore/bytecode/StructureStubInfo.cpp >@@ -40,13 +40,8 @@ static const bool verbose = false; > } > > StructureStubInfo::StructureStubInfo(AccessType accessType) >- : callSiteIndex(UINT_MAX) >+ : bufferingCountdown(Options::repatchBufferingCountdown()) > , accessType(accessType) >- , cacheType(CacheType::Unset) >- , countdown(1) // For a totally clear stub, we'll patch it after the first execution. >- , repatchCount(0) >- , numberOfCoolDowns(0) >- , bufferingCountdown(Options::repatchBufferingCountdown()) > , resetByGC(false) > , tookSlowPath(false) > , everConsidered(false) >diff --git a/Source/JavaScriptCore/bytecode/StructureStubInfo.h b/Source/JavaScriptCore/bytecode/StructureStubInfo.h >index d3a35acafb0a88735ac020c2d5dae0b003ecea75..99ecc9f87b1bc06d8e85314b8b8be9f372ee6c3b 100644 >--- a/Source/JavaScriptCore/bytecode/StructureStubInfo.h >+++ b/Source/JavaScriptCore/bytecode/StructureStubInfo.h >@@ -167,8 +167,39 @@ class StructureStubInfo { > > bool containsPC(void* pc) const; > >+ GPRReg baseGPR() const >+ { >+ return patch.baseGPR; >+ } >+ >+ CodeLocationCall<JSInternalPtrTag> slowPathCallLocation() { return patch.slowPathCallLocation.location(); } >+ CodeLocationLabel<JSInternalPtrTag> doneLocation() { return patch.doneLocation.location(); } >+ CodeLocationLabel<JITStubRoutinePtrTag> slowPathStartLocation() { return patch.slowPathStartLocation.location(); } >+ >+ CodeLocationJump<JSInternalPtrTag> patchableJump() >+ { >+ ASSERT(accessType == AccessType::InstanceOf); >+ return patch.start.jumpAtOffset<JSInternalPtrTag>(0).location(); >+ } >+ >+ JSValueRegs valueRegs() const >+ { >+ return JSValueRegs( >+#if USE(JSVALUE32_64) >+ patch.valueTagGPR, >+#endif >+ patch.valueGPR); >+ } >+ > CodeOrigin codeOrigin; >- CallSiteIndex callSiteIndex; >+ CallSiteIndex callSiteIndex { UINT_MAX }; >+ >+ // We repatch only when this is zero. If not zero, we decrement. >+ // 1 is used for a totally clear stub, we'll patch it after the first execution. >+ uint8_t countdown { 1 }; >+ uint8_t repatchCount { 0 }; >+ uint8_t numberOfCoolDowns { 0 }; >+ uint8_t bufferingCountdown; > > union { > struct { >@@ -185,16 +216,16 @@ class StructureStubInfo { > StructureSet bufferedStructures; > > struct { >- CodeLocationLabel<JITStubRoutinePtrTag> start; // This is either the start of the inline IC for *byId caches. or the location of patchable jump for 'instanceof' caches. >- CodeLocationLabel<JSInternalPtrTag> doneLocation; >- CodeLocationCall<JSInternalPtrTag> slowPathCallLocation; >- CodeLocationLabel<JITStubRoutinePtrTag> slowPathStartLocation; >+ CodeOffsetLabel<JITStubRoutinePtrTag> start; // This is either the start of the inline IC for *byId caches. or the location of patchable jump for 'instanceof' caches. >+ CodeOffsetLabel<JSInternalPtrTag> doneLocation; >+ CodeOffsetCall<JSInternalPtrTag> slowPathCallLocation; >+ CodeOffsetLabel<JITStubRoutinePtrTag> slowPathStartLocation; > > RegisterSet usedRegisters; > > uint32_t inlineSize() const > { >- int32_t inlineSize = MacroAssembler::differenceBetweenCodePtr(start, doneLocation); >+ int32_t inlineSize = MacroAssembler::differenceBetweenCodePtr(start.location(), doneLocation.location()); > ASSERT(inlineSize >= 0); > return inlineSize; > } >@@ -208,38 +239,8 @@ class StructureStubInfo { > GPRReg thisTagGPR; > #endif > } patch; >- >- GPRReg baseGPR() const >- { >- return patch.baseGPR; >- } >- >- CodeLocationCall<JSInternalPtrTag> slowPathCallLocation() { return patch.slowPathCallLocation; } >- CodeLocationLabel<JSInternalPtrTag> doneLocation() { return patch.doneLocation; } >- CodeLocationLabel<JITStubRoutinePtrTag> slowPathStartLocation() { return patch.slowPathStartLocation; } >- >- CodeLocationJump<JSInternalPtrTag> patchableJump() >- { >- ASSERT(accessType == AccessType::InstanceOf); >- return patch.start.jumpAtOffset<JSInternalPtrTag>(0); >- } >- >- JSValueRegs valueRegs() const >- { >- return JSValueRegs( >-#if USE(JSVALUE32_64) >- patch.valueTagGPR, >-#endif >- patch.valueGPR); >- } >- >- > AccessType accessType; >- CacheType cacheType; >- uint8_t countdown; // We repatch only when this is zero. If not zero, we decrement. >- uint8_t repatchCount; >- uint8_t numberOfCoolDowns; >- uint8_t bufferingCountdown; >+ CacheType cacheType { CacheType::Unset }; > bool resetByGC : 1; > bool tookSlowPath : 1; > bool everConsidered : 1; >diff --git a/Source/JavaScriptCore/dfg/DFGJumpReplacement.cpp b/Source/JavaScriptCore/dfg/DFGJumpReplacement.cpp >index 1ebf694ce4cf8781c25467f21980d8e51722587b..9d52c46f1caa6dad3bc8e35c903a029c41cea3e2 100644 >--- a/Source/JavaScriptCore/dfg/DFGJumpReplacement.cpp >+++ b/Source/JavaScriptCore/dfg/DFGJumpReplacement.cpp >@@ -38,13 +38,13 @@ void JumpReplacement::fire() > { > if (Options::dumpDisassembly()) > dataLogF("Firing jump replacement watchpoint from %p, to %p.\n", m_source.dataLocation(), m_destination.dataLocation()); >- MacroAssembler::replaceWithJump(m_source, m_destination); >+ MacroAssembler::replaceWithJump(m_source.location(), m_destination.location()); > } > > void JumpReplacement::installVMTrapBreakpoint() > { > #if ENABLE(SIGNAL_BASED_VM_TRAPS) >- MacroAssembler::replaceWithVMHalt(m_source); >+ MacroAssembler::replaceWithVMHalt(m_source.location()); > #else > UNREACHABLE_FOR_PLATFORM(); > #endif >diff --git a/Source/JavaScriptCore/dfg/DFGJumpReplacement.h b/Source/JavaScriptCore/dfg/DFGJumpReplacement.h >index 88bd78b8526926505f92edb383b7adf12e2a36db..edbc58e7942954d5666506d5624ea37e1b6c6247 100644 >--- a/Source/JavaScriptCore/dfg/DFGJumpReplacement.h >+++ b/Source/JavaScriptCore/dfg/DFGJumpReplacement.h >@@ -27,7 +27,7 @@ > > #if ENABLE(DFG_JIT) > >-#include "CodeLocation.h" >+#include "CodeOffset.h" > > namespace JSC { namespace DFG { > >@@ -44,8 +44,8 @@ class JumpReplacement { > void* dataLocation() const { return m_source.dataLocation(); } > > private: >- CodeLocationLabel<JSInternalPtrTag> m_source; >- CodeLocationLabel<OSRExitPtrTag> m_destination; >+ CodeOffsetLabel<JSInternalPtrTag> m_source; >+ CodeOffsetLabel<OSRExitPtrTag> m_destination; > }; > > } } // namespace JSC::DFG >diff --git a/Source/JavaScriptCore/dfg/DFGOSREntry.h b/Source/JavaScriptCore/dfg/DFGOSREntry.h >index 50cb0605c1067de7bda2fe18c9fcdc59351eae72..e07dcd6e8616d398a247396e97c7afe4c660254d 100644 >--- a/Source/JavaScriptCore/dfg/DFGOSREntry.h >+++ b/Source/JavaScriptCore/dfg/DFGOSREntry.h >@@ -54,7 +54,7 @@ struct OSREntryReshuffling { > > struct OSREntryData { > unsigned m_bytecodeIndex; >- CodeLocationLabel<OSREntryPtrTag> m_machineCode; >+ CodeOffsetLabel<OSREntryPtrTag> m_machineCode; > Operands<AbstractValue> m_expectedValues; > // Use bitvectors here because they tend to only require one word. > BitVector m_localsForcedDouble; >diff --git a/Source/JavaScriptCore/dfg/DFGOSRExit.cpp b/Source/JavaScriptCore/dfg/DFGOSRExit.cpp >index 09b30ba3cfcd486f9485d08fb41b773073936022..50a0a555f6dbe2376c8101426ad06bb4f8f3f271 100644 >--- a/Source/JavaScriptCore/dfg/DFGOSRExit.cpp >+++ b/Source/JavaScriptCore/dfg/DFGOSRExit.cpp >@@ -946,7 +946,7 @@ OSRExit::OSRExit(ExitKind kind, JSValueSource jsValueSource, MethodOfGettingAVal > > CodeLocationJump<JSInternalPtrTag> OSRExit::codeLocationForRepatch() const > { >- return CodeLocationJump<JSInternalPtrTag>(m_patchableJumpLocation); >+ return CodeLocationJump<JSInternalPtrTag>(m_patchableJumpLocation.location()); > } > > void OSRExit::emitRestoreArguments(CCallHelpers& jit, const Operands<ValueRecovery>& operands) >diff --git a/Source/JavaScriptCore/dfg/DFGOSRExit.h b/Source/JavaScriptCore/dfg/DFGOSRExit.h >index f5422ea2925e804dee8ad85a501bef6ea5f1ba9f..084b02532b28138fc3aa9666b32a5cac9eee2b70 100644 >--- a/Source/JavaScriptCore/dfg/DFGOSRExit.h >+++ b/Source/JavaScriptCore/dfg/DFGOSRExit.h >@@ -146,7 +146,7 @@ struct OSRExit : public OSRExitBase { > static void JIT_OPERATION compileOSRExit(ExecState*) WTF_INTERNAL; > static void executeOSRExit(Probe::Context&); > >- CodeLocationLabel<JSInternalPtrTag> m_patchableJumpLocation; >+ CodeOffsetLabel<JSInternalPtrTag> m_patchableJumpLocation; > MacroAssemblerCodeRef<OSRExitPtrTag> m_code; > > RefPtr<OSRExitState> exitState; >diff --git a/Source/JavaScriptCore/ftl/FTLLazySlowPath.cpp b/Source/JavaScriptCore/ftl/FTLLazySlowPath.cpp >index 34cc1eab02ea49966f49803f88c53eba6f0c0dca..d19ccab78a2f9e1a37c136981d5c75f736e17f8e 100644 >--- a/Source/JavaScriptCore/ftl/FTLLazySlowPath.cpp >+++ b/Source/JavaScriptCore/ftl/FTLLazySlowPath.cpp >@@ -64,12 +64,12 @@ void LazySlowPath::generate(CodeBlock* codeBlock) > m_generator->run(jit, params); > > LinkBuffer linkBuffer(jit, codeBlock, JITCompilationMustSucceed); >- linkBuffer.link(params.doneJumps, m_done); >+ linkBuffer.link(params.doneJumps, m_done.location()); > if (m_exceptionTarget) > linkBuffer.link(exceptionJumps, m_exceptionTarget); > m_stub = FINALIZE_CODE_FOR(codeBlock, linkBuffer, JITStubRoutinePtrTag, "Lazy slow path call stub"); > >- MacroAssembler::repatchJump(m_patchableJump, CodeLocationLabel<JITStubRoutinePtrTag>(m_stub.code())); >+ MacroAssembler::repatchJump(m_patchableJump.location(), CodeLocationLabel<JITStubRoutinePtrTag>(m_stub.code())); > } > > } } // namespace JSC::FTL >diff --git a/Source/JavaScriptCore/ftl/FTLLazySlowPath.h b/Source/JavaScriptCore/ftl/FTLLazySlowPath.h >index a5b3a7252a413776852f41fa25124b908378db13..994c0393fdda4142ce9a77ca2233e0756ad6d5b2 100644 >--- a/Source/JavaScriptCore/ftl/FTLLazySlowPath.h >+++ b/Source/JavaScriptCore/ftl/FTLLazySlowPath.h >@@ -74,8 +74,8 @@ class LazySlowPath { > CallSiteIndex, RefPtr<Generator> > ); > >- CodeLocationJump<JSInternalPtrTag> patchableJump() const { return m_patchableJump; } >- CodeLocationLabel<JSInternalPtrTag> done() const { return m_done; } >+ CodeLocationJump<JSInternalPtrTag> patchableJump() const { return m_patchableJump.location(); } >+ CodeLocationLabel<JSInternalPtrTag> done() const { return m_done.location(); } > const RegisterSet& usedRegisters() const { return m_usedRegisters; } > CallSiteIndex callSiteIndex() const { return m_callSiteIndex; } > >@@ -84,8 +84,8 @@ class LazySlowPath { > MacroAssemblerCodeRef<JITStubRoutinePtrTag> stub() const { return m_stub; } > > private: >- CodeLocationJump<JSInternalPtrTag> m_patchableJump; >- CodeLocationLabel<JSInternalPtrTag> m_done; >+ CodeOffsetJump<JSInternalPtrTag> m_patchableJump; >+ CodeOffsetLabel<JSInternalPtrTag> m_done; > CodeLocationLabel<ExceptionHandlerPtrTag> m_exceptionTarget; > RegisterSet m_usedRegisters; > CallSiteIndex m_callSiteIndex; >diff --git a/Source/JavaScriptCore/ftl/FTLOSRExit.cpp b/Source/JavaScriptCore/ftl/FTLOSRExit.cpp >index 09958d2c4f9a5d89b5fbbb65f86bb6dabcdf18e3..78bccdc0bf578d4299638c5f024cd587ceab1977 100644 >--- a/Source/JavaScriptCore/ftl/FTLOSRExit.cpp >+++ b/Source/JavaScriptCore/ftl/FTLOSRExit.cpp >@@ -111,7 +111,7 @@ OSRExit::OSRExit( > CodeLocationJump<JSInternalPtrTag> OSRExit::codeLocationForRepatch(CodeBlock* ftlCodeBlock) const > { > UNUSED_PARAM(ftlCodeBlock); >- return m_patchableJump; >+ return m_patchableJump.location(); > } > > } } // namespace JSC::FTL >diff --git a/Source/JavaScriptCore/ftl/FTLOSRExit.h b/Source/JavaScriptCore/ftl/FTLOSRExit.h >index da87ccf3273fe4be360683b8af3bbf516d6225eb..88215f2fde7a2de8072ad8baaae6491a2000fcd3 100644 >--- a/Source/JavaScriptCore/ftl/FTLOSRExit.h >+++ b/Source/JavaScriptCore/ftl/FTLOSRExit.h >@@ -121,7 +121,7 @@ struct OSRExit : public DFG::OSRExitBase { > OSRExitDescriptor* m_descriptor; > MacroAssemblerCodeRef<OSRExitPtrTag> m_code; > // This tells us where to place a jump. >- CodeLocationJump<JSInternalPtrTag> m_patchableJump; >+ CodeOffsetJump<JSInternalPtrTag> m_patchableJump; > Vector<B3::ValueRep> m_valueReps; > > CodeLocationJump<JSInternalPtrTag> codeLocationForRepatch(CodeBlock* ftlCodeBlock) const; >diff --git a/Source/JavaScriptCore/jit/CodeOffset.h b/Source/JavaScriptCore/jit/CodeOffset.h >new file mode 100644 >index 0000000000000000000000000000000000000000..f71af5ade13611a2c91086e1045c8a1e6a5327be >--- /dev/null >+++ b/Source/JavaScriptCore/jit/CodeOffset.h >@@ -0,0 +1,385 @@ >+/* >+ * Copyright (C) 2019 Apple Inc. All rights reserved. >+ * >+ * Redistribution and use in source and binary forms, with or without >+ * modification, are permitted provided that the following conditions >+ * are met: >+ * 1. Redistributions of source code must retain the above copyright >+ * notice, this list of conditions and the following disclaimer. >+ * 2. Redistributions in binary form must reproduce the above copyright >+ * notice, this list of conditions and the following disclaimer in the >+ * documentation and/or other materials provided with the distribution. >+ * >+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. ``AS IS'' AND ANY >+ * EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE >+ * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR >+ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR >+ * CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, >+ * EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, >+ * PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR >+ * PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY >+ * OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT >+ * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE >+ * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. >+ */ >+ >+#pragma once >+ >+#include "CodeLocation.h" >+#include "ExecutableAllocator.h" >+#include <wtf/Packed.h> >+ >+#if ENABLE(ASSEMBLER) >+ >+namespace JSC { >+ >+template<PtrTag> class CodeOffsetInstruction; >+template<PtrTag> class CodeOffsetLabel; >+template<PtrTag> class CodeOffsetJump; >+template<PtrTag> class CodeOffsetCall; >+template<PtrTag> class CodeOffsetNearCall; >+template<PtrTag> class CodeOffsetDataLabelCompact; >+template<PtrTag> class CodeOffsetDataLabel32; >+template<PtrTag> class CodeOffsetDataLabelPtr; >+template<PtrTag> class CodeOffsetConvertibleLoad; >+ >+// The CodeOffset* types are all pretty much do-nothing wrappers around >+// CodePtr (or MacroAssemblerCodePtr, to give it its full name). These >+// classes only exist to provide type-safety when linking and patching code. >+// >+// The one new piece of functionallity introduced by these classes is the >+// ability to create (or put another way, to re-discover) another CodeOffset >+// at an offset from one you already know. When patching code to optimize it >+// we often want to patch a number of instructions that are short, fixed >+// offsets apart. To reduce memory overhead we will only retain a pointer to >+// one of the instructions, and we will use the *AtOffset methods provided by >+// CodeOffsetCommon to find the other points in the code to modify. >+template<PtrTag tag> >+class CodeOffsetCommon { >+public: >+ template<PtrTag resultTag = tag> CodeOffsetInstruction<resultTag> instructionAtOffset(int offset); >+ template<PtrTag resultTag = tag> CodeOffsetLabel<resultTag> labelAtOffset(int offset); >+ template<PtrTag resultTag = tag> CodeOffsetJump<resultTag> jumpAtOffset(int offset); >+ template<PtrTag resultTag = tag> CodeOffsetCall<resultTag> callAtOffset(int offset); >+ template<PtrTag resultTag = tag> CodeOffsetNearCall<resultTag> nearCallAtOffset(int offset, NearCallMode); >+ template<PtrTag resultTag = tag> CodeOffsetDataLabelPtr<resultTag> dataLabelPtrAtOffset(int offset); >+ template<PtrTag resultTag = tag> CodeOffsetDataLabel32<resultTag> dataLabel32AtOffset(int offset); >+ template<PtrTag resultTag = tag> CodeOffsetDataLabelCompact<resultTag> dataLabelCompactAtOffset(int offset); >+ template<PtrTag resultTag = tag> CodeOffsetConvertibleLoad<resultTag> convertibleLoadAtOffset(int offset); >+ >+ template<typename T = void*> >+ T untaggedExecutableAddress() const >+ { >+ if (!isValid()) >+ return bitwise_cast<T>(static_cast<void*>(nullptr)); >+ return bitwise_cast<T>(startOfFixedExecutableMemoryPool<uint8_t*>() + m_offset.get()); >+ } >+ >+ template<PtrTag newTag, typename T = void*> >+ T retaggedExecutableAddress() const >+ { >+ if (!isValid()) >+ return bitwise_cast<T>(static_cast<void*>(nullptr)); >+ return retagCodePtr<T, tag, newTag>(executableAddress()); >+ } >+ >+ template<typename T = void*> >+ T dataLocation() const >+ { >+ if (!isValid()) >+ return bitwise_cast<T>(static_cast<void*>(nullptr)); >+ uint8_t* pointer = startOfFixedExecutableMemoryPool<uint8_t*>() + m_offset.get(); >+ ASSERT_VALID_CODE_POINTER(pointer); >+#if CPU(ARM_THUMB2) >+ // To use this pointer as a data address remove the decoration. >+ return bitwise_cast<T>(pointer ? pointer - 1 : nullptr); >+#else >+ return bitwise_cast<T>(pointer); >+#endif >+ } >+ >+ template<typename T = void*> >+ T executableAddress() const >+ { >+ if (!isValid()) >+ return bitwise_cast<T>(static_cast<void*>(nullptr)); >+ void* pointer = startOfFixedExecutableMemoryPool<uint8_t*>() + m_offset.get(); >+ return tagCodePtr<T, tag>(pointer); >+ } >+ >+ template<PtrTag aTag, PtrTag bTag> >+ static ptrdiff_t differenceBetweenCodeOffset(const CodeOffsetCommon<aTag>& a, const CodeOffsetCommon<bTag>& b) >+ { >+ return b.template dataLocation<ptrdiff_t>() - a.template dataLocation<ptrdiff_t>(); >+ } >+ >+ bool isValid() const { return m_offset.get() != UINT32_MAX; } >+ >+protected: >+ CodeOffsetCommon() = default; >+ >+ CodeOffsetCommon(MacroAssemblerCodePtr<tag> location) >+ { >+ uint8_t* pointer = location.template untaggedExecutableAddress<uint8_t*>(); >+ if (pointer) { >+ ASSERT(isJITPC(pointer)); >+ m_offset = static_cast<unsigned>(pointer - startOfFixedExecutableMemoryPool<uint8_t*>()); >+ } >+ } >+ >+ template<PtrTag newTag> >+ MacroAssemblerCodePtr<newTag> retagged() const >+ { >+ if (!isValid()) >+ return MacroAssemblerCodePtr<newTag>(); >+ return MacroAssemblerCodePtr<newTag>::createFromExecutableAddress(retaggedExecutableAddress<newTag>()); >+ } >+ >+ Packed<unsigned> m_offset { UINT32_MAX }; >+}; >+ >+template<PtrTag tag> >+class CodeOffsetInstruction : public CodeOffsetCommon<tag> { >+public: >+ CodeOffsetInstruction() = default; >+ explicit CodeOffsetInstruction(MacroAssemblerCodePtr<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+ explicit CodeOffsetInstruction(void* location) >+ : CodeOffsetCommon<tag>(MacroAssemblerCodePtr<tag>(location)) { } >+ CodeOffsetInstruction(CodeLocationInstruction<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+}; >+ >+template<PtrTag tag> >+class CodeOffsetLabel : public CodeOffsetCommon<tag> { >+public: >+ CodeOffsetLabel() = default; >+ explicit CodeOffsetLabel(MacroAssemblerCodePtr<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+ explicit CodeOffsetLabel(void* location) >+ : CodeOffsetCommon<tag>(MacroAssemblerCodePtr<tag>(location)) { } >+ CodeOffsetLabel(CodeLocationLabel<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+ >+ CodeLocationLabel<tag> location() const >+ { >+ return CodeLocationLabel<tag>(MacroAssemblerCodePtr<tag>::createFromExecutableAddress(this->executableAddress())); >+ } >+ >+ template<PtrTag newTag> >+ CodeOffsetLabel<newTag> retagged() { return CodeOffsetLabel<newTag>(*this); } >+ >+ template<typename T = void*> >+ T untaggedExecutableAddress() const { return CodeOffsetCommon<tag>::template untaggedExecutableAddress<T>(); } >+ >+ template<typename T = void*> >+ T dataLocation() const { return CodeOffsetCommon<tag>::template dataLocation<T>(); } >+}; >+ >+template<PtrTag tag> >+class CodeOffsetJump : public CodeOffsetCommon<tag> { >+public: >+ CodeOffsetJump() = default; >+ explicit CodeOffsetJump(MacroAssemblerCodePtr<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+ explicit CodeOffsetJump(void* location) >+ : CodeOffsetCommon<tag>(MacroAssemblerCodePtr<tag>(location)) { } >+ CodeOffsetJump(CodeLocationJump<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+ >+ CodeLocationJump<tag> location() const >+ { >+ return CodeLocationJump<tag>(MacroAssemblerCodePtr<tag>::createFromExecutableAddress(this->executableAddress())); >+ } >+ >+ template<PtrTag newTag> >+ CodeOffsetJump<newTag> retagged() { return CodeOffsetJump<newTag>(*this); } >+}; >+ >+template<PtrTag tag> >+class CodeOffsetCall : public CodeOffsetCommon<tag> { >+public: >+ CodeOffsetCall() = default; >+ explicit CodeOffsetCall(MacroAssemblerCodePtr<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+ explicit CodeOffsetCall(void* location) >+ : CodeOffsetCommon<tag>(MacroAssemblerCodePtr<tag>(location)) { } >+ CodeOffsetCall(CodeLocationCall<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+ >+ CodeLocationCall<tag> location() const >+ { >+ return CodeLocationCall<tag>(MacroAssemblerCodePtr<tag>::createFromExecutableAddress(this->executableAddress())); >+ } >+ >+ template<PtrTag newTag> >+ CodeOffsetCall<newTag> retagged() { return CodeOffsetCall<newTag>(*this); } >+}; >+ >+template<PtrTag tag> >+class CodeOffsetNearCall : public CodeOffsetCommon<tag> { >+public: >+ CodeOffsetNearCall() = default; >+ explicit CodeOffsetNearCall(MacroAssemblerCodePtr<tag> location, NearCallMode callMode) >+ : CodeOffsetCommon<tag>(location), m_callMode(callMode) { } >+ explicit CodeOffsetNearCall(void* location, NearCallMode callMode) >+ : CodeOffsetCommon<tag>(MacroAssemblerCodePtr<tag>(location)), m_callMode(callMode) { } >+ CodeOffsetNearCall(CodeLocationNearCall<tag> location, NearCallMode callMode) >+ : CodeOffsetCommon<tag>(location), m_callMode(callMode) { } >+ CodeOffsetNearCall(CodeLocationNearCall<tag> location) >+ : CodeOffsetCommon<tag>(location), m_callMode(location.callMode()) { } >+ >+ CodeLocationNearCall<tag> location() const >+ { >+ return CodeLocationNearCall<tag>(MacroAssemblerCodePtr<tag>::createFromExecutableAddress(this->executableAddress()), m_callMode); >+ } >+ >+ NearCallMode callMode() { return m_callMode; } >+ >+private: >+ NearCallMode m_callMode { NearCallMode::Regular }; >+}; >+ >+template<PtrTag tag> >+class CodeOffsetDataLabel32 : public CodeOffsetCommon<tag> { >+public: >+ CodeOffsetDataLabel32() = default; >+ explicit CodeOffsetDataLabel32(MacroAssemblerCodePtr<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+ explicit CodeOffsetDataLabel32(void* location) >+ : CodeOffsetCommon<tag>(MacroAssemblerCodePtr<tag>(location)) { } >+ CodeOffsetDataLabel32(CodeLocationDataLabel32<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+ >+ CodeLocationDataLabel32<tag> location() const >+ { >+ return CodeLocationDataLabel32<tag>(MacroAssemblerCodePtr<tag>::createFromExecutableAddress(this->executableAddress())); >+ } >+}; >+ >+template<PtrTag tag> >+class CodeOffsetDataLabelCompact : public CodeOffsetCommon<tag> { >+public: >+ CodeOffsetDataLabelCompact() = default; >+ explicit CodeOffsetDataLabelCompact(MacroAssemblerCodePtr<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+ explicit CodeOffsetDataLabelCompact(void* location) >+ : CodeOffsetCommon<tag>(MacroAssemblerCodePtr<tag>(location)) { } >+ CodeOffsetDataLabelCompact(CodeLocationDataLabelCompact<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+ >+ CodeLocationDataLabelCompact<tag> location() const >+ { >+ return CodeLocationDataLabelCompact<tag>(MacroAssemblerCodePtr<tag>::createFromExecutableAddress(this->executableAddress())); >+ } >+}; >+ >+template<PtrTag tag> >+class CodeOffsetDataLabelPtr : public CodeOffsetCommon<tag> { >+public: >+ CodeOffsetDataLabelPtr() = default; >+ explicit CodeOffsetDataLabelPtr(MacroAssemblerCodePtr<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+ explicit CodeOffsetDataLabelPtr(void* location) >+ : CodeOffsetCommon<tag>(MacroAssemblerCodePtr<tag>(location)) { } >+ CodeOffsetDataLabelPtr(CodeLocationDataLabelPtr<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+ >+ CodeLocationDataLabelPtr<tag> location() const >+ { >+ return CodeLocationDataLabelPtr<tag>(MacroAssemblerCodePtr<tag>::createFromExecutableAddress(this->executableAddress())); >+ } >+}; >+ >+template<PtrTag tag> >+class CodeOffsetConvertibleLoad : public CodeOffsetCommon<tag> { >+public: >+ CodeOffsetConvertibleLoad() = default; >+ explicit CodeOffsetConvertibleLoad(MacroAssemblerCodePtr<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+ explicit CodeOffsetConvertibleLoad(void* location) >+ : CodeOffsetCommon<tag>(MacroAssemblerCodePtr<tag>(location)) { } >+ CodeOffsetConvertibleLoad(CodeLocationConvertibleLoad<tag> location) >+ : CodeOffsetCommon<tag>(location) { } >+ >+ CodeLocationConvertibleLoad<tag> location() const >+ { >+ return CodeLocationConvertibleLoad<tag>(MacroAssemblerCodePtr<tag>::createFromExecutableAddress(this->executableAddress())); >+ } >+}; >+ >+template<PtrTag tag> >+template<PtrTag resultTag> >+inline CodeOffsetInstruction<resultTag> CodeOffsetCommon<tag>::instructionAtOffset(int offset) >+{ >+ ASSERT_VALID_CODE_OFFSET(offset); >+ return CodeOffsetInstruction<resultTag>(tagCodePtr<resultTag>(dataLocation<uint8_t*>() + offset)); >+} >+ >+template<PtrTag tag> >+template<PtrTag resultTag> >+inline CodeOffsetLabel<resultTag> CodeOffsetCommon<tag>::labelAtOffset(int offset) >+{ >+ ASSERT_VALID_CODE_OFFSET(offset); >+ return CodeOffsetLabel<resultTag>(tagCodePtr<resultTag>(dataLocation<uint8_t*>() + offset)); >+} >+ >+template<PtrTag tag> >+template<PtrTag resultTag> >+inline CodeOffsetJump<resultTag> CodeOffsetCommon<tag>::jumpAtOffset(int offset) >+{ >+ ASSERT_VALID_CODE_OFFSET(offset); >+ return CodeOffsetJump<resultTag>(tagCodePtr<resultTag>(dataLocation<uint8_t*>() + offset)); >+} >+ >+template<PtrTag tag> >+template<PtrTag resultTag> >+inline CodeOffsetCall<resultTag> CodeOffsetCommon<tag>::callAtOffset(int offset) >+{ >+ ASSERT_VALID_CODE_OFFSET(offset); >+ return CodeOffsetCall<resultTag>(tagCodePtr<resultTag>(dataLocation<uint8_t*>() + offset)); >+} >+ >+template<PtrTag tag> >+template<PtrTag resultTag> >+inline CodeOffsetNearCall<resultTag> CodeOffsetCommon<tag>::nearCallAtOffset(int offset, NearCallMode callMode) >+{ >+ ASSERT_VALID_CODE_OFFSET(offset); >+ return CodeOffsetNearCall<resultTag>(tagCodePtr<resultTag>(dataLocation<uint8_t*>() + offset), callMode); >+} >+ >+template<PtrTag tag> >+template<PtrTag resultTag> >+inline CodeOffsetDataLabelPtr<resultTag> CodeOffsetCommon<tag>::dataLabelPtrAtOffset(int offset) >+{ >+ ASSERT_VALID_CODE_OFFSET(offset); >+ return CodeOffsetDataLabelPtr<resultTag>(tagCodePtr<resultTag>(dataLocation<uint8_t*>() + offset)); >+} >+ >+template<PtrTag tag> >+template<PtrTag resultTag> >+inline CodeOffsetDataLabel32<resultTag> CodeOffsetCommon<tag>::dataLabel32AtOffset(int offset) >+{ >+ ASSERT_VALID_CODE_OFFSET(offset); >+ return CodeOffsetDataLabel32<resultTag>(tagCodePtr<resultTag>(dataLocation<uint8_t*>() + offset)); >+} >+ >+template<PtrTag tag> >+template<PtrTag resultTag> >+inline CodeOffsetDataLabelCompact<resultTag> CodeOffsetCommon<tag>::dataLabelCompactAtOffset(int offset) >+{ >+ ASSERT_VALID_CODE_OFFSET(offset); >+ return CodeOffsetDataLabelCompact<resultTag>(tagCodePtr<resultTag>(dataLocation<uint8_t*>() + offset)); >+} >+ >+template<PtrTag tag> >+template<PtrTag resultTag> >+inline CodeOffsetConvertibleLoad<resultTag> CodeOffsetCommon<tag>::convertibleLoadAtOffset(int offset) >+{ >+ ASSERT_VALID_CODE_OFFSET(offset); >+ return CodeOffsetConvertibleLoad<resultTag>(tagCodePtr<resultTag>(dataLocation<uint8_t*>() + offset)); >+} >+ >+} // namespace JSC >+ >+#endif // ENABLE(ASSEMBLER) >diff --git a/Source/JavaScriptCore/jit/JITCodeMap.h b/Source/JavaScriptCore/jit/JITCodeMap.h >index e1308f335b8a374759a88700d8d3ff754e79f597..db09e0dac4c2f1050c506316ab69ff8b857499e4 100644 >--- a/Source/JavaScriptCore/jit/JITCodeMap.h >+++ b/Source/JavaScriptCore/jit/JITCodeMap.h >@@ -44,11 +44,11 @@ class JITCodeMap { > { } > > inline unsigned bytecodeIndex() const { return m_bytecodeIndex; } >- inline CodeLocationLabel<JSEntryPtrTag> codeLocation() { return m_codeLocation; } >+ inline CodeLocationLabel<JSEntryPtrTag> codeLocation() { return m_codeLocation.location(); } > > private: > unsigned m_bytecodeIndex; >- CodeLocationLabel<JSEntryPtrTag> m_codeLocation; >+ CodeOffsetLabel<JSEntryPtrTag> m_codeLocation; > }; > > public: >diff --git a/Source/JavaScriptCore/jit/JITMathIC.h b/Source/JavaScriptCore/jit/JITMathIC.h >index 5645e42deffb4e6fd1cff23d176f5986e499f09b..e9c06c38b87fd2a6de876bdfa793278163c423f9 100644 >--- a/Source/JavaScriptCore/jit/JITMathIC.h >+++ b/Source/JavaScriptCore/jit/JITMathIC.h >@@ -29,6 +29,7 @@ > > #include "ArithProfile.h" > #include "CCallHelpers.h" >+#include "CodeOffset.h" > #include "JITAddGenerator.h" > #include "JITMathICInlineResult.h" > #include "JITMulGenerator.h" >@@ -61,9 +62,9 @@ class JITMathIC { > { > } > >- CodeLocationLabel<JSInternalPtrTag> doneLocation() { return m_inlineEnd; } >- CodeLocationCall<JSInternalPtrTag> slowPathCallLocation() { return m_slowPathCallLocation; } >- CodeLocationLabel<JSInternalPtrTag> slowPathStartLocation() { return m_slowPathStartLocation; } >+ CodeOffsetLabel<JSInternalPtrTag> doneLocation() { return m_inlineEnd; } >+ CodeOffsetCall<JSInternalPtrTag> slowPathCallLocation() { return m_slowPathCallLocation; } >+ CodeOffsetLabel<JSInternalPtrTag> slowPathStartLocation() { return m_slowPathStartLocation; } > > bool generateInline(CCallHelpers& jit, MathICGenerationState& state, bool shouldEmitProfiling = true) > { >@@ -128,8 +129,8 @@ class JITMathIC { > auto jump = jit.jump(); > // We don't need a nop sled here because nobody should be jumping into the middle of an IC. > bool needsBranchCompaction = false; >- RELEASE_ASSERT(jit.m_assembler.buffer().codeSize() <= static_cast<size_t>(MacroAssembler::differenceBetweenCodePtr(m_inlineStart, m_inlineEnd))); >- LinkBuffer linkBuffer(jit, m_inlineStart, jit.m_assembler.buffer().codeSize(), JITCompilationMustSucceed, needsBranchCompaction); >+ RELEASE_ASSERT(jit.m_assembler.buffer().codeSize() <= static_cast<size_t>(CodeOffsetLabel<JSInternalPtrTag>::differenceBetweenCodeOffset(m_inlineStart, m_inlineEnd))); >+ LinkBuffer linkBuffer(jit, m_inlineStart.location(), jit.m_assembler.buffer().codeSize(), JITCompilationMustSucceed, needsBranchCompaction); > RELEASE_ASSERT(linkBuffer.isValid()); > linkBuffer.link(jump, CodeLocationLabel<JITStubRoutinePtrTag>(m_code.code())); > FINALIZE_CODE(linkBuffer, NoPtrTag, "JITMathIC: linking constant jump to out of line stub"); >@@ -137,9 +138,9 @@ class JITMathIC { > > auto replaceCall = [&] () { > #if COMPILER(MSVC) && !COMPILER(CLANG) >- ftlThunkAwareRepatchCall(codeBlock, slowPathCallLocation().retagged<JSInternalPtrTag>(), callReplacement); >+ ftlThunkAwareRepatchCall(codeBlock, slowPathCallLocation().retagged<JSInternalPtrTag>().location(), callReplacement); > #else >- ftlThunkAwareRepatchCall(codeBlock, slowPathCallLocation().template retagged<JSInternalPtrTag>(), callReplacement); >+ ftlThunkAwareRepatchCall(codeBlock, slowPathCallLocation().template retagged<JSInternalPtrTag>().location(), callReplacement); > #endif > }; > >@@ -159,8 +160,8 @@ class JITMathIC { > > LinkBuffer linkBuffer(jit, codeBlock, JITCompilationCanFail); > if (!linkBuffer.didFailToAllocate()) { >- linkBuffer.link(generationState.slowPathJumps, slowPathStartLocation()); >- linkBuffer.link(jumpToDone, doneLocation()); >+ linkBuffer.link(generationState.slowPathJumps, slowPathStartLocation().location()); >+ linkBuffer.link(jumpToDone, doneLocation().location()); > > m_code = FINALIZE_CODE_FOR( > codeBlock, linkBuffer, JITStubRoutinePtrTag, "JITMathIC: generating out of line fast IC snippet"); >@@ -201,8 +202,8 @@ class JITMathIC { > if (linkBuffer.didFailToAllocate()) > return; > >- linkBuffer.link(endJumpList, doneLocation()); >- linkBuffer.link(slowPathJumpList, slowPathStartLocation()); >+ linkBuffer.link(endJumpList, doneLocation().location()); >+ linkBuffer.link(slowPathJumpList, slowPathStartLocation().location()); > > m_code = FINALIZE_CODE_FOR( > codeBlock, linkBuffer, JITStubRoutinePtrTag, "JITMathIC: generating out of line IC snippet"); >@@ -238,10 +239,10 @@ class JITMathIC { > > ArithProfile* m_arithProfile; > MacroAssemblerCodeRef<JITStubRoutinePtrTag> m_code; >- CodeLocationLabel<JSInternalPtrTag> m_inlineStart; >- CodeLocationLabel<JSInternalPtrTag> m_inlineEnd; >- CodeLocationLabel<JSInternalPtrTag> m_slowPathCallLocation; >- CodeLocationLabel<JSInternalPtrTag> m_slowPathStartLocation; >+ CodeOffsetLabel<JSInternalPtrTag> m_inlineStart; >+ CodeOffsetLabel<JSInternalPtrTag> m_inlineEnd; >+ CodeOffsetCall<JSInternalPtrTag> m_slowPathCallLocation; >+ CodeOffsetLabel<JSInternalPtrTag> m_slowPathStartLocation; > bool m_generateFastPathOnRepatch { false }; > GeneratorType m_generator; > }; >diff --git a/Source/JavaScriptCore/jit/JITOpcodes.cpp b/Source/JavaScriptCore/jit/JITOpcodes.cpp >index 18ceba6ad90b808c4e04ffd18f1cb926522f4262..8c57f3033620be97b3e0e8296ed12740f10852d3 100644 >--- a/Source/JavaScriptCore/jit/JITOpcodes.cpp >+++ b/Source/JavaScriptCore/jit/JITOpcodes.cpp >@@ -1224,16 +1224,16 @@ void JIT::privateCompileHasIndexedProperty(ByValInfo* byValInfo, ReturnAddressPt > > LinkBuffer patchBuffer(*this, m_codeBlock); > >- patchBuffer.link(badType, byValInfo->slowPathTarget); >- patchBuffer.link(slowCases, byValInfo->slowPathTarget); >+ patchBuffer.link(badType, byValInfo->slowPathTarget.location()); >+ patchBuffer.link(slowCases, byValInfo->slowPathTarget.location()); > >- patchBuffer.link(done, byValInfo->badTypeDoneTarget); >+ patchBuffer.link(done, byValInfo->badTypeDoneTarget.location()); > > byValInfo->stubRoutine = FINALIZE_CODE_FOR_STUB( > m_codeBlock, patchBuffer, JITStubRoutinePtrTag, > "Baseline has_indexed_property stub for %s, return point %p", toCString(*m_codeBlock).data(), returnAddress.value()); > >- MacroAssembler::repatchJump(byValInfo->badTypeJump, CodeLocationLabel<JITStubRoutinePtrTag>(byValInfo->stubRoutine->code().code())); >+ MacroAssembler::repatchJump(byValInfo->badTypeJump.location(), CodeLocationLabel<JITStubRoutinePtrTag>(byValInfo->stubRoutine->code().code())); > MacroAssembler::repatchCall(CodeLocationCall<NoPtrTag>(MacroAssemblerCodePtr<NoPtrTag>(returnAddress)), FunctionPtr<OperationPtrTag>(operationHasIndexedPropertyGeneric)); > } > >diff --git a/Source/JavaScriptCore/jit/JITOpcodes32_64.cpp b/Source/JavaScriptCore/jit/JITOpcodes32_64.cpp >index 25a9c8f350adc3e69440daca34d82422ee4f182c..3732c8b8aaa9c6d80fa6db4f498e9ca570f316dc 100644 >--- a/Source/JavaScriptCore/jit/JITOpcodes32_64.cpp >+++ b/Source/JavaScriptCore/jit/JITOpcodes32_64.cpp >@@ -1095,16 +1095,16 @@ void JIT::privateCompileHasIndexedProperty(ByValInfo* byValInfo, ReturnAddressPt > > LinkBuffer patchBuffer(*this, m_codeBlock); > >- patchBuffer.link(badType, byValInfo->slowPathTarget); >- patchBuffer.link(slowCases, byValInfo->slowPathTarget); >+ patchBuffer.link(badType, byValInfo->slowPathTarget.location()); >+ patchBuffer.link(slowCases, byValInfo->slowPathTarget.location()); > >- patchBuffer.link(done, byValInfo->badTypeDoneTarget); >+ patchBuffer.link(done, byValInfo->badTypeDoneTarget.location()); > > byValInfo->stubRoutine = FINALIZE_CODE_FOR_STUB( > m_codeBlock, patchBuffer, JITStubRoutinePtrTag, > "Baseline has_indexed_property stub for %s, return point %p", toCString(*m_codeBlock).data(), returnAddress.value()); > >- MacroAssembler::repatchJump(byValInfo->badTypeJump, CodeLocationLabel<JITStubRoutinePtrTag>(byValInfo->stubRoutine->code().code())); >+ MacroAssembler::repatchJump(byValInfo->badTypeJump.location(), CodeLocationLabel<JITStubRoutinePtrTag>(byValInfo->stubRoutine->code().code())); > MacroAssembler::repatchCall(CodeLocationCall<NoPtrTag>(MacroAssemblerCodePtr<NoPtrTag>(returnAddress)), FunctionPtr<OperationPtrTag>(operationHasIndexedPropertyGeneric)); > } > >diff --git a/Source/JavaScriptCore/jit/JITPropertyAccess.cpp b/Source/JavaScriptCore/jit/JITPropertyAccess.cpp >index 519ad7aec364aba9266014c2d3eb462d259b3713..fe7b5d9bfbfb62d5c51f1804d5528b80389acf59 100644 >--- a/Source/JavaScriptCore/jit/JITPropertyAccess.cpp >+++ b/Source/JavaScriptCore/jit/JITPropertyAccess.cpp >@@ -1333,16 +1333,16 @@ void JIT::privateCompileGetByVal(const ConcurrentJSLocker&, ByValInfo* byValInfo > > LinkBuffer patchBuffer(*this, m_codeBlock); > >- patchBuffer.link(badType, byValInfo->slowPathTarget); >- patchBuffer.link(slowCases, byValInfo->slowPathTarget); >+ patchBuffer.link(badType, byValInfo->slowPathTarget.location()); >+ patchBuffer.link(slowCases, byValInfo->slowPathTarget.location()); > >- patchBuffer.link(done, byValInfo->badTypeDoneTarget); >+ patchBuffer.link(done, byValInfo->badTypeDoneTarget.location()); > > byValInfo->stubRoutine = FINALIZE_CODE_FOR_STUB( > m_codeBlock, patchBuffer, JITStubRoutinePtrTag, > "Baseline get_by_val stub for %s, return point %p", toCString(*m_codeBlock).data(), returnAddress.value()); > >- MacroAssembler::repatchJump(byValInfo->badTypeJump, CodeLocationLabel<JITStubRoutinePtrTag>(byValInfo->stubRoutine->code().code())); >+ MacroAssembler::repatchJump(byValInfo->badTypeJump.location(), CodeLocationLabel<JITStubRoutinePtrTag>(byValInfo->stubRoutine->code().code())); > MacroAssembler::repatchCall(CodeLocationCall<NoPtrTag>(MacroAssemblerCodePtr<NoPtrTag>(returnAddress)), FunctionPtr<OperationPtrTag>(operationGetByValGeneric)); > } > >@@ -1359,9 +1359,9 @@ void JIT::privateCompileGetByValWithCachedId(ByValInfo* byValInfo, ReturnAddress > > ConcurrentJSLocker locker(m_codeBlock->m_lock); > LinkBuffer patchBuffer(*this, m_codeBlock); >- patchBuffer.link(slowCases, byValInfo->slowPathTarget); >- patchBuffer.link(fastDoneCase, byValInfo->badTypeDoneTarget); >- patchBuffer.link(slowDoneCase, byValInfo->badTypeNextHotPathTarget); >+ patchBuffer.link(slowCases, byValInfo->slowPathTarget.location()); >+ patchBuffer.link(fastDoneCase, byValInfo->badTypeDoneTarget.location()); >+ patchBuffer.link(slowDoneCase, byValInfo->badTypeNextHotPathTarget.location()); > if (!m_exceptionChecks.empty()) > patchBuffer.link(m_exceptionChecks, byValInfo->exceptionHandler); > >@@ -1376,7 +1376,7 @@ void JIT::privateCompileGetByValWithCachedId(ByValInfo* byValInfo, ReturnAddress > "Baseline get_by_val with cached property name '%s' stub for %s, return point %p", propertyName.impl()->utf8().data(), toCString(*m_codeBlock).data(), returnAddress.value()); > byValInfo->stubInfo = gen.stubInfo(); > >- MacroAssembler::repatchJump(byValInfo->notIndexJump, CodeLocationLabel<JITStubRoutinePtrTag>(byValInfo->stubRoutine->code().code())); >+ MacroAssembler::repatchJump(byValInfo->notIndexJump.location(), CodeLocationLabel<JITStubRoutinePtrTag>(byValInfo->stubRoutine->code().code())); > MacroAssembler::repatchCall(CodeLocationCall<NoPtrTag>(MacroAssemblerCodePtr<NoPtrTag>(returnAddress)), FunctionPtr<OperationPtrTag>(operationGetByValGeneric)); > } > >@@ -1418,9 +1418,9 @@ void JIT::privateCompilePutByVal(const ConcurrentJSLocker&, ByValInfo* byValInfo > Jump done = jump(); > > LinkBuffer patchBuffer(*this, m_codeBlock); >- patchBuffer.link(badType, byValInfo->slowPathTarget); >- patchBuffer.link(slowCases, byValInfo->slowPathTarget); >- patchBuffer.link(done, byValInfo->badTypeDoneTarget); >+ patchBuffer.link(badType, byValInfo->slowPathTarget.location()); >+ patchBuffer.link(slowCases, byValInfo->slowPathTarget.location()); >+ patchBuffer.link(done, byValInfo->badTypeDoneTarget.location()); > if (needsLinkForWriteBarrier) { > ASSERT(removeCodePtrTag(m_calls.last().callee.executableAddress()) == removeCodePtrTag(operationWriteBarrierSlowPath)); > patchBuffer.link(m_calls.last().from, m_calls.last().callee); >@@ -1437,7 +1437,7 @@ void JIT::privateCompilePutByVal(const ConcurrentJSLocker&, ByValInfo* byValInfo > m_codeBlock, patchBuffer, JITStubRoutinePtrTag, > "Baseline put_by_val_direct stub for %s, return point %p", toCString(*m_codeBlock).data(), returnAddress.value()); > } >- MacroAssembler::repatchJump(byValInfo->badTypeJump, CodeLocationLabel<JITStubRoutinePtrTag>(byValInfo->stubRoutine->code().code())); >+ MacroAssembler::repatchJump(byValInfo->badTypeJump.location(), CodeLocationLabel<JITStubRoutinePtrTag>(byValInfo->stubRoutine->code().code())); > MacroAssembler::repatchCall(CodeLocationCall<NoPtrTag>(MacroAssemblerCodePtr<NoPtrTag>(returnAddress)), FunctionPtr<OperationPtrTag>(isDirect ? operationDirectPutByValGeneric : operationPutByValGeneric)); > } > // This function is only consumed from another translation unit (JITOperations.cpp), >@@ -1459,8 +1459,8 @@ void JIT::privateCompilePutByValWithCachedId(ByValInfo* byValInfo, ReturnAddress > > ConcurrentJSLocker locker(m_codeBlock->m_lock); > LinkBuffer patchBuffer(*this, m_codeBlock); >- patchBuffer.link(slowCases, byValInfo->slowPathTarget); >- patchBuffer.link(doneCases, byValInfo->badTypeDoneTarget); >+ patchBuffer.link(slowCases, byValInfo->slowPathTarget.location()); >+ patchBuffer.link(doneCases, byValInfo->badTypeDoneTarget.location()); > if (!m_exceptionChecks.empty()) > patchBuffer.link(m_exceptionChecks, byValInfo->exceptionHandler); > >@@ -1475,7 +1475,7 @@ void JIT::privateCompilePutByValWithCachedId(ByValInfo* byValInfo, ReturnAddress > "Baseline put_by_val%s with cached property name '%s' stub for %s, return point %p", (putKind == Direct) ? "_direct" : "", propertyName.impl()->utf8().data(), toCString(*m_codeBlock).data(), returnAddress.value()); > byValInfo->stubInfo = gen.stubInfo(); > >- MacroAssembler::repatchJump(byValInfo->notIndexJump, CodeLocationLabel<JITStubRoutinePtrTag>(byValInfo->stubRoutine->code().code())); >+ MacroAssembler::repatchJump(byValInfo->notIndexJump.location(), CodeLocationLabel<JITStubRoutinePtrTag>(byValInfo->stubRoutine->code().code())); > MacroAssembler::repatchCall(CodeLocationCall<NoPtrTag>(MacroAssemblerCodePtr<NoPtrTag>(returnAddress)), FunctionPtr<OperationPtrTag>(putKind == Direct ? operationDirectPutByValGeneric : operationPutByValGeneric)); > } > // This function is only consumed from another translation unit (JITOperations.cpp),
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Diff
View Attachment As Raw
Actions:
View
|
Formatted Diff
|
Diff
Attachments on
bug 197732
:
369825
|
369829
|
369906
| 369919