-
Notifications
You must be signed in to change notification settings - Fork 67
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[DON'T MERGE THIS] Normative: Allow transferring SharedArrayBuffer objects #336
Conversation
Thinking about this, this introduces a mechanism in ECMA262 to create a new SAB object from an existing data block (clone seem to have the semantics of copy in the spec), where before this was a capability left to hosts. |
as additional info, the main difference here from the Structured Cloning, this proposed change ignore any additional aspects of the sab objects like extra properties or subclassing. |
spec.html
Outdated
1. Return ! WrappedFunctionCreate(_callerRealm_, _value_). | ||
1. If _value_ has an [[ArrayBufferData]] internal slot, then | ||
1. If IsSharedArrayBuffer(_value_) if *true*, | ||
1. Let _clone_ be ? OrdinaryCreateFromConstructor(%SharedArrayBuffer%, "%SharedArrayBuffer.prototype%", « [[ArrayBufferData]], [[ArrayBufferByteLength]] »). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you should use the realm notation to point to the proper intrinsic, e.g.: callerRealm.[[Intrinsics]].[[%Function.prototype%]]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this PR also brings an interesting question, what we call _callerRealm_
is really just the receiver realm of the wrapped value.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I though it was the calling realm at the time of the initial call to evaluate
, which may be different from the calling realm of the wrapped function exotic (after #329).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And I got that wrong again. It would be the realm of the ShadowRealm
intrinsic, since that's plumbed through when evaluate
is first called.
See https://github.com/tc39/test262/pull/3230/files#r738857432
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
@caridy I did some fixes, including but not limited to:
- usage of
"%SharedArrayBuffer.prototype%"
that doesn't need to be prefixed by the realm. Theconstructor
argument forOrdinaryCreateFromConstructor
is the important piece there and SharedArrayPrototype.prototype is not configurable, it should be safe. - I renamed GetWrappedValue to GetTransferrableValue as it does more than wrapping now but it also did more before with the primitive values.
- I renamed
_callerRealm_
to_receiverRealm_
for clarity of this abstraction. - I created a new abstraction that can be reused to clone sharedarraybuffers in general. In the future, constructor can be optional where it defaults to the existing
%SharedArrayBuffer%
whenever the abstraction is used. We don't need it for the ShadowRealms usage, targeting the specific one from the receiverRealm.
c74544b
to
e9d11a9
Compare
e9d11a9
to
01a2614
Compare
I think this is interesting and hoping to see it ship at the first release. But isn't it a big change after stage 3? |
I intend to present this at the next TC39 meeting. I'd be fine with it being a stage 3 change but we need to have all the delegates onboard with the motivation and proposed solution. I believe some delegates might want more time to think through so a follow up proposal seems pragmatic. |
spec.html
Outdated
1. Return _value_. | ||
</emu-alg> | ||
</emu-clause> | ||
|
||
<emu-clause id="sec-clonesharedarraybuffer" aoid="CloneSharedArrayBuffer"> | ||
<h1>CloneSharedArrayBuffer ( _buffer_, _constructor_ )</h1> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this should expect a realm argument to avoid calling this with arbitrary constructors.
spec.html
Outdated
1. If IsCallable(_value_) is *true*, | ||
1. Return ! WrappedFunctionCreate(_receiverRealm_, _value_). | ||
1. If _value_ has an [[ArrayBufferData]] internal slot, then | ||
1. If IsSharedArrayBuffer(_value_) if *true*, return ? CloneSharedArrayBuffer(_value_, _receiverRealm_.[[Intrinsics]].%SharedArrayBuffer%). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pass the receiver realm instead.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Like I mentioned above:
I created a new abstraction that can be reused to clone sharedarraybuffers in general. In the future, constructor can be optional where it defaults to the existing
%SharedArrayBuffer%
whenever the abstraction is used. We don't need it for the ShadowRealms usage, targeting the specific one from the receiverRealm.
spec.html
Outdated
<h1>CloneSharedArrayBuffer ( _buffer_, _constructor_ )</h1> | ||
<emu-alg> | ||
1. Assert: _buffer_ is an Object with a [[SharedArrayBuffer]] internal. | ||
1. Let _clone_ be ? OrdinaryCreateFromConstructor(_constructor_, "%SharedArrayBuffer.prototype%", « [[ArrayBufferData]], [[ArrayBufferByteLength]] »). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use the receiver realm's intrinsic here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the second argument is meant to be a string with the default proto. This proto will be fetched from the given constructor.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I also mentioned this in a comment above.
usage of
"%SharedArrayBuffer.prototype%"
that doesn't need to be prefixed by the realm. The constructor argument for OrdinaryCreateFromConstructor is the important piece there and SharedArrayPrototype.prototype is not configurable, it should be safe.
In fact, it's meant to be just an string, regardless realms. This is string is gonna be used to access the matching intrinsic property of the given constructor.
Also, this name is just symbolic, as the %SharedArrayBuffer%.prototype
will always be available:
OrdinaryCreateFromConstructor ( constructor, intrinsicDefaultProto [ , internalSlotsList ] )
1. Assert: intrinsicDefaultProto is this specification's name of an intrinsic object. The corresponding object must be an intrinsic that is intended to be used as the [[Prototype]] value of an object.
2. Let proto be ? GetPrototypeFromConstructor(constructor, intrinsicDefaultProto).
3. Return ! OrdinaryObjectCreate(proto, internalSlotsList).
GetPrototypeFromConstructor ( constructor, intrinsicDefaultProto )
1. Assert: intrinsicDefaultProto is this specification's name of an intrinsic object. The corresponding object must be an intrinsic that is intended to be used as the [[Prototype]] value of an object.
2. Let proto be ? Get(constructor, "prototype").
3. If Type(proto) is not Object, then
a. Let realm be ? GetFunctionRealm(constructor).
b. Set proto to realm's intrinsic object named intrinsicDefaultProto.
4. Return proto.
The name would only be used in step 3.b for the abstraction above, but %SharedArrayBuffer%.prototype
is non configurable, non writable and the constructor is always the intrinsic one.
That means OrdinaryCreateFromConstructor
results in OrdinaryObjectCreate(_receiverRealm_.[[Intrinsics]].%SharedArrayBuffer.prototype%, « [[ArrayBufferData]], [[ArrayBufferByteLength]] »)
.
I can actually shift usage from OrdinaryCreateFromConstructor to OrdinaryObjectCreate. The current draft uses it to match the slice method in SharedArrayBuffer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The other reason for using OrdinaryCreateFromConstructor
is only if the new abstraction CloneArrayBuffer
becomes reusable within ECMAScript, but apparently I need to avoid this new abstraction as it became noise here.
f2f487e avoids the noise from an abstraction I previously introduced and removes unnecessary steps to clone the SharedArrayBuffer and avoids the confusion of using any intrinsic name inside a string. |
spec.html
Outdated
1. Set _clone_.[[ArrayBufferData]] to _value_.[[ArrayBufferData]]. | ||
1. Set _clone_.[[ArrayBufferByteLength]] to _value_.[[ArrayBufferByteLength]]. | ||
1. Return _clone_. | ||
1. throw a TypeError exception. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
1. throw a TypeError exception. | |
1. Throw a TypeError exception. |
c95c7af
to
c288595
Compare
OrdinaryCreateFromConstructor matches other generic usages to create a new instance from a given constructor, in this case, the given steps just fall short to OrdinaryObjectCreate without any observable steps.
c288595
to
b82d6b2
Compare
Post-MVP should be fine, although still definitely raising it with the TC39 earlier in case it changes how implementors implement the rest of the proposal. |
I have two concerns:
|
We identified objections for the full serialization. The shared array buffer is the only thing necessary for resolving the use cases we identified to motivate this PR. Keep in mind, we intend to propose this change as a follow up proposal to avoid blocking ShadowRealms. |
I would love to learn more, but if this becomes a separate (follow-up) proposal that goes through the stages there's no rush. |
@annevk one of the big issues with serialize is that it does not support all the features of the language. E.g.: Proxies. We are certainly looking at serialization, and currently looking at explicit vs implicit serialization mechanism, but for this to get any traction, we will need to figure how to reform the serialize mechanism to support all features of the language. |
Maybe, but serialization exists today. Adding a new mechanism that does roughly the same thing, but not exactly, does not seem like a step in the right direction. |
@annevk, I believe that @caridy's point is that structure clone as specified by HTML would face objections to be exposed as-is by JavaScript. While moving the algorithm definition of serialize/deserialize is considered for inclusion in ecma262 for maintenance reasons (tc39/ecma262#2555), these operations would likely not be used anywhere in the JavaScript spec with their current definition, in particular because they allow a program to directly test whether an object is a proxy. While I'm hopeful we can find a cloning mechanism that is both backwards compatible with structured clone, and satisfies the current objections to structure clone being included in JavaScript, this is a much larger project. Because of that we have to start somewhere that allows us to "transmit" a subset of objects through the callable boundary, either explicitly or implicitly, while not painting ourselves in a corner regarding any future serialization/clone mechanism in JavaScript. I am personally in favor of an explicit cloning mechanism, exactly because it's unclear how an implicit mechanism could be extended to non-shareable objects. However I also admit that the only concern here is future discrepancy where some specific objects are allowed implicitly and some need explicit handling. That wouldn't be a fatal flaw, just odd as you point out. |
Slight correction to what @mhofman said:
should be "considered for moving under tc39". Given the problems explained in the tc39/ecma262#2555 thread, I do not consider it a candidate for inclusion in ecma262. It is too inconsistent with the JavaScript language. However, tc39 should maintain it as a separate report, likely by a separate TG. |
@mhofman this would also face objections, for reasons stated above. |
@annevk, could you clarify exactly what would face objections? |
Introducing an operation to move objects across "isolated" realms that's incompatible with StructuredSerialize and StructuredDeserialize. Changing those algorithms to accommodate more use cases is an option though. Although webcompat might put limits on that. Other than proxies I didn't see a lot in tc39/ecma262#2555 btw. E.g., where is the argument made that supports the model put forward here? (Also not entirely sure how you could realistically tackle proxies.) |
This is exactly my hope when I say I want to find a cloning mechanism that is backwards compatible with structure clone but that removes exiting objections from TC39 delegates. I don't think allowing new values that currently throw would be considered a compatibility risk, but maybe I'm mistaken? Also a new cloning mechanism to support the
What I'm thinking about right now is that the cloning mechanism would have hooks, and that it would be acceptable to require users of proxies (membranes) to use those hooks if they want their proxies to behave transparently through cloning. |
Removing exceptions (and following a different code path) is generally fine, indeed. And yes, starting with |
If the divergence is the inability to express transfer semantics in this proposal, I entirely agree, and that is why I'm in support of an alternative explicit cloning mechanism for Shadow realms, even for SAB, to avoid creating a special case precedent. I have a draft for such an explicit proposal, but it needs updating. However the shadow realm champions have expressed concerns with added complexity resulting from this explicit cloning. It's all very early, hopefully we can find a solution that answers everyone's requirements. |
@leobalter pointed out the divergence in #336 (comment). (And to be clear, I'm not sure how okay I am with duplicating part of HTML's algorithms and not reusing them explicitly. That creates a lot of risk for divergence going forward. I am okay with moving those algorithms elsewhere, as also stated in the other thread.) (@erights btw, separate document/TC sounds very much like the old Appendix B wishful thinking. I also somewhat doubt anyone is willing to sign up for it in that form, nor does it seem like an improvement over the status quo.) |
Does HTML's structured clone copy other properties of SABs? I'm almost certain it does not. |
@bakkot the functionality I observed for transferring a SAB throw workers does copy the properties. Maybe I took that would come in the package for structured cloning and I’d be happy if I’m corrected. I’d like to reuse the SAB internals, but I got no usage for coping the object properties. |
You can try out a compliant const buffer = new SharedArrayBuffer(10);
buffer.foo = 10;
const clonedBuffer = structuredClone(buffer);
'foo' in clonedBuffer; // false
clonedBuffer.foo === undefined; // true |
Yeah, I see the same behavior as @Jamesernator; that's also what I see in the spec (note that the For anyone else looking to play with it in the browser, SAB is available on https://first-party-test.glitch.me/coep?coep=credentialless&coop=same-origin& |
@annevk I discussed with some people involved in this proposal, not limited to champions only, and I believe there is bigger challenge right now to reuse the Structured Cloning algorithms here, as they have their own set of challenges at TC39. I think the current PR has steps that observed in user land like the steps from the HTML Specs' StructuredSerializeInternal steps 13.2. I actually crafted the current PR abstraction based on it, but they look different editorially. If TC39 eventually agrees to bring Structured Cloning, I'm happy to reuse it. I'd say probably limiting it initially to SharedArrayBuffer, which is where our pain point remains, but open to discuss using the other aspects of the StructuredCloning. WDYT if the observed behavior here doesn't differ and we commit to keep it this way? Of course, for now limiting this logic to SABs. |
@bakkot thanks for the clarification. I missed the |
Our inclination is that the challenges with structured serializing/deserializing ought to be resolved before an equivalent mechanism is added elsewhere. |
Does that mean the web won’t be adding new things to structured cloning until these things can be resolved? |
To add clarity, we won't propose this change to the ongoing ShadowRealms proposal but bring it as a separate proposal following up ShadowRealms. As discussed in the plenary today, we are considering exploring the alternatives to reuse the Structured serialization and so this work will be on hold for at least some Stage 3-like stage. We can advance a few things that might not conflict, such as spinning a new repo and closing this PR, setting a specific explainer, and drafting a spec using the structured serialization parts, with what we estimate it would become. I understand this work might go stage 1. For stage 2 we need to resolve the contention with @mhofman who is proposing an alternative API. |
This PR is to explore the possibility of allowing SharedArrayBuffers to be transferred cross shadow realms as discussed internally by @rwaldron and @caridy. They might expand the use cases and we also have #298 and #322.
The feature only accounts for the buffer data and size, reusing the intrinsic SharedArrayBuffer. Object properties are not transferred here.
This code example quickly shows up it is expected to work:
This might be considered as
a Stage 3 change ora follow up proposal.