Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DON'T MERGE THIS] Normative: Allow transferring SharedArrayBuffer objects #336

Closed
wants to merge 5 commits into from

Conversation

leobalter
Copy link
Member

@leobalter leobalter commented Oct 28, 2021

This PR is to explore the possibility of allowing SharedArrayBuffers to be transferred cross shadow realms as discussed internally by @rwaldron and @caridy. They might expand the use cases and we also have #298 and #322.

The feature only accounts for the buffer data and size, reusing the intrinsic SharedArrayBuffer. Object properties are not transferred here.

This code example quickly shows up it is expected to work:

const r = new ShadowRealm();
const sab = new SharedArrayBuffer(8);
const getFoo = r.evaluate('(clonedSab) => clonedSab.foo');

sab.foo = 2;

// The ShadowRealm creates a new SAB reusing the already allocated sharedarraybuffer data.
getFoo(sab); // undefined

const cloneBack = r.evaluate(`
  (clonedSab) => {
    clonedSab[0] = 1;
    return clonedSab;
  };
`);

const otherSab = cloneBack(sab);

otherSab === sab; // false;
sab[0]; // 1
otherSab[0]; // 1

This might be considered as a Stage 3 change or a follow up proposal.

@mhofman
Copy link
Member

mhofman commented Oct 28, 2021

Thinking about this, this introduces a mechanism in ECMA262 to create a new SAB object from an existing data block (clone seem to have the semantics of copy in the spec), where before this was a capability left to hosts.

@leobalter
Copy link
Member Author

as additional info, the main difference here from the Structured Cloning, this proposed change ignore any additional aspects of the sab objects like extra properties or subclassing.

spec.html Outdated
1. Return ! WrappedFunctionCreate(_callerRealm_, _value_).
1. If _value_ has an [[ArrayBufferData]] internal slot, then
1. If IsSharedArrayBuffer(_value_) if *true*,
1. Let _clone_ be ? OrdinaryCreateFromConstructor(%SharedArrayBuffer%, "%SharedArrayBuffer.prototype%", « [[ArrayBufferData]], [[ArrayBufferByteLength]] »).
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you should use the realm notation to point to the proper intrinsic, e.g.: callerRealm.[[Intrinsics]].[[%Function.prototype%]]

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this PR also brings an interesting question, what we call _callerRealm_ is really just the receiver realm of the wrapped value.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I though it was the calling realm at the time of the initial call to evaluate, which may be different from the calling realm of the wrapped function exotic (after #329).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And I got that wrong again. It would be the realm of the ShadowRealm intrinsic, since that's plumbed through when evaluate is first called.

See https://github.com/tc39/test262/pull/3230/files#r738857432

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@caridy I did some fixes, including but not limited to:

  • usage of "%SharedArrayBuffer.prototype%" that doesn't need to be prefixed by the realm. The constructor argument for OrdinaryCreateFromConstructor is the important piece there and SharedArrayPrototype.prototype is not configurable, it should be safe.
  • I renamed GetWrappedValue to GetTransferrableValue as it does more than wrapping now but it also did more before with the primitive values.
  • I renamed _callerRealm_ to _receiverRealm_ for clarity of this abstraction.
  • I created a new abstraction that can be reused to clone sharedarraybuffers in general. In the future, constructor can be optional where it defaults to the existing %SharedArrayBuffer% whenever the abstraction is used. We don't need it for the ShadowRealms usage, targeting the specific one from the receiverRealm.

@Jack-Works
Copy link
Member

I think this is interesting and hoping to see it ship at the first release. But isn't it a big change after stage 3?

@leobalter
Copy link
Member Author

I intend to present this at the next TC39 meeting. I'd be fine with it being a stage 3 change but we need to have all the delegates onboard with the motivation and proposed solution. I believe some delegates might want more time to think through so a follow up proposal seems pragmatic.

spec.html Outdated
1. Return _value_.
</emu-alg>
</emu-clause>

<emu-clause id="sec-clonesharedarraybuffer" aoid="CloneSharedArrayBuffer">
<h1>CloneSharedArrayBuffer ( _buffer_, _constructor_ )</h1>
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should expect a realm argument to avoid calling this with arbitrary constructors.

spec.html Outdated
1. If IsCallable(_value_) is *true*,
1. Return ! WrappedFunctionCreate(_receiverRealm_, _value_).
1. If _value_ has an [[ArrayBufferData]] internal slot, then
1. If IsSharedArrayBuffer(_value_) if *true*, return ? CloneSharedArrayBuffer(_value_, _receiverRealm_.[[Intrinsics]].%SharedArrayBuffer%).
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pass the receiver realm instead.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Like I mentioned above:

I created a new abstraction that can be reused to clone sharedarraybuffers in general. In the future, constructor can be optional where it defaults to the existing %SharedArrayBuffer% whenever the abstraction is used. We don't need it for the ShadowRealms usage, targeting the specific one from the receiverRealm.

spec.html Outdated
<h1>CloneSharedArrayBuffer ( _buffer_, _constructor_ )</h1>
<emu-alg>
1. Assert: _buffer_ is an Object with a [[SharedArrayBuffer]] internal.
1. Let _clone_ be ? OrdinaryCreateFromConstructor(_constructor_, "%SharedArrayBuffer.prototype%", « [[ArrayBufferData]], [[ArrayBufferByteLength]] »).
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use the receiver realm's intrinsic here.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the second argument is meant to be a string with the default proto. This proto will be fetched from the given constructor.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also mentioned this in a comment above.

usage of "%SharedArrayBuffer.prototype%" that doesn't need to be prefixed by the realm. The constructor argument for OrdinaryCreateFromConstructor is the important piece there and SharedArrayPrototype.prototype is not configurable, it should be safe.

In fact, it's meant to be just an string, regardless realms. This is string is gonna be used to access the matching intrinsic property of the given constructor.

Also, this name is just symbolic, as the %SharedArrayBuffer%.prototype will always be available:

OrdinaryCreateFromConstructor ( constructor, intrinsicDefaultProto [ , internalSlotsList ] )

1. Assert: intrinsicDefaultProto is this specification's name of an intrinsic object. The corresponding object must be an intrinsic that is intended to be used as the [[Prototype]] value of an object.
2. Let proto be ? GetPrototypeFromConstructor(constructor, intrinsicDefaultProto).
3. Return ! OrdinaryObjectCreate(proto, internalSlotsList).

GetPrototypeFromConstructor ( constructor, intrinsicDefaultProto )

1. Assert: intrinsicDefaultProto is this specification's name of an intrinsic object. The corresponding object must be an intrinsic that is intended to be used as the [[Prototype]] value of an object.
2. Let proto be ? Get(constructor, "prototype").
3. If Type(proto) is not Object, then
  a. Let realm be ? GetFunctionRealm(constructor).
  b. Set proto to realm's intrinsic object named intrinsicDefaultProto.
4. Return proto.

The name would only be used in step 3.b for the abstraction above, but %SharedArrayBuffer%.prototype is non configurable, non writable and the constructor is always the intrinsic one.

That means OrdinaryCreateFromConstructor results in OrdinaryObjectCreate(_receiverRealm_.[[Intrinsics]].%SharedArrayBuffer.prototype%, « [[ArrayBufferData]], [[ArrayBufferByteLength]] »).

I can actually shift usage from OrdinaryCreateFromConstructor to OrdinaryObjectCreate. The current draft uses it to match the slice method in SharedArrayBuffer.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The other reason for using OrdinaryCreateFromConstructor is only if the new abstraction CloneArrayBuffer becomes reusable within ECMAScript, but apparently I need to avoid this new abstraction as it became noise here.

@leobalter
Copy link
Member Author

f2f487e avoids the noise from an abstraction I previously introduced and removes unnecessary steps to clone the SharedArrayBuffer and avoids the confusion of using any intrinsic name inside a string.

@leobalter leobalter requested a review from caridy October 29, 2021 19:22
spec.html Outdated Show resolved Hide resolved
spec.html Outdated
1. Set _clone_.[[ArrayBufferData]] to _value_.[[ArrayBufferData]].
1. Set _clone_.[[ArrayBufferByteLength]] to _value_.[[ArrayBufferByteLength]].
1. Return _clone_.
1. throw a TypeError exception.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
1. throw a TypeError exception.
1. Throw a TypeError exception.

OrdinaryCreateFromConstructor matches other generic usages to create a new instance from a given constructor,
in this case, the given steps just fall short to OrdinaryObjectCreate without any observable steps.
@Jamesernator
Copy link

I intend to present this at the next TC39 meeting. I'd be fine with it being a stage 3 change but we need to have all the delegates onboard with the motivation and proposed solution. I believe some delegates might want more time to think through so a follow up proposal seems pragmatic.

Post-MVP should be fine, although still definitely raising it with the TC39 earlier in case it changes how implementors implement the rest of the proposal.

@annevk
Copy link
Member

annevk commented Dec 9, 2021

I have two concerns:

  1. Why does this not reuse serialize and deserialize semantics? Especially as those will be defined by JS, this discrepancy could turn out to be odd in hindsight.
  2. This syntax does not really allow for the transfer semantics you might want for non-shared buffers.

@leobalter
Copy link
Member Author

We identified objections for the full serialization.

The shared array buffer is the only thing necessary for resolving the use cases we identified to motivate this PR.

Keep in mind, we intend to propose this change as a follow up proposal to avoid blocking ShadowRealms.

@annevk
Copy link
Member

annevk commented Dec 9, 2021

I would love to learn more, but if this becomes a separate (follow-up) proposal that goes through the stages there's no rush.

@leobalter
Copy link
Member Author

@annevk sure! I'll make sure we keep this in touch with you!

@rwaldron and @caridy might present this to the TC39 meeting next week and identify the next steps.

I understand there are some contention points to smooth out here and we plan to work through all of it.

@caridy
Copy link
Collaborator

caridy commented Dec 15, 2021

I have two concerns:

  1. Why does this not reuse serialize and deserialize semantics? Especially as those will be defined by JS, this discrepancy could turn out to be odd in hindsight.

@annevk one of the big issues with serialize is that it does not support all the features of the language. E.g.: Proxies. We are certainly looking at serialization, and currently looking at explicit vs implicit serialization mechanism, but for this to get any traction, we will need to figure how to reform the serialize mechanism to support all features of the language.

@annevk
Copy link
Member

annevk commented Dec 15, 2021

Maybe, but serialization exists today. Adding a new mechanism that does roughly the same thing, but not exactly, does not seem like a step in the right direction.

@mhofman
Copy link
Member

mhofman commented Dec 15, 2021

@annevk, I believe that @caridy's point is that structure clone as specified by HTML would face objections to be exposed as-is by JavaScript. While moving the algorithm definition of serialize/deserialize is considered for inclusion in ecma262 for maintenance reasons (tc39/ecma262#2555), these operations would likely not be used anywhere in the JavaScript spec with their current definition, in particular because they allow a program to directly test whether an object is a proxy.

While I'm hopeful we can find a cloning mechanism that is both backwards compatible with structured clone, and satisfies the current objections to structure clone being included in JavaScript, this is a much larger project. Because of that we have to start somewhere that allows us to "transmit" a subset of objects through the callable boundary, either explicitly or implicitly, while not painting ourselves in a corner regarding any future serialization/clone mechanism in JavaScript.

I am personally in favor of an explicit cloning mechanism, exactly because it's unclear how an implicit mechanism could be extended to non-shareable objects. However I also admit that the only concern here is future discrepancy where some specific objects are allowed implicitly and some need explicit handling. That wouldn't be a fatal flaw, just odd as you point out.

@erights
Copy link
Collaborator

erights commented Dec 15, 2021

Slight correction to what @mhofman said:

considered for inclusion in ecma262

should be "considered for moving under tc39".

Given the problems explained in the tc39/ecma262#2555 thread, I do not consider it a candidate for inclusion in ecma262. It is too inconsistent with the JavaScript language. However, tc39 should maintain it as a separate report, likely by a separate TG.

@annevk
Copy link
Member

annevk commented Dec 16, 2021

@mhofman this would also face objections, for reasons stated above.

@mhofman
Copy link
Member

mhofman commented Dec 16, 2021

@annevk, could you clarify exactly what would face objections?

@annevk
Copy link
Member

annevk commented Dec 16, 2021

Introducing an operation to move objects across "isolated" realms that's incompatible with StructuredSerialize and StructuredDeserialize. Changing those algorithms to accommodate more use cases is an option though. Although webcompat might put limits on that.

Other than proxies I didn't see a lot in tc39/ecma262#2555 btw. E.g., where is the argument made that supports the model put forward here? (Also not entirely sure how you could realistically tackle proxies.)

@mhofman
Copy link
Member

mhofman commented Dec 16, 2021

Changing those algorithms to accommodate more use cases is an option though. Although webcompat might put limits on that.

This is exactly my hope when I say I want to find a cloning mechanism that is backwards compatible with structure clone but that removes exiting objections from TC39 delegates. I don't think allowing new values that currently throw would be considered a compatibility risk, but maybe I'm mistaken?

Also a new cloning mechanism to support the ShadowRealm use case does not need to support all use cases of structure clone at first, as long as it doesn't diverge and there remains a way to build up to it in the future. For example starting with an explicit cloning of only SharedArrayBuffer objects would be totally fine.

Also not entirely sure how you could realistically tackle proxies

What I'm thinking about right now is that the cloning mechanism would have hooks, and that it would be acceptable to require users of proxies (membranes) to use those hooks if they want their proxies to behave transparently through cloning.

@annevk
Copy link
Member

annevk commented Dec 16, 2021

Removing exceptions (and following a different code path) is generally fine, indeed.

And yes, starting with SharedArrayBuffer might be okay, but the questions in #336 (comment) stand. With the proposal here there is divergence and it's not clear how to take it further in the future.

@mhofman
Copy link
Member

mhofman commented Dec 16, 2021

If the divergence is the inability to express transfer semantics in this proposal, I entirely agree, and that is why I'm in support of an alternative explicit cloning mechanism for Shadow realms, even for SAB, to avoid creating a special case precedent. I have a draft for such an explicit proposal, but it needs updating. However the shadow realm champions have expressed concerns with added complexity resulting from this explicit cloning. It's all very early, hopefully we can find a solution that answers everyone's requirements.

@annevk
Copy link
Member

annevk commented Dec 16, 2021

@leobalter pointed out the divergence in #336 (comment). (And to be clear, I'm not sure how okay I am with duplicating part of HTML's algorithms and not reusing them explicitly. That creates a lot of risk for divergence going forward. I am okay with moving those algorithms elsewhere, as also stated in the other thread.)

(@erights btw, separate document/TC sounds very much like the old Appendix B wishful thinking. I also somewhat doubt anyone is willing to sign up for it in that form, nor does it seem like an improvement over the status quo.)

@bakkot
Copy link

bakkot commented Dec 21, 2021

this proposed change ignore any additional aspects of the sab objects like extra properties or subclassing

Does HTML's structured clone copy other properties of SABs? I'm almost certain it does not.

@leobalter
Copy link
Member Author

@bakkot the functionality I observed for transferring a SAB throw workers does copy the properties. Maybe I took that would come in the package for structured cloning and I’d be happy if I’m corrected. I’d like to reuse the SAB internals, but I got no usage for coping the object properties.

@Jamesernator
Copy link

@bakkot the functionality I observed for transferring a SAB throw workers does copy the properties. Maybe I took that would come in the package for structured cloning and I’d be happy if I’m corrected. I’d like to reuse the SAB internals, but I got no usage for coping the object properties.

You can try out a compliant structuredClone in Chrome or Node, it doesn't clone properties on shared array buffers:

const buffer = new SharedArrayBuffer(10);
buffer.foo = 10;
const clonedBuffer = structuredClone(buffer);
'foo' in clonedBuffer; // false
clonedBuffer.foo === undefined; // true

@bakkot
Copy link

bakkot commented Dec 21, 2021

Yeah, I see the same behavior as @Jamesernator; that's also what I see in the spec (note that the [[ArrayBufferData]] branch does not set deep to true).

For anyone else looking to play with it in the browser, SAB is available on https://first-party-test.glitch.me/coep?coep=credentialless&coop=same-origin&

@leobalter
Copy link
Member Author

@annevk I discussed with some people involved in this proposal, not limited to champions only, and I believe there is bigger challenge right now to reuse the Structured Cloning algorithms here, as they have their own set of challenges at TC39.

I think the current PR has steps that observed in user land like the steps from the HTML Specs' StructuredSerializeInternal steps 13.2. I actually crafted the current PR abstraction based on it, but they look different editorially.

If TC39 eventually agrees to bring Structured Cloning, I'm happy to reuse it. I'd say probably limiting it initially to SharedArrayBuffer, which is where our pain point remains, but open to discuss using the other aspects of the StructuredCloning.

WDYT if the observed behavior here doesn't differ and we commit to keep it this way? Of course, for now limiting this logic to SABs.

@leobalter
Copy link
Member Author

@bakkot thanks for the clarification. I missed the deep value and this PR seems to end up closer than I imagined to that abstraction.

@annevk
Copy link
Member

annevk commented Jan 21, 2022

Our inclination is that the challenges with structured serializing/deserializing ought to be resolved before an equivalent mechanism is added elsewhere.

@ljharb
Copy link
Member

ljharb commented Jan 21, 2022

Does that mean the web won’t be adding new things to structured cloning until these things can be resolved?

@leobalter
Copy link
Member Author

To add clarity, we won't propose this change to the ongoing ShadowRealms proposal but bring it as a separate proposal following up ShadowRealms.

As discussed in the plenary today, we are considering exploring the alternatives to reuse the Structured serialization and so this work will be on hold for at least some Stage 3-like stage.

We can advance a few things that might not conflict, such as spinning a new repo and closing this PR, setting a specific explainer, and drafting a spec using the structured serialization parts, with what we estimate it would become. I understand this work might go stage 1. For stage 2 we need to resolve the contention with @mhofman who is proposing an alternative API.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants