Replies: 7 comments
-
Quite a mystery! @syg Could you help us figure this out? Thanks. |
Beta Was this translation helpful? Give feedback.
-
played around with logs and source here d8-logs.zip tested against source: if (arguments[0] === 'freeze') {
console.log('freezing')
Object.freeze(Uint8Array.prototype);
console.log('frozen')
}
console.log('start')
let count = 0;
const run = () => {
// throw away the assumption that a specific buffer size is optimized by adding count
const x = new Uint8Array(10000 + count);
for (let i = 0; i < x.length; i++) {
x[i] = 1;
}
count++;
};
const runtime = 10 * 1000;
const experiment = () => {
const start = Date.now();
let now = Date.now();
// run for a fixed time, count how many runs
while (now - start < runtime) {
run();
now = Date.now();
}
};
experiment();
console.log(count);
console.log('done') theres some key differences in the pretty graphs but I don't know what to do with that information 🐈 |
Beta Was this translation helpful? Give feedback.
-
I have reported the behavior here: https://bugs.chromium.org/p/chromium/issues/detail?id=1463387 Also, fresh Node.js v20 shows different opt logs Fast case:
Slow case:
|
Beta Was this translation helpful? Give feedback.
-
@naugtur If we conclude that using a Proxy over all TypedArrays is a net win for performance, it would make sense for SES to tame all the TypedArrays in this way, at least within compartments, and also implement |
Beta Was this translation helpful? Give feedback.
-
@kriskowal I removed my comment before you replied because upon further investigation I was not successful with a proxy. The only times when I got any improvements from using the proxy were ones where I broke something and got length to default to zero. |
Beta Was this translation helpful? Give feedback.
-
If this did work, it would have made things much better on v8 but much worse everywhere else. Thus, if we did make this into a SES repair, it should be conditional on the target being v8. We already make some of our Error taming conditional on v8, so this would not be a new precedent. The key thing is that the observable semantics after repair be as similar as possible across platforms, where performance differences are not considered observable. |
Beta Was this translation helpful? Give feedback.
-
At this point, having tried a few things, I would not recommend putting it in SES because I can't see a way around the problem without tradeoffs in logic or semantics. It could work as a vetted shim for specific usecases, where it could override DataView to return an Array or Array based proxy and internally map the changes onto a typed array. I think we may get a fix from chromium this time - now that we were able to pinpoint the exact misbehavior labeling a function that uses a frozen typed array as megamorphic. |
Beta Was this translation helpful? Give feedback.
-
Consider the following experiment setup:
Results:
Looks like warming up the typed array before lockdown is helping prevent the performance drop. Let's explore the warmup.
None of the above! :D
See the modified experiment
So it's beginning to look like using a frozen TypedArray in a function causes V8 to bail out of optimizing it, but if it's already been optimized, it's fine.
if I add
--trace-opt --trace-deopt
to node arguments, I get:We can see 2 failed attempts to optimize the
run
function.Note that the first attempt in the second run is using Turbofan OSR - a different kind of compilation attempt.
This is a minimal sample that triggers a repro in node.js
I've generated 1MB of TurboFan trace output for the 2 variants and am trying to find any hints as to how V8 decides what to do.
To be continued.
Beta Was this translation helpful? Give feedback.
All reactions