Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

onReady() has significant performance issues #192

Closed
dominikjasek opened this issue Mar 28, 2023 · 13 comments
Closed

onReady() has significant performance issues #192

dominikjasek opened this issue Mar 28, 2023 · 13 comments

Comments

@dominikjasek
Copy link

According to your docs, we should delay rendering application with following code:

optimizelyClient.onReady().then(() => {
  setIsDone(true);
  isClientValid() && setIsClientReady(true);
});

This has significant performance issue in our application, it takes ~1s to initialize Optimizely this way. Do you have any recommendation to improve this? Maybe some caching strategy in localStorage or anything else?

Thanks

@mikechu-optimizely
Copy link
Contributor

I've created an internal bug ticket (FSSDK-9623) to review/test contributing factors to resolving this Promise faster.

@teichmaa
Copy link

Hello @mikechu-optimizely, any updates on this?

We heavily rely on optimizely and need to load it before rendering our application. But its performance overhead is significant and starts to be a real challenge. For our team, loading speed is currently the main reason for considering optimizely alternatives.

Thanks

@mikechu-optimizely
Copy link
Contributor

Hi @teichmaa and @dominikjasek We're in the process of releasing JS SDK v5.0.0 followed by React SDK v3.0.0 which removes the need to wait for a userContext during instantiation. This is a significant departure from previous versions.

To accomplish this, we're establishing a visitor user ID (VUID) and using that as the identifier until the userContext (with a user ID) is established.

For the sake of continuity, we still recommend awaiting the onReady to resolve but as of PR #229 I removed the userPromise in client.ts.

The associated ticket above will help me revisit this issue and put benchmarks around instantiation and readiness.

Please keep the feedback coming. I appreciate the collab and the reports.

@teichmaa
Copy link

This is good news, thank you! Looking forward to the public release of JS SDK v5.0.0!

@mikechu-optimizely
Copy link
Contributor

JS SDK just released. I have to do 1-2 PRs to get React out the door. I'll also see if I can run some manual e2e tests locally before the v3 release to spot problems.

@mikechu-optimizely
Copy link
Contributor

We just released v3.0.0 of the SDK and would like feedback when you have a chance.

@mikechu-optimizely
Copy link
Contributor

Update: I'm chasing down a performance report in the underlying JS SDK. I want to make sure this is solved then come back to the onReady state.

@mikechu-optimizely
Copy link
Contributor

We'll be publishing soon the JS SDK 5.0.1 to address some memory allocation reports. Once that's done, I'll look to load/performance test (clinicjs) and then publish React 3.0.1

@mikechu-optimizely
Copy link
Contributor

JS SDK 5.0.1 is out. Today I'm pulling it into this SDK and begin load testing.

@mikechu-optimizely
Copy link
Contributor

React 3.0.1 is out. I'm working to diagnose the onReady() delay when AAT is enabled.

Does explicitly turning off AAT help as a workaround?

const optimizelyClient = createInstance({
  sdkKey: "<sdk_key>",
  odpOptions: {
    disabled: true, // Not using Advanced Audience Targeting
  },
})

@mikechu-optimizely
Copy link
Contributor

I'm testing this idea a bit more, but let's test putting the onReadyPromise into a useEffect

image

//...
const optimizelyClient = createInstance({
  sdkKey: "<sdk-key>",
})
const onReadyPromise = optimizelyClient.onReady()

export default function App() {
  const [isDone, setIsDone] = useState(false)

  useEffect(() => {
    setIsDone(true)
  }, [onReadyPromise])
// ...

I'm continuing to dig further down the stack (likely into the JS SDK) to see how to improve the ready.

@mikechu-optimizely
Copy link
Contributor

mikechu-optimizely commented Mar 6, 2024

My test cases of putting onReadyPromise into the dependency array of useEffect does seem more performant, but that's just in my trivial test app. I'd love to hear from the real-world if you're available.

I've jumped down into the JS SDK (v5.1.0) now and begun setting up clinicJS to thrash and measure onReady to profile for bottlenecks...

@mikechu-optimizely
Copy link
Contributor

One of the things I'm noticing echos an Edge deployment of the JS SDK. In this scenario, we typically advise that a local datafile be available as close as possible to createInstance(). For example, a local .json file that is updated via an external process or JSON stored in an Edge KV store.

I'm seeing (in a node context) that the receiving and parsing the response from our CDN is actually the most latent portion of the flame chart
image
which is around 5.2% of the primary app excluding dependencies, node processes, etc
The next longest process is parsing the JSON to establish the experimental setup.

If I bring the same json local, reading it from disk, the longest delay occurs when parsing through the project JSON to establish the experiment setup
image
which is about 0.9% of the primary app excluding dependencies, node processes, etc.
The next longest process is user bucketing.

Actions to reduce onReady latency in performance situations:

  1. Bring the datafile close to the execution, preferably in memory or local disk and update it outside the SDK
  2. To continue using the SDK's polling for datafile updates, ensure QoS to the Optimizely CDNs.
  3. Reduce the size of the json datafile by splitting experiments into different projects to reduce parsing time.
  4. For React, try putting onReadyPromise into the dependency array of a useEffect illustrated yesterday ☝️

We could explore ways to optimize the datafile structure and JSON parsing, but these have deeper implications for this SDK and support Experimentation platform as a whole. I think we have a roadmap item for this.

Any additional evidence the community has is greatly appreciated as points of comparison. I'll hold the investigation open for a bit.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants