If we were to run this code, the browser would be like, what the heck is this thing called Document? I have no idea what that is. And third-party scripts are often loaded with DOM operations just like this. These operations must be able to function without any issues, since we cannot modify how the third-party scripts were initially written. Thus, it becomes necessary to find a solution that allows the scripts to run seamlessly without interfering with the performance of the main UI thread.
We recognize that third-party scripts access global objects only available on the main thread, but are not available on the worker thread. As a result, Python utilizes proxies to intercept and forward any DOM operations from the Web Worker to the main thread. By doing so, Python allows a Web Worker to indirectly access the main thread's window and document objects. Now, the proxy acts as a middleware that intercepts the DOM-related requests made by the third-party scripts running on the Web Worker and forwards them to the main thread. The main thread processes these requests and returns the corresponding values back to the worker thread. This approach ensures Python is able to maintain a seamless, consistent experience for the third-party script code, regardless of whether it's running on the main thread or the worker thread.
Now, as previously mentioned, communication between the main thread and the Web Worker is asynchronous. This is because the threads run on separate execution contexts. However, this poses a challenge for Python, especially when it comes to running third-party scripts. Now, third-party scripts are often written with the assumption that they're running on the main thread and, as such, they heavily rely on synchronous operations. These operations may include accessing the DOM, modifying styles, reading cookies among many other things. For Python to work effectively, it needs to be able to ensure that these operations work exactly the same way as they would if as if they were executed on the main thread. Now, instead of trying to replicate the entire DOM on the WebWorker, Python will proxy any DOM operations to the main thread, execute the command there, and return the value back to the WebWorker. This allows third-party scripts to execute normally without any modification and with the same level of performance as they would on the main thread. Now, by doing so, Python ensures that third-party scripts can work despite the asynchronous nature of communication between the threats.
So here's a straightforward example of a third-party script code running document.title. When their script calls document.title, it's a blocking call and the getter is expecting a value. It's not expecting the getter to return a promise. It's expecting a string value of document.title. This is actually the biggest challenge that Python has to solve. As we talked about earlier, no matter what we do, sending messages between the main thread and the worker thread requires an asynchronous task. Basically, we have to use a PostMessage API and listen for those messages from the opposite thread, which is entirely asynchronous. This is the other important piece of the puzzle, which allows the web worker to communicate with the main thread synchronously. In this diagram, the worker thread calls a getter expecting a value, but our proxy will fire off a synchronous XHR request, which is intercepted by the service worker so that it can talk to the main thread. And in the end, the synchronous XHR request returns the getter value. And according to the third-party script running from the web worker, this was one synchronous blocking task.
Comments