So, the glitch problem is if you implement signals the naive way, where when you have a state and you change it, that reruns the computed, you know, that's how a lot of the previous generation of frameworks worked. When React came out, its innovation, well, a big piece of its value was that it didn't do this. It only evaluated things when they were needed. But previously, you know, AngularJS and Ember, when something would change, it would figure out, okay, I have the things that depend on it, so I'll update those, and that kind of becomes a mess.
In particular, if you set a signal, and then you have some effect that's based on reacting to that signal changing, you might observe intermediate states of the computation. So, in this case, t.get should always be greater than seconds.get, because it's one more than it. But if there's a search that goes on the left before going on the right, when visiting the updates, it might log false because it might see one of them updated and the other one not updated. And this image is from Wikipedia's great article about reactivity.
So, the solution is a little bit complicated, but historically, frameworks tried to, you know, implement parts of this, and at this point, everyone realizes, no, we just have to do the correct solution, which is to topologically sort dependencies and evaluate them in a coherent order. So, this is a directed acyclic graph, if there's a cycle, that can cause an error when the cycle is created. So, that can be detected. But you can sort the nodes so that you're never running a computed before all of its things that came before it have already run. And pull-based evaluation is also crucial to this. Because you do this sort when a get occurs. There are multiple algorithms for evaluating this. This is a blog post by Kristen, formerly from the EmberJS team, about using revision numbers. Another algorithm, this is from an explanation from Milo from the solid team, is to use these kind of pushing forward dirtiness and then re-evaluating nodes based on that. Anyway, I don't have time today to go into the algorithms. But even though there are multiple algorithms, they have the same observable behavior. They do the same thing, they run the computed in the same order.
And ultimately, what's happened with the frameworks, it can be compared to carnicization. So, the phenomenon where a bunch of different animals all had convergent evolution towards the body plan of a crab. Because for certain lifestyles, it's very effective. And I think we've seen that within JavaScript frameworks. It's not as if solid invented signals and everybody copied them. Historically, other frameworks were kind of approaching it from other sides and seeing different pieces of the problem and have all kind of both influenced each other directly and also solved the same problem and gotten to the same place.
So, when we at Bloomberg are thinking about how do we want to construct our front end, what kind of framework to use, well, so far, our software development has been largely based on standards. As I said, the core terminal is based on Chromium. So, HTML, CSS, JavaScript. Also, for software supply chain security to detect and remediate outdated pieces of software, we use OAuth, Cyclone DX, S bombs, which is standardized in ECMA as well, just like JavaScript.
Comments