I don’t mean BETTER. That’s a different conversation. I mean cooler.
An old CRT display was literally a small scale particle accelerator, firing angry electron beams at light speed towards the viewers, bent by an electromagnet that alternates at an ultra high frequency, stopped by a rounded rectangle of glowing phosphors.
If a CRT goes bad it can actually make people sick.
That’s just. Conceptually a lot COOLER than a modern LED panel, which really is just a bajillion very tiny lightbulbs.
But that’s not a thing for intel CPUs either, at least not anymore.
I’m not sure why, but Nvidia hasn’t been making chipsets/motherboard sfor quite a while. Or was there a point in time when it only made chipsets for intel CPUs?