doom turned thirty: typos

This commit is contained in:
Wouter Groeneveld 2023-12-11 10:40:20 +01:00
parent 143c5c5b3d
commit e001e1948b
1 changed files with 1 additions and 1 deletions

View File

@ -33,7 +33,7 @@ My 486-66 MHz has `8 MB` RAM and a Cirrus Logic VLB VGA card with a whopping `1
- From early dail-up modems to broadband WiFi speeds;
- ...
It's just crazy what can happen in "just" thirty years in the computing world. DOOM seems to perfectly exemplify the technological revolution of the last three decades, and I do wonder what's next for the coming three ones. In 1993, I was eight years old, so I grew up with that old stuff---and, thankfully, without the existence of a smartphone permanently connected to 4G. Since we have our daughter, we do worry more about the future of tech. When she's eight, I the unified Mac M5 chip might be the standard, and the above DOOM Eternal screenshot will probably look even more hyper-realistic. She'll grow up with that, like I grew up with the pixelated first screenshot. What will be in store for her for the coming thirty years?
It's just crazy what can happen in "just" thirty years in the computing world. DOOM seems to perfectly exemplify the technological revolution of the last three decades, and I do wonder what's next for the coming three ones. In 1993, I was eight years old, so I grew up with that old stuff---and, thankfully, without the existence of a smartphone permanently connected to 4G. Since we have our daughter, we do worry more about the future of tech. When she will be eight, the unified Mac M5 chip coupled with a GeForce GTX 9999 might be the standard, and the above DOOM Eternal screenshot will probably look even more hyper-realistic. She'll grow up with that, like I grew up with the pixelated first screenshot. What will be in store for her for the coming thirty years?
I hope she'll be able to enjoy most of the enhancements as much as I did when I grew up, but I highly doubt she will. Not because of a lack of interest---that's her choice to make, not mine---but because of the ever-increasing complexity in these hardware and software systems. I can understand how a VGA signal works when you give me a schematic and I'll probably be able to program something for it. I can understand single-threaded CPU architectures and can probably write assembly or an emulator for it. But I have a lot of trouble understanding the internals of the digital 4K HDMI/USB-C output port, and even if you give me three months, I will never grasp even the basics of what's under the hood of modern CPU chips. That's a shame.