If random encounters go down as CPUs get faster, my CPU is so much faster than one from the 90s that my random encounters should approach zero, but I had plenty.
I mean some napkin math and averages would tell me that your base clock speed is roughly 8 times faster than the fastest computers they would have tested on. Is 8 times faster truly enough to bring the random event rate to “near zero”? Problably not. And with an old game like this it’s not as easy as just comparing clock speeds because it depends on which CPU you have, do you have Ecores? If so is your computer scheduling it on those or your p cores? And in either case is it using base clock speed or boost clock speed? How do your drivers fit into all this?
There’s also the fact that while the encounter rate is tied to CPU speed it’s not a 1:1 relationship either. The encounter system also factors in tiles, and in game days.
that they built and tested the game on higher end machines than many of their customers had, and that faster CPUs resulted in the correct encounter rate while slower CPUs resulted in dozens.
Like I’ve already said, they accounted for lower CPU clocks at the time. They designed the encounter rate for clock speeds between 200mhz and 450-500mhz, the whole range for the time. You’re also acting like fallout 1 wasn’t a cheap side project half made for free by people working off company hours. It wasn’t some big budget release. Or as if Fallout 2 wasn’t an incredibly rushed game shoved out the door by a financially failing company.
I’d sooner believe that the game working differently at different clock rates was an oversight rather than how they intended for it to work.
It was neither. It was simply an engine limitation they had to account for best they could because the first two games were functionally just tabletop RPGs under the hood that ran on a modified version of GURPS and relied on dice rolls for practically everything.
I mean some napkin math and averages would tell me that your base clock speed is roughly 8 times faster than the fastest computers they would have tested on. Is 8 times faster truly enough to bring the random event rate to “near zero”? Problably not. And with an old game like this it’s not as easy as just comparing clock speeds because it depends on which CPU you have, do you have Ecores? If so is your computer scheduling it on those or your p cores? And in either case is it using base clock speed or boost clock speed? How do your drivers fit into all this?
There’s also the fact that while the encounter rate is tied to CPU speed it’s not a 1:1 relationship either. The encounter system also factors in tiles, and in game days.
Like I’ve already said, they accounted for lower CPU clocks at the time. They designed the encounter rate for clock speeds between 200mhz and 450-500mhz, the whole range for the time. You’re also acting like fallout 1 wasn’t a cheap side project half made for free by people working off company hours. It wasn’t some big budget release. Or as if Fallout 2 wasn’t an incredibly rushed game shoved out the door by a financially failing company.
It was neither. It was simply an engine limitation they had to account for best they could because the first two games were functionally just tabletop RPGs under the hood that ran on a modified version of GURPS and relied on dice rolls for practically everything.