cap on encounter rates, why do they all appear to be at about the rate I experienced?
Well it’s clearly not a cap if you’re seeing people having more frequent encounters than you are.
And why would we not assume that that cap was the intended design?
Because they tied the encounter system to CPU frequency and the highest consumer CPU frequency at the time was like 500mhz. Why on earth would you assume that the developers designed the rate not around what hardware was capable of at the time, but what would be capable 15 years later?
You’re suggesting that the developers got into a room together and said “Let’s design this so that it won’t play the way we intend for it to be played until 15 years pass”
By cap, I mean lower bound. I see random encounters. If random encounters go down as CPUs get faster, my CPU is so much faster than one from the 90s that my random encounters should approach zero, but I had plenty. I just didn’t have what that person experienced where it felt like too many. In fact, it felt so right to me that I didn’t question that anything might be wrong, but I would if I saw dozens. You’re right: there’s no way they could foresee how fast my CPU would be in 2024 or 2013/2014, so how would their logic still output what feels like an acceptable encounter rate that matches other games in the genre by accident?
You’re suggesting that the developers got into a room together and said “Let’s design this so that it won’t play the way we intend for it to be played until 15 years pass”
What would make sense to me based on how those games played for me, and feel free to contradict me with an interview or some other evidence, is that they built and tested the game on higher end machines than many of their customers had, and that faster CPUs resulted in the correct encounter rate while slower CPUs resulted in dozens. I’d sooner believe that the game working differently at different clock rates was an oversight rather than how they intended for it to work. Then again, that person in that reddit thread is playing the same GOG version I did and still recreated that higher encounter rate.
If random encounters go down as CPUs get faster, my CPU is so much faster than one from the 90s that my random encounters should approach zero, but I had plenty.
I mean some napkin math and averages would tell me that your base clock speed is roughly 8 times faster than the fastest computers they would have tested on. Is 8 times faster truly enough to bring the random event rate to “near zero”? Problably not. And with an old game like this it’s not as easy as just comparing clock speeds because it depends on which CPU you have, do you have Ecores? If so is your computer scheduling it on those or your p cores? And in either case is it using base clock speed or boost clock speed? How do your drivers fit into all this?
There’s also the fact that while the encounter rate is tied to CPU speed it’s not a 1:1 relationship either. The encounter system also factors in tiles, and in game days.
that they built and tested the game on higher end machines than many of their customers had, and that faster CPUs resulted in the correct encounter rate while slower CPUs resulted in dozens.
Like I’ve already said, they accounted for lower CPU clocks at the time. They designed the encounter rate for clock speeds between 200mhz and 450-500mhz, the whole range for the time. You’re also acting like fallout 1 wasn’t a cheap side project half made for free by people working off company hours. It wasn’t some big budget release. Or as if Fallout 2 wasn’t an incredibly rushed game shoved out the door by a financially failing company.
I’d sooner believe that the game working differently at different clock rates was an oversight rather than how they intended for it to work.
It was neither. It was simply an engine limitation they had to account for best they could because the first two games were functionally just tabletop RPGs under the hood that ran on a modified version of GURPS and relied on dice rolls for practically everything.
Well it’s clearly not a cap if you’re seeing people having more frequent encounters than you are.
Because they tied the encounter system to CPU frequency and the highest consumer CPU frequency at the time was like 500mhz. Why on earth would you assume that the developers designed the rate not around what hardware was capable of at the time, but what would be capable 15 years later?
You’re suggesting that the developers got into a room together and said “Let’s design this so that it won’t play the way we intend for it to be played until 15 years pass”
By cap, I mean lower bound. I see random encounters. If random encounters go down as CPUs get faster, my CPU is so much faster than one from the 90s that my random encounters should approach zero, but I had plenty. I just didn’t have what that person experienced where it felt like too many. In fact, it felt so right to me that I didn’t question that anything might be wrong, but I would if I saw dozens. You’re right: there’s no way they could foresee how fast my CPU would be in 2024 or 2013/2014, so how would their logic still output what feels like an acceptable encounter rate that matches other games in the genre by accident?
What would make sense to me based on how those games played for me, and feel free to contradict me with an interview or some other evidence, is that they built and tested the game on higher end machines than many of their customers had, and that faster CPUs resulted in the correct encounter rate while slower CPUs resulted in dozens. I’d sooner believe that the game working differently at different clock rates was an oversight rather than how they intended for it to work. Then again, that person in that reddit thread is playing the same GOG version I did and still recreated that higher encounter rate.
I mean some napkin math and averages would tell me that your base clock speed is roughly 8 times faster than the fastest computers they would have tested on. Is 8 times faster truly enough to bring the random event rate to “near zero”? Problably not. And with an old game like this it’s not as easy as just comparing clock speeds because it depends on which CPU you have, do you have Ecores? If so is your computer scheduling it on those or your p cores? And in either case is it using base clock speed or boost clock speed? How do your drivers fit into all this?
There’s also the fact that while the encounter rate is tied to CPU speed it’s not a 1:1 relationship either. The encounter system also factors in tiles, and in game days.
Like I’ve already said, they accounted for lower CPU clocks at the time. They designed the encounter rate for clock speeds between 200mhz and 450-500mhz, the whole range for the time. You’re also acting like fallout 1 wasn’t a cheap side project half made for free by people working off company hours. It wasn’t some big budget release. Or as if Fallout 2 wasn’t an incredibly rushed game shoved out the door by a financially failing company.
It was neither. It was simply an engine limitation they had to account for best they could because the first two games were functionally just tabletop RPGs under the hood that ran on a modified version of GURPS and relied on dice rolls for practically everything.