Yes. Well, the older ones anyways, before they got full digital phase locking anyways.
I had a dumb but rather high tech 15 inch CRT for it to have come from 1994. No smart logic though, just a few relays you had to trigger with certain frequencies. No error messages, no safety checks, the thing either accepted the signal or it didn’t. Or it could explode, that was the fun in trying, and yeah it actually worked and didn’t explode!
How did I do it? Well, by then I had an XP system and an Nvidia GeForce FX 5200. Wait, this was in late 2005 or early 2006 come to think of it…
Anyways, I checked all the details of the monitor’s supported frequencies, both horizontal and vertical. I found the max resolution, while compromising on the framerate, I used 25Hz interlaced to achieve that, with the Nvidia Control Center of the time…
It was officially rated for a max of 1280x1024, 50Hz progressive. The more common resolution of choice back then was 1024x768, 60Hz.
Well I wanted a much higher resolution than that, so by absolutely maxing out the horizontal output transistor frequency to 64KHz and doing some quick number crunching, I was able to make a custom display mode of 2048x1536, 25Hz interlaced.
Although the vertical refresh rate got both cut in half and also switched to interlaced, it still absolutely qualifies as overclocking, because the horizontal output transistor (HOT) is basically the most stressed out non high voltage component in any CRT monitor.
Running the HOT at the bleeding edge of the manufacturer’s frequency rating of 64KHz could and would indeed burn it out prematurely if run too long like that, especially without additional cooling, so I didn’t run it too damn long like that, but I just wanted to see if it was even possible.
The monitor itself was from 1994, so it was effectively all analog, no digital onscreen menus, no signal checking and no error message on the screen to say boohoo video mode not supported, the monitor just tuned itself to the signal and frequencies I calculated and arranged for it.
It did however have quite a few analog image transform buttons on its front panel though, for things like trapezoid and shear distortion, raster rotation, corner bowing, etc, lots of things most monitors from 1994 didn’t have, which meant the CRT yoke had probably twice as many deflection coils as a regular consumer CRT.
Not bad for a monitor from 1994 that probably never saw anything over 1024x768 in practical use before I ever acquired it. I got literally 4 times the pixels of the typical desktop resolution of the time.
Was it worth it? For daily use, no. For learning experience, absolutely!
wait holy crap what? you can overclock CRT monitors?
Yes. Well, the older ones anyways, before they got full digital phase locking anyways.
I had a dumb but rather high tech 15 inch CRT for it to have come from 1994. No smart logic though, just a few relays you had to trigger with certain frequencies. No error messages, no safety checks, the thing either accepted the signal or it didn’t. Or it could explode, that was the fun in trying, and yeah it actually worked and didn’t explode!
How did I do it? Well, by then I had an XP system and an Nvidia GeForce FX 5200. Wait, this was in late 2005 or early 2006 come to think of it…
Anyways, I checked all the details of the monitor’s supported frequencies, both horizontal and vertical. I found the max resolution, while compromising on the framerate, I used 25Hz interlaced to achieve that, with the Nvidia Control Center of the time…
man that’s dope! never got to play around with CRTs because I was born too late. maybe for the better 'cause i’d probably instantly electrocute myself
Probably so, I got zapped quite a few times myself.
That’s probably what’s wrong with me ain’t it? 🤔
Wait, what happens when you overclock a crt and for why?
It was officially rated for a max of 1280x1024, 50Hz progressive. The more common resolution of choice back then was 1024x768, 60Hz.
Well I wanted a much higher resolution than that, so by absolutely maxing out the horizontal output transistor frequency to 64KHz and doing some quick number crunching, I was able to make a custom display mode of 2048x1536, 25Hz interlaced.
Although the vertical refresh rate got both cut in half and also switched to interlaced, it still absolutely qualifies as overclocking, because the horizontal output transistor (HOT) is basically the most stressed out non high voltage component in any CRT monitor.
Running the HOT at the bleeding edge of the manufacturer’s frequency rating of 64KHz could and would indeed burn it out prematurely if run too long like that, especially without additional cooling, so I didn’t run it too damn long like that, but I just wanted to see if it was even possible.
The monitor itself was from 1994, so it was effectively all analog, no digital onscreen menus, no signal checking and no error message on the screen to say boohoo video mode not supported, the monitor just tuned itself to the signal and frequencies I calculated and arranged for it.
It did however have quite a few analog image transform buttons on its front panel though, for things like trapezoid and shear distortion, raster rotation, corner bowing, etc, lots of things most monitors from 1994 didn’t have, which meant the CRT yoke had probably twice as many deflection coils as a regular consumer CRT.
Not bad for a monitor from 1994 that probably never saw anything over 1024x768 in practical use before I ever acquired it. I got literally 4 times the pixels of the typical desktop resolution of the time.
Was it worth it? For daily use, no. For learning experience, absolutely!
…I didn’t understand any of that, but it was awesome to read. Thank you for trying to explain it to me!
TL;DR - Much higher pixel resolution, at the cost of framerate.
Gotcha. Could you please give me an example of what an image might look like before and after overclocking?