Violet (my birth flower) + 08 (my birth year). Pretty basic.

  • over_clox@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    5 days ago

    Yes. Well, the older ones anyways, before they got full digital phase locking anyways.

    I had a dumb but rather high tech 15 inch CRT for it to have come from 1994. No smart logic though, just a few relays you had to trigger with certain frequencies. No error messages, no safety checks, the thing either accepted the signal or it didn’t. Or it could explode, that was the fun in trying, and yeah it actually worked and didn’t explode!

    How did I do it? Well, by then I had an XP system and an Nvidia GeForce FX 5200. Wait, this was in late 2005 or early 2006 come to think of it…

    Anyways, I checked all the details of the monitor’s supported frequencies, both horizontal and vertical. I found the max resolution, while compromising on the framerate, I used 25Hz interlaced to achieve that, with the Nvidia Control Center of the time…

    • brachypelmide@lemmy.zip
      link
      fedilink
      arrow-up
      3
      ·
      5 days ago

      man that’s dope! never got to play around with CRTs because I was born too late. maybe for the better 'cause i’d probably instantly electrocute myself

      • over_clox@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        5 days ago

        Probably so, I got zapped quite a few times myself.

        That’s probably what’s wrong with me ain’t it? 🤔

          • over_clox@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            4 days ago

            It was officially rated for a max of 1280x1024, 50Hz progressive. The more common resolution of choice back then was 1024x768, 60Hz.

            Well I wanted a much higher resolution than that, so by absolutely maxing out the horizontal output transistor frequency to 64KHz and doing some quick number crunching, I was able to make a custom display mode of 2048x1536, 25Hz interlaced.

            Although the vertical refresh rate got both cut in half and also switched to interlaced, it still absolutely qualifies as overclocking, because the horizontal output transistor (HOT) is basically the most stressed out non high voltage component in any CRT monitor.

            Running the HOT at the bleeding edge of the manufacturer’s frequency rating of 64KHz could and would indeed burn it out prematurely if run too long like that, especially without additional cooling, so I didn’t run it too damn long like that, but I just wanted to see if it was even possible.

            The monitor itself was from 1994, so it was effectively all analog, no digital onscreen menus, no signal checking and no error message on the screen to say boohoo video mode not supported, the monitor just tuned itself to the signal and frequencies I calculated and arranged for it.

            It did however have quite a few analog image transform buttons on its front panel though, for things like trapezoid and shear distortion, raster rotation, corner bowing, etc, lots of things most monitors from 1994 didn’t have, which meant the CRT yoke had probably twice as many deflection coils as a regular consumer CRT.

            Not bad for a monitor from 1994 that probably never saw anything over 1024x768 in practical use before I ever acquired it. I got literally 4 times the pixels of the typical desktop resolution of the time.

            Was it worth it? For daily use, no. For learning experience, absolutely!

                  • over_clox@lemmy.world
                    link
                    fedilink
                    arrow-up
                    2
                    ·
                    edit-2
                    3 days ago

                    I long ago lost that monitor and related hardware over the sands of time, moving one place to another multiple times.

                    The benefit though was that I effectively quadrupled the number of pixels on screen from the common 1024x768 resolution of the time.

                    1024x2=2048, 768x2=1536

                    I was basically pioneering early extra high definition video output before it was even a thing.

                    The images themselves wouldn’t look any different, except smaller as each pixel was only 1/4 of original size, giving me a much larger visible pixel area for image editing.

                    It wouldn’t have helped gameplay much though, as I had to sacrifice framerate to accomplish that.

                    Edit: You definitely can’t do shit like that on modern LCDs, that category of overclocking is exclusive to old CRTs.