My background is in telecommunications (the technical side of video production), so I know that 30fps is (or was?) considered the standard for a lot of video. TV and movies don’t seem choppy when I watch them, so why does doubling the frame rate seem to matter so much when it comes to games? Reviewers mention it constantly, and I don’t understand why.

  • Kale@lemmy.zip
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    1 year ago

    A decade ago I had a little extra money and chose to buy a 144 hz gaming monitor and video card. I don’t have great eyesight nor do I play games that require twitch reflexes, but at that time 144 hz frame rate (and configuring the game to be >100 fps) was very noticable. I’d much rather play 1080 at >100 fps rather than 4k at 60 fps or below.

    I don’t believe I have great eyesight, depth perception, color perception, etc, but I am really sensitive to motion. I built my second computer (AMD Athlon 64 bit I think?) and spend a significant sum on a CRT that had higher refresh rates. I can’t use a CRT at 60Hz. I perceive the flicker. I couldn’t use Linux on that computer until I saved up even more to buy an LCD monitor. I can’t perceive the flicker on those, and 60Hz is fine for work.

    But for gaming, high refresh rate is noticable, even for someone that normally doesn’t notice visual stuff, like me.