My background is in telecommunications (the technical side of video production), so I know that 30fps is (or was?) considered the standard for a lot of video. TV and movies don’t seem choppy when I watch them, so why does doubling the frame rate seem to matter so much when it comes to games? Reviewers mention it constantly, and I don’t understand why.

  • magnetosphere @beehaw.orgOP
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    I’ve gotten a lot of helpful answers, but yours was the only one that included a visual aid! Thanks!

    What’s interesting is that when I focused on the UFOs, I didn’t notice a difference between the 30 fps and the 60 fps stars. When I let my eyes go out of focus, though, I was able to see a clear difference between them.

    • moody@lemmings.world
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      That’s because the middle part of your vision sees detail much better but is less reactive to movement, but the outer part of your vision sees motion much better so it can notice the stutter more easily.

      It’s also why low refresh rates are more noticeable on larger screens.

      • Max-P@lemmy.max-p.me
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        We’re better at seeing in the center, and yet technically we have a big blind spot right in the middle. The brain just fills in the gaps.

    • PupBiru@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      afaik the edges of your vision are better at picking up movement too (for “seeing out of the corner of your eye” kinda things), so it’s possible that while you’re trying to make out specific things by looking directly at them, you’re missing the parts of your eye that can make out the higher FPS?

      just a guess though