These days, kids identify them by the aspect ratio.

  • jet@hackertalks.com
    link
    fedilink
    English
    arrow-up
    64
    ·
    11 months ago

    And video quality. Watching some historical videos from my childhood, like tv shows on youtube… the quality is pure potato. Either the archiving is terrible, or we just accepted much worse quality back then.

    • Hypersapien@lemmy.worldOP
      link
      fedilink
      arrow-up
      31
      arrow-down
      1
      ·
      11 months ago

      People always said that Betamax was better quality than VHS. What never gets mentioned is that regular consumer TVs at the time weren’t capable of displaying the difference in quality. To the average person they were the same.

      • jeffw@lemmy.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        4
        ·
        11 months ago

        You kinda can tell though. CRTs didn’t really use pixels, so it’s not like watching on today’s video equipment though

        • NuPNuA@lemm.ee
          link
          fedilink
          arrow-up
          4
          arrow-down
          2
          ·
          11 months ago

          CRT screens definitely used pixels, but they updated on the horizontal line rather than per pixel. This is why earlier flatscreen LCDs were worse than CRTs in a lot of ways as they had much more motion blur as stuff like “sample and hold” meant that each pixel wasn’t updated every frame if the colour info didn’t change. CRTs gave you a fresh image each frame regardless.

          • Psyduck_world@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            11 months ago

            I have heard that pixels in CRTs are round and LCD/LED are square, that’s the reason why aliasing is not too noticeable on CRTs. Is this true or another internet bs?

            • NuPNuA@lemm.ee
              link
              fedilink
              arrow-up
              4
              ·
              11 months ago

              They’re not round persay, but they aren’t as sharp so have more light bleed into one another giving a natural alaising effect. This is why some old games where the art is designed to account for this bluring look wrong when played on pixel perfect modern TVs.

          • zero_gravitas@aussie.zone
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            11 months ago

            What they’re referring to is that analogue CRTs don’t really have a fixed horizontal resolution. The screen has a finite number of horizontal lines (i.e. rows) which it moves down through on a regular-timed basis, but as the beam scans across horizontally it can basically be continuous (limited by the signal and the radius of the beam). This is why screen resolutions are referred to by their vertical resolutions alone (e.g. 360p = 360 lines, progressive scan [as opposed to interlaced]).

            I’m probably wrong on the specifics, but that gives the gist and enough keywords to find a better explanation.

            [EDIT: A word.]

      • fuckwit_mcbumcrumble@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        11 months ago

        VHS was capable of not bad quality, people just had a lot bad equipment.

        Some TV shows (if they were crazy) were shot on film so you could re digitize them now in 4 or 8k and they’d look amazing. But there was also a lot of junk that was out there.

        And as others have mentioned if you do an awful job of digitizing it then you could take something that looked good and throw all of that quality away. But if the tape wasn’t stored in good condition then it could just struggle to be digitized in the first place when done properly.

    • Capt. Wolf@lemmy.world
      link
      fedilink
      arrow-up
      19
      ·
      11 months ago

      There’s a lot of archival video that is just terrible. Digital video compression issues have damaged a lot of old footage that’s gotten shared over the years, especially YouTube’s encoders. They will just straight up murder videos to save bandwidth. There’s also a lot of stuff that just doesn’t look great when it’s being upscaled from magnetic media that’s 240x320 at best.

      However, there’s also a lot of stuff that was bad to begin with and just took advantage of things like scanlines and dithering to make up for poor video quality. Take old games for example. There’s a lot of developers who took advantage of CRT TVs to create shading, smoothing, and the illusion of a higher resolution that a console just wasn’t capable of. There’s a lot of contention in the retro gaming community over whether games looked better with scanlines or if they look better now without them.

      For example.

      Personally, I prefer them without. I like the crisp pixelly edges, but I was also lucky enough to play most of my games on a high quality monitor instead of a TV back then. Then emulators, upscaling, and pixel smoothing became a thing…

    • Dandroid@dandroid.app
      link
      fedilink
      arrow-up
      6
      ·
      11 months ago

      I watch a lot of hockey. Just watching hockey games from the 2000s are full on potato. I don’t remember them looking that bad back then.

        • NuPNuA@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          11 months ago

          All sports have been, also the rise of faster refresh LCD as those early flat screens blurred a lot.

      • Hazdaz@lemmy.world
        link
        fedilink
        arrow-up
        12
        ·
        edit-2
        11 months ago

        Radio vs TV for Boomers

        B&W vs Color for Gen X

        SD vs HD would be Millenials

        4K vs HD for Zoomers

        • drz@lemmy.ca
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          11 months ago

          I’m a millennial and I can’t really tell the difference between SD and HD. Do you mean like when YouTube switches to 360p instead of 1080?

    • Grimlo9ic@kbin.social
      link
      fedilink
      arrow-up
      8
      ·
      11 months ago

      That’s such a trip. Only a 6 year difference between the two of you, yet you experienced the dawn of something and they didn’t, and it shapes both of your perspectives so much.

      Even though it technically applies to transistors, Moore’s Law has been a good barometer for the increase of complexity and capabilities of technology in general. And now because of your comment I’m kinda thinking that since the applicability of that law seems to be nearing its end, it’s either tech will stagnate in the next decade (possible, but I think unlikely), or we may be due for another leapfrog into a higher level of sophistication (more likely).

  • NuPNuA@lemm.ee
    link
    fedilink
    arrow-up
    21
    ·
    11 months ago

    Even early 16:9 stuff looks pretty dated now if it hasn’t been remastered to 1080/4k.

  • 🇨🅾️🇰🅰️N🇪@lemmy.world
    cake
    link
    fedilink
    arrow-up
    20
    ·
    11 months ago

    When I was a kid I used to think black and white meant the TV show or whatever used to be in color but since it got old it turned black and white. My thought process was they changed color just like old people’s hair turns grey… This was 35 years ago before internet.

  • some_guy@lemmy.sdf.org
    link
    fedilink
    arrow-up
    1
    ·
    11 months ago

    I identified them by awkward haircuts and clothing styles. I knew something was off / wrong, but it wasn’t until adulthood that I was able to piece it together.