• pulsewidth@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    2
    ·
    3 days ago

    The consumer has spoken and they don’t care, not even for 4K. Same as happened with 3D and curved TVs, 8K is a solution looking for a problem so that more TVs get sold.

    In terms of physical media - at stores in Australia the 4K section for Blurays takes up a single rack of shelves. Standard Blurays and DVDs take up about 20.

    Even DVDs still sell well because many consumers don’t see a big difference in quality, and certainly not enough to justify the added cost of Bluray, let alone 4K editions. A current example, Superman is $20 on DVD, $30 on Bluray (50% cost increase) or $40 on 4K (100%) cost increase. Streaming services have similar pricing curves for increased fidelity.

    It sucks for fans of high res, but it’s the reality of the market. 4K will be more popular in the future if and when it becomes cheaper, and until then nobody (figuratively) will give a hoot about 8K.

  • n1ckn4m3@lemmy.world
    link
    fedilink
    English
    arrow-up
    44
    ·
    3 days ago

    As someone who stupidly spent the last 20 or so years chasing the bleeding edge of TVs and A/V equipment, GOOD.

    High end A/V is an absolute shitshow. No matter how much you spend on a TV, receiver, or projector, it will always have some stupid gotcha, terrible software, ad-laden interface, HDMI handshaking issue, HDR color problem, HFR sync problem or CEC fight. Every new standard (HDR10 vs HDR10+, Dolby Vision vs Dolby Vision 2) inherently comes with its own set of problems and issues and its own set of “time to get a new HDMI cable that looks exactly like the old one but works differently, if it works as advertised at all”.

    I miss the 90s when the answer was “buy big chonky square CRT, plug in with component cables, be happy”.

    Now you can buy a $15,000 4k VRR/HFR HDR TV, an $8,000 4k VRR/HFR/HDR receiver, and still somehow have them fight with each other all the fucking time and never work.

    8K was a solution in search of a problem. Even when I was 20 and still had good eyesight, sitting 6 inches from a 90 inch TV I’m certain the difference between 4k and 8k would be barely noticeable.

  • BlackVenom@lemmy.world
    link
    fedilink
    English
    arrow-up
    47
    ·
    3 days ago

    For what content? Video gaming (GPUs) has barely gotten to 4k. Movies? 4k streaming is a joke; better off with 1080 BD. If you care about quality go physical… UHD BD is hard to find and you have to wait and hunt to get them at reasonable prices… And these days there are only a couple UHD BD Player mfg left.

  • happydoors@lemmy.world
    link
    fedilink
    English
    arrow-up
    60
    arrow-down
    1
    ·
    3 days ago

    I am a filmmaker and have shot in 6k+ resolution since 2018. The extra pixels are great for the filmmaking side. Pixel binning when stepping down resolutions allows for better noise, color reproduction, sharpened details, and great for re-framing/cropping. 99% of my clients want their stuff in 1080p still! I barely even feel the urge to jump up to 4k unless the quality of the project somehow justifies it. Images have gotten to a good place. Detail won’t provide much more for human enjoyment. I hope they continue to focus on dynamic range, HDR, color accuracy, motion clarity, efficiency, etc. I won’t say no when we step up to 8k as an industry but computing as a whole is not close yet.

  • skisnow@lemmy.ca
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    3 days ago

    I hate the wording of the headline, because it makes it sound like the consumers’ fault that the industry isn’t delivering on something they promised. It’s like marketing a fusion-powered sex robot that’s missing the power core, and turning around and saying “nobody wants fusion-powered sex robots”.

    Side note, I’d like for people to stop insisting that 60fps looks “cheap”, so that we can start getting good 60fps content. Heck, at this stage I’d be willing to compromise at 48fps if it gets more directors on board. We’ve got the camera sensor technology in 2025 for this to work in the same lighting that we used to need for 24fps, so that excuse has flown.

  • Wolf@lemmy.today
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 days ago

    I would love to have an 8K TV or monitor if I had an internet connection up to the task and enough content in 8K to make it worth it, or If I had a PC powerful enough to run games smoothly in that resolution.

    I think it’s silly to say ‘nobody wants this’ when the infrastructure for it isn’t even close to adequate.

    I will admit that there is diminishing returns now, going from 4K to 8K was less impressive than FHD to 4K and I imagine that 8K will probably be where it stops, at least for anything that can reasonably fit in a house.

  • Showroom7561@lemmy.ca
    link
    fedilink
    English
    arrow-up
    35
    ·
    3 days ago

    The difference between 1080 and 4K is pretty visible, but the difference between 4K and 8K, especially from across a room, is so negligible that it might as well be placebo.

    Also the fact that 8K content takes up a fuckload more storage space. So, there’s that, too.

  • kylian0087@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    60
    arrow-down
    1
    ·
    4 days ago

    I do want a dumb 8K TV. I do not want all the so called smart features of a TV. Small Linux device with kodi works way better.

      • viking@infosec.pub
        link
        fedilink
        English
        arrow-up
        8
        ·
        4 days ago

        Some Xiaomi TVs have root exploits, so you can manually disinfect the OS, but it’s cumbersome to get done since you need to enter adb commands over the remote control to get there in the first place.

        Easier to just use an external device and the TV as a screen only. Personally I’m using the Nvidia Shield for 5+ years now and regret nothing.

      • elucubra@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        6
        ·
        4 days ago

        Not ideal, but you can air gap the TV from the network, and use some small sbc, or even a firestick or android box. That’s what I do. Stremio?

    • Don_alForno@feddit.org
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      4 days ago

      I do want a TV that can access Netflix etc without another box. I just don’t want the surveillance that comes with it.

    • GreatAlbatross@feddit.uk
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 days ago

      I just run mine without ever connecting it to the internet.
      I run an Apple TV (shock, walled garden!), as it is the only device I’ve seen that consistently matches frame rates properly on the output.

  • acosmichippo@lemmy.world
    link
    fedilink
    English
    arrow-up
    163
    arrow-down
    2
    ·
    edit-2
    4 days ago

    article took forever to get to the bottom line. content. 8k content essentially does not exist. TV manufacturers were putting the cart before the horse.

    • themeatbridge@lemmy.world
      link
      fedilink
      English
      arrow-up
      110
      ·
      4 days ago

      4k tvs existed before the content existed. I think the larger issue is that the difference between what is and what could be is not worth the additional expense, especially at a time when most people struggle to pay rent, food, and medicine. More people watch videos on their phones than watch broadcast television. 8k is a solution looking for a problem.

      • Fredselfish@lemmy.world
        link
        fedilink
        English
        arrow-up
        37
        ·
        4 days ago

        Hell I still don’t own a 4k tv and don’t plan to go out of my way to buy one unless the need arises. Which I don’t see why I need that when a normal flat-screen looks fine to me.

        I actually have some tube tvs and be thinking of just hooking my vcr back up and watching old tapes. I don’t need fancy resolutions in my shows or movies.

        Only time I even think of those things is with video games.

        • NauticalNoodle@lemmy.ml
          link
          fedilink
          English
          arrow-up
          26
          arrow-down
          2
          ·
          4 days ago

          4K hardly even makes sense unless your tv is over 70" and your watching it from less than 4 feet away. I do think VR could benefit from ultra-high resolution, though.

          • sp3ctr4l@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            31
            ·
            edit-2
            4 days ago

            https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship

            Extensive write up on this whole issue, even includes a calculator tool.

            But, basically:

            Yeah, going by angular resolution, even leaving the 8K content drought aside…

            8K might make sense for a computer monitor you sit about 2 feet / 0.6m away from, if the diagonal size is 35 inches / ~89cm, or greater.

            Take your viewing distance up to 8 feet / 2.4m away?

            Your screen diagonal now has to be about 125 inches / ~318cm, or larger, for you to be able to maybe notice a difference with a jump from 4K to 8K.

            The largest 8K TV that I can see available for purchase anywhere near myself… that costs ~$5,000 USD… is 85 inches.

            I see a single one of 98 inches that is listed for $35,000. That’s the largest one I can see, but its… uh, wildly more expensive.

            So with a $5,000, 85 inch TV, that works out to…

            You would have to be sitting closer than about 5 feet / ~1.5 meters to notice a difference.

            And that’s assuming you have 20/20 vision.

            So yeah, VR goggle displays… seem to me to be the only really possibly practical use case for 8K … other than basically being the kind of person who owns a home with a dedicated theater room.

            • tankplanker@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              ·
              4 days ago

              What this chart is missing is the impact of the quality of the screen and the source material being played on it.

              A shit screen is a shit screen, just like a badly filmed TV show from the 80s will look like crap on anything other than an old CRT.

              People buying a 4k screen from Wallmart for $200 then wondering why they cant tell its any better than their old 1080p screen.

              The problem with pushing up resolution is the cost to get a good set right now is so much its a niche within a niche of people who actually want it. Even a good 4k set with proper HDR support and big enough to make a different is expensive. Even when 8k moves away from early adopter markups its still going to be expensive, especially when compared to the tat you can by at the supermarket.

              • sp3ctr4l@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                3 days ago

                It is totally true that things are even more complex than just resolution, but that is why I linked the much more exhaustive write up.

                Its even more complicated in practice than all the things they bring up, they are focusing on mainly a movie watching experience, not a video game playing experience.

                They do not go into LED vs QLED vs OLED vs other actual display techs, don’t go into response latency times, refresh rates, as you say all the different kinds of HDR color gamut support… I am sure I am forgetting things…

                Power consumption may be a significant thing for you, image quality at various viewing angles…

                Oh right, FreeSync vs GSync, VRR… blargh there are so many fucking things that can be different about displays…

          • WanderingThoughts@europe.pub
            link
            fedilink
            English
            arrow-up
            8
            ·
            4 days ago

            At 1.6 meter for the metric minded. If you really stretch out and can hit the tv with your toes it’s about the right distance.

    • jqubed@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      4 days ago

      I think it’s NHK, or one of the Japanese broadcasters anyways, that has actually been pressing for 8K since the 1990s. They didn’t have content back then and I doubt they have much today, but that’s what they wanted HD to be.

      • NuXCOM_90Percent@lemmy.zip
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        edit-2
        4 days ago

        Not familiar with NHK specifically (or, to be clear, I think I am but not with enough certainty), but it really makes a lot of sense for news networks to push for 8k or even 16k at this point.

        Because it is a chicken and egg thing. Nobody is going to buy an 8k TV if all the things they watch are 1440p. But, similarly, there aren’t going to be widespread 8k releases if everyone is watching on 1440p screens and so forth.

        But what that ALSO means is that there is no reason to justify using 8k cameras if the best you can hope for is a premium 4k stream of a sporting event. And news outlets are fairly regularly the only source of video evidence of literally historic events.

        From a much more banal perspective, it is why there is a gap in TV/film where you go from 1080p or even 4k re-releases to increasingly shady upscaling of 720 or even 480 content back to everything being natively 4k. Over simplifying, it is because we were using MUCH higher quality cameras than we really should have been for so long before switching to cheaper film and outright digital sensors because “there is no point”. Obviously this ALSO is dependent on saving the high resolution originals but… yeah.

        • acosmichippo@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          1
          ·
          edit-2
          4 days ago

          it’s not exactly “there is no point”. It’s more like “the incremental benefit of filming and broadcasting in 8k does jot justify the large cost difference”.

            • paraphrand@lemmy.world
              link
              fedilink
              English
              arrow-up
              9
              ·
              edit-2
              4 days ago

              I’m sorry, but if we are talking about 8k viability in TVs, we are not talking about shooting in 8k for 4k delivery.

              You should be pointing out that shooting in higher than 8k, so you have the freedom to crop in post, is part of the reason 8k is burdensome and expensive.

              • Knock_Knock_Lemmy_In@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 days ago

                So correct the person above me, they wrote about shooting in 8k.

                The RED V-Raptor is expensive for consumer grade but nothing compared to some film equipment. There are lenses more expensive than an 8k camera.

          • NuXCOM_90Percent@lemmy.zip
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            4
            ·
            4 days ago

            Which, for all intents and purposes, means there is no point. Because no news network is going to respond to “Hey boss, I want us to buy a bunch of really expensive cameras that our audience will never notice because it will make our tape library more valuable. Oh, not to sell, but to donate to museums.” with anything other than laughter and MAYBE firing your ass.

            • acosmichippo@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              4 days ago

              the point is, the cost/benefit calculation will change over time as the price of everything goes down. It’s not a forever “no point”.

              • NuXCOM_90Percent@lemmy.zip
                link
                fedilink
                English
                arrow-up
                2
                ·
                4 days ago

                … Almost like it would be more viable to film in higher resolution if more consumers had higher resolution displays?

    • Bobo The Great@startrek.website
      link
      fedilink
      English
      arrow-up
      17
      ·
      edit-2
      4 days ago

      Not only the content doesn’t exist yet, it’s just not practical. Even now 4k broadcasting is rare and 4k streaming is now a premium (and not always with a good bitstream, which matters a lot more) when once was offered as a cost-free future, imagine 8k that would roughly quadruple the amount of data required to transmit it (and transmit speee is not linear, 4x the speed would probably be at least 8x the cost).

      And I seriously think noone except the nerdiest of nerds would notice a difference between 4k and 8k.

    • Broken@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      Not only does it not exist, it isn’t wanted. People are content watching videos on YouTube and Netflix. They don’t care for 4k. Even if they pay extra for Netflix 4k (which I highly doubt they do) I still question if they are watching 4k with their bandwidth and other limiting factors, which means they’re not watching 4k and are fine with it.

  • Photuris@lemmy.ml
    link
    fedilink
    English
    arrow-up
    129
    arrow-down
    3
    ·
    4 days ago

    I don’t care about 8k.

    I just want an affordable dumb TV. No on-board apps whatsoever. No smart anything. No Ethernet port, no WiFi. I have my own stuff to plug into HDMI already.

    I’m aware of commercial displays. It just sucks that I have to pay way more to have fewer features now.

    • dan@upvote.au
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      4 days ago

      You can have a smart TV but never set up any of the smart features. I have two LG OLED TVs but rarely touch anything on the TV itself. I’ve got Nvidia Shields for streaming and turning it on or off also turns the TV on or off. Same with my Xbox.

      I just need to figure out if I can use CEC with my SFF gaming PC (so that turning it on also turns the TV on, and turning it off turns the TV off), then I won’t have to touch the TV’s remote again.

      Ethernet port or wifi are good for controlling the TV using something like Home Assistant. I have my TVs on a separate isolated VLAN with no internet access. I have a automation that runs when the TV turns on, to also turn on some LED lights behind the TV.

      • Photuris@lemmy.ml
        link
        fedilink
        English
        arrow-up
        25
        arrow-down
        3
        ·
        4 days ago

        Fine, but I don’t want the smart features to be installed at all in the first place.

        I don’t want a WiFi antenna or Ethernet port in there.

        I know that sounds ridiculous, since I can “simply not use them,” but I want to spend my money on an appliance, not a consumer data collection tool.

        I don’t want them to have any of my data, and I don’t want to spend money “voting” with my dollar for these data collection devices.

        Some of these devices have even been known to look for other similar devices within WiFi range, and phone home that way (i.e., send analytics data via a neighbor’s connected TV as a proxy).

        Fuuuck that. I don’t want my dollar supporting this, at all, plain and simple. And I don’t want to pay a premium for the privilege of buying a technically simpler device. I do, but it’s bullshit, and I’m unhappy about it.

        • Null User Object@lemmy.world
          link
          fedilink
          English
          arrow-up
          17
          arrow-down
          2
          ·
          4 days ago

          Some of these devices have even been known to look for other similar devices within WiFi range, and phone home that way (i.e., send analytics data via a neighbor’s connected TV as a proxy).

          Ummm, wut? I’m going to need some quality sources to back this claim up.

          • BassTurd@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            4
            ·
            4 days ago

            Yea, this paragraph feels like fear mongering. I’m not saying OP didn’t see that somewhere, but from a tech standpoint, the TV still has to authenticate with any device it’s trying to piggy back off the wifi for. Perhaps if there were any open network in range it could theoretically happen, but I’m guessing that it’s not.

            I do remember reading that some smart TV was able to use the speakers as a mic to record in room audio and pass that out if connected. It may have been a theoretical thing but it might have been a zero day I read about. It’s been some years now.

            • shortwavesurfer@lemmy.zip
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              3 days ago

              Actually, it’s true. Amazon’s sidewalk works in a similar way, where if the sensor is not connected to the internet, it will talk to local Echo devices like your speakers that are connected to the internet and pass the data to Amazon through your device’s network.

              TVs will look for open Wi-Fi networks. And failing that, they could very well do this exact same thing.

              Edit: The way it works is that the echo devices contain a separate radio that works over the 868 to 915 megahertz industrial scientific and medical band, so the sensor communicates with your echo that way, and then your echo communicates it to the network as if it’s coming from the echo itself, not another device. So the sensor gets connected to the network without your network realizing that it’s actually a third-party device. To your network, the only thing it sees is the Echo, but to the Echo, it sees both your network, which it’s connected to, and the sensor, so it’s acting as a relay.

              • BassTurd@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                3 days ago

                I forgot the Sidewalk is a thing. While that tech does kind of do what OP was saying, Sidewalk is limited to only Amazon Sidewalk compatible devices, like the echo line and ring. Just at a quick glance, there are no smart TVs that can connect to that network.

                That said, it is an opt out service, which it awful. No smart TVs will connect, but I’d recommend disabling for anyone that uses Amazon devices.

        • dan@upvote.au
          link
          fedilink
          English
          arrow-up
          8
          ·
          4 days ago

          I totally get where you’re coming from. It’s hard to find devices like that. I think the issue is that regular customers are demanding the smart features, and using them without caring about privacy aspects.

        • vithigar@lemmy.ca
          link
          fedilink
          English
          arrow-up
          6
          ·
          4 days ago

          I know that sounds ridiculous, since I can “simply not use them,” but I want to spend my money on an appliance, not a consumer data collection tool.

          For what it’s worth you’re actually spending the manufacturer’s money (or at least some of their profit margin) on a data collection device that they won’t get to use.

          Smart devices are cheaper because the data collection subsidizes them.

        • ccunix@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 days ago

          They are called “Digital Signage Panels” and they cost an arm and a leg.

          The data collection subsidises the cost of your TV, so that brings the cost down. Also, digital signage panels are rated for 24/7 use, which significantly increases their cost.

        • FreedomAdvocate@lemmy.net.au
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          4 days ago

          Some of these devices have even been known to look for other similar devices within WiFi range, and phone home that way (i.e., send analytics data via a neighbor’s connected TV as a proxy).

        • olympicyes@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          4
          ·
          4 days ago

          Your tv price is subsidized by the presence of those network connections. I recommend using universal remote.

      • 4am@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        4 days ago

        Sometimes that doesn’t even matter anymore; they’ll refuse to work now without a network set up.

        • dan@upvote.au
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 days ago

          If it wants a network then stick it on an isolated VLAN with no internet access.

          • grue@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            4 days ago

            That’s not what that means and you know it. It refuses to work unless it can successfully phone home over the Internet.

            • dan@upvote.au
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 days ago

              So people in rural areas without good internet, or places where the network is airgapped, can’t use them at all? Seems like there’s be a way around it.

    • olympicyes@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      4 days ago

      I blacklist the TVs Ethernet and WiFi MAC addresses. I strongly encourage using a computer, Apple TV, or anything that can’t fingerprint everything you use your tv for.

    • iopq@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      No, I want only one DP port and to have a separate box that selects sources. That way I have the ports I want