8k, what do you envisage?

I
I don't know if you are being sarcastic? your comments are hilarious & you are making a fool of yourself over here.
We had a nice laugh, i like a sense of humour but don't make a clown of yourself here. No hard feelings.

Peace!!
I recently got a 2k monitor. A jump from 1080p to 2k for gaming. And there's a lot of difference. And im not talking about the refresh rate difference, which is night and day. Im not sure how this dude is unable to see a difference when he watches 4k.
 
Last edited:
I

I recently got a 2k monitor. A jump from 1080p to 2k for gaming. And there's a lot of difference. And im not talking about the refresh rate difference, which is night and day. Im not sure how this dude is unable to see a difference when he watches 4k stuff and majorly, he says HDR is a gimmick lmao oh god

Offtopic, how is the 3080ti doing for high refresh rate gaming ?
 
I'm new but do mods really care? In a different thread I was called a derogatory term by him completely unprovoked when I simply asked who the insults were for. As it was festive period & he was much older than me, and considering that mod(s) willingly chose to ignore his direct comment against me and an FM told me to ignore, I left it at that.

I assumed trolling is allowed on this forum as long as they end with "As always everyone is entitled to their views, this is mine." But then, what even is the point of moderation?
Mods don’t willingly ignore. They just don’t or can’t read every posts. And also some times they don’t know the clear picture on what has happened. They see one post which is against forum rules and they send a warning note.
 
Imo 8k is always better than 4k, 4k is better than 2k, 2k better than 1080p. High resolution is always better. There should not be a question mark here especially on big screens.

Suppose you have got option to play your favorite movie or show in 8k 4k and 1080p, what will you choose? The answer is simple and clear imo. Only question is when we will have such an option ready? Common variables are screen size, internet bandwidth and source material availability.

Hdr and upscaling are not good reasons to upgrade to 8k imo. The trash on tv that they currently produce, i wonder who wants to watch that at 8k. Once your favorite material is available at 8k, you will jump onto it. For starters it would be video games and sports live events.
 
You can read the forum rules which is applicable to every one and any one can report if some one doesn’t follow them. Mods have banned some accounts as well for not following forum rules. So rest assured any one who is trolling or name calling others will get a warning note. I have lost my cool at times and got warning notes for name calling and trolling but the person who I trolled or insulted may not get an update that mods have sent a warning note cause that will only increase their work load. They are here just like us to participate in casual talks related to audio and video.
You're right. I should be more considerate.

He is insulting all proven objective data, enthusiasts preference so he can stay relevant in the enthusiast community having a preference that most enthusiasts won’t prefer.
Yep, we have those people everywhere. A good example is the YouTube channel QuantumTV. It's hard to believe how much his small group of followers trust him even while being aware of Rtings and Vincent.

Suppose you have got option to play your favorite movie or show in 8k 4k and 1080p, what will you choose? The answer is simple and clear imo. Only question is when we will have such an option ready? Common variables are screen size, internet bandwidth and source material availability.
Also with codecs like VVC & AV1 that are more efficient the higher the resolution is, bandwidth may not even be a big issue even if it's 4 times the resolution. But 8K alone isn't a significant enough jump because we can only sit so close.
 
What do you feel, how long would it take for 8K to become a norm in Household?
It will take several years. I expect only die hard video enthusiasts to invest in such tech, after a small donation of their own kidneys. Every aspect of the video chain will be expensive.

Anything over 4k poses a bottleneck for Production companies itself. It requires a change in how we approach work and the hardware needed to support a resolution like this. You're going to need higher frames rates for it to look any good too.

Has anyone been to Disney World and seen Avatar Flight of passage? This is a theme park ride, in stereo. 10k, 60fps. Its a visual spectacle. Its all frame stitches from 2k or 3k content, projected using 3 cameras, if I remember correctly. I don't quite remember the exact resolution we bin packed to. That was my first Vfx project. We could not render this locally due to the memory footprint of the work and the time it would take in terms of compute.

I am not going to get into HDR. That requires its own "What's you're HDR experience" thread and I know too little about it from a technical stand point. We talk about HDR during film making but thats just the usual "Dynamic Range" conversation and not to question if the film has HDR or not. Anything you shoot on a professional video camera carries hdr on it. Whether that reaches you and how well it is implemented is a different ball game.
 
Last edited:
HDR, Dolby vision, Dolby Atmos, and more higher resolutions are the future.

And even more future tech's will come.

The fundamental and moot point is this: media consumption, video consumption are changing for the better. The reference of such are , yes , yet to reach masses. And sometimes due to marketing the principle may be confused.

Displays are coming for bigger sizes......so higher resolutions will accompany them for realism. Immersive experience is what is ultimately aimed for.

I mean the mindset of CRT TV watching cannot, should not confuse with micro-LED multipanels 16ftx10ft (which is acoming in future). Look at the pro-displays for designers, gamers, monitors. Look at how home video ....the demand for better resolution cameras et all.


And from the perspective of content creators and providers, how would they want to be stuck at 640p/720p/1080p?

Even look at our smartphone displays - compare the one you had 10 years before with your now 2k screen , the size increase is from 3" to 6".....did the resolution change bring more ........?

Apple is famous for their retina display.


And 4k smartphone display as mainstream not far. And we are talking 6" screens.

PS: back in 2008 HD 1080p was my default for my home videos. From last 10 years or so 4K is what I record my family and personal videos. It won't be long when I "preserved" my personal memories in 8K. My future grand children can then 'relive' in full glories these memories.
 
Last edited:
Offtopic, how is the 3080ti doing for high refresh rate gaming ?
Its amazing to say the least. I was earlier on a 60hz monitor and this one is 240hz. Sure its difficult and sometimes not possible to keep up with 240 in many recent games but a jump from 60hz is very much a day and night difference. The games feel smoother, more responsive and above all you can compete in competitive fps titles like Battlefield or Halo. Basically you are seeing 240 frames a second instead of 60, thus getting more visual information to react and that helps so much in 1v1 fights. Cant go back to 60hz or even 120 now.
4000 series will launch next year. If they are powerful enough, i may get a 4080Ti or a 4080. Then this monitor will be put to best use.

Actually, i was really inclined to get a G9 49" monitor. But the size was against mp gaming. Plus the monitor had some problems and they were recalled and the recent model, G9 Neo which is a improved product over the original G9 was not available anywhere, so i left the thought of buying it.
 
Last edited:
Oh I don't disagree with your general sentiment. It's like I said companies are only trying to make money rather than improve the experience so they keep putting out new things to drive the market, if they did not people might be content to sit on their existing TVs for years. They call it advancement of technology but it's really just marketing/consumerist society, of course you need to actually provide something so the customers actually believe they are getting something for their money so they do so. I don't doubt at all that eventually 8K will be mainstream and then 4K will be looked down upon like 1080p. Before the 1080p explosion on TVs there was not that much demand for TVs nor did people have as much extra funds to spend, so it's really all part of their strategy to drive the consumerist market.

By the way whats fratures? Did you mean features by chance?

Just a note - You can use your 4K TV at 1080p as well.
yes features.sorry for the typo
 
Its amazing to say the least. I was earlier on a 60hz monitor and this one is 240hz. Sure its difficult and sometimes not possible to keep up with 240 in many recent games but a jump from 60hz is very much a day and night difference. The games feel smoother, more responsive and above all you can compete in competitive fps titles like Battlefield or Halo. Basically you are seeing 240 frames a second instead of 60, thus getting more visual information to react and that helps so much in 1v1 fights. Cant go back to 60hz or even 120 now.
4000 series will launch next year. If they are powerful enough, i may get a 4080Ti or a 4080. Then this monitor will be put to best use.

Actually, i was really inclined to get a G9 49" monitor. But the size was against mp gaming. Plus the monitor had some problems and they were recalled and the recent model, G9 Neo which is a improved product over the original G9 was not available anywhere, so i left the thought of buying it.
Yup , no doubts that higher refresh rate monitors are a different ballgame altogether. I was curious on how the 3080ti help in gaming at those refresh rates, have a 2080ti amp edition and a 2k 165hz display. I intend to get the 3080 ti when the 4xxx series launches if the price disparity is high. However for some of the games achieving even 120 @ extreme settings (w 2080ti amp) seems difficult, wanted to see if 3080ti comfortably crosses that barrier.
 
Its not been very long when Atmos and 4K was rare, now we have several OTT streaming option. To top it up Dolby Vision has changed the game.
What do you feel, how long would it take for 8K to become a norm in Household?

This information might help the FMs who are buying new gear or upgrading and 8K is one big factor to consider.
Absolutely 100% worth it. The content available is very limited but on a 55/65 inch TV which is fast becoming the norm, the difference is visible if viewed up close.

Insofar as TVs are concerned, if you don't have budget constraints, theres no reason not to go for it, especially if you're upgrading from a non-4K mid-tier smallish TV.

Insofar as AVRs are concerned, I'd hold on to my horses. Users have reported a number of problems getting a stable handshake for 8K/60 with new Denon receivers so I'd let this play out before getting the newer crop. A better idea perhaps would be to get a TV with more than 1 HDMI eARC and use your existing AVR (assuming it also has eARC) to output the sound. That way, there'll be No need to upgrade unless you have more than three 8K sources (assuming your TV has 4 HDMI eARC ports similar to mine).
 
I

I recently got a 2k monitor. A jump from 1080p to 2k for gaming. And there's a lot of difference. And im not talking about the refresh rate difference, which is night and day. Im not sure how this dude is unable to see a difference when he watches 4k.
Isn't 2k a 1080p display with more horizontal pixels? 2048 x 1080p vs 1920 x 1080P?
Eyes struggle to find the difference between a 1080P TV and a 4k tv of 65 inches at 10 feet viewing distance. 4k is about 12 MP in image size. 8k is somewhere at 33 MP in image size. Unless pixel peeping, one cannot find a difference between 4k and 8k at 10 feet distance on a TV. 8K is indeed a gimmick. What we want is HDR and that makes a huge difference. 1080p vs 4k isn't. I sit at 10 feet away from a 110 inch screen projecting a 1080P image. I cannot see individual pixels at all. Wondering how 8k will make a difference. And we see people talking about 16k, 32 k etc. seriously? 32 k is 500+ megapixels.

Do we really complain about the picture quality of a large 2k cinema that projects a 2048 x 1080p image?
 
He is dragging & dragging how 4k/hdr is useless & 1080p is better for the masses. He is just a troll & looking for unnecessary arguments,
he needs to be monitored by the Mods.
@captrajesh @Nikhil
He may come across as an abrasive person with contrarian views that may even appear outrageous but he's not indulging in personal attacks, unlike some FMs. Forum rules are very clear.
I'm new but do mods really care? In a different thread I was called a derogatory term by him completely unprovoked when I simply asked who the insults were for. As it was festive period & he was much older than me, and considering that mod(s) willingly chose to ignore his direct comment against me and an FM told me to ignore, I left it at that. I assumed trolling is allowed on this forum as long as they end with "As always everyone is entitled to their views, this is mine." But then, what even is the point of moderation?
If someone counters your view in decent language, that can't be termed trolling but if anyone makes personal remarks or abusive or indulges in personal attack, you have every right to report and we assure you of action as deemed fit.
 
Isn't 2k a 1080p display with more horizontal pixels? 2048 x 1080p vs 1920 x 1080P?
Eyes struggle to find the difference between a 1080P TV and a 4k tv of 65 inches at 10 feet viewing distance. 4k is about 12 MP in image size. 8k is somewhere at 33 MP in image size. Unless pixel peeping, one cannot find a difference between 4k and 8k at 10 feet distance on a TV. 8K is indeed a gimmick. What we want is HDR and that makes a huge difference. 1080p vs 4k isn't. I sit at 10 feet away from a 110 inch screen projecting a 1080P image. I cannot see individual pixels at all. Wondering how 8k will make a difference. And we see people talking about 16k, 32 k etc. seriously? 32 k is 500+ megapixels.

Do we really complain about the picture quality of a large 2k cinema that projects a 2048 x 1080p image?
Since he sits close and play games using a high end card the difference in sharpness will be clearly visible between 1080p and 2k monitors.Also if the previous rig is a normal 1080p monitor with an average card the difference will be huge after moving to a 2k gaming monitor paired with a high end card. Comparison method also matters. Many compare 4k and 1080p on the same 4k tv and conclude the difference is huge. This is again a wrong method.
 
Isn't 2k a 1080p display with more horizontal pixels? 2048 x 1080p vs 1920 x 1080P?
Eyes struggle to find the difference between a 1080P TV and a 4k tv of 65 inches at 10 feet viewing distance. 4k is about 12 MP in image size. 8k is somewhere at 33 MP in image size. Unless pixel peeping, one cannot find a difference between 4k and 8k at 10 feet distance on a TV. 8K is indeed a gimmick. What we want is HDR and that makes a huge difference. 1080p vs 4k isn't. I sit at 10 feet away from a 110 inch screen projecting a 1080P image. I cannot see individual pixels at all. Wondering how 8k will make a difference. And we see people talking about 16k, 32 k etc. seriously? 32 k is 500+ megapixels.

Do we really complain about the picture quality of a large 2k cinema that projects a 2048 x 1080p image?

2k changes resolution both horizontally and vertically (also referred to as QHD , 2560 * 1440). It helps in terms of monitors since text is still crisp and clearly visible without need to enlarge vs a 4k monitor which I ended up scaling to 150% for text and icons. For movies I would honestly say I would not go pixel peeping , for games its not an apples to apples comparison as we change the card to support higher resolutions and new gen cards do much better rendering and differences seem a lot compared to when using an old gen card on same monitor.

Gimmick or not , we will be forced to buy source/other gadgets which have such configurations , for.ex your next AVR will automatically support 8k, next-gen consoles support 8K@30fps , other sources will support 8k and naturally if you have a TV supporting 8k in your budget you will end up with one whenever you plan your next TV purchase. However totally agree that there is no point being an early adopter of 8k unless you end up with sources which do 8k in your house, personally I can still live with FHD resolution content while I do enjoy watching 4k content which are already recorded really well.
 
Since he sits close and play games using a high end card the difference in sharpness will be clearly visible between 1080p and 2k monitors.Also if the previous rig is a normal 1080p monitor with an average card the difference will be huge after moving to a 2k gaming monitor paired with a high end card. Comparison method also matters. Many compare 4k and 1080p on the same 4k tv and conclude the difference is huge. This is again a wrong method.
Yeah. I have a 24-inch QHD monitor and it is bloody sharp. I agree with your point.
Gimmick or not , we will be forced to buy source/other gadgets which have such configurations , for.ex your next AVR will automatically support 8k, next-gen consoles support 8K@30fps , other sources will support 8k and naturally if you have a TV supporting 8k in your budget you will end up with one whenever you plan your next TV purchase. However totally agree that there is no point being an early adopter of 8k unless you end up with sources which do 8k in your house, personally I can still live with FHD resolution content while I do enjoy watching 4k content which are already recorded really well.
That's the truth.
 
Yup , no doubts that higher refresh rate monitors are a different ballgame altogether. I was curious on how the 3080ti help in gaming at those refresh rates, have a 2080ti amp edition and a 2k 165hz display. I intend to get the 3080 ti when the 4xxx series launches if the price disparity is high. However for some of the games achieving even 120 @ extreme settings (w 2080ti amp) seems difficult, wanted to see if 3080ti comfortably crosses that barrier.
3080Ti easily crosses 120fps at 2k max settings in the few mp games ive tried like Battlefield 5 and Halo Infinite. Halo Infinite runs at 150fps at 2k maxxed out. Ive lowered the resolution and set all things to low since i dont care for visual fidelity in a mp game and i want maximum fps which i can get. I get easily 240fps at lowest settings.
Battlefield 2042 is clunky, mainly coz its not optimised yet. So cant say its performance.

Overall 2k 144hz can be achieved with this combo, 5950x and a 3080Ti.
 
Isn't 2k a 1080p display with more horizontal pixels? 2048 x 1080p vs 1920 x 1080P?
Eyes struggle to find the difference between a 1080P TV and a 4k tv of 65 inches at 10 feet viewing distance. 4k is about 12 MP in image size. 8k is somewhere at 33 MP in image size. Unless pixel peeping, one cannot find a difference between 4k and 8k at 10 feet distance on a TV. 8K is indeed a gimmick. What we want is HDR and that makes a huge difference. 1080p vs 4k isn't. I sit at 10 feet away from a 110 inch screen projecting a 1080P image. I cannot see individual pixels at all. Wondering how 8k will make a difference. And we see people talking about 16k, 32 k etc. seriously? 32 k is 500+ megapixels.

Do we really complain about the picture quality of a large 2k cinema that projects a 2048 x 1080p image?
Actually its not 2k, its called QHD.
The resolution is 2560*1440 and known as 1440p in short.
I notice a major difference. Infact when watching movies, i sit 10ft away from a 120 inch scope screen and i can easily make out the difference between 1080p and 4k. 4k is a cleaner image.
 
Yup , no doubts that higher refresh rate monitors are a different ballgame altogether. I was curious on how the 3080ti help in gaming at those refresh rates, have a 2080ti amp edition and a 2k 165hz display. I intend to get the 3080 ti when the 4xxx series launches if the price disparity is high. However for some of the games achieving even 120 @ extreme settings (w 2080ti amp) seems difficult, wanted to see if 3080ti comfortably crosses that barrier.
Oh and forgot to add. Doom Eternal Ancient Gods easily crosses 150 on 2k maxxed out. I cant say more about other sp games since i really dont get time to try them. Most of the free time goes into Bf or Halo.
 
For excellent sound that won't break the bank, the 5 Star Award Winning Wharfedale Diamond 12.1 Bookshelf Speakers is the one to consider!
Back
Top