Author Topic: 4K TVs.. waste of time?  (Read 1683 times)

September 12, 2015, 03:11:43 AM
Reply #30

Einhander

  • The public-school system failed me.
  • *****
  • Information Offline
  • Hero Member
  • Posts: 567
But see, that's the thing. I don't want to be "there". I want to be watching the movie, not in it. I cannot stand when it looks so real that you are on set with the actors. Looks sooooo damn fake. Maybe for sports and travel shows, but I don't really watch them. I pretty much only like scripted movies and shows. Sitcoms, cartoons, blockbuster movies. Stuff like that. Don't really care for reality tv, be it sports, news, anything on TLC, etc, don't like any of it. The closest I like to "real life" om TV are game shows like Price is Right or Family Feud.


Man now you're starting to act like me. I guess I can see where you're coming from. Change is hard sometimes. But you cannot deny how awesome the 4k TV's look. My suggestion would be to keep your 1080 P for general purposes. But for movies and shows with really good scenery, use your 4k tv. You don't have to watch Seinfeld or Friends in 4k. By the way, this applies to 2 years down the road. Right now nothing is in 4k anyway.

See for me, there's just something magical about CRT's and retro games. Maybe it's nostalgia, but at times I can't place my finger on what it is exactly. Sometimes I feel like modern games look too fake by trying to go for the blockbuster feel. So therefore, I start telling myself that CRT's and retro games have better graphics. But man.... These 4k TV's look way better than any video game. How could it look fake? I mean not when you're watching real things.
« Last Edit: September 12, 2015, 03:14:57 AM by Einhander »

September 12, 2015, 07:44:17 AM
Reply #31

wiggy

  • The one.. the only... whatever
  • **
  • Information Offline
  • Maximum Volume Poster
  • Posts: 8241
  • Extra cheese please!
    • Rose Colored Gaming
Personally, I don't think home TVs should look crisper/ "better" than movie theaters. I think theaters should be the end all, highest quality you can get.

For me, theaters have always been crap pictures.  The IMAX ones are OK, but theaters are for SOUND.  Nothing compares to the epic boom of a good theater sound system, especially for films designed for it like Inception or Jurassic Park.  
As for picture, I remember seeing Man of Steel in Imax, NOT Imax 3d, and thinking this would look better on my TV at home.  Sure enough, it did.  
I like the crystal clear, 260 fps smoothness I get.  I like feeling like it's happening right in front of me.  That soap opera effect is called Image smoothing or something, and I love it - though that is a matter of taste.

However, for GAMES...I think 4k is a bit much right now until we get some hardware that can actually display it.  Now I have heard there are some PC games that do 4k, but I have yet to see anything like that.  

Funny you would think that, as the films used in movie production have a "resolution" which far exceeds 4K. It's not until movies are made digital that they actually lose a great deal of detail or resolution.  That said, a lot of films are shot digitally these days, which is quite sad IMO.

Also, movies are not shot at 260fps. Not even close, so I'm sort of confused as to exactly what you're seeing at 260fps?
« Last Edit: September 12, 2015, 07:46:05 AM by wiggy »

September 12, 2015, 01:44:52 PM
Reply #32

Doom

  • *
  • Information Offline
  • Devoted Member
  • Cover Admin
  • Posts: 1906
  • Pac-Man CE DX
I've been meaning to do the math on 4K for a few days. There are charts online with arbitrary "optimal distances" for when "4K becomes noticable" -- but I think our knowledge of optics along with math can give a more meaningful answer.

4K TVs are usually paired with the higher standards that come with the spec. HDR (high dynamic range) for example. I want to see a rigorous test comparing 1080p material on a 4K TV with the same content at 4K -- ideally, it would use every feature of 4K TVs except the 4K itself.

In a few years, it won't matter. The reason 4K exists in the first place is that screens are manufactured in sheets. To get a 42 inch 1080p TV, they would actually take a bigger 4K sheet and cut it into fourths. Part of the reason was there was a lot of losses - the sheet couldn't be one giant 4K screen because of imperfections in the process.

Now the process has improved so more and more sheets are usable as one large 4K TV. The process will continue to improve and it will soon be impossible to not get a 4K TV, just as now it's almost impossible to get a 720p TV.
Also, movies are not shot at 260fps. Not even close, so I'm sort of confused as to exactly what you're seeing at 260fps?
Frame interpolation from (probably) 24 fps to 260 fps.

September 12, 2015, 04:51:27 PM
Reply #33

Megatron

  • *******
  • Information Offline
  • Devoted Member
  • Posts: 1718
  • "...I still function!"
    • Email

Funny you would think that, as the films used in movie production have a "resolution" which far exceeds 4K. It's not until movies are made digital that they actually lose a great deal of detail or resolution.  That said, a lot of films are shot digitally these days, which is quite sad IMO.

Also, movies are not shot at 260fps. Not even close, so I'm sort of confused as to exactly what you're seeing at 260fps?

Quick note - I meant 240fps, not 260.  Typo, my bad...

Anyway...I don't know the specs of film vs digital, nor do I really care.  Theaters are big and nice, but when it comes to fine detail (which is important to me) it is severely lacking.  So this may be taboo to film "purists", but I prefer the blu ray digital over film. 

As for 60fps vs 120 vs 240 - the hardware adds smoothing and motion blur to fill in the gap between frames.  In many cases this is the "Soap Opera" effect.  A good blu ray to look at is anything with a car chase or something else really fast moving.  The Dark Knight (scene where the truck flips) is actually really cool to watch in standard 60fps, then 120 then 240 in succession.  And yes, the difference is noticeable.  Some don't like it, some do.  Different strokes.   

September 12, 2015, 05:29:05 PM
Reply #34

Thom Grayson

  • *****
  • Information Offline
  • Hero Member
  • Posts: 544
Anyway...I don't know the specs of film vs digital, nor do I really care.  Theaters are big and nice, but when it comes to fine detail (which is important to me) it is severely lacking.  So this may be taboo to film "purists", but I prefer the blu ray digital over film. 

I don't think the home video versions are more detailed than the theater versions. I think you are talking about pixel size - which is obviously smaller on a home TV than it is on a massive screen - which can result in things looking sharper even though no detail has actually been added. PPI has more to do with sharpness than resolution does, in many cases.

September 19, 2015, 09:05:29 AM
Reply #35

Superchop

  • *****
  • Information Offline
  • Hero Member
  • Posts: 625
Anyway...I don't know the specs of film vs digital, nor do I really care.  Theaters are big and nice, but when it comes to fine detail (which is important to me) it is severely lacking.  So this may be taboo to film "purists", but I prefer the blu ray digital over film. 

I don't think the home video versions are more detailed than the theater versions. I think you are talking about pixel size - which is obviously smaller on a home TV than it is on a massive screen - which can result in things looking sharper even though no detail has actually been added. PPI has more to do with sharpness than resolution does, in many cases.


wrong! ppi, what is that, no one advertises that. resolution is what is importan

Do you know what ppi even is?  If not how can you say that he's wrong?

Resolution is only as important as every other factor involved in watching a movie...by itself it means nothing.

And not for anything but nobody advertises PPI because the average consumer won't understand what it even means.  Advertisements keep things simple so people understand.  You don't hear xbox or sony commercials talking about how much ram their systems have...
Xbl: superchop83
Psn: superchop83
WiiU: Superchop

September 19, 2015, 06:18:35 PM
Reply #36

wiggy

  • The one.. the only... whatever
  • **
  • Information Offline
  • Maximum Volume Poster
  • Posts: 8241
  • Extra cheese please!
    • Rose Colored Gaming

Funny you would think that, as the films used in movie production have a "resolution" which far exceeds 4K. It's not until movies are made digital that they actually lose a great deal of detail or resolution.  That said, a lot of films are shot digitally these days, which is quite sad IMO.

Also, movies are not shot at 260fps. Not even close, so I'm sort of confused as to exactly what you're seeing at 260fps?

Quick note - I meant 240fps, not 260.  Typo, my bad...

Anyway...I don't know the specs of film vs digital, nor do I really care.  Theaters are big and nice, but when it comes to fine detail (which is important to me) it is severely lacking.  So this may be taboo to film "purists", but I prefer the blu ray digital over film. 

As for 60fps vs 120 vs 240 - the hardware adds smoothing and motion blur to fill in the gap between frames.  In many cases this is the "Soap Opera" effect.  A good blu ray to look at is anything with a car chase or something else really fast moving.  The Dark Knight (scene where the truck flips) is actually really cool to watch in standard 60fps, then 120 then 240 in succession.  And yes, the difference is noticeable.  Some don't like it, some do.  Different strokes.   

But it's not an option thing, it's just fact. Film has a "resolution" that far exceeds ANYTHING you'll ever watch on that 4K TV. I'm not arguing whether or not you prefer it, but the assertion that there's more detail in an upscaled 1080P image or even 4K simply isn't true.  It's not "taboo".

The goofy frame rate thing is about as silly as Lucas going back and adding all sorts of stupid CGI to his old films. It's taking what is NOT broken and attempting to "fix" it.  That's my opinion, but the fact is that nothing you watch at 240 FPS was ever intended to be viewed with that sort of filtering.

September 19, 2015, 06:34:33 PM
Reply #37

Megatron

  • *******
  • Information Offline
  • Devoted Member
  • Posts: 1718
  • "...I still function!"
    • Email

But it's not an option thing, it's just fact. Film has a "resolution" that far exceeds ANYTHING you'll ever watch on that 4K TV. I'm not arguing whether or not you prefer it, but the assertion that there's more detail in an upscaled 1080P image or even 4K simply isn't true.  It's not "taboo".

The goofy frame rate thing is about as silly as Lucas going back and adding all sorts of stupid CGI to his old films. It's taking what is NOT broken and attempting to "fix" it.  That's my opinion, but the fact is that nothing you watch at 240 FPS was ever intended to be viewed with that sort of filtering.

OK, so the theater has more "inset whatever here" than home video.  That's fine.  But if I can't see the fine detail in the theater, then I don't care as much. 

And as for "how it is intended" - I don't care what someone INTENDS me to do with their product.  Movies, games, whatever. When I pay for it, I do what I want with it.  The difference between the 240fps and the Lucas stuff is that there is no way to remove what the producers put onto the disc.  If I feel like watching a more "traditional" style movie, I can turn the smoothing off.  It isn't trying to fix anything, it is watching the movie with a filter - turn it on or leave it off, your choice.  But to say that something is only meant to be enjoyed the way it was "intended to be" is naive, especially for someone who modifies older hardware and games.  NES games were never meant to be upscaled to HD, yet people still have consoles that play NES games in high resolution via HDMI.  I play Call of Duty - I don't touch the multiplayer.  Ever.  In fact, I don't play ANY multiplayer games.  The developers would insist that I am "missing the full experience."  Does that mean I am enjoying it incorrectly? 

September 20, 2015, 11:44:52 AM
Reply #38

Thom Grayson

  • *****
  • Information Offline
  • Hero Member
  • Posts: 544
I play Call of Duty - I don't touch the multiplayer.  Ever.  In fact, I don't play ANY multiplayer games.  The developers would insist that I am "missing the full experience."  Does that mean I am enjoying it incorrectly? 

Good to see there is someone else who actually knows Call of Duty has single-player! I know too many people who never play single-player, and in fact skip it intentionally.

I actually quite enjoy their single-player, with multiplayer only getting cursory attention if my friends have been bugging me to play with them.

November 29, 2015, 08:30:43 AM
Reply #39

Einhander

  • The public-school system failed me.
  • *****
  • Information Offline
  • Hero Member
  • Posts: 567
Hi, I was wondering if someone could help me with a question I have. PS3 and 360 display a 720 resolution. Now what would happen if you played them on 4k? Would they look worse or better? Because we all know what happens when you play standard definitions on a HDTV. The images are distorted. Would playing consoles that are 720 and trying to upscale the games on a 4k actually make the games worse?

And if that's the case, then what are we gonna do each time we get new consoles and tvs? We can't just keep a tv with a different resolution for each console we get. This is something that concerns me.

I want 4k in the future. But if my PS3 and PS4 games look worse on a higher resolution, that's gonna be a problem. Because having a CRT and one HDTV is all I have room for.

November 29, 2015, 08:55:40 AM
Reply #40

Arseen

  • Amiibo lover extraordinaire
  • *
  • Information Offline
  • This one has about 10 percent of all posts
  • Oversight
  • Posts: 20562
Well both consoles output 1080P so that alone is definitely improvement.

And if 4K TV has great upscaler the improvement is tremendous.

Anywayhow the quality should only go up, how much depends on the TV.


November 29, 2015, 01:32:57 PM
Reply #41

TDIRunner

  • All round awesome dude!
  • *
  • Information Offline
  • Post Whore
  • Posts: 5086
    • My MediaFire Account
Pretty much what Arseen said.  Your concern about older systems looking like crap on modern TVs is valid.  However, look at it this way.  Standard definition video games should be played on standard definition TVs.  High def video games should be played on high def TVs.  Sure, the PS3 and 360 were meant for 1080p, but it shouldn't be a problem to run them on a 4K TV since it will upscale the image.  You shouldn't need a different TV for each generation of video game system.  You should really only need 2.
Maybe, just once, someone will call me "sir" without adding, "you're making a scene."

My Raw Scans

November 29, 2015, 04:38:59 PM
Reply #42

Einhander

  • The public-school system failed me.
  • *****
  • Information Offline
  • Hero Member
  • Posts: 567
Pretty much what Arseen said.  Your concern about older systems looking like crap on modern TVs is valid.  However, look at it this way.  Standard definition video games should be played on standard definition TVs.  High def video games should be played on high def TVs.  Sure, the PS3 and 360 were meant for 1080p, but it shouldn't be a problem to run them on a 4K TV since it will upscale the image.  You shouldn't need a different TV for each generation of video game system.  You should really only need 2.

Okay, that makes sense. However, I wonder if HD will no longer be HD anymore. 720 and 4k are very different. And when 8k comes out, I wonder if 720 or 1080P are even considered HD enough for it.

November 29, 2015, 05:23:15 PM
Reply #43

TDIRunner

  • All round awesome dude!
  • *
  • Information Offline
  • Post Whore
  • Posts: 5086
    • My MediaFire Account
Pretty much what Arseen said.  Your concern about older systems looking like crap on modern TVs is valid.  However, look at it this way.  Standard definition video games should be played on standard definition TVs.  High def video games should be played on high def TVs.  Sure, the PS3 and 360 were meant for 1080p, but it shouldn't be a problem to run them on a 4K TV since it will upscale the image.  You shouldn't need a different TV for each generation of video game system.  You should really only need 2.

Okay, that makes sense. However, I wonder if HD will no longer be HD anymore. 720 and 4k are very different. And when 8k comes out, I wonder if 720 or 1080P are even considered HD enough for it.

8K is so far out from today you shouldn't even be worrying about it.  If you are worried about 8K, then you should also be worried about 16K, 32K, 64K and so on. 
Maybe, just once, someone will call me "sir" without adding, "you're making a scene."

My Raw Scans

November 29, 2015, 06:20:59 PM
Reply #44

Einhander

  • The public-school system failed me.
  • *****
  • Information Offline
  • Hero Member
  • Posts: 567
Pretty much what Arseen said.  Your concern about older systems looking like crap on modern TVs is valid.  However, look at it this way.  Standard definition video games should be played on standard definition TVs.  High def video games should be played on high def TVs.  Sure, the PS3 and 360 were meant for 1080p, but it shouldn't be a problem to run them on a 4K TV since it will upscale the image.  You shouldn't need a different TV for each generation of video game system.  You should really only need 2.

Okay, that makes sense. However, I wonder if HD will no longer be HD anymore. 720 and 4k are very different. And when 8k comes out, I wonder if 720 or 1080P are even considered HD enough for it.

8K is so far out from today you shouldn't even be worrying about it.  If you are worried about 8K, then you should also be worried about 16K, 32K, 64K and so on. 


Really? 4k will be standard in 2 years right? Isn't 8 a couple years after that?