I've been meaning to do the math on 4K for a few days. There are charts online with arbitrary "optimal distances" for when "4K becomes noticable" -- but I think our knowledge of optics along with math can give a more meaningful answer.
4K TVs are usually paired with the higher standards that come with the spec. HDR (high dynamic range) for example. I want to see a rigorous test comparing 1080p material on a 4K TV with the same content at 4K -- ideally, it would use every feature of 4K TVs except the 4K itself.
In a few years, it won't matter. The reason 4K exists in the first place is that screens are manufactured in sheets. To get a 42 inch 1080p TV, they would actually take a bigger 4K sheet and cut it into fourths. Part of the reason was there was a lot of losses - the sheet couldn't be one giant 4K screen because of imperfections in the process.
Now the process has improved so more and more sheets are usable as one large 4K TV. The process will continue to improve and it will soon be impossible to not get a 4K TV, just as now it's almost impossible to get a 720p TV.
Also, movies are not shot at 260fps. Not even close, so I'm sort of confused as to exactly what you're seeing at 260fps?
Frame interpolation from (probably) 24 fps to 260 fps.