http://www.appleinsider.com/articles/11/09/20/intels_ivy_bridge_support_for_4k_resolution_could_pave_way_for_retina_macs.html
There is a real need for 'retina' desktop monitors. Resolutions where we don't see individual pixels any more. I.e. the 27" Apple Cinema Display is otherwise brilliant, but reading text or even web pages on it is less than optimal. The pixels on such big screen become too big. Apple's laptops are way better in this regard.
But - it seems Intel would be going there with the traditional approach. Why?
We now have Thunderbolt. It's essentially a PCI bus over thin serial wire. So why not make the GPU *inside the display*. The distinction is not big for the consumer - they might not even be aware. The same cable. But instead of streaming each and every pixel across in such a GPU-within-display device the OS X display elements and commands would be sent. Like for text, the characters instead of the rendered bitmaps.
This also makes sense in other ways, especially when thinking of a small laptop + large stationary "office" display. Excess heat dissipation is easier in the display. One would get better graphics processing speed when working with the big screen. Why carry that GPU chip along in the laptop everywhere. Actually, latest Macbook Airs don't have a dedicated GPU. See - maybe Apple indeed is thinking this.
It is *so* obvious to me as a computer engineer that this is the way to go forward. Intel - please say this is what you have in mind. We don't need retina display support the old way.
There is a real need for 'retina' desktop monitors. Resolutions where we don't see individual pixels any more. I.e. the 27" Apple Cinema Display is otherwise brilliant, but reading text or even web pages on it is less than optimal. The pixels on such big screen become too big. Apple's laptops are way better in this regard.
But - it seems Intel would be going there with the traditional approach. Why?
We now have Thunderbolt. It's essentially a PCI bus over thin serial wire. So why not make the GPU *inside the display*. The distinction is not big for the consumer - they might not even be aware. The same cable. But instead of streaming each and every pixel across in such a GPU-within-display device the OS X display elements and commands would be sent. Like for text, the characters instead of the rendered bitmaps.
This also makes sense in other ways, especially when thinking of a small laptop + large stationary "office" display. Excess heat dissipation is easier in the display. One would get better graphics processing speed when working with the big screen. Why carry that GPU chip along in the laptop everywhere. Actually, latest Macbook Airs don't have a dedicated GPU. See - maybe Apple indeed is thinking this.
It is *so* obvious to me as a computer engineer that this is the way to go forward. Intel - please say this is what you have in mind. We don't need retina display support the old way.
2 comments:
Soon we will see 300+ dpi moving from smartphones to tablets. UHDTV is coming to TV within a couple of years. Of course 4K will be expected on monitors.
Thanks, unknown visitor, but you are missing the point of my blog entry. It was about how to do such resolutions. But then again, for most of us it would not matter since it's a technical, under-the-hood detail whether the GPU is within the computer, or the display.
Post a Comment