Future OLED Tech. (Was: Tandem OLED, the next big thing...)

JLnyc

Smack-Fu Master, in training
7
Because they are 1/50th of the size. brightness limitations on TV sized panels are 100% related to power consumption. A Tandem OLED would have the same limitation. That's why they have similar peak output numbers. using 20w for 1000 nits in a 11in panel for a short period of time is completely different from using 1000w in a 77in display.

I'm sure you have seen the roadmaps for a possible apple 20.3 in folding tandem oled to be potentially released sometime in late 2025/2026. That represents about a 54% jump in size compared to the present 13in tandem, and that will be accomplished with a small battery. Using this same math its conceivable that more large jumps are possible. j
 

ScifiGeek

Ars Legatus Legionis
16,404
When announcing the new iPads Apple explicitly mentioned the M4 having a new display controller that was necessary to drive the screen. If there’s nothing special about the display driver then what was Apple referencing? I’m not questioning what you’re saying, just trying to better understand the tech.
I think Apple marketing went a bit overboard on Tandem OLED.
 

ttnuagmada

Smack-Fu Master, in training
72
I'm sure you have seen the roadmaps for a possible apple 20.3 in folding tandem oled to be potentially released sometime in late 2025/2026. That represents about a 54% jump in size compared to the present 13in tandem, and that will be accomplished with a small battery. Using this same math its conceivable that more large jumps are possible. j

That time frame is consistent with when blue PHOLED is expected to start showing up in phone/tablet screens. That will reduce power consumption. Blue PHOLED won't have the same lifespan as Blue FOLED, but that shouldn't really be an issue with a tandem display in a tablet. It will probably be a while before we see it in TV's and monitors though I would imagine.
 

JLnyc

Smack-Fu Master, in training
7
That time frame is consistent with when blue PHOLED is expected to start showing up in phone/tablet screens. That will reduce power consumption. Blue PHOLED won't have the same lifespan as Blue FOLED, but that shouldn't really be an issue with a tandem display in a tablet. It will probably be a while before we see it in TV's and monitors though I would imagine.
edited:
Based on the ability of a potential 20in tandem oled to run off a small battery-one that no doubt that will be optimized for long run times, using plug in power with no battery constraints would offer a whole lot more power to a desktop version. If the 20in version pans out, I could see a 42 inch tandem oled coming not so far out from 2026. Its ironic you mention blue, as many are thinking now that the blue tech is running very much behind schedule to where most had hoped it would be, indeed they may be many years out from developing a blue that delievers anywhere near the efficiency and brightness of Tendem oled- of course as your mentioned even a blue that is a little better will work out much better for a tandem design than it will for regular/current day oled tech.

It seems to me that the oled industry will do a lot to bring tandem oled into larger screen sizes to literally save consumer oled tech from the significant progress LCD has been and will continue to make.

edit: Now that you mention next gen blue, its got me thinking that if the none of the new blues actually wind up being able to significantly improve the brightness of current gen oleds-especsilly to new Sony mini lcd bravia levels,
that may very well push oled makers to accelerate the development of tandem oled for larger applications across the board. j
 
Last edited:

ttnuagmada

Smack-Fu Master, in training
72
Based on the ability of a potential 20in tandem oled to run off a small battery-one that no doubt that will be optimized for long run times, using plug in power with no battery constraints would offer a whole lot more power to a desktop version. If the 20in version pans out, I could see a 42 inch tandem oled coming not so far out from 2026. Its ironic you mention blue, as many are thinking now that the blue tech is running very much behind schedule to where most had hoped it would be, indeed they may be many years out from developing a blue that delievers anywhere near the efficiency and brightness of Tendem oled- of course as your mentioned even a blue that is a little better will work out much better for a tandem design than it will for regular/current day oled tech.

It seems to me that the oled industry will do a lot to bring tandem oled into larger screen sizes to literally save consumer oled tech from the significant progress LCD has been and will continue to make. j

a 20in tablet screen is not going to be optimized for long run times and the logic behind putting something like that in a tablet is the fact that most of the time, you won't be using the full 20in display. It will also likely be 2 panels fused together at the seam, as making panels larger than 16in or so using the same manufacturing method is a major limitation. If you scaled this exact technology to a 42in panel, it would need 15x as much power as an 11in screen to produce a sustained 1000 nit full-field.

I think you are over blowing the significance of Tandem. It's significant for small displays because it's new in that segment. The same capabilities have been fundamental to the TV panels from day one. It's the same 2 companies making tandem and TV panels. The same OLED materials are even being used. You are falling victim to the Apple hype machine.


that may very well push oled makers to accelerate the development of tandem oled for larger applications across the board. j
There's not anything significant to be gained from Tandem at the larger screen sizes. If you made a 77in Tandem OLED TV, it too would need 1000w to hit 1000 nits full field, and thus would be limited to 200-300 nits just like the current panels.
 
  • Like
Reactions: continuum

JLnyc

Smack-Fu Master, in training
7
a 20in tablet screen is not going to be optimized for long run times and the logic behind putting something like that in a tablet is the fact that most of the time, you won't be using the full 20in display. It will also likely be 2 panels fused together at the seam, as making panels larger than 16in or so using the same manufacturing method is a major limitation. If you scaled this exact technology to a 42in panel, it would need 15x as much power as an 11in screen to produce a sustained 1000 nit full-field.

I think you are over blowing the significance of Tandem. It's significant for small displays because it's new in that segment. The same capabilities have been fundamental to the TV panels from day one. It's the same 2 companies making tandem and TV panels. The same OLED materials are even being used. You are falling victim to the Apple hype machine.



There's not anything significant to be gained from Tandem at the larger screen sizes. If you made a 77in Tandem OLED TV, it too would need 1000w to hit 1000 nits full field, and thus would be limited to 200-300 nits just like the current panels.

Sorry to offend you that you need to call me a victim of apple. No need for the trash talk. I'm more interested in plain conversation here.

The fact is, brightness is big now and getting bigger, manufactures and consumers appear to want very bright screens (one could say the manufactures are puching the consumer to do so, but that has always been the case anyway) with full screen nits above 600/700 nits and screens where automatic brightness limiting is not messing with the picture...as well fast motion works better with more brightness. Many enthusiasts on these forum may not like or seek very bright displays but it does appear to be a unstoppable market phenomenon. j
 

ttnuagmada

Smack-Fu Master, in training
72
Sorry to offend you that you need to call me a victim of apple. No need for the trash talk. I'm more interested in plain conversation here.
Trash talk? I mean the context of all of your posts seem to be a misunderstanding of what Tandem OLED means in the scope of things, which is directly derived from Apples marketing. I was just making an observation.

The fact is, brightness is big now and getting bigger, manufactures and consumers appear to want very bright screens (one could say the manufactures are puching the consumer to do so, but that has always been the case anyway) with full screen nits above 600/700 nits and screens where automatic brightness limiting is not messing with the picture...as well fast motion works better with more brightness. Many enthusiasts on these forum may not like or seek very bright displays but it does appear to be a unstoppable market phenomenon. j

Well sure. HDR has been the driving force behind that push.
 

OrangeCream

Ars Legatus Legionis
55,386
The fact is, brightness is big now and getting bigger, manufactures and consumers appear to want very bright screens (one could say the manufactures are puching the consumer to do so, but that has always been the case anyway)
I’m not sure I agree there. I’ve been buying the brightest screens I can get because for decades, literally, I was using laptops where the screen topped out at 180 nits. 250 nits (empirically) is the minimum for me in an office (meaning no window behind me), though 300 is better.

However that means most TVs aren’t bright enough by a long shot when there is a window in the room. I need my laptop at about 50% for it to be usable in that condition; meaning a TV would need to be roughly 500 nits, minimum, to be optimal. During the evening it can be lower, at 250 nits, without issues, because there is no direct sun or sky at those times.

I’m sure I’m not the only one who has similarly noticed that an open window makes their TV or computer unusable, or that Macs, with their 1k nits brightness, are perfectly usable in the same situation.

Of course I’m not an enthusiast, per se. I just want to see my screen comfortably.
 
  • Like
Reactions: JLnyc

JLnyc

Smack-Fu Master, in training
7

"LG Display, the world’s leading innovator of display technologies, announced today that it will start mass-production of the 17-inch Foldable OLED panel for laptops, a representation of the company’s earnest commitment to the expansion of its OLED business for IT devices.

LG Display’s 17-inch Foldable OLED panel features a Tandem OLED structure previously applied to automotive displays, with a dramatically increased lifespan suitable for IT devices."...

"The company’s groundbreaking 17-inch Foldable OLED for laptops integrates a specialized material that minimizes creasing in the folding area of the screen. This results in a seamless display and crystal-clear picture quality, unlike conventional foldable panels."

I do not believe this display is stitched together, this display is not fused together, nor do I believe that this display will see less usage than a normal display. It would seem logical that Apple will use this same tech on their upcoming 20 inch foldable device and as well their rumored 16inch MacBook dispaly...Ideas about larger displays like these using too much power are over blown.

The fact that LG says that this is adapted from their automotive display but has been modified for "a dramatically increased lifespan", tells me that they likely can apply these changes to their recently announced automotive 34inch tandem oled as well. j
 
Last edited:

ScifiGeek

Ars Legatus Legionis
16,404
I’m sure I’m not the only one who has similarly noticed that an open window makes their TV or computer unusable, or that Macs, with their 1k nits brightness, are perfectly usable in the same situation.

Yet somehow, we used CRT's for everything for decades, and their brightness was even less than OLEDs.
 
  • Like
Reactions: continuum

OrangeCream

Ars Legatus Legionis
55,386
Yet somehow, we used CRT's for everything for decades, and their brightness was even less than OLEDs.
The technology was different. A scanning gun lit up phosphors on a CRT, and each pixel was considerably brighter than an LCD pixel. However because the majority of the screen was black the brightness of the screen wasn’t rated as high:

You can see a side by side visual comparison. The scanline is far brighter.

I can’t find an authoritative source but many claims that the beam of electrons hitting the phosphor coating were tens of thousands of nits. However only one pixel is blasted at a time so it had to be bright enough to trigger the persistence effect.

A forum post at Ars trying to explain it is here:
This is also why it is extremely hard to simulate the low persistence of CRT using a digital display -- even for strobed backlights and pulsed rolling-scan OLEDs.

CRT electron gun beam dot shines pretty much north of 10,000 nits for ~0.1 millisecond (from 50%-to-50%).

Display motion blur is related to persistence (pixel visibility time) from the ON-to-OFF, whereas 1ms of persistence translates to 1 pixel of display motion blur per 1000 pixels/second.

crt-phosphor-versus-strobe-backlights.png


So you need a far, far, brighter OLED to be sunlight viewable.
 

ScifiGeek

Ars Legatus Legionis
16,404
The technology was different. A scanning gun lit up phosphors on a CRT, and each pixel was considerably brighter than an LCD pixel. However because the majority of the screen was black the brightness of the screen wasn’t rated as high:

You can see a side by side visual comparison. The scanline is far brighter.

I can’t find an authoritative source but many claims that the beam of electrons hitting the phosphor coating were tens of thousands of nits. However only one pixel is blasted at a time so it had to be bright enough to trigger the persistence effect.

A forum post at Ars trying to explain it is here:
This is also why it is extremely hard to simulate the low persistence of CRT using a digital display -- even for strobed backlights and pulsed rolling-scan OLEDs.

CRT electron gun beam dot shines pretty much north of 10,000 nits for ~0.1 millisecond (from 50%-to-50%).

Display motion blur is related to persistence (pixel visibility time) from the ON-to-OFF, whereas 1ms of persistence translates to 1 pixel of display motion blur per 1000 pixels/second.

crt-phosphor-versus-strobe-backlights.png


So you need a far, far, brighter OLED to be sunlight viewable.

For brightness comparisons, that is Irrelevant because our eyes integrate brightness over time. We aren't seeing 5000+ nits, we are seeing that average it delivers.

CRTs were typically maxing out around 150 nits, and they looked no brighter than an LCD that was also set to 150 nits.

So my original point remains. We spent decades on CRTs that were significantly dimmer than even OLEDs, and they were not considered unusable.
 

OrangeCream

Ars Legatus Legionis
55,386
For brightness comparisons, that is Irrelevant because our eyes integrate brightness over time. We aren't seeing 5000+ nits, we are seeing that average it delivers.

CRTs were typically maxing out around 150 nits, and they looked no brighter than an LCD that was also set to 150 nits.

So my original point remains. We spent decades on CRTs that were significantly dimmer than even OLEDs, and they were not considered unusable.
That’s irrelevant. We are talking about being able to use a panel with sunlight streaming through an open window.

A phosphor lit up at 10k nits is absolutely bright enough to overcome the brightness of a window. Even if the overall screen is 150 nits, each pixel of that screen was 10k nits.

It’s like the difference between a hammer and an ax. Swing both and one will chop down a tree while the other one crushes a few layers of bark and skin.

TLDR: Each pixel on an OLED or LCD needs to be 1000 bits to overpower the sun. A CRT can trivially hit 1000 nits per pixel.
 

ScifiGeek

Ars Legatus Legionis
16,404
A phosphor lit up at 10k nits is absolutely bright enough to overcome the brightness of a window. Even if the overall screen is 150 nits, each pixel of that screen was 10k nits.

It’s like the difference between a hammer and an ax. Swing both and one will chop down a tree while the other one crushes a few layers of bark and skin.

TLDR: Each pixel on an OLED or LCD needs to be 1000 bits to overpower the sun. A CRT can trivially hit 1000 nits per pixel.

That is absurd nonsense. I used CRTs everywhere for years before LCDs were a thing, both at home and at work, and used them together for a period of transition.

CRT clearly had lower levels of practical brightness than LCDs (and lower than modern OLED).

Sub-millisecond peak levels don't matter. It's the integration of total light over time, that we both measure and perceive as brightness.
 

OrangeCream

Ars Legatus Legionis
55,386
That is absurd nonsense. I used CRTs everywhere for years before LCDs were a thing, both at home and at work, and used them together for a period of transition.

CRT clearly had lower levels of practical brightness than LCDs (and lower than modern OLED).
Right, I’m not disputing that. Which is why I’m saying your post is irrelevant to mine.
Sub-millisecond peak levels don't matter. It's the integration of total light over time, that we both measure and perceive as brightness.
And that’s why I’m saying it’s irrelevant.

I’m not talking about the brightness of the screen, I’m talking about being able to use it in a bright environment.

Due to persistence of vision your brain will ‘remember’ a pixel being lit up for a fraction of a second. That’s how CRTs work. An LCD or OLED doesn’t take advantage of that. Instead the pixel stays lit for the entirety of the refresh cycle.

That means when a CRT pixel is lit up at 10k nits the pixel is bright enough to outshine the sun. Your brain remembers that pixel even if it goes dark for a fraction of a second and the next pixel gets lit up. This is true even if the average brightness of the screen is only 150 nits. Every pixel is individually bright enough to be sunlight viewable.

The same is not true for an LCD or OLED. If the screen is 150 nits then it means each pixel is also 150 nits (I’m ignoring peak brightness for this discussion). Therefore even if the average brightness is identical, no pixel is lit up brightly enough to be viewable in the sun.
 

ScifiGeek

Ars Legatus Legionis
16,404
That means when a CRT pixel is lit up at 10k nits the pixel is bright enough to outshine the sun. Your brain remembers that pixel even if it goes dark for a fraction of a second and the next pixel gets lit up. This is true even if the average brightness of the screen is only 150 nits. Every pixel is individually bright enough to be sunlight viewable.

Nope. That is silly nonsense. It's not how it works at all. When you don't understand something, don't make up nonsense to cover it, that just makes you look more like someone willing to die on the Dunning-Kruger peak:

Mount-Stupid_1.jpg


I've had CRT and LCD sitting next to each other, and in a bright daylit room, the measurably brighter LCD was more visible just as the brightness measurements indicate.

150 nits of CRT brightness behaves just like 150 nits of LCD/OLED brightness when it comes to competing with higher ambient light levels.
 

ttnuagmada

Smack-Fu Master, in training
72
The technology was different. A scanning gun lit up phosphors on a CRT, and each pixel was considerably brighter than an LCD pixel. However because the majority of the screen was black the brightness of the screen wasn’t rated as high:

You can see a side by side visual comparison. The scanline is far brighter.

I can’t find an authoritative source but many claims that the beam of electrons hitting the phosphor coating were tens of thousands of nits. However only one pixel is blasted at a time so it had to be bright enough to trigger the persistence effect.

A forum post at Ars trying to explain it is here:
This is also why it is extremely hard to simulate the low persistence of CRT using a digital display -- even for strobed backlights and pulsed rolling-scan OLEDs.

CRT electron gun beam dot shines pretty much north of 10,000 nits for ~0.1 millisecond (from 50%-to-50%).

Display motion blur is related to persistence (pixel visibility time) from the ON-to-OFF, whereas 1ms of persistence translates to 1 pixel of display motion blur per 1000 pixels/second.

crt-phosphor-versus-strobe-backlights.png


So you need a far, far, brighter OLED to be sunlight viewable.

No offense, but I would have to assume that you've never even seen a CRT if you think they were somehow easy to see in a lit up room lol.

When I was a kid, I would literally use chairs to hang a blanket on in order to block the light from the living room window, because otherwise watching Saturday AM cartoons was impossible.
 

OrangeCream

Ars Legatus Legionis
55,386
No offense, but I would have to assume that you've never even seen a CRT if you think they were somehow easy to see in a lit up room lol.

When I was a kid, I would literally use chairs to hang a blanket on in order to block the light from the living room window, because otherwise watching Saturday AM cartoons was impossible.
Maybe I am misremembering then. I was just basing my statements on the last CRT I owned, a Trinitron TV and a Trinitron monitor.
150 nits of CRT brightness behaves just like 150 nits of LCD/OLED brightness when it comes to competing with higher ambient light levels
I admit it’s been literally decades since I had a CRT. I’ve probably erased all my pain. In which case I can’t ever imagine using any display with brightness below 300, and definitely prefer brighter.
 

ttnuagmada

Smack-Fu Master, in training
72
Maybe I am misremembering then. I was just basing my statements on the last CRT I owned, a Trinitron TV and a Trinitron monitor.

I admit it’s been literally decades since I had a CRT. I’ve probably erased all my pain. In which case I can’t ever imagine using any display with brightness below 300, and definitely prefer brighter.

My eyes strain at anything over a consistent 180 nits or so, even in a lit environment. My office at work is pretty well lit, and I have cheapo Dell monitors that probably max out at 200-250 nits, and they're at like 2/3's backlight.

I'm sitting in a room right now that has multiple windows and my C1 is only at 70 OLED light. The only time it's even an issue is if whatever I'm watching is extremely dark and has a lot of shadow detail, and that's not really something a high nit display even helps with anyway, because the full-field brightness doesn't even matter when we're talking about low APL scenes where ABL isn't even going to be triggered. It's all about reflection handling in those cases.

People have to remember that an OLED doing 250-300 nits full-field is literally talking about a 100% white screen. That type of situation is extremely rare. 50% window is going to be much more indicative of real-world uses cases.
 

ScifiGeek

Ars Legatus Legionis
16,404
My eyes strain at anything over a consistent 180 nits or so, even in a lit environment.

That was my problem when I first went LCD. It was brighter on ZERO brightness than my CRT was on max brightness and it was kind of painful to use. . With those CCFL backlights, a lot of early LCDs had minimum brightness near 200 nits.

I had to turn the LCD brightness to ZERO, then lower the brightness more using computer settings, but that would just cut off the top end of the monitor making contrast worse.
 

cerberusTI

Ars Tribunus Angusticlavius
6,463
Subscriptor++
That was my problem when I first went LCD. It was brighter on ZERO brightness than my CRT was on max brightness and it was kind of painful to use. . With those CCFL backlights, a lot of early LCDs had minimum brightness near 200 nits.

I had to turn the LCD brightness to ZERO, then lower the brightness more using computer settings, but that would just cut off the top end of the monitor making contrast worse.
That is a large part of the reason I bought an OLED.

I have it set at 20% brightness in SDR mode with the brightness stabilizer on. It is less than CRT brightness, and is usable even in a large well lit room with two open windows. I am facing the room, so my corner is a bit dark in comparison to the rest, but unless it is in direct sunlight it is perfectly fine.

The big secret there is to get a strong matte screen with an anti glare coating, which is how that was dealt with in the CRT era as well.

I did have a CRT which could get fairly bright. My last one was a Mitsubishi 2070SB, where the SB was SuperBright, and it had a button which would greatly increase the brightness when pressed, I would guess into the 300 nit range. I very rarely used it, but I have always kept my monitors on the low side where possible.

Even at 0 brightness, LCDs were always much too bright for me.
 

Semi On

Senator
89,445
Subscriptor++
When announcing the new iPads Apple explicitly mentioned the M4 having a new display controller that was necessary to drive the screen.

Knowing what Honor is using to drive their Tandem OLED screen, this was just marketing bullshit.

It’s not a digital problem, it’s analog. I think they probably have an insufficient power supply in the M3’s chipset to drive the screen. The M4 was probably defined after they decided to use Tandem so they were able to add it. Their statement isn’t technically wrong but it’s pretty disingenuous.

It’s a discrete part in our chipset specifically so that it’s easy to add for OEMs that want to and doesn’t burden those that don’t.

Edit: Isn’t not is.
 
Last edited:
  • Like
Reactions: continuum

ttnuagmada

Smack-Fu Master, in training
72
Knowing what Honor is using to drive their Tandem OLED screen, this was just marketing bullshit.

It’s not a digital problem, it’s analog. I think they probably have an insufficient power supply in the M3’s chipset to drive the screen. The M4 was probably defined after they decided to use Tandem so they were able to add it. Their statement isn’t technically wrong but it’s pretty disingenuous.

It’s a discrete part in our chipset specifically so that it’s easy to add for OEMs that want to and doesn’t burden those that don’t.

Edit: Isn’t not is.
That makes sense. Driving two pixels (sets of transistors) takes more power.

Any functionality on the M4 is going to be related purely to video processing or general display control, and have nothing at all to do with the fact that the panel is a tandem OLED. It's probably ready for next gen displays in terms of bandwidth/DSC etc, possibly some video processing related stuff too. Whatever it consists of, I can pretty well guarantee you that it's panel agnostic, or at the very least has nothing to do with the fact that the panel is dual layer (you might have some OLED specific video processing for handling something like near-black chrominance overshoot etc)
 
Last edited:
  • Like
Reactions: ScifiGeek

OrangeCream

Ars Legatus Legionis
55,386
Any functionality on the M4 is going to be related purely to video processing or general display control, and have nothing at all to do with the fact that the panel is a tandem OLED. It's probably ready for next gen displays in terms of bandwidth/DSC etc, possibly some video processing related stuff too. Whatever it consists of, I can pretty well guarantee you that it's panel agnostic, or at the very least has nothing to do with the fact that the panel is dual layer (you might have some OLED specific video processing for handling something like near-black chrominance overshoot etc)
The conversation was contrasting the M3’s purported inability to drive tandem oleds, and what the change needed to be for the M4 to drive tandem oleds.

I never meant to imply that the change was tandem specific, only that because Apple claimed a change was needed that we were speculating what that change could be. If the M3 didn’t have the necessary voltage to drive double the transistors per pixel would mean the M4 was given the capability to do so.
 
  • Like
Reactions: Semi On

ScifiGeek

Ars Legatus Legionis
16,404
If the M3 didn’t have the necessary voltage to drive double the transistors per pixel would mean the M4 was given the capability to do so.

There is no evidence they have double the transistors. It's double the emission layers in the OLED stack.

That stack will be controlled by the same transistors and won't require more input voltage. The whole point of transistors is that they are tunable controlled amplifiers. They need negligible control current to control any output power needed by the device.

On top of that, the fine control of individual voltages are in the the display HW itself which will present a generic common interface to the driving device. Screens in modern devices will be use LVDS or embedded DP, and working much like external display from the the device perspective. All the individual pixel control tech will be handled the electronics in the display itself.

Devices are NOT connecting to each individual pixel and controlling it directly. They are sending in high level signals, that only get broken down to individual pixel control within the display.

As stated before, this whole Tandem OLED display explanation from Apple is overdone marketing BS.
 

OrangeCream

Ars Legatus Legionis
55,386
There is no evidence they have double the transistors. It's double the emission layers in the OLED stack.
Ah, I forgot my EE basics. An OLED is a type of diode, not a transistor. However I think my point still holds. Two diodes means twice the voltage (in other words if each diode requires 1.4V then two of them both need 1.4V to light up).
 

ScifiGeek

Ars Legatus Legionis
16,404
Ah, I forgot my EE basics. An OLED is a type of diode, not a transistor. However I think my point still holds. Two diodes means twice the voltage (in other words if each diode requires 1.4V then two of them both need 1.4V to light up).

It's unknown what the voltage differences are at pixel, but that is irrelevant because that comes from the power supply, not the control circuit, and again, the ipad isn't wired to each individual pixel. There aren't millions of wires running from the SoC to the display.

The display runs through a standard interface like LVDS or eDP. A standard interface, that has standard voltage and power requirements.

Do you think driving external OLED vs LCD requires different power from your GPU, when they are each connected with a Display port cable? That answers is obviously No. And the same applies for internal displays. These are all standard digital interfaces.
 

ttnuagmada

Smack-Fu Master, in training
72
The conversation was contrasting the M3’s purported inability to drive tandem oleds, and what the change needed to be for the M4 to drive tandem oleds.

I never meant to imply that the change was tandem specific, only that because Apple claimed a change was needed that we were speculating what that change could be. If the M3 didn’t have the necessary voltage to drive double the transistors per pixel would mean the M4 was given the capability to do so.
That doesn't make any sense though. The CPU isn't what drives the panel.
 

OrangeCream

Ars Legatus Legionis
55,386
It's unknown what the voltage differences are at pixel, but that is irrelevant because that comes from the power supply, not the control circuit, and again, the ipad isn't wired to each individual pixel. There aren't millions of wires running from the SoC to the display.

The display runs through a standard interface like LVDS or eDP. A standard interface, that has standard voltage and power requirements.

Do you think driving external OLED vs LCD requires different power from your GPU, when they are each connected with a Display port cable? That answers is obviously No. And the same applies for internal displays. These are all standard digital interfaces.
Hmm, that's a good point.
That doesn't make any sense though. The CPU isn't what drives the panel.
The M4 isn’t a CPU either. I honestly don’t know how Apple’s driving their displays.

This says they use eDP:

That also says a second device is necessary to drive the backlight, as well as power to the panel. Ostensibly that is built into the M4s
OSCAR takes care of driving the LED backlight and regulating the panel power supply
 

ScifiGeek

Ars Legatus Legionis
16,404
Hmm, that's a good point.

The M4 isn’t a CPU either. I honestly don’t know how Apple’s driving their displays.

As I said before either LVDS or eDP. It's a standard interface. Solved problem. No point reinventing the wheel here.

This says they use eDP:

Then eDP it is. Again. The display is handling display work like individual voltages to pixels.

Ultimately this is essentially just like hooking up an external display to a PC GPU with a DP cable.

From an external (to the display) interface perspective there is NOTHING that would require a special GPU to drive the Tandem OLED display. It's simply resolution (2752x2064) x refresh (120 Hz) as far the GPU is concerned, and that combo can easily be driven by just about any modern SoC. It's essentially the same external DP specs as the 12.9" Pro screen in the M1 iPad Pro.
 

ScifiGeek

Ars Legatus Legionis
16,404
Tandem OLED in upcoming Dell Laptop:


LGD has been producing tandem OLED displays since 2019, mainly for the automotive industry. This expertise has enabled it to be Apple's main tandem OLED display suppler for its 2024 iPad Pro devices, and now to be the first one to produce tandem laptop panels.


LGD's first tandem laptop OLED is a 13" 2880x1800 panel. According to reports from last month, LGD's first customer for these panels is Dell, which ordered 100,000 such panels for its upcoming 2024 XPS 13 laptop.

Which again makes All the Apple claims of dual displays stacked, and need a special new SoC to Drive them seem very misleading.
 

ScifiGeek

Ars Legatus Legionis
16,404
Nice Article on PHOLED :

It also discusses current OLED TV tech.

As was brought up before, TVs already use multiple emitters per subpixel. So Tandem is really only a big deal for mobile, as multiple emitters are the norm in monitors and TVs. So Tandem will not bring benefits there, as they are essentially already Tandem. QD-OLED is already using 3 blue emitter layers to compensate for the blue problem.

"To get the necessary brightness out of today’s fluorescent blue OLEDs, Samsung now uses three layers of the material. PHOLEDs will allow Samsung to cut that number to two..."


Red and Green are already PHOLED, and they are MUCH more efficient.

They claim this is the year of the Blue PHOLED, but we still aren't seeing any marketing on it, or even in leaked Roadmaps, so I'm not holding my breath. They were claiming they were almost ready in 2015.

It's really Blue OLED that is holding us back. It's fluorescent. More of it's energy goes into heat than light. PHOLEDs emit most of the energy as light, may be a game changer.
 

OrangeCream

Ars Legatus Legionis
55,386
Reading the article I see that Samsung OLEDs use multiple blue fluorescent OLEDS because it is so inefficient in comparison to red or green phosphorescent OLED.

Meaning it’s conceptually the same, in using three blue layers to have an equivalent brightness of a single green or red phosphorescent OLED, it’s also not technically the same because it has no red nor green OLEDs at all, relying on quantum dots to turn blue light into red or green light.

LG uses two blue layers and a single each of red and green.

Apple’s tandem I think is a a double blue to each red and green, and two layers, meaning two each of the double blue, red and green.

So yes, technically double or triple blue layers is conceptually the same as double red and double green layers, but the linked paper seems to suggest neither Samsung nor LG have a tandem red nor tandem green subpixels. If I’m reading other articles correctly, LG uses a white OLED, which I think means a single white is composed of a red, green, and double blue layer to generate white light, with a subtractive filter to transform the white light into red, blue, and green color.
 
So yes, technically double or triple blue layers is conceptually the same as double red and double green layers, but the linked paper seems to suggest neither Samsung nor LG have a tandem red nor tandem green subpixels. If I’m reading other articles correctly, LG uses a white OLED, which I think means a single white is composed of a red, green, and double blue layer to generate white light, with a subtractive filter to transform the white light into red, blue, and green color.

Obviously it's not exactly the same. But the point is that multiple emission layers are common everywhere except mobile. When I first posted about Tandem, it sounded like it could improve Monitors and TVs, but it can't because they already have multiple layers.
Technically you could double all the layers but that would be 8 emission layers for LG stack, which likely isn't practical and/or economical.

So "Tandem" AKA more emission layers is already common and won't lead to any real OLED gains outside of mobile.

But PHOLED Blue is a real potential game changer everywhere that OLED exists. Samsung may be the best positioned to take advantage, since their Monitors and TVs are based entirely on Blue OLED base.

It also seems like most OLED heat production comes from blue pixels, and the greatest energy loss in blue pixels. Getting PHOLED blue should lead to cooler running, more efficient OLEDs.
 
Double red and green emission layers would increase lifespan and decrease burn in even in a TV.

LG has two layers of blue to compensate for the current weak blue.

If you double red and green, then it's imbalanced again, because you don't have enough blue anymore, and more expensive/impractical because you are now up to 6 layers.

Plus blue is still going to burn in at the current rate, so you have added cost and complexity for no real benefit.
 
Last edited: