CES 2026 Field Notes: Quantum Dots, HDR Gamut Rings Beyond L*100, and Why “100% BT.2020” Is the Wrong Finish Line
Nanosys booth conversations with Jeff Yurek and Chris Chinnock, plus my own workflow takeaways and HDRmaster 2026 updates.
Florian Friedrich — January 2026
🎬 Follow-up materials
Test and demo materials based on the source footage will follow on my channel: https://www.youtube.com/@FlorianFriedrich
🧭 Why this booth conversation mattered
CES is usually a mix of real innovation and noisy marketing. This year, one booth conversation stood out to me because it exposed a practical blind spot: we still talk about “percent of gamut” (coverage or area) with diagrams that do not include luminance. In HDR, that missing dimension is often exactly where things go wrong first.
I collaborated with Nanosys on a short booth video showing vials filled with red, green, and blue quantum dots (targeting BT.2020 wavelengths). I did the editing, post-production, and the technical overlay analysis using InnoPQ HDRmaster. The clip ran at the Nanosys booth in two versions: one clean, and one with analysis overlays.
💡 Key question for 2026:
Are we discussing wide gamut as a marketing claim—or as measurable, scene-relevant behavior that survives HDR tone mapping and gamut mapping?
✨ Quantum dots: my personal bias (and a bit of HDR history)
I’ll be transparent: I’m a fan of quantum dots. Not because they are fashionable, but because they helped push HDR and wide color gamut from “paper specs” into something people could actually see. When HDR was still messy and inconsistent across products, one of the missing pieces was reliable wide-gamut capability at high brightness. Quantum dots—implemented well—helped close that gap for LCD-based systems by improving spectral separation and practical color volume in ways that show up in real scenes, not only in charts.
Quantum dots are also not a marketing invention. The Nobel Prize in Chemistry 2023 recognized the discovery and synthesis of quantum dots. That doesn’t make every implementation perfect, but it underlines the depth of the underlying science.
- 🏅 Nobel Prize summary (Quantum Dots): https://www.nobelprize.org/prizes/chemistry/2023/summary/
- 📘 Nobel Prize popular information: https://www.nobelprize.org/prizes/chemistry/2023/popular-information/
🔍 What I watch for in QD implementations (beyond the marketing slide)
- Are the colors still stable when the set is bright (high APL, highlights, real tone mapping)?
- Is the “extra gamut” usable in real content, or does it collapse into pastel clipping at high luminance?
- Are we extending gamut responsibly (coverage and volume) without creating unnecessary metamerism risk?
🌈 SQDs and BT.2020-wavelength quantum dots (a CES 2026 snapshot)
Jeff describes the vials as red, green, and blue quantum dots produced at BT.2020 wavelengths. In the broader CES context, you also see brands talking about “Super Quantum Dots (SQDs)”—an evolution aiming for deeper red wavelengths, narrower spectral bandwidth, and better stability over time (especially challenging in red). This is not a single leap; it’s a continuation of a decade-long refinement process.
🧪 Discussion point (for reviewers and engineers)
“Narrower” primaries can be beneficial—but they also make mapping, calibration, and observer variability more important. For me, the interesting question is not only “how pure,” but “how usable and stable across real content and real viewers.”
🧱 The QD roadmap: QDEF, QDP, QDCC, QD-EL—and beyond displays
In another booth segment, Jeff summarized why quantum dots are still a long-term roadmap technology—not something that has already peaked. Quantum dots have been commercialized in displays since 2013, and the story in 2026 is still about pushing performance while keeping reliability and manufacturability under control.
- QDEF: still improving, with SQD as one visible example of better purity and stability.
- QDP (diffuser plate): fewer layers, lower cost, and (per Jeff) now moving into higher brightness/premium territory; longer-term, a heavy-metal-free target around 2028 is mentioned.
- QDCC (color conversion inks): improvements in brightness and performance, aligning with brighter QD-OLED products.
- High flux goals: longer-horizon work for AR and microLED contexts where luminance scale changes dramatically.
- QD-EL: electroluminescent QDs as a late-decade topic; Jeff mentions 2029 as a reasonable target for first market appearance.
- Non-display applications: optics, sunlight conversion for agriculture, pigments (including QD-embedded 3D printer filament), sensing.
🧠 “Chromaticity isn’t color”: why CIE 1931 xy is insufficient for HDR
In the analysis overlay, the lower-right corner shows CIE 1931 xy chromaticity. Jeff points to it and says something that should be obvious, but is still underrepresented in public discussions: chromaticity isn’t color, because it excludes luminance.
For me, creating the analysis version with InnoPQ HDRmaster, it was important to show conventional chromaticity context next to something more meaningful—while keeping the original footage in the same frame. Not to “trash” the classic diagram, but to show its limits in HDR: the moment luminance becomes dynamic, a 2D chart can hide the failure modes we care about most.
❓ Questions I now ask whenever someone shows me a triangle chart
- Where does saturation go when the display gets brighter?
- Do hues stay stable through tone mapping and gamut mapping, scene by scene?
- Is the chart describing the display—or one carefully chosen operating point?
- How about accuracy inside the triangle: are the colors matching the signal or creative intention?
If you want to look at the tool used for overlays: http://hdrmaster.com
🪐 Gamut rings, in plain terms
The practical answer Jeff points to is gamut rings. The classic implementation uses ten rings, each representing a lightness (L*) slice of CIELAB—typically in steps of 10 from L* 0 to L* 100. The result is a 2D plot that still carries information about color behavior across lightness, instead of collapsing everything into a single 2D chromaticity projection.
Jeff adds a key point: L* is relative, not absolute. That becomes a major issue the moment we try to apply the same scheme to HDR, because CIELAB and L* were standardized around SDR assumptions.
🚀 The HDR extension idea: rings beyond L*100 (500 / 750 / 4000 nits)
So the natural question is: what if you want to go beyond L*100?
Jeff describes an idea being discussed within standards organizations: keep the familiar SDR rings for the perceptually grounded 0–100 region, but add extra rings for HDR behavior beyond L*100. Three additional rings are used:
- ☀️ to ~500 nits: increased brightness beyond the L*100 region (bright diffuse behavior).
- ✨ to ~750 nits: a specular highlight regime.
- 🧨 to ~4,000 nits: “signal extensions” (high-end signal headroom beyond typical diffuse assumptions).
As much as the software allows for flexible values, this is a good compromise in my view: it keeps the standardized SDR foundation people already understand (I used 203nits as diffuse white or L* reference), but adds just enough structure to describe the HDR part of the signal in a way that is useful (diffuse vs specular vs extension).
🛠 Practical note
In HDRmaster, the nit levels for these rings (and the diffuse white reference) are user-definable. That flexibility matters, because HDR “reference levels” are not a single global constant in the real world.
📐 Standardization: why this can’t stay a booth idea
Chris asked whether I’m discussing this direction with Kenichiro Masaoka (the pioneer strongly associated with gamut rings). Jeff answers carefully (as he should), but confirms discussions are happening and that Masaoka-san is supportive of this direction.
This matters because without standardization we get five vendors, five definitions, and charts that can’t be compared. With standardization, we get a shared measurement language that reviewers can cite, engineers can optimize toward, and content creators can use to predict failure modes.
📚 Further reading: Masaoka, “Gamut Rings Color Scope” (SID / Information Display):
https://sid.onlinelibrary.wiley.com/doi/full/10.1002/msid.1456
🧰 HDRmaster 2026: what we changed (and why it matches this discussion)
Around CES 2026 we pushed hard on HDRmaster upgrades. I’m mentioning them here because the booth conversation is not theoretical for me—I’ve been building parts of the tooling specifically to answer these questions in a practical way.
- 🪐 HDR gamut rings with flexible additional rings for highlights, speculars, and signal extensions.
- 📊 Gamut ring statistics: percentage of pixels per ring plus saturation distributions (to distinguish “one bright pixel” from scene-level stress).
- 🎞️ Scene-level gamut rings: one diagram per scene, not only per frame—often closer to how HDR decisions are actually made.
We also implemented HDR10+ Advanced. I like it because richer metadata fields can support better tone mapping and gamut mapping decisions scene-by-scene, and those fields also turned out to be useful for more robust scene detection. Many more small but meaningful improvements landed in the 2026 cycle; I’ll unpack those in separate posts and actually videos to show it working.
🧩 Beyond RGB: cyan and yellow are not free lunches
Chris and Jeff discussed Hisense showing two “beyond RGB” ideas at CES: an LCD with RGB miniLED backlight plus cyan, and a direct-view microLED with RGB plus yellow. Jeff’s skepticism is healthy. The incoming signal is still RGB, the subpixels are still RGB, and once you add extra primaries you need algorithms that map RGB content into architectures with additional degrees of freedom—without creating new artifacts.
My own take: cyan and yellow are hard to manage in an RGB world, exactly as Jeff implied. However, I’m hoping they are not used to extend the gamut triangle into weird shapes for marketing slides, but to be utilized in the third dimension—color volume—where it can actually reduce real-world restrictions.
🌊 My hypothesis: where cyan / yellow could matter (if done responsibly)
- 🟦 Cyan: reduce restrictions in skies and oceans, where P3 often runs out of room in real scenes.
- 🟨 Yellow: reduce restrictions in sun-lit yellows and difficult gold/brass tones that tend to break (dulling or hue shift).
- 📦 Overall goal: preserve saturation at high luminance instead of stretching a 2D triangle.
📎 BT.2020: area vs coverage (and why I consider the area debate mostly obsolete)
Chris and Jeff made a distinction that should be mandatory in marketing and reviews:
- 📐 Area: how large the triangle is in a chromaticity diagram.
- ✅ Coverage: how much of the BT.2020 triangle you actually cover.
Many brands can claim “100% BT.2020” if they mean area (and some can claim more than 100% area by shifting primaries). But that does not mean the primaries match BT.2020—which would be the only way to cover 100% of BT.2020.
Jeff also makes the practical point: BT.2020 primaries sit on the spectral locus (pure monochromatic light in the idealized sense). If you are off by a small amount in peak wavelength, geometry punishes you—coverage drops fast. That’s why true 100% coverage is extremely difficult for non-laser systems, and why even perfect primaries can be complicated by LCD filter crosstalk.
🧭 My personal position
BT.2020 area discussions are increasingly obsolete because they are confusing. Coverage and volume behavior are what I care about—because what is the value of area when there is no matching signal (or stable perceptual behavior) for it?
✅ Better questions than “What percent of BT.2020 are you?”
- Which real colors (in real content) are improved beyond P3, and which are still compromised?
- How stable are hues under tone mapping and gamut mapping, scene by scene?
- What is your strategy for wide gamut without creating unnecessary metamerism risk?
🤝 Closing: the next conversation I want us to have
If I could change one habit after CES 2026, it would be this: stop treating a single 2D number as a proxy for HDR color quality.
Instead, I want us to adopt a shared language that connects content (what creators master), perception (what viewers notice), display behavior (tone mapping + gamut mapping under constraints), and measurement (repeatable and honest about luminance).
Gamut rings—especially with a sensible HDR extension concept—are a practical bridge between engineering and perception. They are not the only answer, but they are one of the few tools that scale from labs to real conversations without collapsing everything into a marketing metric.
🔗 If you want to go deeper (or collaborate)
- HDRmaster (analysis + HDR workflows): http://hdrmaster.com
- HDRforum (downloads + classroom + discussion): http://hdrforum.com
- My work / contact: http://www.ff.de
I support manufacturers, post houses, and reviewers with reference content, post-production / HDR mastering, measurement-driven analysis, consulting, and seminars—always with the goal of translating specs into what people actually see.