I have at least one friend who wants individually-addressible bayer layouts, but that likely won't happen.
But OLED is a remarkably personal technology. Some people also have issues with how the images are "strobed" during upgrades, etc.
But fret not, they announced at CES there will be OLED with vertical pixel arrangements really soon.
I've moved back to using a pair of 4K LCDs that I had, and honestly the resolution and aspect ratio are better for text and programming anyhow.
My PC monitors are my only remaining LCD screens largely due to the text fringing issues mentioned in this article and bezel size.
These monitors are great when scaled but have real issues rendering text and other fine details at their native resolution.
If I had to guess it could be something in the manufacturing process is more difficult.
I remember getting one of the early Samsung OLED PenTile displays, and despite the display having a higher resolution on-paper than the display on the LCD phone I replaced it with, the fuzzy fringey text made it far less readable in practice. There were other issues with that phone so I was happy to resell it and go back to my previous one.
PenTile for example (as another commenter pointed out) was woeful with text, and made things look fuzzy.
I'm not a fan of ClearType, but even on Linux OLED text rendering just isn't as good in my experience (at normal desktop monitor DPI)
Perhaps its down to the algorithms most OSes use instead of just ClearType, but why hasn't it been solved by this point even outside Windows?
I'd say that because the article documents my experience at this point in time, the only poor timing is when my old-ish monitor died and I went looking for a replacement. And this article documents my experience with that.
Yes there is color fringing if I take a zoomed in picture with my camera, but nothing I notice in day to day use. And I've been highly annoyed by missing or bad ClearType rendering.
I specifically went for the 27" to get the extra pixel density though. I might not have been happy with the 32" variant.
That said, the true blacks and bright whites is something else. For me a very significant upgrade both on the desktop and in games, despite the previous being an upper-level LCD when I bought it 4-5 years ago.
Here is a more detailed look at several different subpixel arrangements: https://pcmonitors.info/articles/qd-oled-and-woled-fringing-...
And encouragingly both LG and Samsung were demoing RGB (LED-style) arrangements at CES this year.
LCD can have a uniform layout because it's a passive layer doing the filtering. In OLED, each pixel is active and that blue one is trying to burn itself out much faster than the other two.
OTOH the OLED panels on Apple's iPad Pro series are outstanding, and I'm excited for them to come to MacBooks. Yet another example of Apple's hardware team taking its time to do things right.
He's also got Cleartype on and set to RGB stripe even though the OLED is not RGB stripe (though to be fair, Windows doesn't really make it clear what each page of the ClearType tuner does).
But yeah, if you use a _tiny_ font and sit _really_ close to the screen, you see fringing. In practice for me, it's been unnoticeable.
You act like being able to see this a minority opinion, it’s not, and it’s a known issue. And you don’t need to be sitting close, or using a tiny font to notice it.
My 4K OLED is noticeably less clear compared to an IPS display and I’d never use it for productivity as a result, because why would I willingly subject myself to an objectively worse experience?
Exactly what I want for the thing I stare at for 8 hours/day. Not great. Not good. Perfectly adequate.
It actually helped me enforce a separation between WFH/where I relax anyway, so not the worst.
Personally i prefer VA to IPS by far because IPS looks washed out to me.
Then as I went back to where he was describing the problem ("fringing"), I kept forgetting when I scrolled back to the images which was which (and which image was supposed to be "worse").
I'm on a 2025 Macbook, so maybe the laptop's monitor masks the issue?
That's an interesting point you mention about not seeing it, because prior to buying an OLED I'd read a bunch about fringing and in many articles I just... couldn't see it. I couldn't tell what was being illustrated in the images.
It wasn't until I sat in front of one for a few hours, in my room and lighting and with my apps and had funny-feeling eyes and a this-seems-off feeling that I decided to investigate. And yes, those macro photos show fringing, but it /is/ hard to understand how the subpixel pattern translates to on-screen weirdness until you've seen it for yourself.
The sub pixel geometry on samsung's qd-oled needs very specific font configuration to be correctly displayed, and even then it just stops looking bad.
At one point in time 95%+ of HN comments were cheering on about Atom the text editor and later VSCode as being fast enough or unnoticeable. When Sublime user are baffled as to why. And Sublime isn't the fastest text editor either before Zed came out.
Yes, 10 times out of 10 I could tell an OLED font rendering to LCD. I wish I couldn't. Some people call it taste, some call it absurd requirement.
I could go on and on. The point is most people aren't very picky and picky is a definition defined by average. But there are those of us who have, let say very high standards that cares about PPI, Refresh Rate, Colour accuracy etc. Keyboard Key's typing distance, trackpad responsiveness, all the tiny details that I wish I could unseen and un-feel.
As the article state, RGB OLED Tandem is coming out, and I cant wait to see it in person. I have been pro LCD on Laptop for so long that when I learned Apple will soon ditch LCD for OLED I was worried. Hopefully the new sub pixel layout will fix it.
I'm sure I would notice and be annoyed by this fringing too unless the pixels were so small I really couldn't see them. Probably needs to be slightly higher than 200 then. But I haven't seen oled monitors with such high DPIs. The highest I've seen is 4K on 27" which wouldn't even do for me on LCD.
I have Dell P2415Q, from 2015. There are, like, 4 other (legacy) models of 24" 4K out there, and that's it. I've no idea why they don't manufacture them.
I also have one, and it's holding up pretty well. A month or so ago I broke out my colorimeter and it had almost 100% sRGB at around 120 cd/m2. I don't recall the delta E, but it was very low.
While I didn't measure the backlight, it does seem to not go as bright as before, judging by the levels I set in the OSD. I never went above 70% or so when the sun was shining in the room (not directly on the screen, though), so it didn't have any effect on me.
I understand there are two version, I have the second one. But I don't think there's a difference in the panel itself, I think the change was related to HDMI support.
I can't comment on the latency, the only games I played on it were Civilization and Anno Something. Never had a problem for this.
The last new one I bought in 2018 I actually paid the same price for it than when I bought my first one in 2015, so it is one of those few computer accessories that significantly increased in price over its course rather than decrease.
If you cannot see the P2415Q degrading and/or being generally crap in any metric (EXCEPT DPI) when compared to even the non-IPS black Dell monitors from this decade, you are simply blind. They are early HiDPI-revival-era panels, and it shows.
Some of the newer IPS black panels are so good that it is tempting to just take the DPI hit and go 27''... albeit with care as it seems Dell has decided this last year to put some filter that further increases blurriness.
I actually have a newer "ips black" dell, an ultrasharp 3223qe and yes, it's much better.
But what I'm saying is that the old one is still good. However, I never pretended it was as good as current models. That's moving goalposts. The initial comment was about the display degrading, so comparing it to itself when new (not even other similar models from that era!). Mine only seems to have become somewhat dimmer, but not enough to matter in my day-to-day use since it's still brighter than I need.
I have the same monitor and i believe over 200ppi is pointless for desk monitor unless you are very close to it. It makes sense for laptops which you have much closer but i think most people have desk monitors way way further from the eyes.
Source https://www.sven.de/dpi/
A 32" 4k screen is nice enough and a reasonable one [0] can be had for a third of that. My I don't-know-how-old desktop I saved from the bin at work sporting an i5-6500 could drive that with no issues.
---
[0] Around 2020 I bought an LG something-or-other for 350 Euros for work, 32", 4k, some form of VA panel. It had pretty good colors and better contrast than the IPS monitor I use as a Sunday-photographer.
that's a weird start. for me the start is 4k with proper blacks & proper color calibration
> given that a couple of hundred more over 5 years would have surely improved productivity by a little bit
no company wants the bulk of it's people to improve their productivity by even a little bit. you should be productive enough, that's it.
> It felt almost done out of spite to keep people in their place
otherwise amazon and the likes would have competitors in every country. but I don't think it's out of spite.
it's the 'established' interpersonal culture between employers and employees, like in packs without natural alphas: if one beta-beta steals the show of the beta-alpha a few times too many, he's a goner. in packs with alphas the performer gets commended and a chance to compete for the top because you want your team to be lead by the currently best. hasn't been the case in our species for a long while now.
companies don't treat their employees bad out of spite, it's so they can stick to low, moderate(d) standards and cultures, ... and have an easy work life
The only thing that really needed an upgrade was the display. I ditched the crappy 1366x768 TN for a 1440p IPS and an LVDS-eDP conversion board. Looks fantastic. Runs great.
Most of my friends, and nearly everyone on the internet was like, who buys Laptops because of speakers? They all sounded the same. Get a Dell. I think it was on either Anandtech or Tomshardware. It was certainly before Reddit Era.
Somewhere along the line, may be 2015 to 2020 Youtube reviewers have been bashing about Dell or other laptops for their crappy cost saving speakers. ( Thank You Dave2D ) And, manage to actually show it in the video how awful they were. All of a sudden "consumer" took notice and have since demanded better speakers. Laptop Speakers in the past 5 years have improved tremendously. As it turns out, people need to learn how to compare. And once they do, they cant unseen it.
But in all honesty in the past few years I really really wished I dont have the ability to tell the difference. To not have the mentality how something could be "better". To stop thinking how everything, from Food, Furniture, Tech Stack, UI, Buildings anything could be better.
Some say it is a gift, I think it is more of a curse. And it is a struggle and tiring. I then discovered my retreat for peace was to go out to nature and enjoy the creation of god.
Now, I take a more 80/20 approach: I clearly define my needs and shut down any thoughts about features and capabilities that I don't need right now. Frankly, after years of thinking that I might use a feature later, I realise that I never do and never recover my investment in these kinds of gadgets.
Finding a trustworthy review source is key — by trustworthy, I mean mostly in line with your own standards. However, if you can try it yourself, that's always better.
For sound on small devices with clear voice and a good dynamic range, Samsung is quite good with its high-end Galaxy Tab line.
This is how I became quite learned in sound reproduction (incl. acoustics and psychoacoustics) then bought Genelec loudspeakers, for example. But I don't care about finding Samsung B-dies (I think?) for my RAM.
Nothing man-made could even compare with that perfection.
You know about things that are to others unknown unknowns. Since ignorance is bliss, it definitely feels like a curse to you, and since what one doesn’t know can hurt them, others would see as a blessing.
Funny world we live in.
I second this. I can tell, and I would never wish that ability on my worst enemy. Very glad there's a (slim, but exists) market catering to that — and that I no longer have to buy a monitor that costs as much as a small motorcycle to not be constantly infuriated at everything in my field of vision when working.
I can see the difference between 60Hz and 120Hz on phone screens, and I think it's worth the impact on battery life.
I can see the difference in speed between VSCode and editors like Sublime or Zed, however in this case I prefer the additional features at the cost of speed/smoothness.
I kind of want to see one of the Atom vs Sublime people use that for a hot minute.
At home I upgraded from 60hz to 165hz and was underwhelmed. I see the difference .. but eh.
After asking the owner of said screen how he could stand that... "stand what?"
Yep, I guess most people are not that picky.
When DLP projectors first came out, I couldn’t watch them. I would see colors breaking in fast motion scenes and whenever I would move my head even slightly (and … we all move our head slightly often when watching a movie).
When I told other people, some of them nodded in understanding, but the vast majority thought I was making things up - for them, it was a rock solid picture.
One of my friends replied: “I can see about 300hz. Not all the time - only when I have secadic movements; but that means many fluorescents, DLPs and other light sources drive me crazy. I guess you’re also a member of crazy club”
Some people can hear 26khz. Some people can see DLPs. Some people can see the alternating pattern….
My homework at that time revealed a couple of things:
1. Liquid crystals are individually driven by AC waveforms, not DC as one might assume. This is the nature of the beast. The frequency at which the signal alternates is not necessarily very high. Thus, sometimes, this alternating nature is visible.
2. Some displays use dithering. A given display might support just -- say -- 6 bits per subpixel. To get the full 8 or 10 or whatever number of bits that are expected as a final output, the in-between steps are approximated by switching between two values -- sometimes (again) at a fairly low frequency that is visible.
...
But anyway, that ViewSonic monitor: Most people thought it looked fine, but it drove me nuts.
Windows?
I doubt it's Cleartype, the close up photo of the U3223QE show all subpixels uniformly dimmed on the fringes. The post also says the monitor is attached to a Mac mini and a previous post about OpenSCAD has a screenshot with MacOS window decorations.
Had I been using this on Windows I would have started to solve it by trying to tune that.
You have to understand there's enormous effort that needs to go into this to make things look good. It's absolutely no surprise to me that Jobs era Apple used to stick to integer scaling ratios with relatively low-res phones while the competition battled with paper specs.
The trick to make things look good is to be mindful of the pixel grid, (and the subpixel layout). You have to choose font spacing and fonts so that major font features line up with the pixel grid. You sometimes have to slide a letter a bit to the left or right, which might result in inconsistent spacing, you might even want to have multiple versions of characters to hide these issues.
This applies to borders as well (both spacing and thickness).
Sometimes you can't make it look good no matter what you try - and the designer has to change it.
While flexbox and other super-duper layout algorithms might be very clever, if you use them, there's no way to line things up perfectly, and if you expect to, you might be in a world of hurt.
Adding PPI is a poor way to fix this. Even if you 2x the resolution, going from 1080p, to 4K, these issues still persist, and you quickly run out of hardware beyond that.
There's no wonder why modern 'flat' UIs usually have 1-2 pixel-ish gradients on edges of features, or use smooth transitions, it's a cop out - but that's the only thing that works with flexible layout, different pixel densities. But subtly, you notice things looking a bit blurry, just as if the images were low-res, even though the pixel density is insane.
ClearType is brilliant, but it only works with a certain set of assumptions, like a bit of light bleed (which exists on LCD but not on OLED afaik), and needs to know the subpixel layout, and can result in absolutely gorgeous looking fonts with relatively low res displays.
There's a reason why Windows 95-era UIs have a cult following - it's insane how sharp they looked even on hardware that on paper is much worse than modern stuff.
I got unbearable eye strain from it, even though I use rather large fonts, and the ppd was the same as with my previous IPS. Yes, the “more fuzzy” text was very much noticeable too.
Maybe it varies by person, maybe it’s influenced by things like astigmatism, but I totally see where the author is coming from, and I too am waiting for the new OLED panels to see if there’s an improvement.
I do have astigmatism. You do make me wonder if this plays a part as well...
I also have significant problems with blue LEDs around the house, to the point where I've removed, replaced, or covered almost all of them. They really, really bother me because it feels like my eyes never focus on them and they leave me feeling slightly disoriented.
There is no Cleartype; this is macOS. And as mentioned I couldn't see the fringing from normal use, that only became evident with macro photos. During normal use it just looked sparkly or weird or artifacted.
And yes, the fonts are small, but that default size in VS Code or Numbers.app -- the example photos -- work well for me. And look fine on an LCD.
And no, no Cleartype here because (as mentioned in the first paragraph) it's a Mac running macOS.
There's no Cleartype in use here, as the first paragraph says, it's macOS.
And I'm using the default font sizes because they work well for me on an LCD. The point of this post is to document my experience with trying a current-gen generally-available OLED and how it did not work out well because of the subpixel arrangement.
It's also not just an issue on text, it affects any high contrast edges, especially perfectly vertical or horizontal ones. This meant that CAD stuff, spreadsheets (the grid), and large colored sections in graphic design software looked off as well.
Also, many people can see and are bothered by particular non-rectangular pixel layouts - it doesn't require doing odd things.
It will take so long for burn in that you probably want to buy something new anyway.
My LG OLED 4k tv is by far the best picture I ever had.
My Samsung 4k 27" gaming OLED is beautiful when gaming too
Meanwhile I’ve got an MSI OLED 32” 240Hz @ 4k monitor which was super expensive but is absolutely incredible. It takes getting used to a monitor that performs a maintenance routine on itself any time you leave it active for more than a few hours. But it’s great for work (with some aggressive zoom levels) and gaming (with some aggressive black point levels).
Are you saying this is a con as it sounds like a pro to me.
Off topic, but JFYI, with last year's firmware update (OLED CARE 2.0), you can now delay the refresh notification for up to 24 hours. I haven't seen the notification pop up since updating.
The article also compares a budget display Dell S (OLED) vs higher end Dell U.
Only Microsoft can fix it, and as far as I know, they don't seem interested.
https://github.com/snowie2000/mactype
https://github.com/CoolOppo/GDI-PlusPlus
I use MacType and it works really well. You can tune many more things than with ClearType.
Anyway, OLED is great, I'm sitting 2 arm length away from the panel.
People complaining are probably Gen.Z that never sat in-front of an ol' CRT in the 90s and are spoiled by smartphones running 4k on minuscule 7" displays with 460ppi.
It's really annoying because all I really want is to disable ClearType on my primary high DPI monitor while keeping it with default settings for my two side monitors, but Windows does not let you configure it per monitor.
The other issue is that it's not just a text problem. It affects any high color contrast edges, especially directly vertical or horizontal ones. So subpixel rendering tweaks for text rendering (eg: Cleartype) don't solve the whole problem.
The tradeoff is worth it in a lot of scenarios, but I've been thinking about getting a "coding only" monitor that I use for long sessions instead.
I begin to feel like we may need the equivalent of what audio shops used to have: a listening room with normal furniture and a big switch to test different things, but for eyes not ears.
I only buy s/h Dell monitors 3-5 years behind bleeding edge because I am a cheapskate with old eyes, who can't see past the dust on my glasses anyway. But I genuinely can relate to this problem. It would suck to invest in the best you can afford and find it's not doing what the dollars expect for you.
It doesn’t even need this. The old way of buying stuff was from a shop. With big ticket items you build a relationship with staff and their recommendations helped. Now it’s all shipped or bought from faceless big box stores.
It seems that LCD has long been the best technology for desktop monitors - but interestingly, despite its popularity, may never have been the best technology for TVs. CRT, plasma and now OLED have all had better image quality than contemporary LCDs.
https://www.heise.de/en/news/OLED-with-true-RGB-LG-and-Samsu...
When the OS assumes correctly what the monitor actually looks like, you get even better text rendering. When it guesses wrong you get a horrible mess.
I know it's the panel, because I have an office BenQ 21 inch with technology called sth like EyeCare on the side as the second monitor. Compared to the gaming one - it's a balm on my eyes. Sometimes I'd put a document or LLM window there even when the main screen is empty, just because of how nice it feels on the eyes.
Except for few situations where the poor man's HDR does help me see more details in the dark (small edge in multiplayer), I believe that I would be better off with a monitor engineered for office work.
Think about it, I got lured by all the gaming hype, but what I really needed is a monitor that is 80% office and 20% gaming, not 100% gaming. And I believe that's the case with others complaining about the eye strain.
Moisturizing eye drops and Safeeyes (eye exercising Linux app) help a lot though. Safeeyes has an alternative on Mac called EyeLeo (but never used that). Recommend them both to everyone in this thread. Take care of your eyes. They are the most suspectable to drying part of our body - eye exercises help for that. And they are fed with only a miniscule artery, eye exercises help keep a good throughput on that too.
> but as noted we found this quite obvious to the eye as well
... and then 2 images that look exactly the same to me :(
> but sir, what about fringing
Skill issue. Just configure your text rendering correctly.
> but sir, what about burn-in
Didn't happen to me.
Must say my first generation Samsung display looks amazing both for gaming and programming though. If it wasn’t for the annoying smart-tv stuff, and the mini connectors, it’s a perfect monitor.
> Within the past few weeks LG has announced RGB stripe OLED panels which will resolve this problem, but there aren’t currently any monitors available using these panels
with two links to five known upcoming devices.
https://news.lgdisplay.com/en/2025/12/lg-display-unveils-wor...
https://www.analyticsinsight.net/news/ces-2026-first-rgbstri...
3440x1440 @ 34" (110ppi): Asus PG34WCDN, Asus XG34WCDMS, MSI MEG X, MSI MPG 341CQR X36
Digging into this further, I also found another Asus panel that's closer, if not all the way there yet (5K would be, but this is 4K) to the usual Mac pixel densities:
3840x2160 @ 27" (163ppi): Asus PG27UCWM
I'll still be waiting for 5K @ 27" with the new tech, but I'm really glad to see they finally solved this!
ms cleartype isn’t always compatible with oled.
The OP seems to have a problem not only with fonts but also with straight lines in CAD.
It's a problem with how this subpixel problem aligns with my use (mostly static content with high contrast edges) at the display scaling I'm using.
It just doesn't work for me, so until I can get a more traditional pixel pattern OLED at a price I'm willing to pay I'll just go back to LCD.
I used an LG C2 42" as a monitor for a few years. The color fringing was particularly bad for me because I like yellow text and LG uses RWBG. 4K 42" and 1440p 27" are about 110 DPI. This is not enough. 4K 27" is about 160 DPI. That is enough. We've already pushed past needing to care about subpixel layouts if you properly weight pixel density in your selection.
I actually am using it, but I didn't want to go down the rabbithole of an all-encompassing article on displays, PPI, scaling, etc. Using it to scale the display really helps, but I find that for the size of things I like 3008x1692 (on a native 3840x2160 panel) and this looks fine on an LCD. And is better than native res on the OLED, but still not great. It still bugged my eyes.
I just went with native res for demoing things because it's a worst-case, but the fringing problem, because it affects all strong-contrast edges not just text. It was also really noticeable at thin/narrow lines such as when doing CAD or between cells in spreadsheets.
Which is why I ended up buying a replacement 24" 1920x1080 recently. I needed pixels large enough to distinguish with my fave small font. If I want more pixels at once on the screen I need a larger screen.
The screen shown in this blog looks like it's ~140 PPI. Sure these screens are cheap, but they're best used for moving graphical content.
In the demonstration image the text is just 9 pixels tall, while thatis legible, it is unacceptable for long term reading and is completely reliant on subpixel rendering to produce an impression of smoothness.
I've definitely enjoyed having the extra screen real estate over the 27" monitor, and the extra resolution has been very helpful for having a bunch of windows open in Unity.
This year at CES there were a number of new monitors unveiled that compete in this space. There's a new Samsung monitor (G80HS) that is a 32" 6k with a higher refresh rate than what you'd find with existing offerings. Unfortunately it has the matte coating instead of glossy, so clarity will suffer.
Also of interest are the new 27" 4k offerings with true RGB stripe subpixel layout. This should fix text rendering problems, especially on Windows. Both Samsung and LG are making these OLED monitors with the true RGB layout. There will almost certainly be glossy coatings offered with these panels, and they'll have higher refresh rates than IPS. The main downside will be brightness for full screen white windows. I think the Samsung panel is a bit better than LG in terms of brightness.
This is akin to how I've (technically?) stepped back from a 5K 27" to a 4K 32". Likely due to scaling and how far I sit from the screen (about 24" -- average I think) things look the same? At least, I don't notice that the 4K is any worse.
Me being me, I can't help but think I should have a 5K or 6K or whatever, but the price is... high. So I figured I'd try a 4K 32" since the OLED was cheap and the result was this post because the subpixel pattern messed with me. But now for the replacement I'm looking at a simple (but nice color / high end) 4K 32" IPS LCD.
And having been using one for the last day, I'm pretty content with it. It's like everything I wanted from the OLED without the eye strain.
With all that said, I would still recommend it over anything not retina.
Now, I spent an amount of my work life staring at a company-issued 27 inch 1080p display, and that was absolutely horrible, but with 4K, I'm not sure if I would even be able to see the improvement if I went to 6K or 8K even, which I always thought was mostly useless outside of gigantic television sets. Is it really worth it? Can you really genuinely see the text blurring on a 4K monitor?
If you're not using text around 9 pixels tall, as in the article, you're probably going to be okay. On a 27 inch screen at a typical office screen distance, I'd probably want 6k, but 4k is pretty good and 1080p is terrible.
He bought a factory defective monitor. LCDs like this exist, Samsung sold them as AMOLEDS called Pentile. They are hideous.
Until subpixel is eradicated from existence, we shall continue to suffer this.
The problem is such:
1) Windows exists.
2) Windows invented Cleartype for Vista, it was ugly and fringed hard because they misunderstood human perception, the sRGB standard, and math.
3) Windows then readjusted Cleartype for DirectWrite. OSX before Retina and Freetype use subpixel tuning also compatible with human perception, sRGB, and math.
4) Many applications on Windows do not know how to ask Windows what the subpixel orientation is; either they assume RGB all the time (do not do this!) or they only read the first monitor (do not do this!). Windows can tell you per monitor, this is the only correct way. This API has existed since Vista.
5) This problem also effects DPI: they either cannot scale or only scale for the first monitor. Windows scaling for you causes _exceptionally bad_ color fringing for subpixel rendering. Again, the API for that has existed since Vista.
6) Many monitors do not list their subpixel orientation in their EDID. Ones that do and also are rotation sensitive do not set their EDID for RGB->VRGB->BGR->VBGR as you rotate them. Windows assumes RGB for any monitor missing that EDID field.
7) Windows only knows (V)RGB and (V)BGR. It does not know W+RGB, it doesn't know any sort of complex multi-row arrangement (such as Pentile).
8) Many applications ignore your Cleartype settings in Windows, and use RGB at default color weights no matter what you do, even if you turn subpixel off and do greyscale only.
9) And the worst sin of all: people embed screenshots in their documentation and websites, and they never update them. The Internet is full of Vista-era fringe-filled Cleartype text. This ties into 5, but is worth mentioning as its own.
Edit: On second reading, the author says they're on a Mac. Macs don't have subpixel rendering anymore. I don't understand the author's complaint, they have greyscale and avoid all these problems.
I will bitch about Apple's mistakes over the past decade, but removing subpixel rendering from their ecosystem was a smart move: makes all their text rendering compatible with all future monitors forever, and moving to HiDPI eliminates the need for it anyways.
_wire_•16h ago
The display the author doesn't like is a specific model Dell QD-OLED with a sub pixel arrangement that causes a fringe above / below text.
There are macro screenshots that reveal the sub pixel details of a preferred LCD compared the disliked Dell OLED, and it's easy to agree with the author's discontentment.
But the categorical complaint about "OLED" is an over generalization.
Treat the report as a warning to investigate the sub pixel characteristics of any monitor you may be considering.
JoshTriplett•16h ago
amlib•16h ago
JoshTriplett•14h ago
Yeah, I've had subpixel antialiasing disabled for a long time, since before my first OLED; I prefer grayscale antialiasing.
gonesilent•16h ago
Brybry•16h ago
c0nsumer•8h ago
I'd been having an issue with a vertical dark bar during wakeup for a few months, but it'd go away after the whole screen came up so I pushed off opening a case. Then one day the whole thing started having problems.
[1] https://youtu.be/JtbTQ4ldSkI
theshackleford•16h ago
It’s in fact most, if not all of the PC display OLEDs on the market today because they almost all use non standard subpixel arrangements, but will change soon with the introduction of newer generation panels.
> But the categorical complaint about "OLED" is an over generalization
It’s really not, given again we are talking about the majority of PC OLEDs in production today having subpixel layouts that cause issues for users with text rendering.
duckfruit•15h ago
While the author's complaint is perfectly valid, in practice the advantages of even current OLEDs - for me - far outweigh any disadvantages due to subpixel layout fringing and everything else. Even for programming, the lack of backlight and resulting infinite contrast makes such a huge difference in my day to day life that I refuse to use a non-OLED monitor anymore for any purpose whatsoever. Heck it could be half the DPI and I'd still go for OLED anytime.
The only reason LCDs still exist is price, nothing else.
duskdozer•12h ago