I am not aware of LED bulbs (and here I am talking about home lighting, not phones or laptops) that dim by shutting down some of the (multiple) LEDs.
Most home lighting bulbs appear to have several LED elements. A circuit could enable dimming by simply shutting some of them off — running the rest full-on. 50% dim would of course shut half the LEDs off. No PWM required.
this is different than a bulb like hue etc that have the ability to dim themselves through whatever mechanism.
There are two ways to dim an LED: supply less current at the same voltage, or PWM dim it with a fast enough switching speed that you don't notice the flicker (this being slower than it needs to be is what the article is about). A current source is pretty easy to build, and doesn't flicker, but it does dissipate all the excess energy as heat. That's not what you want inside the dimmer switch in your wall, it can be quite a lot of heat and would be a fire hazard in such a confined area. It does work for things like photography lamps which can have exterior heat sinking.
No. That's only true for a linear regulator, which is just one, very terrible, implementation of a current source that's only used for very low power applications. Linear regulators are never used for things like room illumination.
The alternative, and what's used for all commercially available DC LED drivers (plentiful and cheap), is to just use a regular AC->DC switching supply in current mode (current for feedback rather than voltage feedback). The only flicker is the ripple left in the filtered output.
Why aren't these used? Because most dimmer switches use tech from incandescent age, and just chop off parts of the AC sine wave, so the bulbs are designed around the switches you can buy in the store. Why do dimmer switches chop? Because that's what the bulbs you can buy at the store expect, sometimes damaging them if not dimmed as they expect.
You can buy in wall DC dimmer switches from any LED supply store, but they require DC lighting, also only found at LED supply stores. It's entirely a very recent momentum problem, that's slowly going away.
Additionally, for bulbs that are used in regular household fixtures, they basically need a way to convert TRIAC chopped 50/60Hz AC into constant current... which makes things even more expensive. Smart bulbs that are supplied a constant non-chopped AC can do it easier, but it's still expensive to do DC dimming.
When I was in high school we were messing around with liquid nitrogen and overvolting LEDs and noticed the odd effect that the color of the LED would change if you overvolt it. It was years before I found out why
https://www.reddit.com/r/AskElectronics/comments/v28qbh/why_...
https://spectrum.ieee.org/a-definitive-explanation-for-led-d...
It takes 1 mosfet to turn led on/off from a MCU GPIO, but if you want to do DC dimming, now you have to either add more passive components, or turn to special IC, both cost more.
For OLED I remember reading that PWM dimming is necessary because DC dimming causes shifts in color/whitepoint.
It's difficult for PWM-sensitive Mac users right now, as the majority of Apple devices for years have had rough PWM and of course there is no hardware alternative.
I'm stuck on a MacBook from years ago because the only current MacBook I can buy is the Air line, which I'll probably buy soon to replace my aging 2018 MacBook.
No currently for sale iPhone is PWM-free. The iPhone 11 (non-Pro) was the last mainstream device Apple made with a PWM-free backlight. The SE 3 (2022) was also PWM-free, but is no longer available from Apple beyond what stock is still around.
It identifies a "health risk", describes the mechanism in terms that sound very convincing, assigns numbers to its cause and effects, provides a table grading health risks of various products, all without linking to a single scientific study demonstrating that the effect is anything other than nocebo. The closest they come is a image of a table that refers to a few institutions that apparently did a study related to PWM (leaving it an exercise to the reader to find the studies they're supposedly referencing) and a link to a Wikipedia page which links to a Scientific American article which says:
> In 1989, my colleagues and I compared fluorescent lighting that flickered 100 times a second with lights that appeared the same but didn’t flicker. We found that office workers were half as likely on average to experience headaches under the non-flickering lights. No similar study has yet been performed for LED lights. But because LED flickering is even more pronounced, with the light dimming by 100% rather than the roughly 35% of fluorescent lamps, there’s a chance that LEDs could be even more likely to cause headaches.
I'm willing to entertain the idea that LED flicker is actually problematic, but I wish essays like this would be honest about the degree of confidence we have given the current state of the evidence. This piece instead takes it as a given that there's a problem, to the point where they confidently label devices on a scale of Low to Extremely High health risks.
There is nothing anecdote about flickering in LED light causing health risks.
What I'm asking for is for articles like this that cite numbers and provide tables purporting to quantify the degree of harm caused by various devices to point to where they're getting their numbers from or, if they can't do that, stop making up numbers and assigning things to "harm" scales that they invented themselves based on vibes.
Either there's a study showing that 246 Hz flickering poses "Extremely High" health risks or there isn't.
> Either there's a study showing that 246 Hz flickering poses "Extremely High" health risks or there isn't.
They calculated it using the definition from the standard.
They list the 'Xiaomi 15 Ultra' as having a 'Moderately High' health risk, and cite it as having a 2.16 kHz PWM frequency at 30-75% modulation depth.
The IEEE article has recommended practices that state:
8.1.2.3 Example 3: PWM dimming Using Figure 20, the recommended practice for PWM dimming at 100% modulation depth is that the frequency satisfies f > 1.25 kHz. This can also be derived using Recommended Practice 1 and solving 100% = 0.08×fFlicker. This level of flicker could help minimize the visual distractions such as the phantom array effects.
Seems like even at 100% mod depth, >1.25 kHz is just fine.
Also, the article does not seem to distinguish between modulation at reduced brightness, which the IEEE article calls out specifically as something that is unlikely to cause issues. E.g., movie theaters using film all flicker at 48 Hz and nobody complains about that.
You can see on page 27 how this is meant to be used: it should produce a per-hazard matrix.
You might be thinking of Figure 18 on page 29, which does identify Low-risk and No-effect regions by Modulation % and Frequency, but that also does not claim to identify high-risk regions, it just identifies the regions we can be highly confident are safe. And importantly, as a sibling comment notes, TFA's table actually contradicts the line on Figure 18, labeling several devices as higher than Low even when they're squarely within the Low-Risk and No-Effect zones.
Sure, PWM light can cause health risks for some people, in some contexts. But taking research out of context is bad science.
Do you genuinely believe the Pixel 7 and 8 Pro have an "extremely high health risk", in the context of what a lay person would understand?
Edit: I specify 'lay-person' because clearly this is an introductory blog post (or advertisement for Daylight Computer). If they want to use a more specific definition of health risk, then they better define it.
The standard also linked to the researches during their discussion.
Please read it, instead of just randomly throw out things hoping that they supported your argument.
Cite the exact page number and quote that you claim justifies the assertion that 246 Hz PWM carries an "extremely high" health risk. Then we can talk.
If you want to redo the numbers and check if they fit the definition, please feel free to do so, but you will need to put some works in (since the flicker hz -> risk showing in the article is a computed value, you need to find the modulation value and plug it in too)
I understand your fight and your idea, I am just saying that in this specific instance, this is not a fight to be fought. The article is generally correct, and if you want to complain about the writing style or it being an ads, it’s up to you. But this is not the same situation with GMO stuffs
No, they said that IEEE 1789 also uses Modulation % (which they've renamed Flicker %) to calculate risks. That is pointedly not the same thing as claiming that they used IEEE 1789's formulas.
You're reading their copy generously, but that doesn't usually pay with marketing copy. Articles like this always like to wave in the general direction of official-sounding sources while carefully refraining from actually claiming that they got their numbers from anywhere in particular.
The simplest ones always strobe at line frequency or the double of it (due to cheaping out on the power supply). Those have visible strobe. Simpel is bad with led light.
Find some not too cheap dimmable warm colored bulbs. They won't be cheap but might contain both a high frequency driver and fluorescent afterglow and my guess is you will not notice anything.
The simplest LED sources running from AC mains power strobe at mains frequency, which is very visible and very annoying.
Fancy LED sources don't strobe at all. I'm using an LED panel intended for videography as a room light; any flickering could show up as scanlines in video, so most lights intended for that purpose are flicker-free.
PWM sensitivity is real and has nothing to do with someone's belief system.
And perceiveved brightness is equal to the peak of the PWM wave?
That image from courtesy Daylight Computer Company is consuming too much of my attention.
"To understand why PWM bulbs have so much flicker, imagine them being controlled by a robot arm flicking the on/off switch thousands of times per second. When you want bright light, the robot varies the time so the switch is in the 'on' mode most of the time, and 'off' only briefly. Whereas when you want to dim the light, the robot arm puts the switch in 'off' most of the time and 'on' only briefly."
Give me a nice candle.
Why do we use anonymity for that? What's gained and lost by that?
There are two ways to dim LEDs: linear regulation and some sort of pulse modulation. Linear regulation is wasteful and you're pretty unlikely to encounter it, especially in battery-powered devices such as phones or laptops. Pulse modulation is common.
Human vision has a pretty limited response speed, so it seems pretty unlikely that PWM at a reasonable speed (hundreds of hertz to tens of kilohertz) can be directly perceived. That said, it can produce a stroboscopic effect, which makes motion look weird and may be disorienting in some situations. So I don't have a problem believing that it can cause headaches in predisposed individuals.
You can dim your laptop screen in a darkened room and wave your hand in front. Chances are, you're gonna see some ghost images.
Other than adjusting the frequency, pulse modulation can be "smoothed" in a couple of ways. White LEDs that contain phosphor will have an afterglow effect. Adding capacitors or inductors can help too, although it increases the overall cost. But that doesn't make the display "PWM-free", it just makes it flicker less.
Eh, they use what they can get away with. Nobody is out there policing flicker rates. Especially when you add a dimmer into the mix, there's a lot of room between good and bad, and when you're at the hardware store buying bulbs, there's not much to indicate which bulbs are terrible.
Lots of people don't seem to notice, so the terrible ones don't get returned often enough to get unstocked, and anyway, when you come back for more in 6 months, everything is different even if it has the same sku.
Not only flickering, but lots of other information about internationally available brands, including cheap Chinese stuff: CRI, real power use, etc.
Use your favorite online translator.
Actually, Energy Star and California's Title 24 have flicker standards. They may not go as far as some people like, but you can look for these certifications to know that a bulb at least meets a certain minimum standard.
> You can't perceive that
I very easily can. I had to get rid of an otherwise good monitor a few years ago before I knew it used PWM to control the backlight (and before I even knew PWM was used at all for this functionality — I only had experience with CCFL backlight before that).
It was really annoying to look at, like looking directly at cheap fluorescent lighting. Miraculously, setting brightness to 100% fixed the issue.
By googling around, I found that it used PWM with a modulation frequency of 240 Hz, with a duty cycle of 100% at full brightness, which explained everything.
I can also easily perceive flickering of one of my flashlights, the only one that uses PWM at a frequency of a few hundred hertz. Other flashlights either run at multiple KHz, or don't use PWM at all, and either one is much easier on the eyes.
Some of us really do perceive this stuff, which can be hard to believe for some reason.
Next time you see a high refresh screen, move the cursor around rapidly. It's very easy to tell.
In gaming situations what they perceive may not be the actual "flicker" of frames but the input->to->display latency, which is a very different thing to notice.
The jumps from 30-40-60-72-144 are all pretty noticeable, but 144-240 is already very minimal and 240-360+ is pretty much quackery.
My partner and me both notice the difference between cheap LEDs and expensive ones (hue). Whe both cannot pinpoint it down.
You may not believe that people that can see 120->240Hz flicker exist, but we do. In this era of frequently-cheap-ass LED lighting, it's a goddamn curse.
It's really bad for me as I work in an LED and LASER facility. I handle ALL the PWM stuff while everyone else handles the simple led/resistor/connector board assemblies. EVERYTHING FLICKERS.
> EVERYTHING FLICKERS.
I absolutely could not handle that. My sincerest condolences.
I presume this is why you think that?
> ...between cheap LEDs and expensive ones (hue)...
If so, they're referring to the Hue brand of bulbs, rather than the color property. More evidence for the fact that they're talking about flicker is that they quoted this to indicate that they were replying to it:
> > Nah, LED lighting generally uses at least 200 Hz at a minimum. Some up to kHz. You can't perceive that.
Within the pulse modulation case, though, there are two important subcases. You can PWM a load that consists basically of just the LED itself, which acts as a resistive load, and will flash on and off at high rate (potentially too fast to be noticeable, as you say). But you can also PWM an LED load with an inductor added, converting the system into a (potentially open loop) buck converter. And this allows you to choose both the brightness ripple and the PWM frequency, not just have 100% ripple. Taking ripple down to 5%, or 1%, or less, is perfectly straightforward… but inductors are expensive, and large, so shortcuts are taken.
There's a third way: a switched-mode power supply with regulated output current. This is used in most better-designed flashlights (which doesn't always correlate to price) and can be used by anything else that needs to drive an LED as well.
The article doesn't discuss what technique should be used for "constant current reduction"; it probably shouldn't be a linear regulator where efficiency is a priority.
PWM is less annoying if the frequency is very high (several kHz), though I'll leave it to people who research the topic to speak to health effects.
I wrote more here: https://news.ycombinator.com/item?id=44312224
I (and it seems others too) are very interested in this topic. I would appreciate if you could write an aritcle with "less confusion" so I can save it in my tumblog for future reference.
Much more rewarding too, because "we really don't know very much about this yet" is hard to expand to a full click-worthy essay and less likely to move product.
That effectively lowers the frequency of the lcd.
The other reason is LED brightness and color is quite non-linear with current, so PWM gives a much more straightforward dim than adjusting a current smoothly.
I had to get rid of a Samsung TV (120hz backlight, replaced with linear dimmed Sony, nicer anyway..) due to this, and I can only use modern phones at 100% brightness which disables PWM.
When I moved into my current residence I couldn't figure out why my eyes were always sore. I realized my landlord put in cheap LED can lights. I swapped them out for nicer ones, pain gone. People need to stop being cheap AF.
I've had to drastically change my devices after learning PWM was causing my vision issues and eye pain. My partner has no issues using those same devices.
Basically like how Apple laptop fans work, but for temporal modulation of signal instead of spatial modulation of fan blade gaps.
I make power drivers. I have this ultra-tiny one, with output scope captures. It produces ~825kHz PWM output, single-digit mV and mA ripple, 94+% efficiency depending on input voltage (output is input minus 0.2V)
I can induce sacchades in my eyes at will and at high speeds. Couple that with waving my hand in front of my face as I do that, and add in human vision persistence, and I can get artifacting that reveals flicker even at that high of a rate of PWM. Only direct battery power fails to induce that artifacting effect in my vision when I do that combination of movement.
No, no it isn't. We can keep units within half a percent of starting output all day long.
Voltage control is done explicitly on laser diodes, to boot. And those are WAY MORE FINICKY than an LED.
On what, exactly? How can you possibly guarantee output using purely voltage control of an LED? LEDs (and laser diodes) are fundamentally current controlled devices. You need current feedback to set the output brightness operating points.
> Voltage control is done explicitly on laser diodes, to boot. And those are WAY MORE FINICKY than an LED.
Maybe if you don't care about the output power of the laser diode. Just not practical, and will change output power at the same voltage as temperature changes.
They don't tend to emit much heat when underdriven using voltage control because they can't pass high amounts of current at low voltage. Run an LED from 2.4-2.7V in discrete hundredths of a volt steps. You can get almost nothing to getting close to drawing 50mA doing discrete steps like this. This is how we also characterize and bin LEDs. As long as the LED is on a thermal mass, it isn't going anywhere near thermal runaway.
I've done this for almost 2 decades, now.
No LED used for lighting in households is operated this way. You run them hot, because they can take it and you aren't wasting costly LED die area.
> Run an LED from 2.4-2.7V in discrete hundredths of a volt steps. You can get almost nothing to getting close to drawing 50mA doing discrete steps like this. This is how we also characterize and bin LEDs.
Discrete hundredths of volt steps? Maybe for characterization you can do this, but you expect residential LED bulbs to use circuitry able to precisely output in 1mV steps? Not happening.
I'll concede the point you can likely do characterization of LEDs using voltage control, but when you get to actually wanting to drive them for real usage in a lighting environment (i.e., not for some indicating function), voltage control isn't going to be useful. You don't have that precision, you don't have that thermal mass, and you certainly don't have the ability to drive them that weakly.
I literally work in this field, manufacturing thousands of LED boards of all types every single day.
Yes, they absolutely do. Especially under-cabinet lighting - it's deliberately underdriven via voltage because the thin strips of aluminum are basically a backing support strip and nothing more.
Let's grab what under my cabinets, for example. 4 LEDs in series. Okay, assuming 3V that's a 12V chain - the included wall-wart power supply is a 10.5V output supply at 200mA.
Please stop.
Under cabinet lighting and more generic cosmetic lighting tends to use constant voltage for the input supply, but is regulating its current elsewhere, within the cabinet module or LED itself. This is literally how the most common example of RGB LED strips (WS2812s) work. They have current limiting circuitry within the LED module itself. Getting the most brightness isn’t the most important thing, unlike in actual bulbs.
> Let's grab what under my cabinets, for example. 4 LEDs in series. Okay, assuming 3V that's a 12V chain - the included wall-wart power supply is a 10.5V output supply at 200mA.
Nobody is driving a standard white LED at 2.6V for lighting. Please, show me a datasheet of a white LED that has a reasonable lumen output at 2.6V forward voltage drop. That also doesn’t include any voltage drop allowance in any wiring, boards, or any circuitry, which can be significant in low voltage applications. Especially when cabinet lights tend to be daisy chained together. I’d guess your cabinet lighting is actually driving a string of 3 white LEDs in series.
just throws me right off the argument in an article when the fine print notes that a cited study is confounding the thing the author cares about ("sensitivty to flicker") with a much simpler and better-understood explanation (CO₂ poisoning)
https://www.notebookcheck.net/Apple-iPhone-16-Pro-smartphone...
The frequency of 239 Hz is relatively low, so sensitive users will likely notice flickering and experience eyestrain at the stated brightness setting and below.
There are reports that some users are still sensitive to PWM at 500 Hz and above, so be aware.
I always check notebookcheck.net for PWM stats.
For reference, the regular iPhone 16:
Screen flickering / PWM detected 60 Hz Amplitude: 25.75 % Secondary Frequency: 487 Hz
Do you have a source for this claim that 239 Hz is low enough to be noticeable by some measurable fraction of people? People report being sensitive to all kinds of things that end up repeatedly failing to reproduce empirically when it's put to the test (e.g. WiFi and MSG), so that there's a PWM sensitivity subreddit is not the evidence that TFA thinks it is.
The source that TFA links to backing up the idea that between 5% and 20% of people are sensitive to PWM flickering is a Wikipedia article which links to a Scientific American article which does not contain the cited numbers, and even if it did the study it discusses was researching the significantly slower 100 Hz flickering of fluorescent bulbs.
They mentioned in Results section: > For the median viewer, flicker artifacts disappear only over 500 Hz, many times the commonly reported flicker fusion rate.
The study actually demonstrates that perception of flicker for regular PWM does in fact trail off at about 65 Hz and is only perceptible when they create the high-frequency edge by alternating left/right instead of alternating the whole image at once.
It looks like the situation they're trying to recreate is techniques like frame rate control/temporal dithering [0], and since this article is now 10 years old, it's unclear if the "modern" displays that they're talking about are now obsolete or if they actually did become the displays that we're dealing with today. From what I can find OLED displays do not tend to use temporal dithering and neither do nicer LCDs: it looks like a trick employed by cheap LCDs to avoid cleaner methods of representing color.
It's an interesting study, but I don't think it redeems TFA, which isn't about the risks of temporal dithering but instead claims harms for PWM in the general case, which the study you linked shows is not perceived above 65 Hz without additional display trickery.
Basically the claim is that when there are varying flickering frequency, the requirements for non-flicker frequency is much higher.
> Traditional TVs show a sequence of images, each of which looks almost like the one just before it and each of these images has a spatial distribution of light intensities that resembles the natural world. The existing measurements of a relatively low critical flicker fusion rate are appropriate for these displays.
> In contrast, modern display designs include a sequence of coded fields which are intended to be perceived as one frame. This coded content is not a sequence of natural images that each appears similar to the preceding frame. The coded content contains unnatural sequences such as an image being followed by its inverse.
What's unclear to me 10 years down the road is if the type of display they're worried about is common now or obsolete. "Modern" in 2015 could be the same as what we have today, or the problems the study identified could have been fixed already by displays that we would call "modern" from our reference frame.
I don't know enough about display tech to comment on that, but they're very clear that if your display is showing frames in sequence without any weird trickery that the research method that gets you a 65 Hz refresh rate is a valid way to test for visible flickering.
EDIT: Here's another quote that makes the contrast that they're setting out even more clear:
> The light output of modern displays may at no point of time actually resemble a natural scene. Instead, the codes rely on the fact that at a high enough frame rate human perception integrates the incoming light, such that an image and its negative in rapid succession are perceived as a grey field. This paper explores these new coded displays, as opposed to the traditional sort which show only a sequence of nearly identical images.
It's possible that this is actually a thing that modern displays have been doing this whole time and I didn't even know it, but it's also possible that this was some combination of cutting-edge tech and cost-saving techniques that you mostly don't need to worry about with a (to us) modern OLED.
> The work presented here attempts to clarify “the rate at which human perception cannot distinguish between modulated light and a stable field
Otherwise, they would have tested the dithering directly for the full image. Here they are testing a more simpler model: varying flickering causes higher flickering-free requirements (due to eye movements). This would applies to dithering, but potentially other situations.
They repeatedly say that the goal is to measure the effect of flickering in these non-traditional displays and repeatedly say that for displays that do not do the display trickery they're concerned about the traditional measurement methods are sufficient.
You're correct that they do demonstrate that the study shows that the human eye can identify flickering at high framerates under certain conditions, but it also explicitly shows that under normal conditions of one-frame-after-another with blank frames in between for PWM dimming the flickering is unnoticeable after 65 Hz. They go out of their way to prove that before proceeding with the test of the more complicated display which they say was meant to emulate something like a 3D display or similar.
So... yes. Potentially other situations could trigger the same visibility (I'd be very concerned about VR glasses after reading this), but that's a presumption, not something demonstrated by the study. The study as performed explicitly shows that regular PWM is not perceptible as flicker above the traditionally established range of frame rates, and the authors repeatedly say that the traditional measurement methods are entirely "appropriate" for traditional displays that render plain-image frames in sequence.
EDIT: Just to put this quote down again, because it makes the authors' point abundantly clear:
> The light output of modern displays may at no point of time actually resemble a natural scene. Instead, the codes rely on the fact that at a high enough frame rate human perception integrates the incoming light, such that an image and its negative in rapid succession are perceived as a grey field. This paper explores these new coded displays, as opposed to the traditional sort which show only a sequence of nearly identical images.
They explicitly call out that the paper does not apply to traditional displays that show a sequence of nearly identical images.
I mean, they are not even using a screen during the study, they are using a projector. How are you going to even make the claim that this is display technology specific when it is not using a display?!
I started to write out another comment but it ended up just being a repeat of what I wrote above. Since we're going in circles I think I'm going to leave it here. Read the study, or at least read the extracts that I put above. They don't really leave room for ambiguity.
I think if we continue talking, we will keep running in circles. So let’s drop the details on research: it is there, we can both read it. Here is what I was trying to convey since the beginning:
- If you think the (original) article is an ads, with the writing not up to scientific standard: sure, I am ambivalent about the article itself
- If you think the gist of the article and their recommendation is wrong, I mildly disagree with you
- If you think led-flickering affecting people is in the same ballpark of concern about Wifi or GMOs, I violently disagree with you.
LEDs are new, and so the high frequency related research are not too numerous, but for the few exist, they generally point to a higher threshold of perceiving than previously thought. As for the health-effect, I believe that part is more extrapolation than researched (since those can only come after the more generic research on perceiving). So the final assessment is: how bad was the article in presenting information the way they did.
I know you’re looking for large sample size data, but PWM sensitivity absolutely exists, and I wish it didn’t. The way my eyes hurt in less than a minute while looking at an OLED phone (when I can handle an LCD phone for hours just fine) is too “obvious”. This occurs even on screens I didn’t know were OLED till I got a headache, btw.
(I’m also insanely sensitive to noticing flicker and strobe lights - I say a light flickers, everyone disagrees, I pull out the 240fps mode on my phone… and I’ve never been proven wrong till now.)
If you see the strobe effect, return the bulb and buy another one.
Initially I thought it might be related to the alternator.
I still don't know why I perceive these headlights as having an annoying flicker or why. I'd love it if some (informed) commenter could clear it up for me. Am I imagining it?
I also believe some people are just more affected by flicker than others. Some get headaches or migraines from working under PWM light, others don't even notice.
I'm not a mechanic, but I believe these car lights are capable of achieving some pretty high brightness (necessary for fog lights etc.) but are dimmed under normal conditions, leading to PWM effects you also see in cheap dimmable bulbs. It's especially noticeable for me on those "fancy" lights that try to evade blinding other cars (and end up blinding bikes and pedestrians) and those animating blinker/brake light setups.
I also just hate hate hate seeing the flicker in my peripheral vision.
Isn't it extremely more likely that any problem with the appearance of something under LED light be due to the light's peculiar spectrum?
See also:
You can get a pretty good idea of frequency, depth of flicker, and if the LED’s colors are flickering in sync from this, and I can confirm that Philips LEDs, specifically the EyeComfort series, are good.
If it really is thousands, I don't think you have a problem.
The 1981 film Looker, written and directed by Michael Crichton, features a trope: the Looker gun, which induces a trance in its targets via flashes of light, such that they are unaware of the passage of many hours of time.
Now you can turn off leds one at a time, have 1/10th dimming, and no pwm.
The same could be done with LCD backlighting or edge lighting on displays. Additional complexity, to be sure, but no power loss.
OLED is, well.. oled. Not sure what to do there.
You could of course turn on/off leds in an exponential fashion, but that would result in an impractically large light to be able to dim properly, and with increased cost (much cheaper to assemble fewer more powerful leds than many smaller ones).
Then maintain a 1/10 lux range with the combinations. Note I'm not doing the math, just showing how simple it is to work around. It's all just napkin.
The cost isn't a biggie, if it's for a target market and shares the rest of the assembly.
They flicker at 100Hz due to the rectification (or 120hz in those 60hz countries of course).
Dimmable bulbs use higher frequencies.
One of the bulbs recently burned out, and I picked up a replacement at Menards. Even though it was just a basic Sylvania, the new one clearly has a rectifier circuit as it does not exhibit any flickering that I can detect.
So anecdotally at least, the cheap bulbs without rectifiers seem to be going away from the big box stores (although I’m sure you can still get them with unpronounceable all-caps names from Amazon).
The quick test is to wave your hand quickly in front of the light.
It is done because, like most crappy things in the world, it saves somebody, somewhere, a few cents on the dollar.
Most people would not be able to tell the CRI impact of DC dimming vs PWM. Many do not visibly notice the difference. (I unfortunately do, and you won’t believe how many expensive Mercedes and similar cars flicker).
But high frequency PWM is slightly more difficult and expensive, and DC dimming might need a few more capacitors or inductors… so let’s save a buck there, shall we?
At least my house has little RFI. My neighbors on the other hand…
The gamma curves got a bit messed up, but when it's that dim it's not like I expect stellar color accuracy anyway.
I started looking into it, these poor people are paying hundreds of dollars for "flicker measurement" devices that cannot reliably tell you how the light source you're measuring is controlled
Incandescents have analog inertia in the filament which smooths the light output from the AC sine wave. This smoothing is not 100%, but I've never met anyone who can detect it without equipment.
A photocell and an oscilloscope will show the smoothed live-frequency wave (I wouldn't call it a "flicker"). The wave delta is relatively higher in the perceptual range as the voltage is lowered to approach the minimum "glow-activation" threshold of the filament -- i.e. the fluctuation is more noticeable when the bulb is dimmed to nearly off.
incidentally i found some LEDs to be extremely annoying but the flicker would sometimes just disappear on its own or after turning off and on the light switch. what could cause this?
It was a godsend. In I went with a bunch of Ikea bulbs and couldn’t be happier. Absolutely zero flicker in my testing, good color output.
https://www.amazon.fr/dp/B073QS6K3C?ref=ppx_pop_dt_b_product...
Anyone care to weigh in on this?
Thanks for the clarification.
Based on my personal experience, I think "health risk" is an overstatement: bad PWM can be uncomfortable (Geneva Airport had particularly egregious lights that started flickering in your peripheral vision), but I doubt there are any long-term effects of it.
Reading further down, a few other comments [1][2] have stated this better than me.
[1]: https://news.ycombinator.com/item?id=44313661 [2]: https://news.ycombinator.com/item?id=44312224
> I doubt there are any long-term effects of it.
I would have thought the same, but it seems to be a common experience that once someone becomes PWM sensitive it actually sticks with them.
I've been a techy my whole life; the iPhone 12 mini seemed to be the device that triggered my PWM sensitivity and since then I have been extremely sensitive to any device with PWM.
Although I have tried to keep PWM devices out of my life, I can still quickly tell when the TV in a lobby or the airplane entertainment display has PWM and there's not much you can do about it.
Also, if you take a photovoltaic cel and hook it up to an audio jack, you can turn the unseen flicker in light into sound.
Not sensitive to this thankfully, so apart from making me act a diva and pissing me off, it doesn't affect me, but I sure wish I understood the EE side of it all so that I could properly avoid all these lights, at least in my own home.
A few months ago I went through most of the bulbs in my house and replaced nearly all of them with LIFX bulbs. I had spent quite some time trying to figure out which bulbs would have the least flicker and knew from my more DIY setups[0] that PWM frequency is the cause.
I deal with Migraine somewhat regularly and PWM flicker/strobe lights amplify the pain when I'm dealing with one.
Nearly every smart bulb I've grabbed incorporates such a miserably slow PWM setting that dimming the bulb to 1% results in lighting that's reduced by only about 25%. It becomes clear when you set it to 1% that the manufacturer couldn't limit length of the "off" cycle further or the bulb would begin resembling a strobe light.
I haven't tested all of the more expensive variants, but I also had a really hard time finding any "from the manufacturer" information about the PWM frequencies. I've also never encountered an incandescent drop-in that uses anything other than PWM frequency (I wasn't even aware that there are fixtures that do that).
[0] Experiments? Magic-smoke generators? Sometimes-almost-house-fires? I'm no electrical engineer.
PWM is awful. I can tell within seconds of seeing a screen if it has PWM and usually I start to get eye issues within a few minutes.
Not all PWM has issues: I have an LG Plasma TV and it has PWM, but the way the plasma glows compared to the rigid on/off PWM cycling on modern iPhones means the Plasma display causes no issues. Same with Samsung Note9, which uses OLED with a soft PWM ... no issues
What I chose is an ESP32 controller attached to WS1812B LEDs. It turns out these operate at a PWM of nearly 20Khz and my low key tests confirm this. Even at the lowest dim level I can't detect any flicker when I move the led quickly or move something quickly in front of it.
It's amazing to me that you can get off the shelf hardware with WLED installed that works at 20Khz with these cheap RGB LEDs for less than the leading brands like a Philips Hue!
LED does have uses, such as many indicator lights (although they should not make them blue (unless you already used the other colours); but blue indicator lights are too common), and for some kind of displays. I think LED is not very good for general lighting, Christmas light, etc.
Second image is just interference with camera chip frequency. Usualy eliminated by mechanical shutter in photography.
mtalantikite•7mo ago
I've got a couple bulbs from Waveform Lighting and they don't flicker, but I totally can tell the reds are off.
I really hate the LED transition. My building replaced all the outdoor lights with them, and now it's just too bright to sit on my stoop at night like used to be so common here in Brooklyn. My backyard neighbor put in an LED floodlight and now I have to buy blackout curtains. I drive rarely, but the oncoming headlights are blinding when I do. It's pretty depressing if I think about it too much.
igor47•7mo ago
I wonder if there's room to at least engage with the neighbor to talk about friendlier light options? You might also be able engage with these folks to see if there are efforts to improve the lighting in new York: https://darksky.org/
PaulHoule•7mo ago
You could probably still reduce the flicker by either increasing the switching frequency or putting some kind of filter network between the switch and the load.
mtalantikite•7mo ago
But I've also never lived in a house that has dimmers (they've all been old homes in the north eastern US) and I never use overhead lighting, so it's not something I need or would miss.
card_zero•7mo ago
https://en.wikipedia.org/wiki/LED_tube#History
throwaway290•7mo ago
I googled for G4 LED tube PWM and got products that say they are G4 LED tubes that use PWM.
Pretty sure 100% of LED products sold anywhere use PWM if you don't use them at full brightness. I sometimes walk around lightning stores with a slo mo camera and see PWM in every price bracket.
Kubuxu•7mo ago
card_zero•7mo ago
Actually that's not true, my first thought was "just use a layer of phosphor excited by the LEDs", but fluorescent tubes do that and people used to make the same complaints about flicker, so.
Looks like "flicker index" is a useful(?) search term, anyway.
leoedin•7mo ago
Apparently Philips Hue uses 500-1000Hz. I wonder if there's manufacturers that use a much higher rate.
leakycap•7mo ago
On an old iPhone with basic slow-mo recording capabilities, typical Hue bulbs don't "blink" when the video plays back, but the PWM-dimmed iPhone in the same video recording was blinking/flashing like crazy.
~~
Another example of the PWM details mattering: I can't use any iPhone with OLED (anything from the X to current), but I am able to use a Note9 which has OLED with DC-style PWM.
leoedin•7mo ago
PWM at low duty cycles tends to be much more noticeable. But that’s where higher frequencies should solve the problem.
leakycap•7mo ago
Some PWM implementations ramp the brightness up and down slightly (easier on eyes), while other manufacturers flip the switch on and off harshly (like strobing)
The shorter time the screen is dark between being lit up results in a a shorter pulse duration, and the pulse duration and depth are more important than the Hz
Saris•7mo ago
Obviously it costs more, but I wish manufacturers would just do it.
davrosthedalek•7mo ago
Saris•7mo ago
bayindirh•7mo ago
Similarly, I prefer a Class A amplifier if I have the space, but I won't open that can of worms here.
Workaccount2•7mo ago
JKCalhoun•7mo ago
genewitch•7mo ago
Medicine for depression, anxiety, insomnia...
it's nearly a closed loop; something i intuitively realized shortly after 2001/09/11 - by the end of that year i decided i would no longer have a "Television" attached to CATV/SAT/ANT service.
I'm not sure if i am correct, i haven't really dedicated a lot of time to getting the exact numbers, talking to psychologists and sociologists and the like. But two people i know had "breakdowns" (grippy sock) in the last month and both of them always have true crime on TV in the background or listen to true crime podcasts. Shortly after that happened i was listening to the moe facts podcast where Moe used the term "trauma based entertainment" and something clicked - Moe didn't mention "it's because of pharma ads" - that's my own input after having worked for the largest television "broadcast" company in the world, just long enough to see the advertiser "dinner".
RiverCrochet•7mo ago
> it's nearly a closed loop; something i intuitively realized shortly after 2001/09/11 - by the end of that year i decided i would no longer have a "Television" attached to CATV/SAT/ANT service.
Curiously this is about the same time I decided to give up on TV and radio as well.
ToDougie•7mo ago
shakna•7mo ago
Over 75 year olds are the largest FTA cohort.
[0] https://www.acma.gov.au/publications/2024-12/report/communic...
nottorp•7mo ago
genewitch•7mo ago
Movies are another one, and lots of people watch movies. If i go on hulu or netflix and start tallying the genres (either TBE or not-TBE), what do we figure it will be?
The person i heard use the phrase "Trauma Based Entertainment" used it to describe movies that "we were sat down to watch when we were 9-12." Unfortunately the podcast i mentioned isn't super-advanced on the backend so i am unsure how to share clips at this point. But i've heard before the claim "young women as a demographic listen to true crime" repeated as a truism. I know the women close to me listened to this sort of content in the past or currently. I'm not trying to generalize this to the entire cohort.
also i only think, myself, that it's harmful, TBE/true crime/etc; i'm not a sociologist or psychologist.
BoxOfRain•7mo ago
rescbr•7mo ago
oakwhiz•7mo ago
bayindirh•7mo ago
They are nice.
CincinnatiMan•7mo ago
bayindirh•7mo ago
yx827ha•7mo ago
mousethatroared•7mo ago
Also, you can buy high wattage lights, and the three ways have lower wattage settings.
Finally, outdoor and appliance incandescents lamps are very inefficient, but last forever.
Zak•7mo ago
How do the reds look to you?
I looked at the photometric reports from a couple Waveform models on their website and the R9 (saturated red rendering) was in the 90s for both with tint almost exactly on the blackbody line. The 2700K did have a bit worse R9 than the 4000K so I could imagine it doesn't look exactly like an incandescent.
mtalantikite•7mo ago
I did at one point randomly put the LED in different configurations when I first got it and my wife was able to pick out which lamp had the LED in it every time. They just have a different feel, even if the temperature rating is around the same as the incandescent and the R9 was the highest of the LEDs I evaluated. At least these Waveform LEDs don't give me migraines though.
Zak•7mo ago
catlifeonmars•7mo ago
Zak•7mo ago
camgunz•7mo ago
[0]: https://optimizeyourbiology.com/smart-light-database
Zak•7mo ago
There are seven extended samples for CRI (R9-R15) not included in the average. LEDs often do particularly poorly on R9, a measure of saturated red rendering. LED sources with high R9 usually advertise it separately.
Tint, or blackbody deviation (Duv) is also important to the look of light and listed on the chart, but not for every model. These numbers are very small, but important: anything outside of +/s 0.006 is not white light according to ANSI. +0.006 looks very green, and -0.006 looks very pink. Interestingly, after acclimating for a few minutes, most people think very pink looks better and more natural than neutral white[0]. Most people do not like green tint.
[0] https://www.energystar.gov/sites/default/files/asset/documen...
winrid•7mo ago
dotancohen•7mo ago
jalk•7mo ago
Workaccount2•7mo ago
dotancohen•7mo ago
bondarchuk•7mo ago
What is this world coming to, that you have to buy some top-range lamps just to see the inside of your own home in true colour...
bayindirh•7mo ago
I have to pay 3x the price for a CRI>90 LED w.r.t. a CRI>80 one. At least the price difference brings better light quality regardless of CRI (soft start, dimmability, even less flicker, better light distribution). On the other hand, I'm happy that I can get halogen bulbs if I really want to.
The problem comes from losing past frames of reference. We say "we're at 99% of benefit parity with the previous generation", but this 1% losses compensate every generation, and now we live in a more efficient, but arguably less comfortable life.
A couple of Technology Connections (this guy is nuts when it comes to LEDs, in a good way) videos on the subject:
https://www.youtube.com/watch?v=qSFNufruSKw
https://www.youtube.com/watch?v=tbvVnOxb1AI
zzo38computer•7mo ago
(Unfortunately, other people where I live like to turn on the light even in the day time and that bothers me.)
bayindirh•7mo ago
It'd provide you nice warm light, and will allow to flood the space with bright light if the need arises. Neither of them are expensive. Halogen bulbs are also CRI100, so their color rendering is not different from incandescent bulbs.
Turning on lights when you have ample sun is not a wise choice, I agree.
floatrock•7mo ago
Average lifespan of an incandescent bulb is about 1,000 hours. For a typical 60 watt bulb, that means it burns 60 kWh in electricity over the course of it's life. At $0.20/kWh, that means an incandescent is going to cost you $12 in electricity over its lifetime.
A Philips Ulta-Definition 4-pack of 60W-equivalent is $11.53 on amazon today, or $2.88 / bulb. That $3 bulb is actually 8W. So over those same 1,000 hours, that's 8 kWh, or $1.6 in electricity costs. So the $3 bulb saves you $10 in lifetime electricity costs vs. one incandescent.
But those bulbs are rated for 15,000 hours. Lets assume they all lie and deflate that by 1/3 (maybe a power surge will hit a few years in). That single $3 bulb still saves you 10 x $10 = $100 in electricity costs vs incandescents over its useful life. A bit more if you pay California electricity rates, a bit less if you live near some hydro co-op. But the difference is large enough that the effect is true no matter where you are.
So yeah, top-range lamps give better results than the cheapo stuff, but top range isn't that much more expensive, and the lifetime savings of going to LED are hard to ignore -- op-ex vs. cap-ex if you will.
hulitu•7mo ago
and they last 1000 hours. Technology has evolved. Also the methods to take your money.
boomskats•7mo ago
ryandrake•7mo ago
mtalantikite•7mo ago
But I also live in a small NYC apartment, so I don't have your typical suburban house with 20+ light fixtures to deal with, I only have 6.
zzo38computer•7mo ago
BenjiWiebe•7mo ago
dzhiurgis•7mo ago
veltas•7mo ago
I drive a shallow car with old lights, and once I was blocked on a street by a much taller car sitting in front of me with very bright LED lights, and I couldn't see a thing because of the glare. I was unable to manoeuvre out the way because of this. They sat there for a minute or so stubbornly refusing to move for me before finally moving out the way.
Loughla•7mo ago
ryandrake•7mo ago
BoxOfRain•7mo ago
What I miss are the old low-pressure sodium street lights that used to be ubiquitous in the UK. Not everyone's cup of tea but they were highly efficient (outperforming LEDs for a surprisingly long time) and had this cool property of being so monochromatic they ate the colours out of everything. This made them useful for astronomers because their light was easily filtered, and reduced their impact on wildlife relative to harsh blueish LEDs. The main reason I like them is aesthetic though, they made night look like night rather than a poor approximation of day.
Thankfully my local area have given up trying to use the really harsh white they put in initially, and have at least starting putting in warmer LEDs.
xattt•7mo ago
It was so monochromatic that we thought we lost our shuttle bus stickers that were stuck to our shirts, and would have to walk around instead of being able to hop-on/hop-off. What a relief it was to emerge in daylight.
albrewer•7mo ago
The wavelength is so specific that it can have all kinds of cool applications:
https://youtu.be/UQuIVsNzqDk?si=R4VUDCfC6zcHd4XC
danudey•7mo ago
junon•7mo ago
zzo38computer•7mo ago
christophilus•7mo ago
jollyllama•7mo ago
kstrauser•7mo ago
jollyllama•7mo ago
On a long enough timeline, everything is disposable. And what is "disposal", really? How many LED bulbs are actually getting properly recycled, and isn't it true that the materials in incandescent bulbs are less harmful, relatively speaking, than those in LEDs?
leakycap•7mo ago
I don't like LED bulbs, but I think they clearly win the disposal/economical argument against incandescent in every way. Unfortunately they blink and have poor color reproduction in many versions.
jollyllama•7mo ago
leakycap•7mo ago
Don't forget the costs & emissions related to manufacturing and transporting all twenty of those incandescent bulbs.
As much as I like the old bulbs, they're unlikely to "win" in this question unless you are wanting to ignore the major lifespan difference.
Economically speaking, one can go to Dollar Tree and spend $1.25 and get a two pack of LED bulbs that will save 38 other bulbs from the manufacturing stream and landfill. Seems obvious?
mtalantikite•7mo ago
As someone that has never owned a car, is vegetarian, and walks/subways everywhere, I kinda feel like what's the big deal if I use a few incandescent bulbs every year in my small NYC apartment?
leakycap•7mo ago
Incandescents can be bad in lots of categories but if they make you happy, that's a huge plus. Some categories are much more important than others.
I'm lazy and prefer not to change bulbs multiple times a year... but the warm light is lovely when I do use a classic bulb. I live in a warm environment and hate the heat they add, but in your climate perhaps it is nice much of the year.
BenjiWiebe•7mo ago
However, if you're like my parents, they buy lots of very cheap LED bulbs which fail quickly and flicker a lot, because since they fail so often, nice ones are too expensive.
I buy bulbs that cost a bit more up front but have less flicker (undetectable to me when waving a hand under them), higher CRI, and last longer.
mtalantikite•7mo ago
However, I think I'm probably out of luck -- LEDs are cheaper and they likely don't break as much in shipping. So even if I personally find LEDs to be worse than incandescents -- they don't render reds properly, so even something simple like skin tones don't have the depth they once did, plus they give me migraines -- I likely won't be finding them on shelves anywhere near me ever again.
vitaflo•7mo ago
For bulbs tho I found this site that tests tons of them for flicker.
https://flickeralliance.org/
It was a godsend and I was able to get some Ikea bulbs with zero flicker and they’ve been great. So at least my house isn’t a flickering mess. Now I just gotta figure out a phone that’s not garbage.
prmph•7mo ago
vitaflo•7mo ago
bayindirh•7mo ago
Asking because I don't have a Phantom at home.
vitaflo•7mo ago
bmacho•7mo ago
bayindirh•7mo ago
Thicker bars means a lower PWM frequency, hence lower quality light/brightness control.
leakycap•7mo ago
vitaflo•7mo ago
leakycap•7mo ago
leakycap•7mo ago
The MacBook Air displays with the notch do not have PWM and do not seem to bother people. The 14/16" Pro models seem to be quite bad for most people (I had to return my new 14" model, it was rough).
The first PWM MacBook I bought was the 16" MacBook Pro from 2019 (the last Intel model). I'd had a 2018 MBP 15" and couldn't figure out why I just couldn't stand looking at the new 16". I thought I had a bad display but ended up learning about PWM.
cycomanic•7mo ago
vitaflo•7mo ago
Strangely tho there is something about PWM flicker, especially the kind that is deep cycle (basically 100% on then 100% off) that are super bad for me. I can look at an old CRT fine because it’s not completely on and off as it does it’s scanlines. But PWM is like flicking a light switch rapidly and it gives me the worst headaches.
orloffm•7mo ago
yx827ha•7mo ago
yx827ha•7mo ago
mdip•7mo ago
Incidentally, I went with LIFX -- I had purchased their bulbs back when they were the only realistic option besides Philips Hue for smart RGBW bulbs[0]. Still seems those two brands produce the most flicker-free variety.
[0] LIFX was a handful of lumens brighter at the time and didn't have a hub requirement
Melatonic•7mo ago
hedora•7mo ago
Hyperikon used to make great, somewhat cheap bulbs, but went under during the pandemic.
These days, I just get cheap 3000K, 90+CRI ones at the hardware store, and they’re fine.
Watch out for idiocy like “smart” bulbs though. For instance, they have ones that change color temp if you rapidly flip the light switch on and off and back on.
Also, make sure they are dimmer compatible. The ones that are not flicker badly (even at full brightness on our dimmers) and burn out after a few years.
W3zzy•7mo ago