These LED light flickers actually trigger ocular migraines for me. I had tried to put in LEDs when the incandescent ban hit the US, and ended up with a Philips Hue system. I had 4 migraines in 3 days and had to send them back. I purchased as many incandescent bulbs as I could find, but they were somewhat impossible to find at that point.
I've got a couple bulbs from Waveform Lighting and they don't flicker, but I totally can tell the reds are off.
I really hate the LED transition. My building replaced all the outdoor lights with them, and now it's just too bright to sit on my stoop at night like used to be so common here in Brooklyn. My backyard neighbor put in an LED floodlight and now I have to buy blackout curtains. I drive rarely, but the oncoming headlights are blinding when I do. It's pretty depressing if I think about it too much.
>My building replaced all the outdoor lights with them, and now it's just too bright to sit on my stoop at night like used to be so common here in Brooklyn.
What I miss are the old low-pressure sodium street lights that used to be ubiquitous in the UK. Not everyone's cup of tea but they were highly efficient (outperforming LEDs for a surprisingly long time) and had this cool property of being so monochromatic they ate the colours out of everything. This made them useful for astronomers because their light was easily filtered, and reduced their impact on wildlife relative to harsh blueish LEDs. The main reason I like them is aesthetic though, they made night look like night rather than a poor approximation of day.
Thankfully my local area have given up trying to use the really harsh white they put in initially, and have at least starting putting in warmer LEDs.
Sodium light story: my family went to Niagara Falls and took a tour of the tunnels behind the falls. The tunnels were lit by sodium lights that were not augmented by phosphors.
It was so monochromatic that we thought we lost our shuttle bus stickers that were stuck to our shirts, and would have to walk around instead of being able to hop-on/hop-off. What a relief it was to emerge in daylight.
I knew what this link was before I clicked it, and it's a must-watch for anyone who's interested in basically any aspect of film making, cinematography, or light science in general. Extremely cool topic.
I did not know those things about the sodium lights, but I did think they were better than the LEDs which are too bright and have other problems. However, I think that they should not put so much light outside in the night time, anyways.
I have your problem. Philips makes no flicker LEDs, they don't have PWM like the Hue system, and good capacitors so no 60hz flicker. They're the "Ultra Definition Eye Comfort" models.
On older LEDs you could replace the caps, but newer ones use SMT caps that require more than just a soldering iron. It makes a huge difference, and also eliminates the slight delay when switching the lamp on (though I don't recall experiencing this with recent LED lamps, even cheap ones).
searched for images of "LED bulb teardown", the SMT components are not that tiny, so quite possible to solder with a regular iron. Am I missing something?
kinda, but you're also not buying a mini space heater.
Average lifespan of an incandescent bulb is about 1,000 hours. For a typical 60 watt bulb, that means it burns 60 kWh in electricity over the course of it's life. At $0.20/kWh, that means an incandescent is going to cost you $12 in electricity over its lifetime.
A Philips Ulta-Definition 4-pack of 60W-equivalent is $11.53 on amazon today, or $2.88 / bulb. That $3 bulb is actually 8W. So over those same 1,000 hours, that's 8 kWh, or $1.6 in electricity costs. So the $3 bulb saves you $10 in lifetime electricity costs vs. one incandescent.
But those bulbs are rated for 15,000 hours. Lets assume they all lie and deflate that by 1/3 (maybe a power surge will hit a few years in). That single $3 bulb still saves you 10 x $10 = $100 in electricity costs vs incandescents over its useful life. A bit more if you pay California electricity rates, a bit less if you live near some hydro co-op. But the difference is large enough that the effect is true no matter where you are.
So yeah, top-range lamps give better results than the cheapo stuff, but top range isn't that much more expensive, and the lifetime savings of going to LED are hard to ignore -- op-ex vs. cap-ex if you will.
Personally, I'd pay a lot more in electricity costs to have light that has full spectrum output. The Waveform lights I bought are about $40/bulb, and they're nicer than the Philips I tried, but they're still not as nice as a regular full spectrum incandescent.
But I also live in a small NYC apartment, so I don't have your typical suburban house with 20+ light fixtures to deal with, I only have 6.
Incandescent has other advantages. For example, in winter time if it is cold and it is also dark in winter time, then the heat can be beneficial. In summer time you should not need the light so much since there is already the light. Either way you should not need to use the light too often, and if you do not use the light too often then you can save energy by that too, and does not need to be replace as often.
You're suggesting that LED light bulbs need replacing every year, which hasn't been my experience (like, at all). I switched over to LED bulbs 10 or so years ago and haven't had to replace a single one yet.
I’ve got outdoor LED lights that fail constantly. So often that I keep dozens of them in storage to replace them as they die. Much less reliable than the incandescents they replaced. I’m fact, I have a string of about 50 sockets, about half are still incandescents that have survived for 10+ years, and the other half are LEDs that I have to keep replacing. Sadly, whenever an incandescent light goes, I have to replace it with the crappy LED version, so eventually it will be 100% crap.
I really hate modern technology sometimes. I have nothing against being more energy efficient, running cooler, lasting longer, but we're losing some great things along the road.
I have to pay 3x the price for a CRI>90 LED w.r.t. a CRI>80 one. At least the price difference brings better light quality regardless of CRI (soft start, dimmability, even less flicker, better light distribution). On the other hand, I'm happy that I can get halogen bulbs if I really want to.
The problem comes from losing past frames of reference. We say "we're at 99% of benefit parity with the previous generation", but this 1% losses compensate every generation, and now we live in a more efficient, but arguably less comfortable life.
A couple of Technology Connections (this guy is nuts when it comes to LEDs, in a good way) videos on the subject:
I would rather buy a incandescent light (even if I have to pay 3x or 5x) which is not as bright as the LED (forty watts or possibly even lower, should be sufficient; I have a few 40W incandescent light and they are good enough), and then not turn it on in the day time when it is light outside.
(Unfortunately, other people where I live like to turn on the light even in the day time and that bothers me.)
You can buy an E27 halogen bulb around 50 watts (which would be around 100W incandescent) and pair it with a universal dimmer.
It'd provide you nice warm light, and will allow to flood the space with bright light if the need arises. Neither of them are expensive. Halogen bulbs are also CRI100, so their color rendering is not different from incandescent bulbs.
Turning on lights when you have ample sun is not a wise choice, I agree.
It was a godsend and I was able to get some Ikea bulbs with zero flicker and they’ve been great. So at least my house isn’t a flickering mess. Now I just gotta figure out a phone that’s not garbage.
How did you survive office buildings, airports, department stores etc. prior to leds? I am by no means sensitive to flicker, but fluorescent tubes which were used in pretty much every office building/large department store, before leds became common, used to flicker like crazy in comparison. And there was always at least one which was broken and flickered on and off very slowly (including the characteristic sound).
Not well. But at least I had windows at work to counteract it and I could turn the lights off. I also was a contractor for 15 years which meant working from my home mostly.
Strangely tho there is something about PWM flicker, especially the kind that is deep cycle (basically 100% on then 100% off) that are super bad for me. I can look at an old CRT fine because it’s not completely on and off as it does it’s scanlines. But PWM is like flicking a light switch rapidly and it gives me the worst headaches.
Thanks for the link! I bought a moto g power 5G - 2024. It has an LED screen. I've been using this line of phone for years. It's morning fancy, but at least I can actually look at it.
In my testing no, the MacBooks are fine. It’s the phones that are a problem. Easiest way to tell is get a camera app you can set the shutter speed to 10,000 fps and point it at a screen. If there are black lines across the screen there is a PWM issue. The thicker and darker the black lines the worse it is.
Due to rolling shutter, you take a “progressive” photo. As a result, if the screen flickers during that time, you see this change in light intensity as horizontal bars.
Thicker bars means a lower PWM frequency, hence lower quality light/brightness control.
I think it varies by model. The 14" & 16" MacBook Pros with the miniLED give me and many others PWM issues. On the other hand, the MacBook Air models with the notch don't seem to bother most people.
The MacBook Air displays with the notch do not have PWM and do not seem to bother people. The 14/16" Pro models seem to be quite bad for most people (I had to return my new 14" model, it was rough).
The first PWM MacBook I bought was the 16" MacBook Pro from 2019 (the last Intel model). I'd had a 2018 MBP 15" and couldn't figure out why I just couldn't stand looking at the new 16". I thought I had a bad display but ended up learning about PWM.
That sucks; I feel your pain. I, too, strongly dislike overly bright lighting.
I wonder if there's room to at least engage with the neighbor to talk about friendlier light options? You might also be able engage with these folks to see if there are efforts to improve the lighting in new York: https://darksky.org/
> I drive rarely, but the oncoming headlights are blinding when I do.
I drive a shallow car with old lights, and once I was blocked on a street by a much taller car sitting in front of me with very bright LED lights, and I couldn't see a thing because of the glare. I was unable to manoeuvre out the way because of this. They sat there for a minute or so stubbornly refusing to move for me before finally moving out the way.
It's super common where I live for teenagers who drive jacked up trucks to replace their headlights with super bright led lights. They don't adjust the angle of the beam, so they're just like brights all the time. It's miserable.
Also, more people seem to be driving with their bright lights on 100% of the time. I once rode as a passenger at night with an ex-coworker driving and I noticed he used his brights the whole time, even when there were oncoming cars. I asked him why and he looked at me like I was stupid and said “because they’re brighter and let me see better.” When I pointed out that they blind other drivers he just shrugged and said “fuck em, not my problem.”
My take is that PWM dimmers are dramatically more energy efficient than the old rheostat dimmers people used to use. If you operate a transistor in a digital mode where it is either on or off it is close to 100% efficient, but if you operate it in a 50% power mode you have to send 50% of the power to a load and the other 50% to a resistor. Thus CMOS logic eradicated bipolar, switching power supplies replaced linear power supplies, a Class D amplifier can be a fraction the size of a Class A amplifier, etc.
You could probably still reduce the flicker by either increasing the switching frequency or putting some kind of filter network between the switch and the load.
Old dimmers are triac based, with the potentiometer simply setting the trigger voltage, not doing the actual dimming. These were in fact very efficient.
For sure, they're definitely way more efficient. They just unfortunately give me migraines. I'd be open to trying some that have a filter network or some other smoothing on the flicker.
But I've also never lived in a house that has dimmers (they've all been old homes in the north eastern US) and I never use overhead lighting, so it's not something I need or would miss.
Have you ever tested various PWM frequencies? 50/60Hz is very noticable - but if the PWM is switching at 1000Hz? 5kHz? There is presumably a rate at which it is imperceptible to you?
Apparently Philips Hue uses 500-1000Hz. I wonder if there's manufacturers that use a much higher rate.
Beyond the Hz, the depth of the modulation matters. I am sensitive to poor PWM implementation, but Hue bulbs luckily don't bother me.
On an old iPhone with basic slow-mo recording capabilities, typical Hue bulbs don't "blink" when the video plays back, but the PWM-dimmed iPhone in the same video recording was blinking/flashing like crazy.
~~
Another example of the PWM details mattering: I can't use any iPhone with OLED (anything from the X to current), but I am able to use a Note9 which has OLED with DC-style PWM.
I was referring to the extent of brightness variation in the flicker -- while many focus on a higher frequency (Hz) to reduce eyestrain, the key factor is how the screen behaves during the off cycle
Some PWM implementations ramp the brightness up and down slightly (easier on eyes), while other manufacturers flip the switch on and off harshly (like strobing)
The shorter time the screen is dark between being lit up results in a a shorter pulse duration, and the pulse duration and depth are more important than the Hz
Aside from that Wikipedia article, where 1 source is not available and the other one is in Finnish, there's pretty much nothing online.
I googled for G4 LED tube PWM and got products that say they are G4 LED tubes that use PWM.
Pretty sure 100% of LED products sold anywhere use PWM if you don't use them at full brightness. I sometimes walk around lightning stores with a slo mo camera and see PWM in every price bracket.
It is always PWM under the hood, the question is, how much was spent (or not) on the filtering network out of the PWM. Is it closer to buck converter or is it straight up flicker at the output.
Since these things have lots of LEDs, my first thought was to put a range of different tiny delays on them to induce destructive interference, so that the off parts of one LED's flicker are the on parts of another, to smooth out the overall output.
Actually that's not true, my first thought was "just use a layer of phosphor excited by the LEDs", but fluorescent tubes do that and people used to make the same complaints about flicker, so.
Looks like "flicker index" is a useful(?) search term, anyway.
Personally, I don't care for more energy efficiency if my head is hurting after 30 minutes under that light. I can really see all the flickering when I blink or move my head.
Similarly, I prefer a Class A amplifier if I have the space, but I won't open that can of worms here.
Lights for high speed cameras use really good filtering on their PWM switching, or just linear power supplies. It would be nice to have a premium bulb that has longer life and much less flickering.
Philips' Eye Comfort series virtually have no flicker. I personally only use these series. They come in CRI>80 and CRI>90 variants. Latter one is also dimmable and comes with soft on/off.
Right there with you. I just dropped an amount of money I'm unwilling to admit on replacement bulbs throughout -- only this time I was replacing WiFi RGBW LED bulbs in rooms that had lower-end bulbs (almost everything on the market).
Incidentally, I went with LIFX -- I had purchased their bulbs back when they were the only realistic option besides Philips Hue for smart RGBW bulbs[0]. Still seems those two brands produce the most flicker-free variety.
[0] LIFX was a handful of lumens brighter at the time and didn't have a hub requirement
Well, no one is forcing you to buy incandescents, but I certainly would be happy to buy them if they're available.
However, I think I'm probably out of luck -- LEDs are cheaper and they likely don't break as much in shipping. So even if I personally find LEDs to be worse than incandescents -- they don't render reds properly, so even something simple like skin tones don't have the depth they once did, plus they give me migraines -- I likely won't be finding them on shelves anywhere near me ever again.
On a long enough timeline, everything is disposable. And what is "disposal", really? How many LED bulbs are actually getting properly recycled, and isn't it true that the materials in incandescent bulbs are less harmful, relatively speaking, than those in LEDs?
I have never heard of anyone recycling an incandescent bulb, but recycling bulbs became a big deal with CFLs came out and most people seem to be in the habit in the area I'm in. Store have bulb take-backs for CFL/LED in the entryway, for example.
I don't like LED bulbs, but I think they clearly win the disposal/economical argument against incandescent in every way. Unfortunately they blink and have poor color reproduction in many versions.
Then you might should reframe your question to the approximate # of incandescent bulbs a person would buy during the typical lifetime of an LED bulb. On the low end, it's 20.
Don't forget the costs & emissions related to manufacturing and transporting all twenty of those incandescent bulbs.
As much as I like the old bulbs, they're unlikely to "win" in this question unless you are wanting to ignore the major lifespan difference.
Economically speaking, one can go to Dollar Tree and spend $1.25 and get a two pack of LED bulbs that will save 38 other bulbs from the manufacturing stream and landfill. Seems obvious?
I looked at the photometric reports from a couple Waveform models on their website and the R9 (saturated red rendering) was in the 90s for both with tint almost exactly on the blackbody line. The 2700K did have a bit worse R9 than the 4000K so I could imagine it doesn't look exactly like an incandescent.
I mean I hate to be like "vibes", but, kinda vibes. There's just something about the light coming out of the Waveform LED I have in one lamp in my living room versus the incandescent I have on the other side of the room. Definitely not a scientific take!
I did at one point randomly put the LED in different configurations when I first got it and my wife was able to pick out which lamp had the LED in it every time. They just have a different feel, even if the temperature rating is around the same as the incandescent and the R9 was the highest of the LEDs I evaluated. At least these Waveform LEDs don't give me migraines though.
"Vibes" are fair. I just put flashlights with two incandescent-like LEDs (Nichia 519A and Nichia B35A in 2700K) and I can see a slight difference in how they render colors even though the spectrophotometer says all the major metrics are within a couple points of each other.
Looking closely at the measurements, when the B35A has an advantage on individual CRI samples, it's usually a larger gap than when the 519A has an advantage. They're both in the 90s for R1-14, and it takes a keen eye to tell the difference.
I recently learned about Color Rendering Index, which sounds like pseudoscience but apparently it is not. Here's a handy table I used for buying lights; again domain sounds grifty but, it's a searchable table :shrug: [0].
CRI is absolutely real. It's an old and relatively simplistic metric with several potential successors. The chart you linked uses one: TM30, which is based on the average of 99 different colors instead of CRI's 8.
There are seven extended samples for CRI (R9-R15) not included in the average. LEDs often do particularly poorly on R9, a measure of saturated red rendering. LED sources with high R9 usually advertise it separately.
Tint, or blackbody deviation (Duv) is also important to the look of light and listed on the chart, but not for every model. These numbers are very small, but important: anything outside of +/s 0.006 is not white light according to ANSI. +0.006 looks very green, and -0.006 looks very pink. Interestingly, after acclimating for a few minutes, most people think very pink looks better and more natural than neutral white[0]. Most people do not like green tint.
I don’t heave the migraines, but everything else you described is spot on. Sitting outside at night is much less enjoyable due to neighborhood LED lighting. If I could, I’d shoot them all out.
If you're savvy with manufacturing, make yourself a left handed edison thread (I can't find them anywhere). Left handed incandecent lightbulbs are still legal
Also, you can buy high wattage lights, and the three ways have lower wattage settings.
Finally, outdoor and appliance incandescents lamps are very inefficient, but last forever.
I wonder to the degree their effects are much worse than migraines. Perhaps irritability? Mental confusion? Anxiety? I'm spitballing here, but to be sure it seems like our world is somehow a place of more anxiety, irritation .... I would love for it to be something we could take control of.
i doubt it's the light bulbs. I posited the other day, by assembling a few different ideas, that Trauma Based Entertainment is to blame for this. something like 2/3rds of all Television programing is law-enforcement adjacent. True Crime is super popular on TV, law and order, NCIS, FBI this-and-that. And what's one of the largest advertising cohorts?
Medicine for depression, anxiety, insomnia...
it's nearly a closed loop; something i intuitively realized shortly after 2001/09/11 - by the end of that year i decided i would no longer have a "Television" attached to CATV/SAT/ANT service.
I'm not sure if i am correct, i haven't really dedicated a lot of time to getting the exact numbers, talking to psychologists and sociologists and the like. But two people i know had "breakdowns" (grippy sock) in the last month and both of them always have true crime on TV in the background or listen to true crime podcasts. Shortly after that happened i was listening to the moe facts podcast where Moe used the term "trauma based entertainment" and something clicked - Moe didn't mention "it's because of pharma ads" - that's my own input after having worked for the largest television "broadcast" company in the world, just long enough to see the advertiser "dinner".
The only ones watching traditional OTA TV anymore are elders. That advertising cohort is why OTA TV ads are filled with pharmaceuticals and "you may be entitled to financial compensation" type ads, at least where I'm at. Traditional TV has been dying since Youtube and broadband. MTV plays Ridiculousness constantly because no one is actually watching it.
> it's nearly a closed loop; something i intuitively realized shortly after 2001/09/11 - by the end of that year i decided i would no longer have a "Television" attached to CATV/SAT/ANT service.
Curiously this is about the same time I decided to give up on TV and radio as well.
It's definitely long lost its crown as the main way to watch video, but linear TV does still have a role. Apparently there's still the odd broadcast in the UK that means the national grid has to work to keep the frequency stable when everyone goes to put their kettles on in the ad breaks.
Live sports. Latency is so much lower on OTA television that you can tell who is watching a football match on UHF, cable or multicast IPTV, satellite and through unicast internet.
I don't know the content breakdown of online videos, but i know that creators like audit the audit are in the algo; and that's trauma based entertainment as well. the "here's the story and police interview of so-and-so", plus the news stations that have youtube presence.
Movies are another one, and lots of people watch movies. If i go on hulu or netflix and start tallying the genres (either TBE or not-TBE), what do we figure it will be?
The person i heard use the phrase "Trauma Based Entertainment" used it to describe movies that "we were sat down to watch when we were 9-12." Unfortunately the podcast i mentioned isn't super-advanced on the backend so i am unsure how to share clips at this point. But i've heard before the claim "young women as a demographic listen to true crime" repeated as a truism. I know the women close to me listened to this sort of content in the past or currently. I'm not trying to generalize this to the entire cohort.
also i only think, myself, that it's harmful, TBE/true crime/etc; i'm not a sociologist or psychologist.
There are two ways to dim LEDs: linear regulation and some sort of pulse modulation. Linear regulation is wasteful and you're pretty unlikely to encounter it, especially in battery-powered devices such as phones or laptops. Pulse modulation is common.
Human vision has a pretty limited response speed, so it seems pretty unlikely that PWM at a reasonable speed (hundreds of hertz to tens of kilohertz) can be directly perceived. That said, it can produce a stroboscopic effect, which makes motion look weird and may be disorienting in some situations. So I don't have a problem believing that it can cause headaches in predisposed individuals.
You can dim your laptop screen in a darkened room and wave your hand in front. Chances are, you're gonna see some ghost images.
Other than adjusting the frequency, pulse modulation can be "smoothed" in a couple of ways. White LEDs that contain phosphor will have an afterglow effect. Adding capacitors or inductors can help too, although it increases the overall cost. But that doesn't make the display "PWM-free", it just makes it flicker less.
The article is worse than confused. It's a marketing piece written in a way that sounds vaguely science-y but with only a tenuous basis in real research.
> I'm willing to entertain the idea that LED flicker is actually problematic, but I wish essays like this would be honest about the degree of confidence we have given the current state of the evidence.
I (and it seems others too) are very interested in this topic. I would appreciate if you could write an aritcle with "less confusion" so I can save it in my tumblog for future reference.
Yep, it's obvious that a lot of people are interested because junk articles like this usually get penalized harder. That interest is what this piece is preying on. Unfortunately it's easier to write a piece that tells people what they want to hear and spreads FUD than it is to write a piece that corrects the misinformation.
Much more rewarding too, because "we really don't know very much about this yet" is hard to expand to a full click-worthy essay and less likely to move product.
I think it’s pretty common for people to be able to perceive PWM flicker in their peripheral vision that they can’t when looking directly at the source. I encounter this fairly regularly myself.
Nah, LED lighting generally uses at least 200 Hz at a minimum. Some up to kHz. You can't perceive that. Older stuff or cheap quality might be using un-rectified AC/DC which you can see. Like cheap Xmas lights.
I very easily can. I had to get rid of an otherwise good monitor a few years ago before I knew it used PWM to control the backlight (and before I even knew PWM was used at all for this functionality — I only had experience with CCFL backlight before that).
It was really annoying to look at, like looking directly at cheap fluorescent lighting. Miraculously, setting brightness to 100% fixed the issue.
By googling around, I found that it used PWM with a modulation frequency of 240 Hz, with a duty cycle of 100% at full brightness, which explained everything.
I can also easily perceive flickering of one of my flashlights, the only one that uses PWM at a frequency of a few hundred hertz. Other flashlights either run at multiple KHz, or don't use PWM at all, and either one is much easier on the eyes.
Some of us really do perceive this stuff, which can be hard to believe for some reason.
Like back when CRTs were mainstream you'd have those computer labs with monitors set to 60-85Hz and most people wouldn't notice, but some would. I definitely did, I couldn't stand looking at a CRT set to less than 100 Hz for more than an hour.
Absolutely, I also had massive issues with them, ending with red eyes and headaches within an hour of use. Getting my first LED monitor (with a CCFL backlight) was out of this world.
In gaming situations what they perceive may not be the actual "flicker" of frames but the input->to->display latency, which is a very different thing to notice.
Hue bulbs have pretty bad CRI actually. They only claim >80, which almost any LED bulb is capable of these days. A good LED bulb (include those made by Philips these days) have a CRI >95.
Depends on the bulb. For those of us with "fast" eyes, some LED bulbs that are just fine for others are a subtly flickering infuriation generator.
You may not believe that people that can see 120->240Hz flicker exist, but we do. In this era of frequently-cheap-ass LED lighting, it's a goddamn curse.
> ...between cheap LEDs and expensive ones (hue)...
If so, they're referring to the Hue brand of bulbs, rather than the color property. More evidence for the fact that they're talking about flicker is that they quoted this to indicate that they were replying to it:
> > Nah, LED lighting generally uses at least 200 Hz at a minimum. Some up to kHz. You can't perceive that.
I had to shop around a LOT before I could find LED headlight bulbs with just a fan and a bare-ass resistor for current control so I wouldn't see that forsaken flicker.
It's really bad for me as I work in an LED and LASER facility. I handle ALL the PWM stuff while everyone else handles the simple led/resistor/connector board assemblies. EVERYTHING FLICKERS.
My thanks to you for shopping around for non-shit headlights. For a long, long while it seems like every third car had strobing headlights. (Now, the primary problem is INTENSELY bright and poorly-aimed headlights. I'm not sure which is worse, to be honest... but both SHOULD be super illegal.)
> EVERYTHING FLICKERS.
I absolutely could not handle that. My sincerest condolences.
> Nah, LED lighting generally uses at least 200 Hz at a minimum.
Eh, they use what they can get away with. Nobody is out there policing flicker rates. Especially when you add a dimmer into the mix, there's a lot of room between good and bad, and when you're at the hardware store buying bulbs, there's not much to indicate which bulbs are terrible.
Lots of people don't seem to notice, so the terrible ones don't get returned often enough to get unstocked, and anyway, when you come back for more in 6 months, everything is different even if it has the same sku.
Actually, Energy Star and California's Title 24 have flicker standards. They may not go as far as some people like, but you can look for these certifications to know that a bulb at least meets a certain minimum standard.
There's a third way: a switched-mode power supply with regulated output current. This is used in most better-designed flashlights (which doesn't always correlate to price) and can be used by anything else that needs to drive an LED as well.
The article doesn't discuss what technique should be used for "constant current reduction"; it probably shouldn't be a linear regulator where efficiency is a priority.
PWM is less annoying if the frequency is very high (several kHz), though I'll leave it to people who research the topic to speak to health effects.
In the world of LEDs, a switched-mode power supply with regulated output current is called a "constant current driver". I assume that's what this calls "DC dimming".
A linear regulator might also reasonably be described as "DC dimming" or "constant current". The article is only concerned with flicker so it doesn't discuss efficiency.
It causes physical pain for many. You can't always see the flicker but your eye muscles react faster than your perception, and they get sore.
I had to get rid of a Samsung TV (120hz backlight, replaced with linear dimmed Sony, nicer anyway..) due to this, and I can only use modern phones at 100% brightness which disables PWM.
When I moved into my current residence I couldn't figure out why my eyes were always sore. I realized my landlord put in cheap LED can lights. I swapped them out for nicer ones, pain gone. People need to stop being cheap AF.
It makes me wonder how differently we all perceive the world around us, and if that impacts our decisions in more ways than we realize.
I've had to drastically change my devices after learning PWM was causing my vision issues and eye pain. My partner has no issues using those same devices.
> There are two ways to dim LEDs: linear regulation and some sort of pulse modulation. Linear regulation is wasteful and you're pretty unlikely to encounter it. Pulse modulation is common.
Within the pulse modulation case, though, there are two important subcases. You can PWM a load that consists basically of just the LED itself, which acts as a resistive load, and will flash on and off at high rate (potentially too fast to be noticeable, as you say). But you can also PWM an LED load with an inductor added, converting the system into a (potentially open loop) buck converter. And this allows you to choose both the brightness ripple and the PWM frequency, not just have 100% ripple. Taking ripple down to 5%, or 1%, or less, is perfectly straightforward… but inductors are expensive, and large, so shortcuts are taken.
Yes, but you need to at some point decide it’s worth adding an inductor to the BOM — trace inductance isn’t enough at any sane switching frequency (and FET switching losses kick in at the insane frequencies). And BOM item costs more than no BOM item.
>There are two ways to dim LEDs: linear regulation and some sort of pulse modulation. Linear regulation is wasteful and you're pretty unlikely to encounter it, especially in battery-powered devices such as phones or laptops. Pulse modulation is common.
The other reason is LED brightness and color is quite non-linear with current, so PWM gives a much more straightforward dim than adjusting a current smoothly.
no, 2 main methods to change led brightness: Changing duty cycle, or changing current. PWM is way to change duty cycle, and Linear is 1 way to change current. There are other ways to change duty cycle or current.
I wonder if it’s possible to have randomized sub-cycle interval variation so as to “spread out the spectrum” of the PWM signal while preserving the same resulting integrated brightness over the observable timescale.
Basically like how Apple laptop fans work, but for temporal modulation of signal instead of spatial modulation of fan blade gaps.
There are other ways to dim LEDs. Voltage stepping to control how much current can actually flow through the diode is one method (and this is how we test LEDs where I work, a 3V LED gets tested at ~2.4-2.6V. It will barely draw a couple milliamps despite the power supply set to allow an amp of draw.) The LED will light up, enough to see that it is working well, but not eye-searingly bright.
Voltage control of LEDs is extremely unreliable. The amount of current consumed at a specific voltage will vary wildly depending on temperature, process, etc. Nobody does voltage control for LEDs unless you do not care about consistent brightness at all.
> No, no it isn't. We can keep units within half a percent of starting output all day long.
On what, exactly? How can you possibly guarantee output using purely voltage control of an LED? LEDs (and laser diodes) are fundamentally current controlled devices. You need current feedback to set the output brightness operating points.
> Voltage control is done explicitly on laser diodes, to boot. And those are WAY MORE FINICKY than an LED.
Maybe if you don't care about the output power of the laser diode. Just not practical, and will change output power at the same voltage as temperature changes.
"Human vision has a pretty limited response speed, so it seems pretty unlikely that PWM at a reasonable speed (hundreds of hertz to tens of kilohertz) can be directly perceived."
I make power drivers. I have this ultra-tiny one, with output scope captures. It produces ~825kHz PWM output, single-digit mV and mA ripple, 94+% efficiency depending on input voltage (output is input minus 0.2V)
I can induce sacchades in my eyes at will and at high speeds. Couple that with waving my hand in front of my face as I do that, and add in human vision persistence, and I can get artifacting that reveals flicker even at that high of a rate of PWM. Only direct battery power fails to induce that artifacting effect in my vision when I do that combination of movement.
Isn't this article conflating PWM flicker with cheap AC rectifiers causing LEDs to flicker because they periodically, at 50 Hz, get under the voltage they need to emit any light? I can't see why light fixtures in buildings—except modern ones where the lights are actually dimmable—would even have any sort of PWM.
Yes it absolutely is. Those are candle-style bulbs that use filament LEDs, they’re normally just use a basic rectifier and a regulator chip that does basic linear regulation of the peak of the sine wave.
They flicker at 100Hz due to the rectification (or 120hz in those 60hz countries of course).
There is a pendant light in my entryway that came outfitted with LED filament bulbs when I bought my home. They flicker noticeably, and I would have replaced them all if it were in my living room.
One of the bulbs recently burned out, and I picked up a replacement at Menards. Even though it was just a basic Sylvania, the new one clearly has a rectifier circuit as it does not exhibit any flickering that I can detect.
So anecdotally at least, the cheap bulbs without rectifiers seem to be going away from the big box stores (although I’m sure you can still get them with unpronounceable all-caps names from Amazon).
> I can't see why light fixtures in buildings—except modern ones where the lights are actually dimmable—would even have any sort of PWM.
It is done because, like most crappy things in the world, it saves somebody, somewhere, a few cents on the dollar.
Most people would not be able to tell the CRI impact of DC dimming vs PWM. Many do not visibly notice the difference. (I unfortunately do, and you won’t believe how many expensive Mercedes and similar cars flicker).
But high frequency PWM is slightly more difficult and expensive, and DC dimming might need a few more capacitors or inductors… so let’s save a buck there, shall we?
I've wondered about PWM flicker when I started trying to figure out why so many modern car headlights seem like they are strobing to me.
Initially I thought it might be related to the alternator.
I still don't know why I perceive these headlights as having an annoying flicker or why. I'd love it if some (informed) commenter could clear it up for me. Am I imagining it?
Car headlights seem to really cheapen out on the PWN flicker. Even the 2 euro LEDs I buy at the discount store seem less flickery than the lights of some luxury cars. I thought it could be that people are buying the cheapest replacement bulbs they can get their hands on, but then I saw the same thing happening on a new BMW.
I also believe some people are just more affected by flicker than others. Some get headaches or migraines from working under PWM light, others don't even notice.
I'm not a mechanic, but I believe these car lights are capable of achieving some pretty high brightness (necessary for fog lights etc.) but are dimmed under normal conditions, leading to PWM effects you also see in cheap dimmable bulbs. It's especially noticeable for me on those "fancy" lights that try to evade blinding other cars (and end up blinding bikes and pedestrians) and those animating blinker/brake light setups.
This is a timely article for me since I'm building an indoor light display and the client specifically mentioned flicker. It's supposed to be a cosy warm light setup (that animates to show the movement of our planets, each represented by a lantern in the building). They were so concerned about flicker they suggested using incandescent but I'd really like to use leds for the obvious reasons (power consumption, fire risk, lifetime, color choice).
What I chose is an ESP32 controller attached to WS1812B LEDs. It turns out these operate at a PWM of nearly 20Khz and my low key tests confirm this. Even at the lowest dim level I can't detect any flicker when I move the led quickly or move something quickly in front of it.
It's amazing to me that you can get off the shelf hardware with WLED installed that works at 20Khz with these cheap RGB LEDs for less than the leading brands like a Philips Hue!
I can tell you that lights strobing exacerbate my migraines. Even 120 hertz from fluorescent lights will affect me. I have mitigated this in the past by adding incandescent lights in my office, or demanding to work near a window. LED lamps are no good, as another commenter posted, even the simplest ones strobe. Incandescent bulbs grow harder to find as time goes on. Progress?
The simplest LED sources running from AC mains power strobe at mains frequency, which is very visible and very annoying.
Fancy LED sources don't strobe at all. I'm using an LED panel intended for videography as a room light; any flickering could show up as scanlines in video, so most lights intended for that purpose are flicker-free.
The simplest ones always strobe at line frequency or the double of it (due to cheaping out on the power supply). Those have visible strobe. Simpel is bad with led light.
Find some not too cheap dimmable warm colored bulbs. They won't be cheap but might contain both a high frequency driver and fluorescent afterglow and my guess is you will not notice anything.
I can sometimes tell when a lamp is PWM'd if I look out it out of the corner of my eye. I suspect it may just be the cheaper, lower frequency ones but I can often see it flickering, such as a particular lamp I have when it's on low brightness, and there's a particular shop near me that I can see it in
As a flicker-sensitive person: the sad part of it is that to do this properly you need to have your LEDs on a proper inverter, so for most scenarios getting rid of the flicker means "get expensive light fixtures _and_ rewire their supply _and_ you can't use your existing AC mains anymore, nor can you use switches". The PWM is a cheap way to do dimming given the AC input of the grid. And it will be especially prevalent when you do want LEDs but you don't want to "do anything special" to make them work well
And effectively useless for dimming in the upper half of the intensity range.
You could of course turn on/off leds in an exponential fashion, but that would result in an impractically large light to be able to dim properly, and with increased cost (much cheaper to assemble fewer more powerful leds than many smaller ones).
I think there are many problems with LED general purpose lighting. They are too bright, wrong colours, and others. I prefer to have the windows that we can have the light from the sun outside, and when that won't do, to use incandescent lights which are not too bright. However, not everyone will do that, and they put too much light outside at night too.
LED does have uses, such as many indicator lights (although they should not make them blue (unless you already used the other colours); but blue indicator lights are too common), and for some kind of displays. I think LED is not very good for general lighting, Christmas light, etc.
Utility frequency vary depending on location. That's why we have NTSC and PAL standards (50 vs 60 Hz) for flicker free video under various artificial light conditions.
Second image is just interference with camera chip frequency. Usualy eliminated by mechanical shutter in photography.
They list the PWM frequency in the bulb specifications? That's news to me.
A few months ago I went through most of the bulbs in my house and replaced nearly all of them with LIFX bulbs. I had spent quite some time trying to figure out which bulbs would have the least flicker and knew from my more DIY setups[0] that PWM frequency is the cause.
I deal with Migraine somewhat regularly and PWM flicker/strobe lights amplify the pain when I'm dealing with one.
Nearly every smart bulb I've grabbed incorporates such a miserably slow PWM setting that dimming the bulb to 1% results in lighting that's reduced by only about 25%. It becomes clear when you set it to 1% that the manufacturer couldn't limit length of the "off" cycle further or the bulb would begin resembling a strobe light.
I haven't tested all of the more expensive variants, but I also had a really hard time finding any "from the manufacturer" information about the PWM frequencies. I've also never encountered an incandescent drop-in that uses anything other than PWM frequency (I wasn't even aware that there are fixtures that do that).
[0] Experiments? Magic-smoke generators? Sometimes-almost-house-fires? I'm no electrical engineer.
I thought this article was a parody of the people who think they're being poisoned by wifi until I read the comments here.
I started looking into it, these poor people are paying hundreds of dollars for "flicker measurement" devices that cannot reliably tell you how the light source you're measuring is controlled
Even incandescent running on AC has flicker. It's funny when that's used as the gold standard. LED running on DC has less flicker than normal incandescent.
An LED running on continuous DC has no flicker, but PWM is not continuous DC, it's a square wave of some frequency.
Incandescents have analog inertia in the filament which smooths the light output from the AC sine wave. This smoothing is not 100%, but I've never met anyone who can detect it without equipment.
A photocell and an oscilloscope will show the smoothed live-frequency wave (I wouldn't call it a "flicker"). The wave delta is relatively higher in the perceptual range as the voltage is lowered to approach the minimum "glow-activation" threshold of the filament -- i.e. the fluctuation is more noticeable when the bulb is dimmed to nearly off.
I don't know about the health risks and harms, but it sure as hell is annoying if nothing else. I don't think it's due to PWM specifically, but the light in my fridge strobes at the mains frequency, so it "samples" my arm as I move it around in there when picking stuff out. But 50 Hz is extremely low - so it looks like as if my arm's movement "stuttered". Super jarring.
Not sensitive to this thankfully, so apart from making me act a diva and pissing me off, it doesn't affect me, but I sure wish I understood the EE side of it all so that I could properly avoid all these lights, at least in my own home.
I wonder if some of the problem may be "beating": the PWM frequency of two lights may be too high to affect anyone, but if they are different, then where they overlap you will see a pulse at the difference between the frequencies. Surely that would be a very obvious problem though.
I have a phone camera app you can set the shutter speed to 10000 fps. It becomes super obvious then, and you can actually somewhat compare how fast the flicker is that way.
Entirely anecdotal and just personal experience but I get eye strain and headaches from "flickery" LEDs. Cheap shitty room lights. Replace them with good bulbs (Philips Hue) which strobe at a much faster rate and hey less eye strain and less headaches.
I also just hate hate hate seeing the flicker in my peripheral vision.
It's difficult for PWM-sensitive Mac users right now, as the majority of Apple devices for years have had rough PWM and of course there is no hardware alternative.
I'm stuck on a MacBook from years ago because the only current MacBook I can buy is the Air line, which I'll probably buy soon to replace my aging 2018 MacBook.
No currently for sale iPhone is PWM-free. The iPhone 11 (non-Pro) was the last mainstream device Apple made with a PWM-free backlight. The SE 3 (2022) was also PWM-free, but is no longer available from Apple beyond what stock is still around.
I have LED in my home office. The "temperature" and this flicker were driving me bonkers. Fortunately no headache. Now I have them all pointed away to reflect off wall or ceiling, or behind diffusers. Much less bothersome,
Lightbulbs are cheap & eyes/brains are not... home office? May be worth taking care of yourself and save these cheap bulbs for places you don't spent most of your waking hours.
The 1981 film Looker, written and directed by Michael Crichton, features a trope: the Looker gun, which induces a trance in its targets via flashes of light, such that they are unaware of the passage of many hours of time.
Suspicious of "DC dimming". If you can just lower the current to an LED to dim it, everyone would. Someone will know better than me, but I believe there is a kind of threshold voltage for the (solid-state) LED.
I am not aware of LED bulbs (and here I am talking about home lighting, not phones or laptops) that dim by shutting down some of the (multiple) LEDs.
Most home lighting bulbs appear to have several LED elements. A circuit could enable dimming by simply shutting some of them off — running the rest full-on. 50% dim would of course shut half the LEDs off. No PWM required.
DC dimming LEDs is relatively easy, and somewhat common. The problem is that it's expensive compared to PWM dimming. It requires more expensive current-adjustable circuitry.
Additionally, for bulbs that are used in regular household fixtures, they basically need a way to convert TRIAC chopped 50/60Hz AC into constant current... which makes things even more expensive. Smart bulbs that are supplied a constant non-chopped AC can do it easier, but it's still expensive to do DC dimming.
I guess there is some threshold below which the LED turns off so the voltage/current -> light function needs to be set accordingly.
When I was in high school we were messing around with liquid nitrogen and overvolting LEDs and noticed the odd effect that the color of the LED would change if you overvolt it. It was years before I found out why
Voltage, yes. Current, no not really. You can drive extremely low currents and still get photon emissions from LEDs. That said, it's highly non-linear, so you basically need to assign set points. Doubling the current won't double the lumen output.
You can just lower the current. Not everyone does because it generally requires more expensive components, e.g. inductors. There is a threshold voltage ("forward voltage") needed for LEDs to turn on but there's no threshold for minimum radiant flux. LEDs are actually more efficient at low current (although this might be counteracted by greater losses in the power supply).
It takes 1 mosfet to turn led on/off from a MCU GPIO, but if you want to do DC dimming, now you have to either add more passive components, or turn to special IC, both cost more.
you can dim LED that are running on DC (it requires more than a potentiometer i guess - probably a buck circuit controlled by a pot, though) or AC; i have scant idea how the AC ones work, although variacs have existed for a real long time; but you have to buy special LED bulbs that can handle being on a dimming circuit.
this is different than a bulb like hue etc that have the ability to dim themselves through whatever mechanism.
Traditional dimmers used TRIACs. Those don't dim LEDs well, they make very visible flicker. TRIACs turn the AC off for part of the waveform, essentially a very slow version of PWM. With an incandescent filament that flicker isn't as noticeable since it takes some time to cool down & stop glowing, which visibly smooths the flicker. It just stabilizies around a lower temperature. With LEDs, the turn-off is nearly instant. You visibly see the flicker at the AC mains frequency.
There are two ways to dim an LED: supply less current at the same voltage, or PWM dim it with a fast enough switching speed that you don't notice the flicker (this being slower than it needs to be is what the article is about). A current source is pretty easy to build, and doesn't flicker, but it does dissipate all the excess energy as heat. That's not what you want inside the dimmer switch in your wall, it can be quite a lot of heat and would be a fire hazard in such a confined area. It does work for things like photography lamps which can have exterior heat sinking.
> but it does dissipate all the excess energy as heat.
No. That's only true for a linear regulator, which is just one, very terrible, implementation of a current source that's only used for very low power applications. Linear regulators are never used for things like room illumination.
The alternative, and what's used for all commercially available DC LED drivers (plentiful and cheap), is to just use a regular AC->DC switching supply in current mode (current for feedback rather than voltage feedback). The only flicker is the ripple left in the filtered output.
Why aren't these used? Because most dimmer switches use tech from incandescent age, and just chop off parts of the AC sine wave, so the bulbs are designed around the switches you can buy in the store. Why do dimmer switches chop? Because that's what the bulbs you can buy at the store expect, sometimes damaging them if not dimmed as they expect.
You can buy in wall DC dimmer switches from any LED supply store, but they require DC lighting, also only found at LED supply stores. It's entirely a very recent momentum problem, that's slowly going away.
Linear regulators are in fact used for room lighting, and efficiency can be reasonably good. Typical design is AC input -> bridge rectifier -> passive low-pass filter -> long string of LEDs with a single linear regulator in series. Voltage drop across the regulator is much lower than across the string of LEDs so there's not a whole lot of heat generated.
You can dim LEDs running on AC by converting to DC and then adjusting the current limit of the switching power supply. No flicker, but more expensive components.
We often talk about screen time and eye strain, but rarely do we mention the quality of ambient light. Low color rendering index, high flickering LED lights may not cause eye strain immediately, but they can wear on your eyes over time.
"To understand why PWM bulbs have so much flicker, imagine them being controlled by a robot arm flicking the on/off switch thousands of times per second. When you want bright light, the robot varies the time so the switch is in the 'on' mode most of the time, and 'off' only briefly. Whereas when you want to dim the light, the robot arm puts the switch in 'off' most of the time and 'on' only briefly."
It's entirely fine if the rate is high enough, but lowering the frequency of the PWM and using smaller inductors (or even no inductor at all) is a prime way to make the bulbs cheaper.
This the reverse, actually, you can use much smaller inductors the higher the switching frequency. That's why the GaN chargers are so much smaller, for example.
Smaller relative to the requirement for the frequency: you can cheap out both using lower frequency components _and_ using a small inductor than you should be for that lower frequency (or again, not using one at all at that lower frequency because it's still higher than the eye can directly perceive and you think that's all that matters)
Wouldn't it surpass the flicker if you're recording at 60fps and the lights flicker at a multiple of that (ex. 120Hz in the USA)? At least that's my experience recording CRT monitors - you set it to a multiple of its refresh rate and the flicker is gone from the video.
I used this method to check some of the lights in my house a few years ago.
The slow-mo video mode on my phone used a rolling shutter which captures one row of pixels at a time, meaning you could see the flicker in part of the video, even when it’s a multiple of 60 Hz or above 240 Hz. The flicker and camera frequencies also aren’t exactly synced up, so you can see the dimmed parts move across the screen.
You can get a pretty good idea of frequency, depth of flicker, and if the LED’s colors are flickering in sync from this, and I can confirm that Philips LEDs, specifically the EyeComfort series, are good.
> *Up to 15–50% slower decision-making in offices with high flicker (and high CO₂)
just throws me right off the argument in an article when the fine print notes that a cited study is confounding the thing the author cares about ("sensitivty to flicker") with a much simpler and better-understood explanation (CO₂ poisoning)
I tried my best but once it starts citing different sources as providing hard numbers and then not linking to the sources... and of course they're selling something. Might need a citation on that claim that iPhones don't use PWM.
I had to return an iPhone because the oleds they use flicker so bad. The only iPhones that don’t ficker are the SEs because they use ond school LCD screens. But of course they got rid of the SE. So now I’m stuck on this old SE3 until I can find a different phone that doesn’t flicker because as of now ALL iPhones flicker.
The frequency of 239 Hz is relatively low, so sensitive users will likely notice flickering and experience eyestrain at the stated brightness setting and below.
There are reports that some users are still sensitive to PWM at 500 Hz and above, so be aware.
> The frequency of 239 Hz is relatively low, so sensitive users will likely notice flickering and experience eyestrain at the stated brightness setting and below.
Do you have a source for this claim that 239 Hz is low enough to be noticeable by some measurable fraction of people? People report being sensitive to all kinds of things that end up repeatedly failing to reproduce empirically when it's put to the test (e.g. WiFi and MSG), so that there's a PWM sensitivity subreddit is not the evidence that TFA thinks it is.
The source that TFA links to backing up the idea that between 5% and 20% of people are sensitive to PWM flickering is a Wikipedia article which links to a Scientific American article which does not contain the cited numbers, and even if it did the study it discusses was researching the significantly slower 100 Hz flickering of fluorescent bulbs.
They mentioned in Results section:
> For the median viewer, flicker artifacts disappear only over 500 Hz, many times the commonly reported flicker fusion rate.
Yes, that's an interesting source that at least shows that our eyes can perceive things at those high frequencies, but I'm not sold that it generalizes.
The study actually demonstrates that perception of flicker for regular PWM does in fact trail off at about 65 Hz and is only perceptible when they create the high-frequency edge by alternating left/right instead of alternating the whole image at once.
It looks like the situation they're trying to recreate is techniques like frame rate control/temporal dithering [0], and since this article is now 10 years old, it's unclear if the "modern" displays that they're talking about are now obsolete or if they actually did become the displays that we're dealing with today. From what I can find OLED displays do not tend to use temporal dithering and neither do nicer LCDs: it looks like a trick employed by cheap LCDs to avoid cleaner methods of representing color.
It's an interesting study, but I don't think it redeems TFA, which isn't about the risks of temporal dithering but instead claims harms for PWM in the general case, which the study you linked shows is not perceived above 65 Hz without additional display trickery.
What they are trying to do is recreating the situation that is more similar to actual light sources in computer screens and TV (varying flickering rate from different pixels/ areas). They are saying that the current threshold of 65Hz commonly reported are tested on light sources that are uniformed in flickering, which is not the case for actual screens. It is not about dithering.
Basically the claim is that when there are varying flickering frequency, the requirements for non-flicker frequency is much higher.
No, they're specifically contrasting two types of displays and identify that the traditional way of measuring flicker effect does work for the traditional displays, regardless of image complexity:
> Traditional TVs show a sequence of images, each of which looks almost like the one just before it and each of these images has a spatial distribution of light intensities that resembles the natural world. The existing measurements of a relatively low critical flicker fusion rate are appropriate for these displays.
> In contrast, modern display designs include a sequence of coded fields which are intended to be perceived as one frame. This coded content is not a sequence of natural images that each appears similar to the preceding frame. The coded content contains unnatural sequences such as an image being followed by its inverse.
What's unclear to me 10 years down the road is if the type of display they're worried about is common now or obsolete. "Modern" in 2015 could be the same as what we have today, or the problems the study identified could have been fixed already by displays that we would call "modern" from our reference frame.
I don't know enough about display tech to comment on that, but they're very clear that if your display is showing frames in sequence without any weird trickery that the research method that gets you a 65 Hz refresh rate is a valid way to test for visible flickering.
EDIT: Here's another quote that makes the contrast that they're setting out even more clear:
> The light output of modern displays may at no point of time actually resemble a natural scene. Instead, the codes rely on the fact that at a high enough frame rate human perception integrates the incoming light, such that an image and its negative in rapid succession are perceived as a grey field. This paper explores these new coded displays, as opposed to the traditional sort which show only a sequence of nearly identical images.
It's possible that this is actually a thing that modern displays have been doing this whole time and I didn't even know it, but it's also possible that this was some combination of cutting-edge tech and cost-saving techniques that you mostly don't need to worry about with a (to us) modern OLED.
That is just the motivation, the experiment is much more general and is not related to display technology:
> The work presented here attempts to clarify “the rate at which human perception cannot distinguish between modulated light and a stable field
Otherwise, they would have tested the dithering directly for the full image. Here they are testing a more simpler model: varying flickering causes higher flickering-free requirements (due to eye movements). This would applies to dithering, but potentially other situations.
You can't say "that is just the motivation", because the motivation is what dictated the terms of the experiment. I read the whole study: the contrast between the two types of displays permeates the whole thing.
They repeatedly say that the goal is to measure the effect of flickering in these non-traditional displays and repeatedly say that for displays that do not do the display trickery they're concerned about the traditional measurement methods are sufficient.
You're correct that they do demonstrate that the study shows that the human eye can identify flickering at high framerates under certain conditions, but it also explicitly shows that under normal conditions of one-frame-after-another with blank frames in between for PWM dimming the flickering is unnoticeable after 65 Hz. They go out of their way to prove that before proceeding with the test of the more complicated display which they say was meant to emulate something like a 3D display or similar.
So... yes. Potentially other situations could trigger the same visibility (I'd be very concerned about VR glasses after reading this), but that's a presumption, not something demonstrated by the study. The study as performed explicitly shows that regular PWM is not perceptible as flicker above the traditionally established range of frame rates, and the authors repeatedly say that the traditional measurement methods are entirely "appropriate" for traditional displays that render plain-image frames in sequence.
EDIT: Just to put this quote down again, because it makes the authors' point abundantly clear:
> The light output of modern displays may at no point of time actually resemble a natural scene. Instead, the codes rely on the fact that at a high enough frame rate human perception integrates the incoming light, such that an image and its negative in rapid succession are perceived as a grey field. This paper explores these new coded displays, as opposed to the traditional sort which show only a sequence of nearly identical images.
They explicitly call out that the paper does not apply to traditional displays that show a sequence of nearly identical images.
The paper's motivation is to explore the new coded display, and they are doing that by exploring an aspect that they care about. That aspect is very specifically well-defined, and if you want to show whether a display has the same effect or not, then we need to look into it. But at no point is the experiment itself relating to any kind of display tech.
I mean, they are not even using a screen during the study, they are using a projector. How are you going to even make the claim that this is display technology specific when it is not using a display?!
Did you actually read the study? I assumed you did and so I read every word so I could engage with you on it, but it's really feeling like you skimmed it looking for it to prove what you thought it would prove. It's not even all that long, and it's worth reading in full to understand what they're saying.
I started to write out another comment but it ended up just being a repeat of what I wrote above. Since we're going in circles I think I'm going to leave it here. Read the study, or at least read the extracts that I put above. They don't really leave room for ambiguity.
Edit: I dropped the points on the details, just to focus on the main point. Rest assured that I read the paper, I was arguing in good faith, and that after a bit more thinking I understand your criticism of my interpretation. I don’t think the criticism of the research being unable to generalized is warranted, considering the experimental design. But we aren’t going to agree on that. The difference in our thinking seems to be the probability of the similar effect showing up in daily lives. I know the projector was emulating the coded display, but my point is that it was reasonably easy to do it, and the same setup could conceivably show up easily in different way. Not to mention that the researchers specifically said all the displays in their office had the effect, so it is common within displays itself.
I think if we continue talking, we will keep running in circles. So let’s drop the details on research: it is there, we can both read it. Here is what I was trying to convey since the beginning:
- If you think the (original) article is an ads, with the writing not up to scientific standard: sure, I am ambivalent about the article itself
- If you think the gist of the article and their recommendation is wrong, I mildly disagree with you
- If you think led-flickering affecting people is in the same ballpark of concern about Wifi or GMOs, I violently disagree with you.
LEDs are new, and so the high frequency related research are not too numerous, but for the few exist, they generally point to a higher threshold of perceiving than previously thought. As for the health-effect, I believe that part is more extrapolation than researched (since those can only come after the more generic research on perceiving). So the final assessment is: how bad was the article in presenting information the way they did.
> People report being sensitive to all kinds of things that end up repeatedly failing to reproduce empirically when it's put to the test (e.g. WiFi and MSG), so that there's a PWM sensitivity subreddit is not the evidence that TFA thinks it is.
I know you’re looking for large sample size data, but PWM sensitivity absolutely exists, and I wish it didn’t. The way my eyes hurt in less than a minute while looking at an OLED phone (when I can handle an LCD phone for hours just fine) is too “obvious”. This occurs even on screens I didn’t know were OLED till I got a headache, btw.
(I’m also insanely sensitive to noticing flicker and strobe lights - I say a light flickers, everyone disagrees, I pull out the 240fps mode on my phone… and I’ve never been proven wrong till now.)
I used a 240 Hz PWM-dimmed monitor for a couple of years and I adjusted, but when I switched to a flicker-free one, it was very noticeable and bothersome to use the old one. Even though it's not perceptible when looking at a fixed point, when moving one's eyes around the after-images are easy to see. Even 1000 Hz PWM-dimmed LED strips are easy to notice when looking around the room. The light is basically being panned across one's retina like a panoramic camera/oscilloscope, logging its brightness versus time.
Even if the iPhone was flicker free, holding the iPhone itself throws all that out the window with all the addictive colors and notifications and badges
With the lofty claims of "health risks", I was disappointed to find no sources linked at the bottom of this article (correct me if I'm wrong).
Based on my personal experience, I think "health risk" is an overstatement: bad PWM can be uncomfortable (Geneva Airport had particularly egregious lights that started flickering in your peripheral vision), but I doubt there are any long-term effects of it.
Reading further down, a few other comments [1][2] have stated this better than me.
I'm interested in this topic and stay on top of most threads/discussions about PWM.
> I doubt there are any long-term effects of it.
I would have thought the same, but it seems to be a common experience that once someone becomes PWM sensitive it actually sticks with them.
I've been a techy my whole life; the iPhone 12 mini seemed to be the device that triggered my PWM sensitivity and since then I have been extremely sensitive to any device with PWM.
Although I have tried to keep PWM devices out of my life, I can still quickly tell when the TV in a lobby or the airplane entertainment display has PWM and there's not much you can do about it.
can anyone recommend flicker-free LEDs (E14) to order from Amazon.de?
incidentally i found some LEDs to be extremely annoying but the flicker would sometimes just disappear on its own or after turning off and on the light switch. what could cause this?
I would certainly agree that finding LED bulbs that you like and/or don't bother you can take some work (especially if you want to put them on a dimmer, in which case you may also need to replace your dimmer). However, I am skeptical that subtle PWM flickering is unavoidable. For the chateau example, it would be better to choose bulbs with fewer lumens and run them at 100%?
I wonder about this too. If I have a dimmer and a LED bulb, does putting the dimmer all the way up still use PWM? I have a hunch that it still does, but would love to be proven wrong.
We had these flourescents in our computer lab at school. They were light yet dark. On yet off. Crazy. Some weird color or flickering frequency. If you sat there for a couple of hours you would start to stink. Like, a weird stink. Some speculated that it did something to your glands.
The health stuff seems more like woo to me, but subjectively speaking for a while I had OnePlus phone with linear dimming it was easily the most pleasant to read at it's lowest brightness setting, while simultaneously being able to go even dimmer compared to any other phone I've ever used.
The gamma curves got a bit messed up, but when it's that dim it's not like I expect stellar color accuracy anyway.
There is a strong and widespread tendency to view anything artificial as highly dangerous. I understand this perspective, but on the other hand, we have science and reasoned arguments.
I've seen this repeated many times but never seem any evidence for it. At typical PWM frequencies the perceived brightness is just the average brightness of the wave. I believe this myth arose from people driving low-brightness indicator LEDs using PWM for increased efficiency when using simple current-limiting resistor circuits. People saw the energy savings from less waste heat in the resistor and somehow confused it with something happening in the eye.
I don't think this is true. The typical uneven LED spectrum causes poor color rendering accuracy, but human color perception is highly inconsistent anyway. Think of the blue/white dress people were arguing about ( https://en.wikipedia.org/wiki/The_dress )
PWM lights generate a lot of radio frequency interference. Nobody seems to care except for us ham radio operators who can’t enjoy the radio anymore. (It’s also a reason we lost AM radios in cars)
I know that people anecdotally report complaints about flicker and it's plausible to me that there could be an effect, but the way this piece is written reminds me distinctly of similar essays about WiFi sickness, MSG, and GMOs.
It identifies a "health risk", describes the mechanism in terms that sound very convincing, assigns numbers to its cause and effects, provides a table grading health risks of various products, all without linking to a single scientific study demonstrating that the effect is anything other than nocebo. The closest they come is a image of a table that refers to a few institutions that apparently did a study related to PWM (leaving it an exercise to the reader to find the studies they're supposedly referencing) and a link to a Wikipedia page which links to a Scientific American article which says:
> In 1989, my colleagues and I compared fluorescent lighting that flickered 100 times a second with lights that appeared the same but didn’t flicker. We found that office workers were half as likely on average to experience headaches under the non-flickering lights. No similar study has yet been performed for LED lights. But because LED flickering is even more pronounced, with the light dimming by 100% rather than the roughly 35% of fluorescent lamps, there’s a chance that LEDs could be even more likely to cause headaches.
I'm willing to entertain the idea that LED flicker is actually problematic, but I wish essays like this would be honest about the degree of confidence we have given the current state of the evidence. This piece instead takes it as a given that there's a problem, to the point where they confidently label devices on a scale of Low to Extremely High health risks.
There doesn't need to be a health risk for it to be annoying. I personally dislike PWM and I'll continue to personally dislike it even if it's proven safe. Fortunately it's easy to find non-flickering LED lights.
I am not questioning that certain types of flickering are harmful, so that there's an IEEE standard for how to safely use PWM does not contradict what I said.
What I'm asking for is for articles like this that cite numbers and provide tables purporting to quantify the degree of harm caused by various devices to point to where they're getting their numbers from or, if they can't do that, stop making up numbers and assigning things to "harm" scales that they invented themselves based on vibes.
Either there's a study showing that 246 Hz flickering poses "Extremely High" health risks or there isn't.
Was it an astronomically high health risk to watch a TV set that flickers at 60 Hz or movies that flicker at 48 or 72 Hz? (It is 24 frames per second but you'd perceive a lot of flicker at that rate so the shutter has 2 or 3 blades)
Can you please cite the page number where this definition exists? When I search for "extreme" in the standard that the other commenter links to I don't turn anything up, so I'm unclear where that classification is defined.
That does not define the scale that they're using. That's a typical hazard analysis risk matrix which has two axes which can be converted into a 4-point scale (Low, Medium, Serious, High). Importantly, to do a risk assessment in the style of IEEE 1789's you have to identify the specific Hazards that you're analyzing, which TFA does not claim to be doing in that table, instead speaking vaguely of "health risks". IEEE 1789 does not provide a mechanism for evaluating "health risks" without specifying exactly which risks are being evaluated.
You can see on page 27 how this is meant to be used: it should produce a per-hazard matrix.
You might be thinking of Figure 18 on page 29, which does identify Low-risk and No-effect regions by Modulation % and Frequency, but that also does not claim to identify high-risk regions, it just identifies the regions we can be highly confident are safe. And importantly, as a sibling comment notes, TFA's table actually contradicts the line on Figure 18, labeling several devices as higher than Low even when they're squarely within the Low-Risk and No-Effect zones.
They list the 'Xiaomi 15 Ultra' as having a 'Moderately High' health risk, and cite it as having a 2.16 kHz PWM frequency at 30-75% modulation depth.
The IEEE article has recommended practices that state:
8.1.2.3 Example 3: PWM dimming
Using Figure 20, the recommended practice for PWM dimming at 100% modulation depth is that the frequency satisfies f > 1.25 kHz. This can also be derived using Recommended Practice 1 and solving 100% = 0.08×fFlicker. This level of flicker could help minimize the visual distractions such as the phantom array effects.
Seems like even at 100% mod depth, >1.25 kHz is just fine.
Also, the article does not seem to distinguish between modulation at reduced brightness, which the IEEE article calls out specifically as something that is unlikely to cause issues. E.g., movie theaters using film all flicker at 48 Hz and nobody complains about that.
Sure, PWM light can cause health risks for some people, in some contexts. But taking research out of context is bad science.
Do you genuinely believe the Pixel 7 and 8 Pro have an "extremely high health risk", in the context of what a lay person would understand?
Edit: I specify 'lay-person' because clearly this is an introductory blog post (or advertisement for Daylight Computer). If they want to use a more specific definition of health risk, then they better define it.
The “very/moderate high” comes from the standard itself, which is quantified within the standard. In the context, it is about the probability of having issues, while the effect (mild to catastrophic) is another axis. Considering that they stick to the “official” wording and seeing the criticism, I am not even sure if they can change to a more “lay-person” friendly and be acceptable to all the critics.
The standard also linked to the researches during their discussion.
Please read it, instead of just randomly throw out things hoping that they supported your argument.
You can't just point people at a 60-page paywalled standard and say "the supporting evidence to my claim is somewhere in here, I pinky promise". You are the one making assertions, it's on you to prove that the standard actually does reflect the text of TFA. I'm not going to read the whole standard because I'm not the one making the argument and I can't be bothered doing the research needed to refute every piece of nonsense science that shows up on the internet. What I can do is point out when someone is making unsourced claims and insist that they provide their sources if they want to be taken seriously.
Cite the exact page number and quote that you claim justifies the assertion that 246 Hz PWM carries an "extremely high" health risk. Then we can talk.
Look, they sourced their claims (quite literally, they put how they calculate, from which standard). And linking to the correct document is literally how scientific citation works — I replied the page to you above anyway.
If you want to redo the numbers and check if they fit the definition, please feel free to do so, but you will need to put some works in (since the flicker hz -> risk showing in the article is a computed value, you need to find the modulation value and plug it in too)
I understand your fight and your idea, I am just saying that in this specific instance, this is not a fight to be fought. The article is generally correct, and if you want to complain about the writing style or it being an ads, it’s up to you. But this is not the same situation with GMO stuffs
> Look, they sourced their claims (quite literally, they put how they calculate, from which standard).
No, they said that IEEE 1789 also uses Modulation % (which they've renamed Flicker %) to calculate risks. That is pointedly not the same thing as claiming that they used IEEE 1789's formulas.
You're reading their copy generously, but that doesn't usually pay with marketing copy. Articles like this always like to wave in the general direction of official-sounding sources while carefully refraining from actually claiming that they got their numbers from anywhere in particular.
These LED light flickers actually trigger ocular migraines for me. I had tried to put in LEDs when the incandescent ban hit the US, and ended up with a Philips Hue system. I had 4 migraines in 3 days and had to send them back. I purchased as many incandescent bulbs as I could find, but they were somewhat impossible to find at that point.
I've got a couple bulbs from Waveform Lighting and they don't flicker, but I totally can tell the reds are off.
I really hate the LED transition. My building replaced all the outdoor lights with them, and now it's just too bright to sit on my stoop at night like used to be so common here in Brooklyn. My backyard neighbor put in an LED floodlight and now I have to buy blackout curtains. I drive rarely, but the oncoming headlights are blinding when I do. It's pretty depressing if I think about it too much.
>My building replaced all the outdoor lights with them, and now it's just too bright to sit on my stoop at night like used to be so common here in Brooklyn.
What I miss are the old low-pressure sodium street lights that used to be ubiquitous in the UK. Not everyone's cup of tea but they were highly efficient (outperforming LEDs for a surprisingly long time) and had this cool property of being so monochromatic they ate the colours out of everything. This made them useful for astronomers because their light was easily filtered, and reduced their impact on wildlife relative to harsh blueish LEDs. The main reason I like them is aesthetic though, they made night look like night rather than a poor approximation of day.
Thankfully my local area have given up trying to use the really harsh white they put in initially, and have at least starting putting in warmer LEDs.
Sodium light story: my family went to Niagara Falls and took a tour of the tunnels behind the falls. The tunnels were lit by sodium lights that were not augmented by phosphors.
It was so monochromatic that we thought we lost our shuttle bus stickers that were stuck to our shirts, and would have to walk around instead of being able to hop-on/hop-off. What a relief it was to emerge in daylight.
> This made them useful for astronomers because their light was easily filtered
The wavelength is so specific that it can have all kinds of cool applications:
https://youtu.be/UQuIVsNzqDk?si=R4VUDCfC6zcHd4XC
I already know this is the Mary Poppins Corridor Crew video before clicking. Super cool video.
I knew what this link was before I clicked it, and it's a must-watch for anyone who's interested in basically any aspect of film making, cinematography, or light science in general. Extremely cool topic.
I did not know those things about the sodium lights, but I did think they were better than the LEDs which are too bright and have other problems. However, I think that they should not put so much light outside in the night time, anyways.
I have your problem. Philips makes no flicker LEDs, they don't have PWM like the Hue system, and good capacitors so no 60hz flicker. They're the "Ultra Definition Eye Comfort" models.
On older LEDs you could replace the caps, but newer ones use SMT caps that require more than just a soldering iron. It makes a huge difference, and also eliminates the slight delay when switching the lamp on (though I don't recall experiencing this with recent LED lamps, even cheap ones).
searched for images of "LED bulb teardown", the SMT components are not that tiny, so quite possible to solder with a regular iron. Am I missing something?
Really the worst part is that the bulbs are usually glued/ultrasonice welded, together, so you kind of have to destroy it to open it up.
>EyeComfort& LEDs have a high Colour Rendering Index, meaning that your home's furnishings appear in high definition and true colour.
What is this world coming to, that you have to buy some top-range lamps just to see the inside of your own home in true colour...
kinda, but you're also not buying a mini space heater.
Average lifespan of an incandescent bulb is about 1,000 hours. For a typical 60 watt bulb, that means it burns 60 kWh in electricity over the course of it's life. At $0.20/kWh, that means an incandescent is going to cost you $12 in electricity over its lifetime.
A Philips Ulta-Definition 4-pack of 60W-equivalent is $11.53 on amazon today, or $2.88 / bulb. That $3 bulb is actually 8W. So over those same 1,000 hours, that's 8 kWh, or $1.6 in electricity costs. So the $3 bulb saves you $10 in lifetime electricity costs vs. one incandescent.
But those bulbs are rated for 15,000 hours. Lets assume they all lie and deflate that by 1/3 (maybe a power surge will hit a few years in). That single $3 bulb still saves you 10 x $10 = $100 in electricity costs vs incandescents over its useful life. A bit more if you pay California electricity rates, a bit less if you live near some hydro co-op. But the difference is large enough that the effect is true no matter where you are.
So yeah, top-range lamps give better results than the cheapo stuff, but top range isn't that much more expensive, and the lifetime savings of going to LED are hard to ignore -- op-ex vs. cap-ex if you will.
Personally, I'd pay a lot more in electricity costs to have light that has full spectrum output. The Waveform lights I bought are about $40/bulb, and they're nicer than the Philips I tried, but they're still not as nice as a regular full spectrum incandescent.
But I also live in a small NYC apartment, so I don't have your typical suburban house with 20+ light fixtures to deal with, I only have 6.
Incandescent has other advantages. For example, in winter time if it is cold and it is also dark in winter time, then the heat can be beneficial. In summer time you should not need the light so much since there is already the light. Either way you should not need to use the light too often, and if you do not use the light too often then you can save energy by that too, and does not need to be replace as often.
> But those bulbs are rated for 15,000 hours
and they last 1000 hours. Technology has evolved. Also the methods to take your money.
You're suggesting that LED light bulbs need replacing every year, which hasn't been my experience (like, at all). I switched over to LED bulbs 10 or so years ago and haven't had to replace a single one yet.
I’ve got outdoor LED lights that fail constantly. So often that I keep dozens of them in storage to replace them as they die. Much less reliable than the incandescents they replaced. I’m fact, I have a string of about 50 sockets, about half are still incandescents that have survived for 10+ years, and the other half are LEDs that I have to keep replacing. Sadly, whenever an incandescent light goes, I have to replace it with the crappy LED version, so eventually it will be 100% crap.
I really hate modern technology sometimes. I have nothing against being more energy efficient, running cooler, lasting longer, but we're losing some great things along the road.
I have to pay 3x the price for a CRI>90 LED w.r.t. a CRI>80 one. At least the price difference brings better light quality regardless of CRI (soft start, dimmability, even less flicker, better light distribution). On the other hand, I'm happy that I can get halogen bulbs if I really want to.
The problem comes from losing past frames of reference. We say "we're at 99% of benefit parity with the previous generation", but this 1% losses compensate every generation, and now we live in a more efficient, but arguably less comfortable life.
A couple of Technology Connections (this guy is nuts when it comes to LEDs, in a good way) videos on the subject:
https://www.youtube.com/watch?v=qSFNufruSKw
https://www.youtube.com/watch?v=tbvVnOxb1AI
I would rather buy a incandescent light (even if I have to pay 3x or 5x) which is not as bright as the LED (forty watts or possibly even lower, should be sufficient; I have a few 40W incandescent light and they are good enough), and then not turn it on in the day time when it is light outside.
(Unfortunately, other people where I live like to turn on the light even in the day time and that bothers me.)
You can buy an E27 halogen bulb around 50 watts (which would be around 100W incandescent) and pair it with a universal dimmer.
It'd provide you nice warm light, and will allow to flood the space with bright light if the need arises. Neither of them are expensive. Halogen bulbs are also CRI100, so their color rendering is not different from incandescent bulbs.
Turning on lights when you have ample sun is not a wise choice, I agree.
You don't and you probably don't want to. Daylight for true color. At nightime you want amber LEDs.
I have the same issues. I actually had to return an iPhone because the oled screens they use are so bad and gave me migraines.
For bulbs tho I found this site that tests tons of them for flicker.
https://flickeralliance.org/
It was a godsend and I was able to get some Ikea bulbs with zero flicker and they’ve been great. So at least my house isn’t a flickering mess. Now I just gotta figure out a phone that’s not garbage.
How did you survive office buildings, airports, department stores etc. prior to leds? I am by no means sensitive to flicker, but fluorescent tubes which were used in pretty much every office building/large department store, before leds became common, used to flicker like crazy in comparison. And there was always at least one which was broken and flickered on and off very slowly (including the characteristic sound).
Not well. But at least I had windows at work to counteract it and I could turn the lights off. I also was a contractor for 15 years which meant working from my home mostly.
Strangely tho there is something about PWM flicker, especially the kind that is deep cycle (basically 100% on then 100% off) that are super bad for me. I can look at an old CRT fine because it’s not completely on and off as it does it’s scanlines. But PWM is like flicking a light switch rapidly and it gives me the worst headaches.
Thanks for the link! I bought a moto g power 5G - 2024. It has an LED screen. I've been using this line of phone for years. It's morning fancy, but at least I can actually look at it.
If you are sensitive, I would recommend getting an Opple Light Master. It's a tiny and cheap flicker meter.
Does this also affect Macbooks?
In my testing no, the MacBooks are fine. It’s the phones that are a problem. Easiest way to tell is get a camera app you can set the shutter speed to 10,000 fps and point it at a screen. If there are black lines across the screen there is a PWM issue. The thicker and darker the black lines the worse it is.
10,000fps, or 1/10,000 of a second?
Asking because I don't have a Phantom at home.
The latter. But it’s the preview you’re looking at. You’re not actually taking a photo.
A camera app can't set a phone camera to 10,000 fps. So the question is what does it do?
Due to rolling shutter, you take a “progressive” photo. As a result, if the screen flickers during that time, you see this change in light intensity as horizontal bars.
Thicker bars means a lower PWM frequency, hence lower quality light/brightness control.
I think it varies by model. The 14" & 16" MacBook Pros with the miniLED give me and many others PWM issues. On the other hand, the MacBook Air models with the notch don't seem to bother most people.
I have an Air and agree these are great for people sensitive to PWM like myself.
Yes, some MacBook models are affected.
The MacBook Air displays with the notch do not have PWM and do not seem to bother people. The 14/16" Pro models seem to be quite bad for most people (I had to return my new 14" model, it was rough).
The first PWM MacBook I bought was the 16" MacBook Pro from 2019 (the last Intel model). I'd had a 2018 MBP 15" and couldn't figure out why I just couldn't stand looking at the new 16". I thought I had a bad display but ended up learning about PWM.
That sucks; I feel your pain. I, too, strongly dislike overly bright lighting.
I wonder if there's room to at least engage with the neighbor to talk about friendlier light options? You might also be able engage with these folks to see if there are efforts to improve the lighting in new York: https://darksky.org/
> I drive rarely, but the oncoming headlights are blinding when I do.
I drive a shallow car with old lights, and once I was blocked on a street by a much taller car sitting in front of me with very bright LED lights, and I couldn't see a thing because of the glare. I was unable to manoeuvre out the way because of this. They sat there for a minute or so stubbornly refusing to move for me before finally moving out the way.
It's super common where I live for teenagers who drive jacked up trucks to replace their headlights with super bright led lights. They don't adjust the angle of the beam, so they're just like brights all the time. It's miserable.
Also, more people seem to be driving with their bright lights on 100% of the time. I once rode as a passenger at night with an ex-coworker driving and I noticed he used his brights the whole time, even when there were oncoming cars. I asked him why and he looked at me like I was stupid and said “because they’re brighter and let me see better.” When I pointed out that they blind other drivers he just shrugged and said “fuck em, not my problem.”
My take is that PWM dimmers are dramatically more energy efficient than the old rheostat dimmers people used to use. If you operate a transistor in a digital mode where it is either on or off it is close to 100% efficient, but if you operate it in a 50% power mode you have to send 50% of the power to a load and the other 50% to a resistor. Thus CMOS logic eradicated bipolar, switching power supplies replaced linear power supplies, a Class D amplifier can be a fraction the size of a Class A amplifier, etc.
You could probably still reduce the flicker by either increasing the switching frequency or putting some kind of filter network between the switch and the load.
Old dimmers are triac based, with the potentiometer simply setting the trigger voltage, not doing the actual dimming. These were in fact very efficient.
For sure, they're definitely way more efficient. They just unfortunately give me migraines. I'd be open to trying some that have a filter network or some other smoothing on the flicker.
But I've also never lived in a house that has dimmers (they've all been old homes in the north eastern US) and I never use overhead lighting, so it's not something I need or would miss.
Have you ever tested various PWM frequencies? 50/60Hz is very noticable - but if the PWM is switching at 1000Hz? 5kHz? There is presumably a rate at which it is imperceptible to you?
Apparently Philips Hue uses 500-1000Hz. I wonder if there's manufacturers that use a much higher rate.
Beyond the Hz, the depth of the modulation matters. I am sensitive to poor PWM implementation, but Hue bulbs luckily don't bother me.
On an old iPhone with basic slow-mo recording capabilities, typical Hue bulbs don't "blink" when the video plays back, but the PWM-dimmed iPhone in the same video recording was blinking/flashing like crazy.
~~
Another example of the PWM details mattering: I can't use any iPhone with OLED (anything from the X to current), but I am able to use a Note9 which has OLED with DC-style PWM.
What do you mean by “the depth of the modulation”? The relative brightness?
PWM at low duty cycles tends to be much more noticeable. But that’s where higher frequencies should solve the problem.
I was referring to the extent of brightness variation in the flicker -- while many focus on a higher frequency (Hz) to reduce eyestrain, the key factor is how the screen behaves during the off cycle
Some PWM implementations ramp the brightness up and down slightly (easier on eyes), while other manufacturers flip the switch on and off harshly (like strobing)
The shorter time the screen is dark between being lit up results in a a shorter pulse duration, and the pulse duration and depth are more important than the Hz
Apparently fourth-generation LED tube lights are designed not to flicker.
https://en.wikipedia.org/wiki/LED_tube#History
Aside from that Wikipedia article, where 1 source is not available and the other one is in Finnish, there's pretty much nothing online.
I googled for G4 LED tube PWM and got products that say they are G4 LED tubes that use PWM.
Pretty sure 100% of LED products sold anywhere use PWM if you don't use them at full brightness. I sometimes walk around lightning stores with a slo mo camera and see PWM in every price bracket.
It is always PWM under the hood, the question is, how much was spent (or not) on the filtering network out of the PWM. Is it closer to buck converter or is it straight up flicker at the output.
Since these things have lots of LEDs, my first thought was to put a range of different tiny delays on them to induce destructive interference, so that the off parts of one LED's flicker are the on parts of another, to smooth out the overall output.
Actually that's not true, my first thought was "just use a layer of phosphor excited by the LEDs", but fluorescent tubes do that and people used to make the same complaints about flicker, so.
Looks like "flicker index" is a useful(?) search term, anyway.
You can also eliminate flicker by not using PWM, and instead using a high switching frequency DC power supply with a clean stable output current.
Obviously it costs more, but I wish manufacturers would just do it.
Conceptually, that's not far away from PWM + filter, except maybe for the closed loop.
Yeah that's true, LEDs do need a closed loop for current regulation in a high efficiency setup, so might as well use it for dimming too.
Personally, I don't care for more energy efficiency if my head is hurting after 30 minutes under that light. I can really see all the flickering when I blink or move my head.
Similarly, I prefer a Class A amplifier if I have the space, but I won't open that can of worms here.
Lights for high speed cameras use really good filtering on their PWM switching, or just linear power supplies. It would be nice to have a premium bulb that has longer life and much less flickering.
Philips' Eye Comfort series virtually have no flicker. I personally only use these series. They come in CRI>80 and CRI>90 variants. Latter one is also dimmable and comes with soft on/off.
They are nice.
Soraa brand bulbs are a tier above Philips if anyone is interested in premium lighting.
I tried their healthy bulbs years ago and they were terrible. I've avoided the brand since then. I think they have done discontinued the healthy line.
Unfortunately we don’t have them here. However, there are some good offerings from OSRAM.
Right there with you. I just dropped an amount of money I'm unwilling to admit on replacement bulbs throughout -- only this time I was replacing WiFi RGBW LED bulbs in rooms that had lower-end bulbs (almost everything on the market).
Incidentally, I went with LIFX -- I had purchased their bulbs back when they were the only realistic option besides Philips Hue for smart RGBW bulbs[0]. Still seems those two brands produce the most flicker-free variety.
[0] LIFX was a handful of lumens brighter at the time and didn't have a hub requirement
There is hope: https://www.congress.gov/bill/119th-congress/house-bill/3341
That’s horrid. I can’t imagine going back to crappy incandescent bulbs today. We don’t need to import more disposable junk.
Well, no one is forcing you to buy incandescents, but I certainly would be happy to buy them if they're available.
However, I think I'm probably out of luck -- LEDs are cheaper and they likely don't break as much in shipping. So even if I personally find LEDs to be worse than incandescents -- they don't render reds properly, so even something simple like skin tones don't have the depth they once did, plus they give me migraines -- I likely won't be finding them on shelves anywhere near me ever again.
> disposable junk
On a long enough timeline, everything is disposable. And what is "disposal", really? How many LED bulbs are actually getting properly recycled, and isn't it true that the materials in incandescent bulbs are less harmful, relatively speaking, than those in LEDs?
I have never heard of anyone recycling an incandescent bulb, but recycling bulbs became a big deal with CFLs came out and most people seem to be in the habit in the area I'm in. Store have bulb take-backs for CFL/LED in the entryway, for example.
I don't like LED bulbs, but I think they clearly win the disposal/economical argument against incandescent in every way. Unfortunately they blink and have poor color reproduction in many versions.
I guess what I'm saying is, in the case that it's not recycled, are you better off with an incandescent in a landfill, or an LED in a landfill?
Then you might should reframe your question to the approximate # of incandescent bulbs a person would buy during the typical lifetime of an LED bulb. On the low end, it's 20.
Don't forget the costs & emissions related to manufacturing and transporting all twenty of those incandescent bulbs.
As much as I like the old bulbs, they're unlikely to "win" in this question unless you are wanting to ignore the major lifespan difference.
Economically speaking, one can go to Dollar Tree and spend $1.25 and get a two pack of LED bulbs that will save 38 other bulbs from the manufacturing stream and landfill. Seems obvious?
> I totally can tell the reds are off.
How do the reds look to you?
I looked at the photometric reports from a couple Waveform models on their website and the R9 (saturated red rendering) was in the 90s for both with tint almost exactly on the blackbody line. The 2700K did have a bit worse R9 than the 4000K so I could imagine it doesn't look exactly like an incandescent.
I mean I hate to be like "vibes", but, kinda vibes. There's just something about the light coming out of the Waveform LED I have in one lamp in my living room versus the incandescent I have on the other side of the room. Definitely not a scientific take!
I did at one point randomly put the LED in different configurations when I first got it and my wife was able to pick out which lamp had the LED in it every time. They just have a different feel, even if the temperature rating is around the same as the incandescent and the R9 was the highest of the LEDs I evaluated. At least these Waveform LEDs don't give me migraines though.
"Vibes" are fair. I just put flashlights with two incandescent-like LEDs (Nichia 519A and Nichia B35A in 2700K) and I can see a slight difference in how they render colors even though the spectrophotometer says all the major metrics are within a couple points of each other.
What is the most logical explanation for the difference then?
Looking closely at the measurements, when the B35A has an advantage on individual CRI samples, it's usually a larger gap than when the 519A has an advantage. They're both in the 90s for R1-14, and it takes a keen eye to tell the difference.
I recently learned about Color Rendering Index, which sounds like pseudoscience but apparently it is not. Here's a handy table I used for buying lights; again domain sounds grifty but, it's a searchable table :shrug: [0].
[0]: https://optimizeyourbiology.com/smart-light-database
CRI is absolutely real. It's an old and relatively simplistic metric with several potential successors. The chart you linked uses one: TM30, which is based on the average of 99 different colors instead of CRI's 8.
There are seven extended samples for CRI (R9-R15) not included in the average. LEDs often do particularly poorly on R9, a measure of saturated red rendering. LED sources with high R9 usually advertise it separately.
Tint, or blackbody deviation (Duv) is also important to the look of light and listed on the chart, but not for every model. These numbers are very small, but important: anything outside of +/s 0.006 is not white light according to ANSI. +0.006 looks very green, and -0.006 looks very pink. Interestingly, after acclimating for a few minutes, most people think very pink looks better and more natural than neutral white[0]. Most people do not like green tint.
[0] https://www.energystar.gov/sites/default/files/asset/documen...
You need high CRI lighting - that's the key to proper colors.
I don’t heave the migraines, but everything else you described is spot on. Sitting outside at night is much less enjoyable due to neighborhood LED lighting. If I could, I’d shoot them all out.
If you're savvy with manufacturing, make yourself a left handed edison thread (I can't find them anywhere). Left handed incandecent lightbulbs are still legal
Also, you can buy high wattage lights, and the three ways have lower wattage settings.
Finally, outdoor and appliance incandescents lamps are very inefficient, but last forever.
I wonder to the degree their effects are much worse than migraines. Perhaps irritability? Mental confusion? Anxiety? I'm spitballing here, but to be sure it seems like our world is somehow a place of more anxiety, irritation .... I would love for it to be something we could take control of.
i doubt it's the light bulbs. I posited the other day, by assembling a few different ideas, that Trauma Based Entertainment is to blame for this. something like 2/3rds of all Television programing is law-enforcement adjacent. True Crime is super popular on TV, law and order, NCIS, FBI this-and-that. And what's one of the largest advertising cohorts?
Medicine for depression, anxiety, insomnia...
it's nearly a closed loop; something i intuitively realized shortly after 2001/09/11 - by the end of that year i decided i would no longer have a "Television" attached to CATV/SAT/ANT service.
I'm not sure if i am correct, i haven't really dedicated a lot of time to getting the exact numbers, talking to psychologists and sociologists and the like. But two people i know had "breakdowns" (grippy sock) in the last month and both of them always have true crime on TV in the background or listen to true crime podcasts. Shortly after that happened i was listening to the moe facts podcast where Moe used the term "trauma based entertainment" and something clicked - Moe didn't mention "it's because of pharma ads" - that's my own input after having worked for the largest television "broadcast" company in the world, just long enough to see the advertiser "dinner".
The only ones watching traditional OTA TV anymore are elders. That advertising cohort is why OTA TV ads are filled with pharmaceuticals and "you may be entitled to financial compensation" type ads, at least where I'm at. Traditional TV has been dying since Youtube and broadband. MTV plays Ridiculousness constantly because no one is actually watching it.
> it's nearly a closed loop; something i intuitively realized shortly after 2001/09/11 - by the end of that year i decided i would no longer have a "Television" attached to CATV/SAT/ANT service.
Curiously this is about the same time I decided to give up on TV and radio as well.
It's definitely long lost its crown as the main way to watch video, but linear TV does still have a role. Apparently there's still the odd broadcast in the UK that means the national grid has to work to keep the frequency stable when everyone goes to put their kettles on in the ad breaks.
Live sports. Latency is so much lower on OTA television that you can tell who is watching a football match on UHF, cable or multicast IPTV, satellite and through unicast internet.
... but the content on streaming services isn't much different.
I don't know the content breakdown of online videos, but i know that creators like audit the audit are in the algo; and that's trauma based entertainment as well. the "here's the story and police interview of so-and-so", plus the news stations that have youtube presence.
Movies are another one, and lots of people watch movies. If i go on hulu or netflix and start tallying the genres (either TBE or not-TBE), what do we figure it will be?
The person i heard use the phrase "Trauma Based Entertainment" used it to describe movies that "we were sat down to watch when we were 9-12." Unfortunately the podcast i mentioned isn't super-advanced on the backend so i am unsure how to share clips at this point. But i've heard before the claim "young women as a demographic listen to true crime" repeated as a truism. I know the women close to me listened to this sort of content in the past or currently. I'm not trying to generalize this to the entire cohort.
also i only think, myself, that it's harmful, TBE/true crime/etc; i'm not a sociologist or psychologist.
Utter nonsense.
Verifiable data. Nielson data has to be paid for, but Australia's is free and open. [0]
Over 75 year olds are the largest FTA cohort.
[0] https://www.acma.gov.au/publications/2024-12/report/communic...
I think this article is pretty confused.
There are two ways to dim LEDs: linear regulation and some sort of pulse modulation. Linear regulation is wasteful and you're pretty unlikely to encounter it, especially in battery-powered devices such as phones or laptops. Pulse modulation is common.
Human vision has a pretty limited response speed, so it seems pretty unlikely that PWM at a reasonable speed (hundreds of hertz to tens of kilohertz) can be directly perceived. That said, it can produce a stroboscopic effect, which makes motion look weird and may be disorienting in some situations. So I don't have a problem believing that it can cause headaches in predisposed individuals.
You can dim your laptop screen in a darkened room and wave your hand in front. Chances are, you're gonna see some ghost images.
Other than adjusting the frequency, pulse modulation can be "smoothed" in a couple of ways. White LEDs that contain phosphor will have an afterglow effect. Adding capacitors or inductors can help too, although it increases the overall cost. But that doesn't make the display "PWM-free", it just makes it flicker less.
The article is worse than confused. It's a marketing piece written in a way that sounds vaguely science-y but with only a tenuous basis in real research.
I wrote more here: https://news.ycombinator.com/item?id=44312224
> I'm willing to entertain the idea that LED flicker is actually problematic, but I wish essays like this would be honest about the degree of confidence we have given the current state of the evidence.
I (and it seems others too) are very interested in this topic. I would appreciate if you could write an aritcle with "less confusion" so I can save it in my tumblog for future reference.
Yep, it's obvious that a lot of people are interested because junk articles like this usually get penalized harder. That interest is what this piece is preying on. Unfortunately it's easier to write a piece that tells people what they want to hear and spreads FUD than it is to write a piece that corrects the misinformation.
Much more rewarding too, because "we really don't know very much about this yet" is hard to expand to a full click-worthy essay and less likely to move product.
I think it’s pretty common for people to be able to perceive PWM flicker in their peripheral vision that they can’t when looking directly at the source. I encounter this fairly regularly myself.
Nah, LED lighting generally uses at least 200 Hz at a minimum. Some up to kHz. You can't perceive that. Older stuff or cheap quality might be using un-rectified AC/DC which you can see. Like cheap Xmas lights.
> at least 200 Hz at a minimum
> You can't perceive that
I very easily can. I had to get rid of an otherwise good monitor a few years ago before I knew it used PWM to control the backlight (and before I even knew PWM was used at all for this functionality — I only had experience with CCFL backlight before that).
It was really annoying to look at, like looking directly at cheap fluorescent lighting. Miraculously, setting brightness to 100% fixed the issue.
By googling around, I found that it used PWM with a modulation frequency of 240 Hz, with a duty cycle of 100% at full brightness, which explained everything.
I can also easily perceive flickering of one of my flashlights, the only one that uses PWM at a frequency of a few hundred hertz. Other flashlights either run at multiple KHz, or don't use PWM at all, and either one is much easier on the eyes.
Some of us really do perceive this stuff, which can be hard to believe for some reason.
Like back when CRTs were mainstream you'd have those computer labs with monitors set to 60-85Hz and most people wouldn't notice, but some would. I definitely did, I couldn't stand looking at a CRT set to less than 100 Hz for more than an hour.
Absolutely, I also had massive issues with them, ending with red eyes and headaches within an hour of use. Getting my first LED monitor (with a CCFL backlight) was out of this world.
Considering that people use screens that are 360-500fps for noticeable improvements in video games, people can definitely perceive that.
Next time you see a high refresh screen, move the cursor around rapidly. It's very easy to tell.
That's 2~3ms per frame.
In gaming situations what they perceive may not be the actual "flicker" of frames but the input->to->display latency, which is a very different thing to notice.
But people don't get noticeable improvements from that.
The jumps from 30-40-60-72-144 are all pretty noticeable, but 144-240 is already very minimal and 240-360+ is pretty much quackery.
> Nah, LED lighting generally uses at least 200 Hz at a minimum. Some up to kHz. You can't perceive that.
My partner and me both notice the difference between cheap LEDs and expensive ones (hue). Whe both cannot pinpoint it down.
This is due to different CRI (basically, how even light output is distributed across wavelengths), not due to PWM.
Hue bulbs have pretty bad CRI actually. They only claim >80, which almost any LED bulb is capable of these days. A good LED bulb (include those made by Philips these days) have a CRI >95.
Depends on the bulb. For those of us with "fast" eyes, some LED bulbs that are just fine for others are a subtly flickering infuriation generator.
You may not believe that people that can see 120->240Hz flicker exist, but we do. In this era of frequently-cheap-ass LED lighting, it's a goddamn curse.
The OP was specifically talking about hue. Hue does not change due to PWM.
> The OP was specifically talking about hue.
I presume this is why you think that?
> ...between cheap LEDs and expensive ones (hue)...
If so, they're referring to the Hue brand of bulbs, rather than the color property. More evidence for the fact that they're talking about flicker is that they quoted this to indicate that they were replying to it:
> > Nah, LED lighting generally uses at least 200 Hz at a minimum. Some up to kHz. You can't perceive that.
I had to shop around a LOT before I could find LED headlight bulbs with just a fan and a bare-ass resistor for current control so I wouldn't see that forsaken flicker.
It's really bad for me as I work in an LED and LASER facility. I handle ALL the PWM stuff while everyone else handles the simple led/resistor/connector board assemblies. EVERYTHING FLICKERS.
My thanks to you for shopping around for non-shit headlights. For a long, long while it seems like every third car had strobing headlights. (Now, the primary problem is INTENSELY bright and poorly-aimed headlights. I'm not sure which is worse, to be honest... but both SHOULD be super illegal.)
> EVERYTHING FLICKERS.
I absolutely could not handle that. My sincerest condolences.
> Nah, LED lighting generally uses at least 200 Hz at a minimum.
Eh, they use what they can get away with. Nobody is out there policing flicker rates. Especially when you add a dimmer into the mix, there's a lot of room between good and bad, and when you're at the hardware store buying bulbs, there's not much to indicate which bulbs are terrible.
Lots of people don't seem to notice, so the terrible ones don't get returned often enough to get unstocked, and anyway, when you come back for more in 6 months, everything is different even if it has the same sku.
> there's not much to indicate which bulbs are terrible
https://lamptest.ru
Not only flickering, but lots of other information about internationally available brands, including cheap Chinese stuff: CRI, real power use, etc.
Use your favorite online translator.
> Nobody is out there policing flicker rates.
Actually, Energy Star and California's Title 24 have flicker standards. They may not go as far as some people like, but you can look for these certifications to know that a bulb at least meets a certain minimum standard.
200 Hz PWM on lighting is very noticeable.
I can see it with one lamp I own, and in one shop that I've noticed that is using LED strip lights
just shake your hand between the lights and your eye(s).
Hi there, flashlight nerd here.
There's a third way: a switched-mode power supply with regulated output current. This is used in most better-designed flashlights (which doesn't always correlate to price) and can be used by anything else that needs to drive an LED as well.
The article doesn't discuss what technique should be used for "constant current reduction"; it probably shouldn't be a linear regulator where efficiency is a priority.
PWM is less annoying if the frequency is very high (several kHz), though I'll leave it to people who research the topic to speak to health effects.
In the world of LEDs, a switched-mode power supply with regulated output current is called a "constant current driver". I assume that's what this calls "DC dimming".
A linear regulator might also reasonably be described as "DC dimming" or "constant current". The article is only concerned with flicker so it doesn't discuss efficiency.
It could be, but I doubt you’ll find any commercial LED driver that is a linear regulator, so I don’t think that’s what they mean.
It causes physical pain for many. You can't always see the flicker but your eye muscles react faster than your perception, and they get sore.
I had to get rid of a Samsung TV (120hz backlight, replaced with linear dimmed Sony, nicer anyway..) due to this, and I can only use modern phones at 100% brightness which disables PWM.
When I moved into my current residence I couldn't figure out why my eyes were always sore. I realized my landlord put in cheap LED can lights. I swapped them out for nicer ones, pain gone. People need to stop being cheap AF.
It makes me wonder how differently we all perceive the world around us, and if that impacts our decisions in more ways than we realize.
I've had to drastically change my devices after learning PWM was causing my vision issues and eye pain. My partner has no issues using those same devices.
> There are two ways to dim LEDs: linear regulation and some sort of pulse modulation. Linear regulation is wasteful and you're pretty unlikely to encounter it. Pulse modulation is common.
Within the pulse modulation case, though, there are two important subcases. You can PWM a load that consists basically of just the LED itself, which acts as a resistive load, and will flash on and off at high rate (potentially too fast to be noticeable, as you say). But you can also PWM an LED load with an inductor added, converting the system into a (potentially open loop) buck converter. And this allows you to choose both the brightness ripple and the PWM frequency, not just have 100% ripple. Taking ripple down to 5%, or 1%, or less, is perfectly straightforward… but inductors are expensive, and large, so shortcuts are taken.
Higher PWM frequency will let you use smaller inductors.
Yes, but you need to at some point decide it’s worth adding an inductor to the BOM — trace inductance isn’t enough at any sane switching frequency (and FET switching losses kick in at the insane frequencies). And BOM item costs more than no BOM item.
>There are two ways to dim LEDs: linear regulation and some sort of pulse modulation. Linear regulation is wasteful and you're pretty unlikely to encounter it, especially in battery-powered devices such as phones or laptops. Pulse modulation is common.
The other reason is LED brightness and color is quite non-linear with current, so PWM gives a much more straightforward dim than adjusting a current smoothly.
no, 2 main methods to change led brightness: Changing duty cycle, or changing current. PWM is way to change duty cycle, and Linear is 1 way to change current. There are other ways to change duty cycle or current.
I wonder if it’s possible to have randomized sub-cycle interval variation so as to “spread out the spectrum” of the PWM signal while preserving the same resulting integrated brightness over the observable timescale.
Basically like how Apple laptop fans work, but for temporal modulation of signal instead of spatial modulation of fan blade gaps.
Even if your PWM is faster than your eye's response it's trivial to see how one is still affected, namely, the stop motion effect.
That effectively lowers the frequency of the lcd.
There are other ways to dim LEDs. Voltage stepping to control how much current can actually flow through the diode is one method (and this is how we test LEDs where I work, a 3V LED gets tested at ~2.4-2.6V. It will barely draw a couple milliamps despite the power supply set to allow an amp of draw.) The LED will light up, enough to see that it is working well, but not eye-searingly bright.
Voltage control of LEDs is extremely unreliable. The amount of current consumed at a specific voltage will vary wildly depending on temperature, process, etc. Nobody does voltage control for LEDs unless you do not care about consistent brightness at all.
"Voltage control of LEDs is extremely unreliable."
No, no it isn't. We can keep units within half a percent of starting output all day long.
Voltage control is done explicitly on laser diodes, to boot. And those are WAY MORE FINICKY than an LED.
> No, no it isn't. We can keep units within half a percent of starting output all day long.
On what, exactly? How can you possibly guarantee output using purely voltage control of an LED? LEDs (and laser diodes) are fundamentally current controlled devices. You need current feedback to set the output brightness operating points.
> Voltage control is done explicitly on laser diodes, to boot. And those are WAY MORE FINICKY than an LED.
Maybe if you don't care about the output power of the laser diode. Just not practical, and will change output power at the same voltage as temperature changes.
I personally doubt PWM is visible at all at high enough speeds (kilohertz+) even in peripheral vision. I would love to see a good study on this.
I can't see PWM flickering even at 30Hz. But it gives me bloodshot, twitching eyes even at 20kHz.
"Human vision has a pretty limited response speed, so it seems pretty unlikely that PWM at a reasonable speed (hundreds of hertz to tens of kilohertz) can be directly perceived."
I make power drivers. I have this ultra-tiny one, with output scope captures. It produces ~825kHz PWM output, single-digit mV and mA ripple, 94+% efficiency depending on input voltage (output is input minus 0.2V)
I can induce sacchades in my eyes at will and at high speeds. Couple that with waving my hand in front of my face as I do that, and add in human vision persistence, and I can get artifacting that reveals flicker even at that high of a rate of PWM. Only direct battery power fails to induce that artifacting effect in my vision when I do that combination of movement.
PWM is not to be confused with Switch Mode Power Supplies in the general case
Isn't this article conflating PWM flicker with cheap AC rectifiers causing LEDs to flicker because they periodically, at 50 Hz, get under the voltage they need to emit any light? I can't see why light fixtures in buildings—except modern ones where the lights are actually dimmable—would even have any sort of PWM.
Yes it absolutely is. Those are candle-style bulbs that use filament LEDs, they’re normally just use a basic rectifier and a regulator chip that does basic linear regulation of the peak of the sine wave.
They flicker at 100Hz due to the rectification (or 120hz in those 60hz countries of course).
Dimmable bulbs use higher frequencies.
There is a pendant light in my entryway that came outfitted with LED filament bulbs when I bought my home. They flicker noticeably, and I would have replaced them all if it were in my living room.
One of the bulbs recently burned out, and I picked up a replacement at Menards. Even though it was just a basic Sylvania, the new one clearly has a rectifier circuit as it does not exhibit any flickering that I can detect.
So anecdotally at least, the cheap bulbs without rectifiers seem to be going away from the big box stores (although I’m sure you can still get them with unpronounceable all-caps names from Amazon).
I’ve only recently seen mains flicker from speciality bulbs, small filament style ones and odd fittings with low production rates.
The quick test is to wave your hand quickly in front of the light.
> I can't see why light fixtures in buildings—except modern ones where the lights are actually dimmable—would even have any sort of PWM.
It is done because, like most crappy things in the world, it saves somebody, somewhere, a few cents on the dollar.
Most people would not be able to tell the CRI impact of DC dimming vs PWM. Many do not visibly notice the difference. (I unfortunately do, and you won’t believe how many expensive Mercedes and similar cars flicker).
But high frequency PWM is slightly more difficult and expensive, and DC dimming might need a few more capacitors or inductors… so let’s save a buck there, shall we?
I've wondered about PWM flicker when I started trying to figure out why so many modern car headlights seem like they are strobing to me.
Initially I thought it might be related to the alternator.
I still don't know why I perceive these headlights as having an annoying flicker or why. I'd love it if some (informed) commenter could clear it up for me. Am I imagining it?
Car headlights seem to really cheapen out on the PWN flicker. Even the 2 euro LEDs I buy at the discount store seem less flickery than the lights of some luxury cars. I thought it could be that people are buying the cheapest replacement bulbs they can get their hands on, but then I saw the same thing happening on a new BMW.
I also believe some people are just more affected by flicker than others. Some get headaches or migraines from working under PWM light, others don't even notice.
I'm not a mechanic, but I believe these car lights are capable of achieving some pretty high brightness (necessary for fog lights etc.) but are dimmed under normal conditions, leading to PWM effects you also see in cheap dimmable bulbs. It's especially noticeable for me on those "fancy" lights that try to evade blinding other cars (and end up blinding bikes and pedestrians) and those animating blinker/brake light setups.
This is a timely article for me since I'm building an indoor light display and the client specifically mentioned flicker. It's supposed to be a cosy warm light setup (that animates to show the movement of our planets, each represented by a lantern in the building). They were so concerned about flicker they suggested using incandescent but I'd really like to use leds for the obvious reasons (power consumption, fire risk, lifetime, color choice).
What I chose is an ESP32 controller attached to WS1812B LEDs. It turns out these operate at a PWM of nearly 20Khz and my low key tests confirm this. Even at the lowest dim level I can't detect any flicker when I move the led quickly or move something quickly in front of it.
It's amazing to me that you can get off the shelf hardware with WLED installed that works at 20Khz with these cheap RGB LEDs for less than the leading brands like a Philips Hue!
I can tell you that lights strobing exacerbate my migraines. Even 120 hertz from fluorescent lights will affect me. I have mitigated this in the past by adding incandescent lights in my office, or demanding to work near a window. LED lamps are no good, as another commenter posted, even the simplest ones strobe. Incandescent bulbs grow harder to find as time goes on. Progress?
> even the simplest ones strobe
The simplest LED sources running from AC mains power strobe at mains frequency, which is very visible and very annoying.
Fancy LED sources don't strobe at all. I'm using an LED panel intended for videography as a room light; any flickering could show up as scanlines in video, so most lights intended for that purpose are flicker-free.
> even the simplest ones strobe
The simplest ones always strobe at line frequency or the double of it (due to cheaping out on the power supply). Those have visible strobe. Simpel is bad with led light.
Find some not too cheap dimmable warm colored bulbs. They won't be cheap but might contain both a high frequency driver and fluorescent afterglow and my guess is you will not notice anything.
I can sometimes tell when a lamp is PWM'd if I look out it out of the corner of my eye. I suspect it may just be the cheaper, lower frequency ones but I can often see it flickering, such as a particular lamp I have when it's on low brightness, and there's a particular shop near me that I can see it in
As a flicker-sensitive person: the sad part of it is that to do this properly you need to have your LEDs on a proper inverter, so for most scenarios getting rid of the flicker means "get expensive light fixtures _and_ rewire their supply _and_ you can't use your existing AC mains anymore, nor can you use switches". The PWM is a cheap way to do dimming given the AC input of the grid. And it will be especially prevalent when you do want LEDs but you don't want to "do anything special" to make them work well
Let's take a bulb, and instead of one led we have 10 less powerful leds in one package.
Now you can turn off leds one at a time, have 1/10th dimming, and no pwm.
The same could be done with LCD backlighting or edge lighting on displays. Additional complexity, to be sure, but no power loss.
OLED is, well.. oled. Not sure what to do there.
Except that perceived illumination is log, not linear ;)
All the better for low light situations then.
And effectively useless for dimming in the upper half of the intensity range.
You could of course turn on/off leds in an exponential fashion, but that would result in an impractically large light to be able to dim properly, and with increased cost (much cheaper to assemble fewer more powerful leds than many smaller ones).
I'd just wire some in quads, and some in pairs to save on complexity.
Then maintain a 1/10 lux range with the combinations. Note I'm not doing the math, just showing how simple it is to work around. It's all just napkin.
The cost isn't a biggie, if it's for a target market and shares the rest of the assembly.
I think there are many problems with LED general purpose lighting. They are too bright, wrong colours, and others. I prefer to have the windows that we can have the light from the sun outside, and when that won't do, to use incandescent lights which are not too bright. However, not everyone will do that, and they put too much light outside at night too.
LED does have uses, such as many indicator lights (although they should not make them blue (unless you already used the other colours); but blue indicator lights are too common), and for some kind of displays. I think LED is not very good for general lighting, Christmas light, etc.
Utility frequency vary depending on location. That's why we have NTSC and PAL standards (50 vs 60 Hz) for flicker free video under various artificial light conditions.
Second image is just interference with camera chip frequency. Usualy eliminated by mechanical shutter in photography.
They list the PWM frequency in the bulb specifications? That's news to me.
A few months ago I went through most of the bulbs in my house and replaced nearly all of them with LIFX bulbs. I had spent quite some time trying to figure out which bulbs would have the least flicker and knew from my more DIY setups[0] that PWM frequency is the cause.
I deal with Migraine somewhat regularly and PWM flicker/strobe lights amplify the pain when I'm dealing with one.
Nearly every smart bulb I've grabbed incorporates such a miserably slow PWM setting that dimming the bulb to 1% results in lighting that's reduced by only about 25%. It becomes clear when you set it to 1% that the manufacturer couldn't limit length of the "off" cycle further or the bulb would begin resembling a strobe light.
I haven't tested all of the more expensive variants, but I also had a really hard time finding any "from the manufacturer" information about the PWM frequencies. I've also never encountered an incandescent drop-in that uses anything other than PWM frequency (I wasn't even aware that there are fixtures that do that).
[0] Experiments? Magic-smoke generators? Sometimes-almost-house-fires? I'm no electrical engineer.
I flagged because this is a submarine ad, but it was still interesting tbh.
I thought this article was a parody of the people who think they're being poisoned by wifi until I read the comments here.
I started looking into it, these poor people are paying hundreds of dollars for "flicker measurement" devices that cannot reliably tell you how the light source you're measuring is controlled
Even incandescent running on AC has flicker. It's funny when that's used as the gold standard. LED running on DC has less flicker than normal incandescent.
An LED running on continuous DC has no flicker, but PWM is not continuous DC, it's a square wave of some frequency.
Incandescents have analog inertia in the filament which smooths the light output from the AC sine wave. This smoothing is not 100%, but I've never met anyone who can detect it without equipment.
A photocell and an oscilloscope will show the smoothed live-frequency wave (I wouldn't call it a "flicker"). The wave delta is relatively higher in the perceptual range as the voltage is lowered to approach the minimum "glow-activation" threshold of the filament -- i.e. the fluctuation is more noticeable when the bulb is dimmed to nearly off.
I don't know about the health risks and harms, but it sure as hell is annoying if nothing else. I don't think it's due to PWM specifically, but the light in my fridge strobes at the mains frequency, so it "samples" my arm as I move it around in there when picking stuff out. But 50 Hz is extremely low - so it looks like as if my arm's movement "stuttered". Super jarring.
Not sensitive to this thankfully, so apart from making me act a diva and pissing me off, it doesn't affect me, but I sure wish I understood the EE side of it all so that I could properly avoid all these lights, at least in my own home.
I wonder if some of the problem may be "beating": the PWM frequency of two lights may be too high to affect anyone, but if they are different, then where they overlap you will see a pulse at the difference between the frequencies. Surely that would be a very obvious problem though.
An easy way to see PWM flicker (and distinguish cheap led bulbs for better ones) is to wave your open hand in-front of them.
If you see the strobe effect, return the bulb and buy another one.
I have a phone camera app you can set the shutter speed to 10000 fps. It becomes super obvious then, and you can actually somewhat compare how fast the flicker is that way.
Entirely anecdotal and just personal experience but I get eye strain and headaches from "flickery" LEDs. Cheap shitty room lights. Replace them with good bulbs (Philips Hue) which strobe at a much faster rate and hey less eye strain and less headaches.
I also just hate hate hate seeing the flicker in my peripheral vision.
I'm still using a 2018 MBP & iPhone SE 3 because newer Apple devices make my eyes hurt for the rest of the day after a few hours of use.
PWM is awful. I can tell within seconds of seeing a screen if it has PWM and usually I start to get eye issues within a few minutes.
Would you please care to elaborate? How can you tell? Which eye issues? Thanks.
> The light wasn’t steady; the LEDs were flickering, pulsing on and off thousands of times per second to create the dimmed effect.
If it really is thousands, I don't think you have a problem.
Do computer screens flicker and release this bad light?
Yes-many do.
It's difficult for PWM-sensitive Mac users right now, as the majority of Apple devices for years have had rough PWM and of course there is no hardware alternative.
I'm stuck on a MacBook from years ago because the only current MacBook I can buy is the Air line, which I'll probably buy soon to replace my aging 2018 MacBook.
No currently for sale iPhone is PWM-free. The iPhone 11 (non-Pro) was the last mainstream device Apple made with a PWM-free backlight. The SE 3 (2022) was also PWM-free, but is no longer available from Apple beyond what stock is still around.
Some do, some don't. Sites like notebookcheck.net typically mention in their reviews whether a given laptop screen exhibits PWM.
Yes, Notebookcheck regularly measures PWM in displays: https://www.notebookcheck.net/PWM-Ranking-Notebooks-Smartpho...
For OLED I remember reading that PWM dimming is necessary because DC dimming causes shifts in color/whitepoint.
In the article they rank some smartphones by how much they flicker for dimming, I assume it's the same when computer screens dim?
I have LED in my home office. The "temperature" and this flicker were driving me bonkers. Fortunately no headache. Now I have them all pointed away to reflect off wall or ceiling, or behind diffusers. Much less bothersome,
Lightbulbs are cheap & eyes/brains are not... home office? May be worth taking care of yourself and save these cheap bulbs for places you don't spent most of your waking hours.
Fun tangent.
The 1981 film Looker, written and directed by Michael Crichton, features a trope: the Looker gun, which induces a trance in its targets via flashes of light, such that they are unaware of the passage of many hours of time.
https://en.wikipedia.org/wiki/Looker
Suspicious of "DC dimming". If you can just lower the current to an LED to dim it, everyone would. Someone will know better than me, but I believe there is a kind of threshold voltage for the (solid-state) LED.
I am not aware of LED bulbs (and here I am talking about home lighting, not phones or laptops) that dim by shutting down some of the (multiple) LEDs.
Most home lighting bulbs appear to have several LED elements. A circuit could enable dimming by simply shutting some of them off — running the rest full-on. 50% dim would of course shut half the LEDs off. No PWM required.
DC dimming LEDs is relatively easy, and somewhat common. The problem is that it's expensive compared to PWM dimming. It requires more expensive current-adjustable circuitry.
Additionally, for bulbs that are used in regular household fixtures, they basically need a way to convert TRIAC chopped 50/60Hz AC into constant current... which makes things even more expensive. Smart bulbs that are supplied a constant non-chopped AC can do it easier, but it's still expensive to do DC dimming.
I guess there is some threshold below which the LED turns off so the voltage/current -> light function needs to be set accordingly.
When I was in high school we were messing around with liquid nitrogen and overvolting LEDs and noticed the odd effect that the color of the LED would change if you overvolt it. It was years before I found out why
https://www.reddit.com/r/AskElectronics/comments/v28qbh/why_...
https://spectrum.ieee.org/a-definitive-explanation-for-led-d...
Voltage, yes. Current, no not really. You can drive extremely low currents and still get photon emissions from LEDs. That said, it's highly non-linear, so you basically need to assign set points. Doubling the current won't double the lumen output.
You can just lower the current. Not everyone does because it generally requires more expensive components, e.g. inductors. There is a threshold voltage ("forward voltage") needed for LEDs to turn on but there's no threshold for minimum radiant flux. LEDs are actually more efficient at low current (although this might be counteracted by greater losses in the power supply).
it's expensive, and in some way, less efficient.
It takes 1 mosfet to turn led on/off from a MCU GPIO, but if you want to do DC dimming, now you have to either add more passive components, or turn to special IC, both cost more.
You can in fact dim leds. You can see a lot of controllers that are just that at various parts suppliers.
you can dim LED that are running on DC (it requires more than a potentiometer i guess - probably a buck circuit controlled by a pot, though) or AC; i have scant idea how the AC ones work, although variacs have existed for a real long time; but you have to buy special LED bulbs that can handle being on a dimming circuit.
this is different than a bulb like hue etc that have the ability to dim themselves through whatever mechanism.
Traditional dimmers used TRIACs. Those don't dim LEDs well, they make very visible flicker. TRIACs turn the AC off for part of the waveform, essentially a very slow version of PWM. With an incandescent filament that flicker isn't as noticeable since it takes some time to cool down & stop glowing, which visibly smooths the flicker. It just stabilizies around a lower temperature. With LEDs, the turn-off is nearly instant. You visibly see the flicker at the AC mains frequency.
There are two ways to dim an LED: supply less current at the same voltage, or PWM dim it with a fast enough switching speed that you don't notice the flicker (this being slower than it needs to be is what the article is about). A current source is pretty easy to build, and doesn't flicker, but it does dissipate all the excess energy as heat. That's not what you want inside the dimmer switch in your wall, it can be quite a lot of heat and would be a fire hazard in such a confined area. It does work for things like photography lamps which can have exterior heat sinking.
> but it does dissipate all the excess energy as heat.
No. That's only true for a linear regulator, which is just one, very terrible, implementation of a current source that's only used for very low power applications. Linear regulators are never used for things like room illumination.
The alternative, and what's used for all commercially available DC LED drivers (plentiful and cheap), is to just use a regular AC->DC switching supply in current mode (current for feedback rather than voltage feedback). The only flicker is the ripple left in the filtered output.
Why aren't these used? Because most dimmer switches use tech from incandescent age, and just chop off parts of the AC sine wave, so the bulbs are designed around the switches you can buy in the store. Why do dimmer switches chop? Because that's what the bulbs you can buy at the store expect, sometimes damaging them if not dimmed as they expect.
You can buy in wall DC dimmer switches from any LED supply store, but they require DC lighting, also only found at LED supply stores. It's entirely a very recent momentum problem, that's slowly going away.
Linear regulators are in fact used for room lighting, and efficiency can be reasonably good. Typical design is AC input -> bridge rectifier -> passive low-pass filter -> long string of LEDs with a single linear regulator in series. Voltage drop across the regulator is much lower than across the string of LEDs so there's not a whole lot of heat generated.
They're never used for dimming, since that requires a voltage drop, which is the context here.
You can dim LEDs running on AC by converting to DC and then adjusting the current limit of the switching power supply. No flicker, but more expensive components.
Apparently dogs have a far higher flicker fusion threshold. So a light that might seem OK to you is like a disco stobe for your pooch.
Also, if you take a photovoltaic cel and hook it up to an audio jack, you can turn the unseen flicker in light into sound.
We often talk about screen time and eye strain, but rarely do we mention the quality of ambient light. Low color rendering index, high flickering LED lights may not cause eye strain immediately, but they can wear on your eyes over time.
I had no idea about this:
"To understand why PWM bulbs have so much flicker, imagine them being controlled by a robot arm flicking the on/off switch thousands of times per second. When you want bright light, the robot varies the time so the switch is in the 'on' mode most of the time, and 'off' only briefly. Whereas when you want to dim the light, the robot arm puts the switch in 'off' most of the time and 'on' only briefly."
It's entirely fine if the rate is high enough, but lowering the frequency of the PWM and using smaller inductors (or even no inductor at all) is a prime way to make the bulbs cheaper.
This the reverse, actually, you can use much smaller inductors the higher the switching frequency. That's why the GaN chargers are so much smaller, for example.
Smaller relative to the requirement for the frequency: you can cheap out both using lower frequency components _and_ using a small inductor than you should be for that lower frequency (or again, not using one at all at that lower frequency because it's still higher than the eye can directly perceive and you think that's all that matters)
Then congrats, you don't have the problem because to those of us who can notice it, PWM working this way is pretty obvious from first principles.
Can’t we low pass filter PWM ambient lights, to make them time-continuous?
I guess not at the circuit level because the higher peak signals are necessary to drive the light emission?
Anyone care to weigh in on this?
In the meantime, it occurred to me PWM stimulate a light-emiting phosphor for continuous-time ambient light applications - CRT style.
You totally can, but the target quantity is average current, not average voltage. Since the LED is a diode, that's not the same.
"higher peak current signals", that is.
Thanks for the clarification.
Use your phones slow motion video to check flicker. Phillips has one of the better led bulbs out there
Wouldn't it surpass the flicker if you're recording at 60fps and the lights flicker at a multiple of that (ex. 120Hz in the USA)? At least that's my experience recording CRT monitors - you set it to a multiple of its refresh rate and the flicker is gone from the video.
I used this method to check some of the lights in my house a few years ago. The slow-mo video mode on my phone used a rolling shutter which captures one row of pixels at a time, meaning you could see the flicker in part of the video, even when it’s a multiple of 60 Hz or above 240 Hz. The flicker and camera frequencies also aren’t exactly synced up, so you can see the dimmed parts move across the screen.
You can get a pretty good idea of frequency, depth of flicker, and if the LED’s colors are flickering in sync from this, and I can confirm that Philips LEDs, specifically the EyeComfort series, are good.
> *Up to 15–50% slower decision-making in offices with high flicker (and high CO₂)
just throws me right off the argument in an article when the fine print notes that a cited study is confounding the thing the author cares about ("sensitivty to flicker") with a much simpler and better-understood explanation (CO₂ poisoning)
I tried my best but once it starts citing different sources as providing hard numbers and then not linking to the sources... and of course they're selling something. Might need a citation on that claim that iPhones don't use PWM.
I had to return an iPhone because the oleds they use flicker so bad. The only iPhones that don’t ficker are the SEs because they use ond school LCD screens. But of course they got rid of the SE. So now I’m stuck on this old SE3 until I can find a different phone that doesn’t flicker because as of now ALL iPhones flicker.
iPhone 16 Pro Screen flickering / PWM detected 239 Hz Amplitude: 15 %
https://www.notebookcheck.net/Apple-iPhone-16-Pro-smartphone...
The frequency of 239 Hz is relatively low, so sensitive users will likely notice flickering and experience eyestrain at the stated brightness setting and below.
There are reports that some users are still sensitive to PWM at 500 Hz and above, so be aware.
I always check notebookcheck.net for PWM stats.
For reference, the regular iPhone 16:
Screen flickering / PWM detected 60 Hz Amplitude: 25.75 % Secondary Frequency: 487 Hz
> The frequency of 239 Hz is relatively low, so sensitive users will likely notice flickering and experience eyestrain at the stated brightness setting and below.
Do you have a source for this claim that 239 Hz is low enough to be noticeable by some measurable fraction of people? People report being sensitive to all kinds of things that end up repeatedly failing to reproduce empirically when it's put to the test (e.g. WiFi and MSG), so that there's a PWM sensitivity subreddit is not the evidence that TFA thinks it is.
The source that TFA links to backing up the idea that between 5% and 20% of people are sensitive to PWM flickering is a Wikipedia article which links to a Scientific American article which does not contain the cited numbers, and even if it did the study it discusses was researching the significantly slower 100 Hz flickering of fluorescent bulbs.
Here is one: https://www.nature.com/articles/srep07861
They mentioned in Results section: > For the median viewer, flicker artifacts disappear only over 500 Hz, many times the commonly reported flicker fusion rate.
Yes, that's an interesting source that at least shows that our eyes can perceive things at those high frequencies, but I'm not sold that it generalizes.
The study actually demonstrates that perception of flicker for regular PWM does in fact trail off at about 65 Hz and is only perceptible when they create the high-frequency edge by alternating left/right instead of alternating the whole image at once.
It looks like the situation they're trying to recreate is techniques like frame rate control/temporal dithering [0], and since this article is now 10 years old, it's unclear if the "modern" displays that they're talking about are now obsolete or if they actually did become the displays that we're dealing with today. From what I can find OLED displays do not tend to use temporal dithering and neither do nicer LCDs: it looks like a trick employed by cheap LCDs to avoid cleaner methods of representing color.
It's an interesting study, but I don't think it redeems TFA, which isn't about the risks of temporal dithering but instead claims harms for PWM in the general case, which the study you linked shows is not perceived above 65 Hz without additional display trickery.
[0] https://en.wikipedia.org/wiki/Frame_rate_control
What they are trying to do is recreating the situation that is more similar to actual light sources in computer screens and TV (varying flickering rate from different pixels/ areas). They are saying that the current threshold of 65Hz commonly reported are tested on light sources that are uniformed in flickering, which is not the case for actual screens. It is not about dithering.
Basically the claim is that when there are varying flickering frequency, the requirements for non-flicker frequency is much higher.
No, they're specifically contrasting two types of displays and identify that the traditional way of measuring flicker effect does work for the traditional displays, regardless of image complexity:
> Traditional TVs show a sequence of images, each of which looks almost like the one just before it and each of these images has a spatial distribution of light intensities that resembles the natural world. The existing measurements of a relatively low critical flicker fusion rate are appropriate for these displays.
> In contrast, modern display designs include a sequence of coded fields which are intended to be perceived as one frame. This coded content is not a sequence of natural images that each appears similar to the preceding frame. The coded content contains unnatural sequences such as an image being followed by its inverse.
What's unclear to me 10 years down the road is if the type of display they're worried about is common now or obsolete. "Modern" in 2015 could be the same as what we have today, or the problems the study identified could have been fixed already by displays that we would call "modern" from our reference frame.
I don't know enough about display tech to comment on that, but they're very clear that if your display is showing frames in sequence without any weird trickery that the research method that gets you a 65 Hz refresh rate is a valid way to test for visible flickering.
EDIT: Here's another quote that makes the contrast that they're setting out even more clear:
> The light output of modern displays may at no point of time actually resemble a natural scene. Instead, the codes rely on the fact that at a high enough frame rate human perception integrates the incoming light, such that an image and its negative in rapid succession are perceived as a grey field. This paper explores these new coded displays, as opposed to the traditional sort which show only a sequence of nearly identical images.
It's possible that this is actually a thing that modern displays have been doing this whole time and I didn't even know it, but it's also possible that this was some combination of cutting-edge tech and cost-saving techniques that you mostly don't need to worry about with a (to us) modern OLED.
That is just the motivation, the experiment is much more general and is not related to display technology:
> The work presented here attempts to clarify “the rate at which human perception cannot distinguish between modulated light and a stable field
Otherwise, they would have tested the dithering directly for the full image. Here they are testing a more simpler model: varying flickering causes higher flickering-free requirements (due to eye movements). This would applies to dithering, but potentially other situations.
You can't say "that is just the motivation", because the motivation is what dictated the terms of the experiment. I read the whole study: the contrast between the two types of displays permeates the whole thing.
They repeatedly say that the goal is to measure the effect of flickering in these non-traditional displays and repeatedly say that for displays that do not do the display trickery they're concerned about the traditional measurement methods are sufficient.
You're correct that they do demonstrate that the study shows that the human eye can identify flickering at high framerates under certain conditions, but it also explicitly shows that under normal conditions of one-frame-after-another with blank frames in between for PWM dimming the flickering is unnoticeable after 65 Hz. They go out of their way to prove that before proceeding with the test of the more complicated display which they say was meant to emulate something like a 3D display or similar.
So... yes. Potentially other situations could trigger the same visibility (I'd be very concerned about VR glasses after reading this), but that's a presumption, not something demonstrated by the study. The study as performed explicitly shows that regular PWM is not perceptible as flicker above the traditionally established range of frame rates, and the authors repeatedly say that the traditional measurement methods are entirely "appropriate" for traditional displays that render plain-image frames in sequence.
EDIT: Just to put this quote down again, because it makes the authors' point abundantly clear:
> The light output of modern displays may at no point of time actually resemble a natural scene. Instead, the codes rely on the fact that at a high enough frame rate human perception integrates the incoming light, such that an image and its negative in rapid succession are perceived as a grey field. This paper explores these new coded displays, as opposed to the traditional sort which show only a sequence of nearly identical images.
They explicitly call out that the paper does not apply to traditional displays that show a sequence of nearly identical images.
The paper's motivation is to explore the new coded display, and they are doing that by exploring an aspect that they care about. That aspect is very specifically well-defined, and if you want to show whether a display has the same effect or not, then we need to look into it. But at no point is the experiment itself relating to any kind of display tech.
I mean, they are not even using a screen during the study, they are using a projector. How are you going to even make the claim that this is display technology specific when it is not using a display?!
Did you actually read the study? I assumed you did and so I read every word so I could engage with you on it, but it's really feeling like you skimmed it looking for it to prove what you thought it would prove. It's not even all that long, and it's worth reading in full to understand what they're saying.
I started to write out another comment but it ended up just being a repeat of what I wrote above. Since we're going in circles I think I'm going to leave it here. Read the study, or at least read the extracts that I put above. They don't really leave room for ambiguity.
Edit: I dropped the points on the details, just to focus on the main point. Rest assured that I read the paper, I was arguing in good faith, and that after a bit more thinking I understand your criticism of my interpretation. I don’t think the criticism of the research being unable to generalized is warranted, considering the experimental design. But we aren’t going to agree on that. The difference in our thinking seems to be the probability of the similar effect showing up in daily lives. I know the projector was emulating the coded display, but my point is that it was reasonably easy to do it, and the same setup could conceivably show up easily in different way. Not to mention that the researchers specifically said all the displays in their office had the effect, so it is common within displays itself.
I think if we continue talking, we will keep running in circles. So let’s drop the details on research: it is there, we can both read it. Here is what I was trying to convey since the beginning:
- If you think the (original) article is an ads, with the writing not up to scientific standard: sure, I am ambivalent about the article itself
- If you think the gist of the article and their recommendation is wrong, I mildly disagree with you
- If you think led-flickering affecting people is in the same ballpark of concern about Wifi or GMOs, I violently disagree with you.
LEDs are new, and so the high frequency related research are not too numerous, but for the few exist, they generally point to a higher threshold of perceiving than previously thought. As for the health-effect, I believe that part is more extrapolation than researched (since those can only come after the more generic research on perceiving). So the final assessment is: how bad was the article in presenting information the way they did.
> People report being sensitive to all kinds of things that end up repeatedly failing to reproduce empirically when it's put to the test (e.g. WiFi and MSG), so that there's a PWM sensitivity subreddit is not the evidence that TFA thinks it is.
I know you’re looking for large sample size data, but PWM sensitivity absolutely exists, and I wish it didn’t. The way my eyes hurt in less than a minute while looking at an OLED phone (when I can handle an LCD phone for hours just fine) is too “obvious”. This occurs even on screens I didn’t know were OLED till I got a headache, btw.
(I’m also insanely sensitive to noticing flicker and strobe lights - I say a light flickers, everyone disagrees, I pull out the 240fps mode on my phone… and I’ve never been proven wrong till now.)
I used a 240 Hz PWM-dimmed monitor for a couple of years and I adjusted, but when I switched to a flicker-free one, it was very noticeable and bothersome to use the old one. Even though it's not perceptible when looking at a fixed point, when moving one's eyes around the after-images are easy to see. Even 1000 Hz PWM-dimmed LED strips are easy to notice when looking around the room. The light is basically being panned across one's retina like a panoramic camera/oscilloscope, logging its brightness versus time.
Even if the iPhone was flicker free, holding the iPhone itself throws all that out the window with all the addictive colors and notifications and badges
thanks for saving me time
You can also spot houses and cars with high dollar alarms and ir cameras. Under night vision they are very obvious. It’s like looking at a light show
With the lofty claims of "health risks", I was disappointed to find no sources linked at the bottom of this article (correct me if I'm wrong).
Based on my personal experience, I think "health risk" is an overstatement: bad PWM can be uncomfortable (Geneva Airport had particularly egregious lights that started flickering in your peripheral vision), but I doubt there are any long-term effects of it.
Reading further down, a few other comments [1][2] have stated this better than me.
[1]: https://news.ycombinator.com/item?id=44313661 [2]: https://news.ycombinator.com/item?id=44312224
I'm interested in this topic and stay on top of most threads/discussions about PWM.
> I doubt there are any long-term effects of it.
I would have thought the same, but it seems to be a common experience that once someone becomes PWM sensitive it actually sticks with them.
I've been a techy my whole life; the iPhone 12 mini seemed to be the device that triggered my PWM sensitivity and since then I have been extremely sensitive to any device with PWM.
Although I have tried to keep PWM devices out of my life, I can still quickly tell when the TV in a lobby or the airplane entertainment display has PWM and there's not much you can do about it.
can anyone recommend flicker-free LEDs (E14) to order from Amazon.de?
incidentally i found some LEDs to be extremely annoying but the flicker would sometimes just disappear on its own or after turning off and on the light switch. what could cause this?
I have massive issues with PWM and used this site to decide on what bulbs to buy.
https://flickeralliance.org/
It was a godsend. In I went with a bunch of Ikea bulbs and couldn’t be happier. Absolutely zero flicker in my testing, good color output.
after searching for a while I found osram to make light-bulb that I could support:
https://www.amazon.fr/dp/B073QS6K3C?ref=ppx_pop_dt_b_product...
Random guess here but maybe some phase interaction with electrical grid frequency
Is PWM kind of generic term now? There is also PDM ( pulse density modulation ) - might be a better way to modulate LED
It's better to filter your PWMs or use an inductance based chopper.
I would certainly agree that finding LED bulbs that you like and/or don't bother you can take some work (especially if you want to put them on a dimmer, in which case you may also need to replace your dimmer). However, I am skeptical that subtle PWM flickering is unavoidable. For the chateau example, it would be better to choose bulbs with fewer lumens and run them at 100%?
I wonder about this too. If I have a dimmer and a LED bulb, does putting the dimmer all the way up still use PWM? I have a hunch that it still does, but would love to be proven wrong.
We had these flourescents in our computer lab at school. They were light yet dark. On yet off. Crazy. Some weird color or flickering frequency. If you sat there for a couple of hours you would start to stink. Like, a weird stink. Some speculated that it did something to your glands.
Give me a nice candle.
The health stuff seems more like woo to me, but subjectively speaking for a while I had OnePlus phone with linear dimming it was easily the most pleasant to read at it's lowest brightness setting, while simultaneously being able to go even dimmer compared to any other phone I've ever used.
The gamma curves got a bit messed up, but when it's that dim it's not like I expect stellar color accuracy anyway.
There is a strong and widespread tendency to view anything artificial as highly dangerous. I understand this perspective, but on the other hand, we have science and reasoned arguments.
Science is informed by listening to the experience of other humans and doing research.
PWM sensitivity is real and has nothing to do with someone's belief system.
"Perceiveved brightness"?
And perceiveved brightness is equal to the peak of the PWM wave?
That image from courtesy Daylight Computer Company is consuming too much of my attention.
I've seen this repeated many times but never seem any evidence for it. At typical PWM frequencies the perceived brightness is just the average brightness of the wave. I believe this myth arose from people driving low-brightness indicator LEDs using PWM for increased efficiency when using simple current-limiting resistor circuits. People saw the energy savings from less waste heat in the resistor and somehow confused it with something happening in the eye.
I'm so surprised this hasn't been said yet.
Isn't it extremely more likely that any problem with the appearance of something under LED light be due to the light's peculiar spectrum?
I don't think this is true. The typical uneven LED spectrum causes poor color rendering accuracy, but human color perception is highly inconsistent anyway. Think of the blue/white dress people were arguing about ( https://en.wikipedia.org/wiki/The_dress )
See also:
https://en.wikipedia.org/wiki/Color_constancy
It can be both, but for people sensitive to PWM, it’s the PWM.
PWM lights generate a lot of radio frequency interference. Nobody seems to care except for us ham radio operators who can’t enjoy the radio anymore. (It’s also a reason we lost AM radios in cars)
Im a ham radio op and incredibly sensitive to PWM lighting. Imagine how I feel!
At least my house has little RFI. My neighbors on the other hand…
I know that people anecdotally report complaints about flicker and it's plausible to me that there could be an effect, but the way this piece is written reminds me distinctly of similar essays about WiFi sickness, MSG, and GMOs.
It identifies a "health risk", describes the mechanism in terms that sound very convincing, assigns numbers to its cause and effects, provides a table grading health risks of various products, all without linking to a single scientific study demonstrating that the effect is anything other than nocebo. The closest they come is a image of a table that refers to a few institutions that apparently did a study related to PWM (leaving it an exercise to the reader to find the studies they're supposedly referencing) and a link to a Wikipedia page which links to a Scientific American article which says:
> In 1989, my colleagues and I compared fluorescent lighting that flickered 100 times a second with lights that appeared the same but didn’t flicker. We found that office workers were half as likely on average to experience headaches under the non-flickering lights. No similar study has yet been performed for LED lights. But because LED flickering is even more pronounced, with the light dimming by 100% rather than the roughly 35% of fluorescent lamps, there’s a chance that LEDs could be even more likely to cause headaches.
I'm willing to entertain the idea that LED flicker is actually problematic, but I wish essays like this would be honest about the degree of confidence we have given the current state of the evidence. This piece instead takes it as a given that there's a problem, to the point where they confidently label devices on a scale of Low to Extremely High health risks.
There doesn't need to be a health risk for it to be annoying. I personally dislike PWM and I'll continue to personally dislike it even if it's proven safe. Fortunately it's easy to find non-flickering LED lights.
If the article said "I find PWM annoying" I wouldn't have commented like I did.
IEEE Recommended Practices for Modulating Current in High-Brightness LEDs for Mitigating Health Risks to Viewers : https://standards.ieee.org/ieee/1789/4479/
There is nothing anecdote about flickering in LED light causing health risks.
I am not questioning that certain types of flickering are harmful, so that there's an IEEE standard for how to safely use PWM does not contradict what I said.
What I'm asking for is for articles like this that cite numbers and provide tables purporting to quantify the degree of harm caused by various devices to point to where they're getting their numbers from or, if they can't do that, stop making up numbers and assigning things to "harm" scales that they invented themselves based on vibes.
Either there's a study showing that 246 Hz flickering poses "Extremely High" health risks or there isn't.
Was it an astronomically high health risk to watch a TV set that flickers at 60 Hz or movies that flicker at 48 or 72 Hz? (It is 24 frames per second but you'd perceive a lot of flicker at that rate so the shutter has 2 or 3 blades)
See my comment on the other reply.
> Either there's a study showing that 246 Hz flickering poses "Extremely High" health risks or there isn't.
They calculated it using the definition from the standard.
Can you please cite the page number where this definition exists? When I search for "extreme" in the standard that the other commenter links to I don't turn anything up, so I'm unclear where that classification is defined.
31 and 32 (by the printed page number), in pdf it’s 42
That does not define the scale that they're using. That's a typical hazard analysis risk matrix which has two axes which can be converted into a 4-point scale (Low, Medium, Serious, High). Importantly, to do a risk assessment in the style of IEEE 1789's you have to identify the specific Hazards that you're analyzing, which TFA does not claim to be doing in that table, instead speaking vaguely of "health risks". IEEE 1789 does not provide a mechanism for evaluating "health risks" without specifying exactly which risks are being evaluated.
You can see on page 27 how this is meant to be used: it should produce a per-hazard matrix.
You might be thinking of Figure 18 on page 29, which does identify Low-risk and No-effect regions by Modulation % and Frequency, but that also does not claim to identify high-risk regions, it just identifies the regions we can be highly confident are safe. And importantly, as a sibling comment notes, TFA's table actually contradicts the line on Figure 18, labeling several devices as higher than Low even when they're squarely within the Low-Risk and No-Effect zones.
The article contradicts the IEEE paper.
They list the 'Xiaomi 15 Ultra' as having a 'Moderately High' health risk, and cite it as having a 2.16 kHz PWM frequency at 30-75% modulation depth.
The IEEE article has recommended practices that state:
8.1.2.3 Example 3: PWM dimming Using Figure 20, the recommended practice for PWM dimming at 100% modulation depth is that the frequency satisfies f > 1.25 kHz. This can also be derived using Recommended Practice 1 and solving 100% = 0.08×fFlicker. This level of flicker could help minimize the visual distractions such as the phantom array effects.
Seems like even at 100% mod depth, >1.25 kHz is just fine.
Also, the article does not seem to distinguish between modulation at reduced brightness, which the IEEE article calls out specifically as something that is unlikely to cause issues. E.g., movie theaters using film all flicker at 48 Hz and nobody complains about that.
Here is a non-paywalled link: https://www.lisungroup.com/wp-content/uploads/2020/02/IEEE-2...
Sure, PWM light can cause health risks for some people, in some contexts. But taking research out of context is bad science.
Do you genuinely believe the Pixel 7 and 8 Pro have an "extremely high health risk", in the context of what a lay person would understand?
Edit: I specify 'lay-person' because clearly this is an introductory blog post (or advertisement for Daylight Computer). If they want to use a more specific definition of health risk, then they better define it.
The “very/moderate high” comes from the standard itself, which is quantified within the standard. In the context, it is about the probability of having issues, while the effect (mild to catastrophic) is another axis. Considering that they stick to the “official” wording and seeing the criticism, I am not even sure if they can change to a more “lay-person” friendly and be acceptable to all the critics.
The standard also linked to the researches during their discussion.
Please read it, instead of just randomly throw out things hoping that they supported your argument.
You can't just point people at a 60-page paywalled standard and say "the supporting evidence to my claim is somewhere in here, I pinky promise". You are the one making assertions, it's on you to prove that the standard actually does reflect the text of TFA. I'm not going to read the whole standard because I'm not the one making the argument and I can't be bothered doing the research needed to refute every piece of nonsense science that shows up on the internet. What I can do is point out when someone is making unsourced claims and insist that they provide their sources if they want to be taken seriously.
Cite the exact page number and quote that you claim justifies the assertion that 246 Hz PWM carries an "extremely high" health risk. Then we can talk.
Look, they sourced their claims (quite literally, they put how they calculate, from which standard). And linking to the correct document is literally how scientific citation works — I replied the page to you above anyway.
If you want to redo the numbers and check if they fit the definition, please feel free to do so, but you will need to put some works in (since the flicker hz -> risk showing in the article is a computed value, you need to find the modulation value and plug it in too)
I understand your fight and your idea, I am just saying that in this specific instance, this is not a fight to be fought. The article is generally correct, and if you want to complain about the writing style or it being an ads, it’s up to you. But this is not the same situation with GMO stuffs
> Look, they sourced their claims (quite literally, they put how they calculate, from which standard).
No, they said that IEEE 1789 also uses Modulation % (which they've renamed Flicker %) to calculate risks. That is pointedly not the same thing as claiming that they used IEEE 1789's formulas.
You're reading their copy generously, but that doesn't usually pay with marketing copy. Articles like this always like to wave in the general direction of official-sounding sources while carefully refraining from actually claiming that they got their numbers from anywhere in particular.
Hey it got flagged.
Why do we use anonymity for that? What's gained and lost by that?