Generally I see efficiency related to the lumens per watt, but what is the actual typical efficiency of LEDs in terms of electrical energy in to optical energy out? What sort of conversions apply?
Answer
To make things clear let's define what we are talking about.
There are two terms which are mixed up pretty often:
- Luminous efficiency:
The luminous efficiency is a dimensionless quantity which is derived from the luminous efficacy. It is simply the quotient of luminous efficacy of the source and maximum possible luminous efficacy of radiation.
- Luminous efficacy:
This is the value you see more often. It usually has the unit of lumen per watt. And gives the luminous flux per power, which is a useful quantity to see how much light we will get with a given power.
With this we have to be a bit careful as well. Because the power can be the radiant flux of the source or the electrical power. So the former can be called luminous efficacy of radiation, and the latter luminous efficacy of a source or overall luminous efficacy.
Now the problem arises, that we cannot see all colors equally well. And lumens are actually weighted based on the response of our eye:
Public Domain, Link
So with this, you can create some values of upper bounds (based on the redefinition of the unit candela). This would be the luminous efficacy of radiation.
Which are:
- Green light at 555 nm: 683 lm/W
- Maximum for CRI=95 at 5800K: 310 lm/W (based on truncated black body radiators)
- Maximum for CRI=95 at 2800 K: 370 lm/W
For more see here.
If you lower the color rendering index (CRI), you can achieve higher values. But not higher than 683 lm/W.
So how efficient are LEDs?
Here we have values of luminous efficacy of a source.
Well there is a race of efficiency. Cree posted a press release with a laboratory LED of 303 lm/W at 5150K. The CRI was not mentioned, I guess it is lower than 95, but based on the data above that seems like it would have a luminous efficiency of something like 80% to 90%.
Of course your average available LED has less. 100 lm/W would be around 25% to 30% and the new 200 lm/W chips announced recently (as of August 2017) reach 50% to 60%.
Note that the above is for photopic vision (day-vision), things change with scotopic vision, but that's usually not so interesting.
If you really want to get into the guts of it, you'd have to take the spectrum of the LED and find out what the highest theoretical maximum for that spectrum is (based on the weighting curve) and then you can calculate the value.
As each and every LED has a different spectrum it is hard to get this data easily.
I hope I haven't made a mistake here, because I always find the topic a bit confusing no matter how many times I revisit it.
No comments:
Post a Comment