grim
03/31/2004, 09:13 AM
Why are people so 'K-Obsessed'? Two bulbs that are manufacturer labeled with a certain #K can have very different spectra, and spectra, really, is where it is at.
So when are we going to get to the point when we're looking at spectra versus spectra instead of comparing (or being obsessed with) Kelvin ratings?
(if you haven't been in the computer field for a while, the below will probably make no sense to you)
It's kind of like back in the old computer days when MIPS (# Million Instructions per Second) was king as far as performance measurement went.. Until people realized that MIPS were meaningless, and thus the joke-acronym Meaningless-Indicator-Of-Performance was assigned. Or, maybe now like the gHz assigned to a CPU, where, in reality, there are so many factors that affect performance that gHz really is a meaningless indicator, and in most cases, it is really impossible to compare mHz or gHz values across different CPU architectures, let alone different manufacturers.
So, with that, lets just get off the Kelvin bandwagon here, it's a nice shorthand value to get a very rough idea of the color of a bulb, but, when I see two bulbs with the same Kelvin rating side by side that look radically different, I know the time has come to abandon that measurement and move on.
Unfortunately, there is no easy way to talk about spectrum without a graph or other pictorial (unless you like looking at tables of numbers), and I suppose it is difficult to comprehend for a non-techie, so I guess therein lies the problem.
Everyone knew that 10K was best, well, then 20K was best, then someone found the 6.5K was best, and now well something between 12-15K is best, so is anything really the best, or is nothing the best? Or, well, it is a little more complicated than just looking at the "meaningless-indicator-of-lighting-performance" that the K-Rating has turned into..
Let's be a little smarter and throw the "-K-" into the same trashcan as "watts-per-gallon".
jb
So when are we going to get to the point when we're looking at spectra versus spectra instead of comparing (or being obsessed with) Kelvin ratings?
(if you haven't been in the computer field for a while, the below will probably make no sense to you)
It's kind of like back in the old computer days when MIPS (# Million Instructions per Second) was king as far as performance measurement went.. Until people realized that MIPS were meaningless, and thus the joke-acronym Meaningless-Indicator-Of-Performance was assigned. Or, maybe now like the gHz assigned to a CPU, where, in reality, there are so many factors that affect performance that gHz really is a meaningless indicator, and in most cases, it is really impossible to compare mHz or gHz values across different CPU architectures, let alone different manufacturers.
So, with that, lets just get off the Kelvin bandwagon here, it's a nice shorthand value to get a very rough idea of the color of a bulb, but, when I see two bulbs with the same Kelvin rating side by side that look radically different, I know the time has come to abandon that measurement and move on.
Unfortunately, there is no easy way to talk about spectrum without a graph or other pictorial (unless you like looking at tables of numbers), and I suppose it is difficult to comprehend for a non-techie, so I guess therein lies the problem.
Everyone knew that 10K was best, well, then 20K was best, then someone found the 6.5K was best, and now well something between 12-15K is best, so is anything really the best, or is nothing the best? Or, well, it is a little more complicated than just looking at the "meaningless-indicator-of-lighting-performance" that the K-Rating has turned into..
Let's be a little smarter and throw the "-K-" into the same trashcan as "watts-per-gallon".
jb