Posts Tagged optical bench
The measurement of light is complicated by a variety of units and concepts that are not used in other fields. For example, the ‘light level’ could be measured in units appropriate to the sensitivity of our eyes (lux), or by the power level (Watts) – but that’s confounded by the wavelength (nano-meters, but sometimes Angstroms) and you need to think in steradians, etendue must be conserved … you get the idea.
We’ve written about some of these issues in earlier posts, but this is one big, complete reference manual – a kind of ‘everything you wanted to know about light, but were afraid to ask’ – and it’s from NIST. They call it a ‘Self-Study Manual’ and it’s a clearly written tutorial on optical radiometry.
And it’s a free download. Enjoy. The test is Tuesday.
The official title is The Self-Study Manual on Optical Radiation Measurements, edited by Fred Nicodemus
For both a clinical test microscope, and a home theater HDTV projection display, the light from the source must be quite uniform.
To test some non-imaging illumination optics, we set up our digital camera, and wrestled with the RAW data files from the camera. Most cameras have some ability to ‘see’ infra-red, so we can also test the pattern from the remote control output, or for other purposes.
It’s easy to confuse the units of LED light output. Steradians, luminous intensity, etc.
Here’s a link to an application note that explains these well, written by C. Richard Duda of UDT (now part of OSI Inc.). Apertures, intentional and otherwise, are discussed, along with typical test configurations.
Please tell us if the link gets broken!
Lately we’ve been able to use our digital camera to perform some nice measurements, through the help of a program called ImageJ.
It’s free, was developed at NIH, is open-source, it has a ton of features and plug-ins, and you can write scripting macros, etc etc. It was developed so that the scientific community would have an open standard to process images. (Without an open standard for image number crunching, there’s no good way to independently reproduce an experiment that makes heavy use of images and image processing.)
You can read about it here at Wikipedia:
It’s available here:
We were turned onto this image analysis program by a couple of our clients. We recommend it. Today the cool thing was to separate the RGB channels, and allow us to ‘see’ an IR LED without being confused by the camera’s ‘grey scale’ clipping algorithm. Very nice.
This tech note was motivated by the question – how does the response of our eyes
differ from the response of a CCD camera sensor.
Using the data of a particular Hammamatsu CCD camera as an example,
we compared how silicon ‘sees’ to the photopic eye response
and compared both to a Planck black-body curve of a light at a particular
We don’t know what those lumps are in that CCD response curve – maybe some
strange reflection interference??
If you know – tell us!
Color temperature is based upon the idea of a Planck black-body radiator.
Here’s a Tech Note that shows how our eyes respond to the Planck Black-Body radiator.
For a lamp filament at a certain ‘color temperature’ there’s a curve of how our eyes
respond to the lamp. Pete put this into a MathCAD model, and there’s a pdf here
that shows off a few nice graphs.