Optical System Design.pdf

(7640 KB) Pobierz
23408807 UNPDF
Source: Optical System Design
CHAPTER
1
Basic Optics and
Optical System
Specifications
Downloaded from Digital Engineering Library @ McGraw-Hill (www.digitalengineeringlibrary.com)
Copyright © 2004 The McGraw-Hill Companies. All rights reserved.
Any use is subject to the Terms of Use as given at the website.
1
23408807.002.png
Basic Optics and Optical System Specifications
2
Chapter 1
This chapter will discuss what a lens or mirror system does and how
we specify an optical system. You will find that properly and completely
specifying a lens system early in the design cycle is an imperative ingre-
dient required to design a good system.
The Purpose of an Imaging Optical
System
The purpose of virtually all image-forming optical systems is to resolve
a specified minimum-sized object over a desired field of view. The field
of view is expressed as the spatial or angular extent in object space, and
the minimum-sized object is the smallest resolution element which is
required to identify or otherwise understand the image. The word “spa-
tial” as used here simply refers to the linear extent of the field of view in
the plane of the object. The field of view can be expressed as an angle
or alternatively as a lateral size at a specified distance. For example, the
field of view might be expressed as 10
350
7 matrix of dots will do
just fine. This applies to telescopes, microscopes, infrared systems, camera
lenses, and any other form of image-forming optics. The generally
accepted guideline is that approximately three resolution elements or 1.5
line pairs over the object’s spatial extent are required to acquire an
object. Approximately eight resolution elements or four line pairs are
required to recognize the object and 14 resolution elements or seven line
pairs are required to identify the object.
There is an important rule of thumb, which says that this smallest
desired resolution element should be matched in size to the minimum
detector element or pixel in a pixelated charged-coupled device (CCD) or
complementary metal-oxide semiconductor (CMOS)–type sensor. While
Downloaded from Digital Engineering Library @ McGraw-Hill (www.digitalengineeringlibrary.com)
Copyright © 2004 The McGraw-Hill Companies. All rights reserved.
Any use is subject to the Terms of Use as given at the website.
10°, or alternatively as 350
m at a distance of 2 km, both of which mean the same thing.
A good example of a resolution element is the dot pattern in a dot
matrix printer. The capital letter E has three horizontal bars, and hence
five vertical resolution elements are required to resolve the letter. Hori-
zontally, we would require three resolution elements. Thus, the mini-
mum number of resolution elements required to resolve capital letters is
in the vicinity of five vertical by three horizontal. Figure 1.1 is an exam-
ple of this. Note that the capital letter B and the number 8 cannot be
distinguished in a 3
5 matrix, and the 5
23408807.003.png
Basic Optics and Optical System Specifications
Basic Optics and Optical System Specifications
3
Figure 1.1
Illustration of Num-
ber of Resolution Ele-
ments Required to
Resolve or Distin-
guish Alphanumerics
not rigorous, this is an excellent guideline to follow for an optimum
match between the optics and the sensor. This will become especially
clear when we learn about the Nyquist Frequency in Chap. 21, where we
show a digital camera design example. In addition, the aperture of the
system and transmittance of the optics must be sufficient for the desired
sensitivity of the sensor or detector. The detector can be the human eye,
a CCD chip, or film in your 35-mm camera. If we do not have enough
photons to record the imagery, then what good is the imagery?
The preceding parameters relate to the optical system performance. In
addition, the design form or configuration of the optical system must be
capable of meeting this required level of performance. For example,
most of us will agree that we simply cannot use a single magnifying
glass element to perform optical microlithography where submicron
line-width imagery is required, or even lenses designed for 35-mm pho-
tography for that matter. The form or configuration of the system
includes the number of lens or mirror elements along with their relative
position and shape within the system. We discuss design configurations
in Chap. 8 in detail.
Furthermore, we often encounter special requirements, such as cold
stop efficiency, in infrared systems, scanning systems, and others. These
will be addressed later in this book.
Finally, the system design must be producible, meet defined packag-
ing and environmental requirements, weight and cost guidelines, and sat-
isfy other system specifications.
How to Specify Your Optical
System: Basic Parameters
Consider the lens shown in Fig. 1.2 where light from infinity enters the
lens over its clear aperture diameter. If we follow the solid ray, we see that
Downloaded from Digital Engineering Library @ McGraw-Hill (www.digitalengineeringlibrary.com)
Copyright © 2004 The McGraw-Hill Companies. All rights reserved.
Any use is subject to the Terms of Use as given at the website.
23408807.004.png
Basic Optics and Optical System Specifications
4
Chapter 1
Figure 1.2
Typical Specifications
it is redirected by each of the lens element groups and components
until it comes to focus at the image. If we now extend this ray backwards
from the image towards the front of the system as if it were not bent or
refracted by the lens groups, it intersects the entering ray at a distance
from the image called the focal length. The final imaging cone reaching
the image at its center is defined by its ƒ / numb er or ƒ / #, where
ƒ/number
fo c a l l e n g t h
clear aperture diameter
You may come across two other similar terms, effective focal length and
equivalent focal length, both of which are often abbreviated EFL. The effec-
tive focal length is simply the focal length of a lens or a group of lenses.
Equivalent focal length is very much the same; it is the overall focal
length of a group of lens elements, some or all of which may be separat-
ed from one another.
The lens is used over a full field of view, which is expressed as an
angle, or alternatively as a linear distance on the object plane. It is
important to express the total or full field of view rather than a subset
Downloaded from Digital Engineering Library @ McGraw-Hill (www.digitalengineeringlibrary.com)
Copyright © 2004 The McGraw-Hill Companies. All rights reserved.
Any use is subject to the Terms of Use as given at the website.
23408807.005.png
Basic Optics and Optical System Specifications
Basic Optics and Optical System Specifications
5
5 aspect ratio. We could specify the horizontal field of
view, which is often done in video technology and cinematography.
However, if we do this, we would be ignoring the full diagonal of the
field of view. If you do specify a field of view less than the full or
total field, you absolutely must indicate this. For example, it is quite
appropriate to specify the field of view as ±10°. This means, of course,
that the total or full diagonal field of view is 20°. Above all, do not sim-
ply say “field of view 10°” as the designer will be forced to guess what
you really mean!
System specifications should include a defined spectral range or wave-
length band over which the system will be used. A visible system, for
example, generally covers the spectral range from approximately 450 nm
to 650 nm. It is important to specify from three to five specific wave-
lengths and their corresponding relative weights or importance factors
for each wavelength. If your sensor has little sensitivity, say, in the blue,
then the image quality or performance of the optics can be more
degraded in the blue without perceptible performance degradation. In
effect, the spectral weights represent an importance factor across the
wavelength band where the sensor is responsive. If we have a net spec-
tral sensitivity curve, as in Fig. 1.3, we first select five representative wave-
lengths distributed over the band,
650 nm, as
shown. The circular data points represent the relative sensitivity at the
specific wavelengths, and the relative weights are now the normalized
area or integral within each band from band 1 through band 5, respec-
tively. Note that the weights are not the ordinate of the curve at each
wavelength as you might first expect but rather the integral within each
band. Table 1.1 shows the data for this example.
Even if your spectral band is narrow, you must work with its band-
width and derive the relative weightings. You may find some cases where
you think the spectral characteristics suggest a monochromatic situa-
tion but in reality, there is a finite bandwidth. Pressure-broadened spec-
tral lines emitted by high-pressure arc lamps exhibit this characteristic.
Designing such a system monochromatically could produce a disastrous
result. In most cases, laser-based systems only need to be designed at the
specific laser wavelength.
Sys tem packaging constraints are important to set at the outset of a
design effort, if at all possible. These include length, diameter, weight, dis-
tance or clearance from the last surface to the image, location and space
1
450 nm through
5
Downloaded from Digital Engineering Library @ McGraw-Hill (www.digitalengineeringlibrary.com)
Copyright © 2004 The McGraw-Hill Companies. All rights reserved.
Any use is subject to the Terms of Use as given at the website.
of the field of view. This is an extremely critical point to remember.
For example, assume we have a CCD camera lens covering a sensor
with a 3
4
23408807.001.png
Zgłoś jeśli naruszono regulamin