Open Access
1 September 2006 simEye: computer-based simulation of visual perception under various eye defects using Zernike polynomials
Wolfgang Fink, Daniel Micol
Author Affiliations +
Abstract
We describe a computer eye model that allows for aspheric surfaces and a three-dimensional computer-based ray-tracing technique to simulate optical properties of the human eye and visual perception under various eye defects. Eye surfaces, such as the cornea, eye lens, and retina, are modeled or approximated by a set of Zernike polynomials that are fitted to input data for the respective surfaces. A ray-tracing procedure propagates light rays using Snell's law of refraction from an input object (e.g., digital image) through the eye under investigation (i.e., eye with defects to be modeled) to form a retinal image that is upside down and left-right inverted. To obtain a first-order realistic visual perception without having to model or simulate the retina and the visual cortex, this retinal image is then back-propagated through an emmetropic eye (e.g., Gullstrand exact schematic eye model with no additional eye defects) to an output screen of the same dimensions and at the same distance from the eye as the input object. Visual perception under instances of emmetropia, regular astigmatism, irregular astigmatism, and (central symmetric) keratoconus is simulated and depicted. In addition to still images, the computer ray-tracing tool presented here (simEye) permits the production of animated movies. These developments may have scientific and educational value. This tool may facilitate the education and training of both the public, for example, patients before undergoing eye surgery, and those in the medical field, such as students and professionals. Moreover, simEye may be used as a scientific research tool to investigate optical lens systems in general and the visual perception under a variety of eye conditions and surgical procedures such as cataract surgery and laser assisted in situ keratomileusis (LASIK) in particular.

1.

Introduction

As light from objects enters the eye (Fig. 1 ) it undergoes refraction, governed by Snell’s law, at the transition between the outer air and the anterior surface of the cornea. The light front undergoes further refraction as it passes through the posterior surface of the cornea and enters the anterior chamber of the eye. The light front then passes through the pupillary opening of the iris to enter the eye lens or crystalline lens. This is followed by several additional refractions taking place within the eye lens itself. Upon exiting the eye lens, the light front travels through the vitreous cavity, ultimately striking the retina of the eye (thereby forming an upside down, left-right inverted, and warped retinal image), where it is received by the photoreceptors—the rods and cones—and converted to electrochemical signals. These electrochemical signals are subsequently processed and compressed by the neural network cascade of the retina, at which point the “retinal image” does not exist any more as an image, but rather as a spatiotemporal neural spike pattern. This spatiotemporal neural spike pattern is then transmitted via the optic nerve to the visual cortex, where it is further processed at multiple levels and merged with other sensory inputs, ultimately leading to what we would call “visual perception” in a rather abstract (virtual) manner [see, for example, chapters 1 to 5 in Ref. 1 for an interesting (popular) description of the concept of vi sual perception (cognition)]. Therefore, we would like to emphasize the fact that visual perception is profoundly different from retinal images [and point-spread-functions (PSF) for that matter, which are a useful metric for optical engineering], and that we do not attempt to simulate the retinal processing cascade, let alone the visual cortex in this work. After all, even a successful simulation of these processing schemes would yield only a spatiotemporal neural spike pattern, but not a visible image that would resemble “visual perception.” In contrast, the visual perception simulation technique discussed here is admittedly not biologically motivated. However, it is a rather straightforward technique that yields results people can relate to, judging from their own personal experience (e.g., subjects who have one emmetropic eye, and one eye with a certain defect can confirm the visual experience by viewing the original image source and the simulated outcome with the respective other eye).

Fig. 1

Schematic view of normal human eye.

054011_1_011605jbo1.jpg

Computer ray-tracing permits simulations of the optical properties of the human eye (e.g., Refs. 2, 3, 4, 5, 6, 7) and of visual perception under various eye defects.4 Three-dimensional (3D) scenes or two-dimensional patterns of point sources, including digital photographs (images), serve as “light-giving” input objects for a ray-tracing simulation. In ophthalmic ray-tracing, the path of the light rays is calculated between these input objects and the retina of a computer eye model (e.g., Gullstrand’s exact schematic eye model,8, 9 Fig. 2 , or more elaborate eye models, e.g., Refs. 10, 11, 12, 13, 14) using Snell’s law of refraction. The image formed on the retina is upside down, left-right inverted, and warped due to the curvature of the retina. To obtain a first-order visual perception without having to simulate the neural function (processing) of the retina and the visual cortex, Fink 4 devised a back-projection method for the retinal image through an idealized eye (i.e., an emmetropic eye with only minor aberrations but no additional eye defects) to an output screen of the same dimensions and at the same distance from the eye as the input object (Fig. 3 ).

Fig. 2

Gullstrand exact schematic eye model (iris added) consisting of six refractive spherical surfaces (anterior and posterior cornea and four crystalline lens surfaces) and one nonrefractive spherical surface (retina) (Refs. 8, 9).

054011_1_011605jbo2.jpg

Fig. 3

Schematic view of 3D ray-tracing technique (Ref. 4) used in the visual perception simulation environment simEye.

054011_1_011605jbo3.jpg

We have previously used Gullstrand’s exact schematic eye model with improved parameters (Figs. 2 and 3)4 with six spherical refractive surfaces, a spherical retina, and an added iris (for determining the degree of both on-axis and off-axis line-of-sight aberrations to be simulated), to study visual perception under various eye conditions such as myopia (nearsightedness), hyperopia (farsightedness), cataract caused by microvacuoles, dislocated intraocular lens after cataract surgery, and refractive scotomata (visual field defects) caused by the usage of correction lenses in automated perimetry (visual field testing method).4, 15, 16, 17, 18 While a qualitatively valuable, analytically calculable, and successful test environment for studying visual perception, Gullstrand’s exact schematic eye model also has its limitations, predominantly because of the sphericity of its surfaces and, resulting from that, a very limited customizability.

To obtain more realistic and quantitative results, aspheric surfaces must be considered (e.g., Refs. 10, 11, 12). Ray tracing with aspheric surfaces can still be analytically calculable: Langenbucher, 19 for example, report on an algebraic method for ray tracing through the optical system of an eye with aspheric surfaces. Their method is restricted to second-order surfaces (quadric surfaces). In Sec. 2, we introduce a new ophthalmic ray-tracing tool for the simulation of visual perception, termed simEye,20 and discuss its underlying mathematical framework using Zernike polynomials2, 21, 22, 23 for extending the ray-tracing process to include “arbitrary,” nonspherical surfaces. In Sec. 3, we give examples of simulations, obtained with simEye, of visual perception under instances of emmetropia, regular astigmatism, irregular astigmatism, and (central symmetric) keratoconus—eye conditions that are characterized by aspheric corneal surfaces.

2.

Methods

2.1.

Surface Modeling

Zernike polynomials are a set of orthogonal polynomials used in geometrical optics for representing different kinds of eye aberrations.2 More recently, their use for modeling or fitting of the refractive surfaces responsible for the aberrations has been proposed (e.g., Ref. 23). The Zernike polynomials can be defined in cylindrical coordinates as a function of ρ (radius), θ (polar angle), and z (height)2

1.

Zm(ρ,θ)=Rnm(ρ)cos(mθ)
Zm(ρ,θ)=Rnm(ρ)sin(mθ)
with

Eq. 2

Rnm(ρ)=s=0(nm)2(1)s(ns)!s!(n+m2s)!(nm2s)!ρn2s.
We assume that cylindrical coordinates are provided for a set of points on the surface to be fitted, and hence, each point on this surface is specified by a radius ρ , an angle θ , and a height z . In the case of eye surfaces, such surface data can either originate from biometric measurements performed on real eyes or from model eyes. The goal is to obtain a surface that best approximates the surface defined by the input points. This is done by calculating the height S(ρ,θ) from the Zernike polynomial fit with ρ and θ as input parameters
S(ρi,θi)k=0PCkZk(ρi,θi)i=1,,N,
where N is the number of input points, P+1 is the number of polynomials (from index 0 to index P ), S(ρi,θi) is the height of the input point at (ρi,θi) , and Ck is the coefficient for the k ’th Zernike polynomial. We perform a least-squares minimization of the average distance between the surface to be fitted and the fitting surface created by the Zernike polynomials (e.g., Ref. 22), expressed as the mean squared error
Δ1Ni=1N[S(ρi,θi)k=0PCkZk(ρi,θi)]2.
The optimal fit (minimal Δ ) is obtained when all P+1 partial derivatives ΔCl vanish
ΔCl=2Ni=1N[S(ρi,θi)k=0PCkZk(ρi,θi)]Zl(ρi,θi)=0.
From this system of equations, we can extract the values of all the coefficients Ck for the Zernike polynomials via matrix inversion and multiplication (e.g., Refs. 22, 23, 24, 25).

This mathematical formulation has been implemented as a software package, termed simEye, in standard American National Standards Institute C and runs on UNIX platforms, such as Linux and Mac OS X. The surface input data are arranged in three columns (one column for each coordinate) that define the surface in cylindrical coordinates. simEye then attempts to fit the given surface starting with the user-specified number of Zernike polynomials. The measure chosen for evaluating the quality of the surface fit is the root-mean-squared error (RMSE) defined as (other error measures may be applied as well)

RMSEΔ=1Ni=1N[S(ρi,θi)k=0PCkZk(ρi,θi)]2.
If the RMSE between the given surface and the calculated fitting surface exceeds a maximum accuracy error prespecified by the user, the number of polynomials will be incremented by one and the software will try to fit the surface with the new, extended set of polynomials. This loop is repeated until the RMSE reaches the user-defined accuracy threshold, and at this point, the coefficients for each polynomial for the best surface fit are returned to the user. It should be cautioned that there is a possibility of noise fitting (overfitting) if the accuracy threshold is chosen too aggressively. This manifests itself in numerical instabilities and inefficient convergence behavior of the fitting procedure. In general, the accuracy threshold should be governed by the magnitude of the visual effect to be studied [e.g., visual effect of laser assisted in situ keratomileusis (LASIK) ablation pattern versus visual effect of astigmatism].

2.2.

Computer Ray-Tracing

In the ray-tracing procedure used here (for more details see Ref. 4), light rays are propagated from an input object (e.g., digital image) through the eye under investigation (i.e., the eye with modeled eye defects), taking Snell’s law of refraction into account, to form a warped retinal image (due to the retinal curvature) that is upside down and left-right inverted. This is apparently not how we visually experience (perceive) the world. To obtain a first-order realistic visual perception without having to simulate the neural function (processing) of the retina and the visual cortex as mentioned in Sec. 1, this retinal image is then back-propagated through a defect-free, idealized eye (e.g., Gullstrand’s exact schematic eye with no additional eye defects, Fig. 2) to an output screen of the same dimensions and at the same distance from the eye as the input object (Fig. 3).4, 15 This ray-tracing procedure unwarps the retinal image, flips it right-side up and, from a visual perception perspective, projects the image to where it originates (i.e., “we see things where they are”).

Using the geometric optics ray tracing above, it is not sufficient to flip the retinal image both vertically and horizontally in order to obtain a first-order simulation of visual perception for the following reasons: (1) the retinal image is warped due to the curvature of the retina (eyeball) in contrast to our experience of visual perception; (2) while the source image is, for computer-simulation purposes, a digital image and hence consists of well-defined, discrete, and equidistant individual pixels, the computer-simulated retinal image is characterized by a subpixel resolution and therefore does not adhere to well-defined, equidistant individual pixels any more. A horizontal and vertical flip operation on the retinal image would only be possible if one were to “average out” the subpixel resolution in order to arrive at a pixel-based (i.e., grid-based) retinal image. Such an averaging procedure would produce artifacts and would, in addition, not remove the warping of the retinal image, in other words, it would not produce a realistic experience of visual perception.

Because light rays could fail to hit the output image due to refraction while propagating through both the eye with modeled defects and the defect-free eye for back-propagation, the path of the light rays is reversed for practical purposes, making the output image the starting point for the ray-tracing procedure (Fig. 3).4, 15 This means that the light rays are propagated from the pixels of the output image through Gullstrand’s exact schematic eye (i.e., defect-free eye) and subsequently through the eye with the modeled eye defects toward the input source image. This guarantees that all pixels of the output image will be filled with color information from the input source image, thereby reducing void light ray calculations.

To describe the trajectory of the individual light ray, several mathematical and optical calculations must be performed for each of these surfaces. These can be summarized as

  • Calculation of the intersection point between the light ray and the corresponding eye surface.

  • Calculation of the corresponding surface normal in the intersection point.

  • Calculation of the direction of the refracted light ray by applying Snell’s law of refraction.

2.2.1.

Calculation of the intersection point between the light ray and the corresponding eye surface

A light ray is described in a linear algebraic, parameterized form with a point of origin P0 and a (normalized) direction vector d multiplied by a scalar parameter λ

g(λ)=P0+λd.
Different values of λ define all points along the trajectory of the light ray. Recalling the mathematical formulation of the eye surfaces, a surface based on Zernike polynomials is defined as follows:
S(ρ,θ)=i=0PCiZi(ρ,θ)
or in vector form
S(ρ,θ)=(ρcosθρsinθi=0PCiZi(ρ,θ)).
To determine the intersection point between the light ray and the eye surface, the following expression must be solved for (λ,ρ,θ) (details are in Ref. 26):
g(λ)=S(ρ,θ)g(λ)S(ρ,θ)=0.
This can be numerically accomplished with the 3D Newton-Raphson method.25, 26

2.2.2.

Calculation of the corresponding surface normal in the intersection point

With the intersection point determined, we proceed to calculate the normal of the corresponding surface at this point. This is necessary to obtain the new direction vector of the refracted light ray in the next step.

The surface normal is defined as follows:

surfacenormal(Sρ)×(Sθ)(Sρ)×(Sθ),
where (Sρ) is the partial derivative with respect to ρ of the fitting surface S , and (Sθ) is the partial derivative with respect to θ of the fitting surface S . These partial derivatives can be obtained analytically by differentiating Eqs. 1, 2 with respect to ρ and θ .

2.2.3.

Calculation of the direction of the refracted light ray by applying Snell’s law of refraction

To determine the direction of the refracted light ray, we apply Snell’s law of refraction

sinαsinβ=n2n1,
where α is the angle between the original light ray (before refraction) and the surface normal at the intersection point, β is the angle between the refracted light ray and the normal, and n1 and n2 are the refraction indices on either side of the refracting surface. Since the surface normal is already known, one needs only to calculate the new refracted angle and from that the new direction vector for the light ray (details are in Ref. 26). Once the new direction vector is obtained, it only remains to replace the previous one with the new one and to set the point of origin of the light ray as the intersection point with the last surface.

3.

Results

We have performed computer ray-tracing simulations, using simEye,20 of the visual perception under instances of the following four eye conditions:

  • 1. emmetropia (normal vision)

  • 2. regular astigmatism

  • 3. irregular astigmatism

  • 4. central symmetric keratoconus.

All four eye conditions above are characterized by aspheric corneal surfaces.

To create an eye model to be used with the simEye ray-tracing procedure, we have, without loss of generality, rebuilt Gullstrand’s exact schematic eye model (Fig. 2). We have fitted its spherical refractive surfaces (with the exception of the anterior corneal surface, which is to be modeled according to the respective eye condition) and the spherical retina with respective sets of Zernike polynomials both for distance viewing (see Table 1 for surface parameters8) and for maximal accommodation (see Table 2 for surface parameters8). We would like to emphasize that any of these surfaces can be replaced by surfaces that are fitted to more elaborate and realistic eye models (e.g., Refs. 10, 11, 12, 13), otherwise modeled data, or to actual biometric data27 (see also Sec. 4).

Table 1

Gullstrand exact schematic eye parameters for focus at infinity (⩾5m) (Ref. 8).

Position (mm)Radius (mm)Refractive index
Cornea07.71.376
0.56.81.336
Lens3.610.01.385
4.1467.9111.406
6.565 5.76 1.385
7.2 6.0 1.336
Retina24.0 11.5

Table 2

Gullstrand exact schematic eye parameters for maximal accommodation (10.23cm) (Ref. 8).

Position (mm)Radius (mm)Refractive index
Cornea07.71.376
0.56.81.336
Lens3.25.331.385
3.87252.6551.406
6.5725 2.655 1.385
7.2 5.33 1.336
Retina24.0 11.5

It is important to note that in the following, the correct viewing distance for the simulated visual perceptions, depicted in Figs. 4, 5, 6, 7, 8 , is a few centimeters from the picture plane with one eye covered (i.e., monocular viewing). The reason for this is that in the following simulations (see Table 3) the source image has a height of 12m and a width of 12m and is viewed from a distance of 5m (distance viewing) with the eye to be simulated (i.e., eye with eye defect). Therefore, we simulate a visual field of about 50deg radially, which allows for both on-axis and off-axis line-of-sight aberrations. The dimension of all simulated images is 500×500pixels , including the original image source (Fig. 4, top). The simulation results for maximal accommodation are not shown. The diameter of the pupil is user-adjustable in simEye and was set to 4mm for all depicted simulation results (see Table 3 ).

  • 1. (a) Emmetropic (normal) visual perception with spherical cornea (Fig. 4, bottom): We used the following expression to generate a spherical anterior corneal surface:

    z=r0r02ρ2for0ρr0,
    with r0=7.7 (Tables 1, 2). The center of the simulated perception is clear as opposed to the periphery (Fig. 4, bottom), compared to the original image source (Fig. 4, top) used in the simEye simulation. The apparent blurriness in the periphery is, in this case, the result of the assumed sphericity of the cornea, which leads to spherical aberration. However, it resembles the naturally occurring blurry perception in our peripheral vision due to the reduced retinal receptor density. This can be further demonstrated by approaching Fig. 4, bottom, with one eye covered. As one gets closer to the image, the peripheral blurriness seems to disappear.

    (b) Emmetropic visual perception with aspheric cornea (Fig. 5): We used the following expression to generate the anterior corneal surface (after Ref. 23):

    z=r0pr02p2ρ2pfor0ρr0,
    with r0=7.7 (Tables 1, 2) and p=0.3 . Because of the asphericity of the anterior corneal surface (Fig. 5, top), the peripheral simulated perception is improved (Fig. 5, bottom), that is, it is less blurry, compared to the peripheral simulated perception with a spherical cornea (Fig. 4, bottom).

  • 2. Visual perception under one instance of regular astigmatism (Fig. 6): We used the following expression to generate the anterior corneal surface (from Ref. 23):

    z=rapra2p2ρ2p,withra=1[1rh+(1rv1rh)sin2θ]
    and p=0.3 , rh=7.7 (Tables 1, 2) and rv=5.0 . Because the vertical radius of curvature, rv , of the anterior corneal surface is significantly shorter than the horizontal one, rh , an instance of regular astigmatism is introduced (Fig. 6, top), which manifests itself as an arc-like, structural image distortion along the vertical ( y axis) with the horizontal x axis being the symmetry axis (Fig. 6, bottom).

  • 3. Visual perception under one instance of irregular astigmatism (Fig. 7): We used the following expression to generate the anterior corneal surface (from Ref. 23):

    z=rapra2p2ρ2p,withra={1[1rh+(1rv1rh)sin2θ]for0θ<π1[1rv+(1rh1rv)sin2θ]forπθ<2π}
    and p=0.3 , rh=7.7 (Tables 1, 2), and rv=5.0 . The anterior corneal surface for this particular instance of irregular astigmatism was obtained by applying the above formula to the nasal half of the cornea and the swapped set of radii of curvature to the temporal half. The resulting point cloud of input data was subsequently fitted with a set of 51 Zernike polynomials, resulting in the anterior corneal surface depicted in the top part of Fig. 7. The resulting visual perception exhibits five distinct areas (Fig. 7, bottom): the upper left and right are characterized by a more arc-like, structural image distortion akin to the visual perception under regular astigmatism, whereas the lower left and right are characterized by a more Gaussian-type, fuzzy blur without any apparent structure to it. In the image center, a relatively undistorted viewing channel is visible.

  • 4. Visual perception under one instance of (central symmetric) keratoconus (Fig. 8): We used the following expression to generate the anterior corneal surface (from Ref. 23):

    z={r0r02ρ2for0ρρ1r0r02ρ2+a2{1cos[2π(ρρ1ρ2ρ1)]}forρ1<ρ<ρ2r0r02ρ2forρ2ρr0}
    with a=0.009 , ρ1=1.5 , ρ2=3.0 , and r0=7.7 (Tables 1, 2). The visual perception under this particular instance of central symmetric keratoconus, that is, keratoconus symmetrically centered around the optical axis, exhibits three zones of varying degrees of distortion (Fig. 8, bottom): In the central region, image blurriness paired with slight image enlargement is exhibited because of the increased central corneal thickness ( 0to1mm radially from the optical axis, Fig. 8, top) due to the central symmetric keratoconus compared to the “normal” central corneal thickness marked as a black line in Fig. 8, top. This central region is surrounded by an annular region ( 1to3mm radially from the optical axis, Fig. 8, top) of image blurriness because of the reduced corneal thickness due to the central symmetric keratoconus compared to the normal corneal thickness in that region marked as a black line in Fig. 8, top. Surrounding this annular region is a peripheral region where the regular image blurriness due to the spherical aberration of the anterior corneal surface is exhibited (compare to Fig. 4, bottom).

Fig. 4

(Top) Original input/source image (500×500pixels) for simEye ray-tracing procedure; (bottom) emmetropic (normal) visual perception with spherical cornea.

054011_1_011605jbo4.jpg

Fig. 5

(Top) Aspheric (physiologically more realistic) cornea (gray) and spherical cornea (black); (bottom) emmetropic (normal) visual perception with aspheric cornea.

054011_1_011605jbo5.jpg

Fig. 6

(Top) Regular astigmatic cornea (gray) and spherical cornea (black); (bottom) visual perception under regular astigmatism.

054011_1_011605jbo6.jpg

Fig. 7

(Top) Irregular astigmatic cornea (gray) and spherical cornea (black); (bottom) visual perception under irregular astigmatism.

054011_1_011605jbo7.jpg

Fig. 8

(Top) Keratoconic cornea (gray) and spherical cornea (black); (bottom) visual perception under central symmetric keratoconus.

054011_1_011605jbo8.jpg

Table 3

Parameters and data for surface fitting (columns 1 and 2) and ray tracing with simEye (columns 3 to 6). The computation times are based on an Apple PowerMac G5 Dual 2GHz with 8GB of RAM running Mac OS X Tiger. Only one CPU was used per ray-tracing simulation with simEye.

Number ofZernikepolynomialsfor fittingRMSE offitOutputpicturedimensions(m)Distancefrom theeye(m)Calculatedlight raysper pixelRun timeper still-image(min)
Sphericalemmetropicvisualperception40.0000 12.0×12.0 5.0500333
Asphericalemmetropicvisualperception160.0001 12.0×12.0 5.0500330
Visualperceptionunderregularastigmatism410.0010 12.0×12.0 5.0500370
Visualperceptionunderirregularastigmatism510.1600 12.0×12.0 5.0500382
Visualperceptionunderkeratoconus490.0150 12.0×12.0 5.0500337

Table 3 summarizes the parameters, data, and results for surface fitting and ray tracing with simEye for all the eye conditions simulated above.

4.

Discussion

The computer ray-tracing tool, simEye, presented here permits simulations of the optical properties of the human eye (both on-axis and off-axis line-of-sight aberrations). Further it allows for a first-order approximation of the visual perception under various eye defects without the need for simulating the neural processing of the retina and the visual cortex. This is accomplished by back-projecting through an idealized eye the retinal image of an object or scene produced by an eye with a certain eye defect. Obviously the choice of an unimpaired Gullstrand exact schematic eye model as the back-projecting idealized eye introduces spherical aberrations in addition to the visual effects produced by the eye with eye defects. However, simulations using a Gullstrand eye with an aspheric cornea [see case 1(b) in the Sec. 3] as the back-projecting idealized eye show that this is a minor effect that does not impact the overall visual perception, in particular in the central visual field. A more sophisticated eye model with reduced spherical aberration could be employed to further reduce this side effect.

simEye, in contrast to earlier ray-tracing simulations,4, 15, 16, 17, 18 permits the introduction of arbitrary surfaces (e.g., aspheric surfaces), represented or fitted by a set of Zernike polynomials. Furthermore, the usefulness of Zernike polynomials for fitting actual surface data, in addition to wave-front data (optical aberrations),21, 22 is demonstrated, in agreement with Carvalho’s findings.23

It should be pointed out that, although the corneal surfaces considered in this study are mathematically modeled (i.e., somewhat artificial), the presented Zernike-based surface-fitting framework is equally applicable to realistic, biometrically measured data of any surface within the eye under investigation (e.g., Ref. 27). Furthermore, surfaces that do not exhibit symmetries with respect to the optical axis, such as tilted or laterally dislocated surfaces, can be fitted as well by expanding the surface-fitting framework to incorporate rotation and translation matrices (e.g., Ref. 24) that operate on the sets of Zernike polynomials used for the fits. For example, this would allow for the simulation of off-center, asymmetric keratoconus conditions. Moreover, additional surfaces can be introduced, necessary for more elaborate and realistic eye models (e.g., Refs. 11, 12). For example, this would allow for the simulation of a multishell crystalline lens with varying refractive index.

The Stiles-Crawford effect 28 is currently not considered in simEye since the neural function of the retinal receptors is not modeled. The Stiles-Crawford effect describes the angular dependence of retinal sensitivity. Light rays that enter the pupil near its center (axial light), which are parallel to retinal receptors, are more effective than oblique rays, which enter the pupil near its margins (off-axis light). Therefore, light passing through the periphery of the pupil is less efficient at stimulating vision than light passing near the center of the pupil (i.e., axial light forms sharper images than off-axis light) and hence increases the depth of focus. Since the Stiles-Crawford effect is not modeled, the spherical and aspherical aberrations exhibited in the periphery (i.e., at high eccentricities) of the simulated visual perceptions may be overestimated. One feasible way to mitigate the Stiles-Crawford effect within simEye is to choose a small pupil diameter for the simulations, because the Stiles-Crawford effect, like most aberrations, is bigger with bigger pupils. Similarly, the contrast sensitivity function (CSF), sometimes also called visual acuity, is currently not considered in simEye. The CSF tells us how sensitive we are to the various frequencies of visual stimuli. If the frequency of visual stimuli is too high, we will not be able to recognize the stimuli pattern any more because of the limited number of photoreceptors in the retina. Since the density of photoreceptors in the retina drops exponentially toward higher eccentricities, the effect of the CSF, akin to the Stiles-Crawford effect, can be mitigated in simEye again by using a small pupil size for the simulations. Furthermore, chromatic aberration of the human eye is currently not considered in simEye.

The ray-tracing procedure outlined here is ideally suited for parallel processing on a cluster computer (or computers accessible via the Internet akin to, e.g., SETI@home, http://setiathome.ssl.berkeley.edu/) since each light ray can be calculated independently from each other. Thus, an almost perfect linear speedup with the number of available central processing units (CPUs) can be expected for each still image to be generated with simEye. The same holds true for the generation of ray-traced movies 20where each frame of a movie can be independently calculated from one another. Using simEye as a front end, we have been able to create an optimized software package that allows for the generation of animated movies at a rate of about 0.5Hz , that is, one movie frame every 2s , enabling near real-time performance.

simEye may have a wide range of applications in science, optics, and education. This tool may help educate (train) both the lay public, for example, patients before undergoing eye surgery, and medical personnel, such as medical students and professionals. Moreover, simEye may help simulate and investigate optical lens systems, such as cameras, telescopes, microscopes, and robotic vision systems. Furthermore, it may help study the visual perception through multifocal intraocular lenses and through intracorneal lenses. Finally, simEye may be used as a scientific research tool to investigate the visual perception under a variety of eye conditions, in addition to the ones presented here, and after various ophthalmic surgical procedures such as cataract surgery and LASIK (e.g., Refs. 29, 30).

Acknowledgments

The authors would like to thank Vincent McKoy for a critical review of the manuscript and Wayne Waller for invaluable discussions about the complexity and concept of visual perception. One of the authors (WF) would like to thank Erich W. Schmid for having introduced him to the field of ray tracing in ophthalmology. One of the authors (DM) would like to acknowledge the support of the Summer Undergraduate Research Fellowship (SURF) program at Caltech. This research and the fellowship were supported by National Science Foundation Grant No. EEC-0310723.

References

1. 

J. Hawkins and S. Blakeslee, On Intelligence, (2004) Google Scholar

2. 

M. Born and E. Wolf, Principles of Optics, 5th ed.Pergamon Press, Oxford (1975). Google Scholar

3. 

J. E. Greivenkamp, J. Schwiegerling, and J. M. Miller, “Visual acuity modeling using optical raytracing of schematic eyes,” Am. J. Ophthalmol., 120 (2), 227 –240 (1995). 0002-9394 Google Scholar

4. 

W. Fink, A. Frohn, U. Schiefer, E. W. Schmid, and N. Wendelstein, “A ray tracer for ophthalmological applications,” Ger. J. Ophthalmol., 5 118 –125 (1996). 0941-2921 Google Scholar

5. 

I. Escudero-Sanz and R. Navarro, “Off-Axis Aberrations of a Wide-Angle Schematic Eye Model,” J. Opt. Soc. Am. A, 16 1881 –1891 (1999). 0740-3232 Google Scholar

6. 

J. Y. Huang and D. Moore, “Computer simulated human eye modeling with GRIN incorporated in the crystalline lens,” J. Vision, 4 (11), 58a (2004). 1534-7362 Google Scholar

7. 

R. Navarro, L. González, and J. L. Hernández, “Optics of the average normal cornea from general and canonical representations of its surface topography,” J. Opt. Soc. Am. A, 23 219 –232 (2006). 0740-3232 Google Scholar

8. 

A. Gullstrand, Handbuch der Physiologischen Optik, 1 226 3rd ed.L. Voss, Hamburg Leipzig (1909). Google Scholar

9. 

W. Trendelenburg, M. Monjé, I. Schmidt, and E. Schütz, Der Ge-sichtssinn, 2nd ed.Auflage, Springer, Berlin Göttingen Heidelberg (1961). Google Scholar

10. 

W. Lotmar, “Theoretical eye model with aspherics,” J. Opt. Soc. Am., 61 1522 –1529 (1971). 0030-3941 Google Scholar

11. 

R. Navarro, J. Santamaria, and J. Bescos, “Accommodation-dependent model of the human eye with aspherics,” J. Opt. Soc. Am. A, 2 1273 –1281 (1985). 0740-3232 Google Scholar

12. 

H.-L. Liou and N. A. Brennan, “Anatomically accurate, finite model eye for optical modeling,” J. Opt. Soc. Am. A, 14 1684 –1695 (1997). 0740-3232 Google Scholar

13. 

A. Popiolek-Masajada and H. Kasprzak, “Model of the optical system of the human eye during accommodation,” Ophthalmic Physiol. Opt., 22 (3), 201 (2002). 0275-5408 Google Scholar

14. 

M. F. Deering, “Perception: A photon accurate model of the human eye,” 649 –658 (2005). Google Scholar

15. 

W. Fink, A. Frohn, U. Schiefer, E. W. Schmid, N. Wendelstein, and E. Zrenner, “Visuelle wahrnehmung bei hohen ametropien—Computergestützte simulation mittels strahlenoptischer rechnungen,” Klin. Monatsbl. Augenheilkd., 208 472 –476 (1996). 0023-2165 Google Scholar

16. 

W. Fink, U. Schiefer, and E. W. Schmid, “Effect of dislocated and tilted correction glasses on perimetric outcome—A simulation using ray-tracing,” 201 –204 (1997). Google Scholar

17. 

W. Fink, A. Frohn, and E. W. Schmid, “Dislocation of intraocular lens analysed by means of ray tracing,” Invest. Ophthalmol. Visual Sci., 37 (3), 770 (1996). 0146-0404 Google Scholar

18. 

W. Fink, “Project Eyemovie: Motion visualization of eye defects,” Invest. Ophthalmol. Visual Sci., 42 (4), S705 (2001). 0146-0404 Google Scholar see also Project “Eyemovie,” http://www.eyemovie.org Google Scholar

19. 

A. Langenbucher, A. Viestenz, A. Viestenz, H. Brünner, and B. Seitz, “Ray tracing through a schematic eye containing second-order (quadric) surfaces using 4×4 matrix notation,” Ophthalmic Physiol. Opt., 26 (2), 180 –188 (2006). 0275-5408 Google Scholar

20. 

D. Micol and W. Fink, “SIMEYE: Computer-based simulation of visual perception under various eye defects,” Invest. Ophthalmol. Visual Sci., 47 578 (2006). 0146-0404 Google Scholar See also: http//autonomy.caltech. edu/biomedicine/project _simeye.html Google Scholar

21. 

D. Malacara, J. M. Carpio-Valadéz, and J. J. Sánchez-Mondragón, “Wavefront fitting with discrete orthogonal polynomials in a unit radius circle,” Opt. Eng., 29 672 (1990). https://doi.org/10.1117/1.2168920 0091-3286 Google Scholar

22. 

J. L. Rayces, “Least-squares fitting of orthogonal polynomials for the wave-aberration function,” Appl. Opt., 31 2223 –2228 (1992). 0003-6935 Google Scholar

23. 

L. A. Carvalho, “Accuracy of Zernike polynomials in characterizing optical aberrations and the corneal surface of the eye,” Invest. Ophthalmol. Visual Sci., 46 (6), 1915 –1926 (2005). 0146-0404 Google Scholar

24. 

D. F. Rogers and J. A. Adams, Mathematical Elements for Computer Graphics, 2nd ed.McGraw-Hill, New York (1990). Google Scholar

25. 

W. H. Press, B. P. Flannery, S. A. Teukolsky, and W. T. Vetterling, Numerical Recipes in C: The Art of Scientific Computing, 286 –289 Cambridge University Press, Cambridge, NY (1991). Google Scholar

26. 

W. Fink, “Refractive correction method for digital charge-coupled device-recorded Scheimpflug photographs by means of ray tracing,” J. Biomed. Opt., 10 (2), 024003 (2005). https://doi.org/10.1117/1.1899683 1083-3668 Google Scholar

27. 

K. D. Singh, N. S. Logan, and B. Gilmartin, “Three-dimensional modeling of the human eye based on magnetic resonance imaging,” Invest. Ophthalmol. Visual Sci., 47 (6), 2272 –2279 (2006). 0146-0404 Google Scholar

28. 

W. H. Stiles and B. H. Crawford, “The luminous efficiency of rays entering the eye pupil at different points,” Proc. R. Soc. London, Ser. B, 112 428 –450 (1933). 0962-8452 Google Scholar

29. 

D. Ortiz, J. M. Saiz, J. M. Gonzalez, J. N. Fernandez del Cotero, and F. Moreno, “Geometric ray tracing for design of customized ablation in laser in situ keratomileusis,” J. Refract. Surg., 18 (3 Suppl.), S327 –S331 (2002). 1081-597X Google Scholar

30. 

D. Ortiz, J. M. Saiz, J. M. Gonzalez, J. I. Velarde, J. N. Fernandez del Cotero, and F. Moreno, “Optimization of an individualized LASIK surgery. Geometric ray tracing model,” Arch. Soc. Esp. Oftalmol., 78 (8), 443 –449 (2003). Google Scholar
©(2006) Society of Photo-Optical Instrumentation Engineers (SPIE)
Wolfgang Fink and Daniel Micol "simEye: computer-based simulation of visual perception under various eye defects using Zernike polynomials," Journal of Biomedical Optics 11(5), 054011 (1 September 2006). https://doi.org/10.1117/1.2357734
Published: 1 September 2006
Lens.org Logo
CITATIONS
Cited by 14 scholarly publications and 1 patent.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Eye

Visualization

Computer simulations

Eye models

Monochromatic aberrations

Cornea

Zernike polynomials

Back to Top