Multifocal scene simulations often provide only a stylized rendering of blurry and sharp regions. This study creates simulated scenes in which different types of correction including multifocal(MF) lenses are analyzed to create realistic blurring of the scene.
A series of optotypes with the same angular subtense but varying distance is often used to illustrate MF lens performance. Here, for a more realistic rendering, a technique is developed where a 3D scene consisting of multiple objects (cell phone, computer screen, and the outdoors) each at a distinct distance, is blurred. Each object is convolved with a depth-adjusted Point Spread Function(PSF) and the 3 objects recombined to form a simulated scene. Occlusion of background objects by the foreground is taken into account. Pupil sizes are controlled. Simulated images induced by distance, near, monovision, ring-type bifocal, and two types of MF lens corrections were produced.
Assumptions regarding the scene composition are used. In principal, each object point in the scene has a unique PSF that must be added independently to the final simulation. By assuming several separate depth layers within the scene where the PSF is nearly constant, simulation can be accelerated. Occlusion of background objects by foreground objects occurs by blurring the deepest layer first, masking it by the near layers, and then repeating for the next deepest layer. Images are presented for the range of vision correction modalities.
3D scene simulation is a useful tool for analyzing performance of various types of presbyopic correction. The scenes provide realistic comparisons of common visual situations such as reading a cell phone or computer screen and viewing distant objects. The simulations illustrate the benefits of MF lenses such as improved near vision and demonstrate contrast effects that occur with these lenses. They are also useful for patient education and for comparing different designs.