PySilSub: An open-source Python toolbox for implementing the method of silent substitution in vision and nonvisual photoreception research.
J Vis
; 23(7): 10, 2023 Jul 03.
Article
en En
| MEDLINE
| ID: mdl-37450287
ABSTRACT
The normal human retina contains several classes of photosensitive cell-rods for low-light vision, three cone classes for daylight vision, and intrinsically photosensitive retinal ganglion cells (ipRGCs) expressing melanopsin for non-image-forming functions, including pupil control, melatonin suppression, and circadian photoentrainment. The spectral sensitivities of the photoreceptors overlap significantly, which means that most lights will stimulate all photoreceptors to varying degrees. The method of silent substitution is a powerful tool for stimulating individual photoreceptor classes selectively and has found much use in research and clinical settings. The main hardware requirement for silent substitution is a spectrally calibrated light stimulation system with at least as many primaries as there are photoreceptors under consideration. Device settings that will produce lights to selectively stimulate the photoreceptor(s) of interest can be found using a variety of analytic and algorithmic approaches. Here we present PySilSub (https//github.com/PySilentSubstitution/pysilsub), a novel Python package for silent substitution featuring flexible support for individual colorimetric observer models (including human and mouse observers), multiprimary stimulation devices, and solving silent substitution problems with linear algebra and constrained numerical optimization. The toolbox is registered with the Python Package Index and includes example data sets from various multiprimary systems. We hope that PySilSub will facilitate the application of silent substitution in research and clinical settings.
Texto completo:
1
Colección:
01-internacional
Banco de datos:
MEDLINE
Asunto principal:
Visión de Colores
/
Luz
Límite:
Animals
/
Humans
Idioma:
En
Revista:
J Vis
Asunto de la revista:
OFTALMOLOGIA
Año:
2023
Tipo del documento:
Article
País de afiliación:
Reino Unido