Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Mais filtros

Base de dados
País/Região como assunto
Tipo de documento
País de afiliação
Intervalo de ano de publicação
1.
PLoS Comput Biol ; 18(7): e1010202, 2022 07.
Artigo em Inglês | MEDLINE | ID: mdl-35834439

RESUMO

Science students increasingly need programming and data science skills to be competitive in the modern workforce. However, at our university (San Francisco State University), until recently, almost no biology, biochemistry, and chemistry students (from here bio/chem students) completed a minor in computer science. To change this, a new minor in computing applications, which is informally known as the Promoting Inclusivity in Computing (PINC) minor, was established in 2016. Here, we present the lessons we learned from our experience in a set of 10 rules. The first 3 rules focus on setting up the program so that it interests students in biology, chemistry, and biochemistry. Rules 4 through 8 focus on how the classes of the program are taught to make them interesting for our students and to provide the students with the support they need. The last 2 rules are about what happens "behind the scenes" of running a program with many people from several departments involved.


Assuntos
Estudantes , Humanos , São Francisco , Universidades , Recursos Humanos
2.
J Technol Pers Disabil ; 11: 192-208, 2023 May.
Artigo em Inglês | MEDLINE | ID: mdl-38516032

RESUMO

The You Described, We Archived dataset (YuWA) is a collaboration between San Francisco State University and The Smith-Kettlewell Eye Research Institute. It includes audio description (AD) data collected worldwide 2013-2022 through YouDescribe, an accessibility tool for adding audio descriptions to YouTube videos. YouDescribe, a web-based audio description tool along with an iOS viewing app, has a community of 12,000+ average annual visitors, with approximately 3,000 volunteer describers, and has created over 5,500 audio described YouTube videos. Blind and visually impaired (BVI) viewers request videos, which then are saved to a wish list and volunteer audio describers select a video, write a script, record audio clips, and edit clip placement to create an audio description. The AD tracks are stored separately, posted for public view at https://youdescribe.org/ and played together with the YouTube video. The YuWA audio description data paired with the describer and viewer metadata, and collection timeline has a large number of research applications including artificial intelligence, machine learning, sociolinguistics, audio description, video understanding, video retrieval and video-language grounding tasks.

3.
Artigo em Inglês | MEDLINE | ID: mdl-38545917

RESUMO

How well a caption fits an image can be difficult to assess due to the subjective nature of caption quality. What is a good caption? We investigate this problem by focusing on image-caption ratings and by generating high quality datasets from human feedback with gamification. We validate the datasets by showing a higher level of inter-rater agreement, and by using them to train custom machine learning models to predict new ratings. Our approach outperforms previous metrics - the resulting datasets are more easily learned and are of higher quality than other currently available datasets for image-caption rating.

4.
IEEE Trans Vis Comput Graph ; 14(2): 302-12, 2008.
Artigo em Inglês | MEDLINE | ID: mdl-18192711

RESUMO

We present a novel approach for latency-tolerant delivery of visualization and rendering results where client-side frame rate display performance is independent of source dataset size, image size, visualization technique or rendering complexity. Our approach delivers pre-rendered, multiresolution images to a remote user as they navigate through different viewpoints, visualization or rendering parameters. We employ demand-driven tiled, multiresolution image streaming and prefetching to efficiently utilize available bandwidth while providing the maximum resolution user can perceive from a given viewpoint. Since image data is the only input to our system, our approach is generally applicable to all visualization and graphics rendering applications capable of generating image files in an ordered fashion. In our implementation, a normal web server provides on-demand images to a remote custom client application, which uses client-pull to obtain and cache only those images required to fulfill the interaction needs. The main contributions of this work are: (1) an architecture for latency-tolerant, remote delivery of precomputed imagery suitable for use with any visualization or rendering application capable of producing images in an ordered fashion; (2) a performance study showing the impact of diverse network environments and different tunable system parameters on end-to-end system performance in terms of deliverable frames per second.

5.
Invert Neurosci ; 13(1): 45-55, 2013 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-23007685

RESUMO

We have developed a machine vision-based method for automatically tracking deformations in the body wall to monitor ecdysis behaviors in the hornworm, Manduca sexta. The method utilizes naturally occurring features on the animal's body (spiracles) and is highly accurate (>95 % success in tracking). Moreover, it is robust to unanticipated changes in the animal's position and in lighting, and in the event tracking of specific features is lost, tracking can be reestablished within a few cycles without input from the user. We have paired our tracking technique with electromyography and have also compared our in vivo results to fictive motor patterns recorded from isolated nerve cords. We found no major difference in the cycle periods of contractions during naturally occurring ecdysis compared to ecdysis initiated prematurely through injection of the peptide ecdysis-triggering hormone, and we confirmed that the ecdysis period in vivo is statistically similar to that of the fictive motor pattern.


Assuntos
Manduca/fisiologia , Muda/fisiologia , Gravação em Vídeo , Animais , Hormônios de Inseto/fisiologia
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA