Your browser doesn't support javascript.
loading
Towards Generating Authentic Human-Removed Pictures in Crowded Places Using a Few-Second Video.
Lee, Juhwan; Lee, Euihyeok; Kang, Seungwoo.
Afiliação
  • Lee J; Lululab Inc., Seoul 06054, Republic of Korea.
  • Lee E; Department of Computer Science and Engineering, Graduate School, Korea University of Technology and Education, Cheonan 31253, Republic of Korea.
  • Kang S; School of Computer Science and Engineering, Korea University of Technology and Education, Cheonan 31253, Republic of Korea.
Sensors (Basel) ; 24(11)2024 May 28.
Article em En | MEDLINE | ID: mdl-38894277
ABSTRACT
If we visit famous and iconic landmarks, we may want to take a photo of them. However, such sites are usually crowded, and taking photos with only landmarks without people could be challenging. This paper aims to automatically remove people in a picture and produce a natural image of the landmark alone. To this end, it presents Thanos, a system to generate authentic human-removed images in crowded places. It is designed to produce high-quality images with reasonable computation cost using short video clips of a few seconds. For this purpose, a multi-frame-based recovery region minimization method is proposed. The key idea is to aggregate information partially available from multiple image frames to minimize the area to be restored. The evaluation result presents that the proposed method outperforms alternatives; it shows lower Fréchet Inception Distance (FID) scores with comparable processing latency. It is also shown that the images by Thanos achieve a lower FID score than those of existing applications; Thanos's score is 242.8, while those by Retouch-photos and Samsung object eraser are 249.4 and 271.2, respectively.
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Sensors (Basel) Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Sensors (Basel) Ano de publicação: 2024 Tipo de documento: Article