Generating Curated Photo Collection using Shot Type Pattern

Dongwann Kang, Yangmi Lim

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Smartphones are broadly used to take photos frequently in our real lives. As a result, the need for organizing the photos piled up on smartphones is rapidly increasing. To solve this, curated photo collections of user-taken photos based on significant people, places, and events is widely used in recent smartphones. But mostly they generate simple sequences of photos with background music. In this paper, we propose an automatic curated photo collection generation method that utilizes shot type patterns generally used in commercial film making. Through this, the users can enjoy the photo collection organized in a professional cinematographic way. It also provides an improved experience from an aesthetic point of view.

Original languageEnglish
Title of host publicationICTC 2022 - 13th International Conference on Information and Communication Technology Convergence
Subtitle of host publicationAccelerating Digital Transformation with ICT Innovation
PublisherIEEE Computer Society
Pages2173-2175
Number of pages3
ISBN (Electronic)9781665499392
DOIs
StatePublished - 2022
Event13th International Conference on Information and Communication Technology Convergence, ICTC 2022 - Jeju Island, Korea, Republic of
Duration: 19 Oct 202221 Oct 2022

Publication series

NameInternational Conference on ICT Convergence
Volume2022-October
ISSN (Print)2162-1233
ISSN (Electronic)2162-1241

Conference

Conference13th International Conference on Information and Communication Technology Convergence, ICTC 2022
Country/TerritoryKorea, Republic of
CityJeju Island
Period19/10/2221/10/22

Keywords

  • cinematographic shot pattern
  • convolutional neural networks
  • image classification
  • photo curation

Fingerprint

Dive into the research topics of 'Generating Curated Photo Collection using Shot Type Pattern'. Together they form a unique fingerprint.

Cite this