LLM-enhanced idea generation: data-driven morphological analysis with LDA and NuNER

  • Jihyun Park
  • , Youngjung Geum

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Technology opportunity discovery (TOD) plays a critical role in firms’ success, leading to extensive research on methodologies for identifying promising technologies. Morphological analysis has been regarded as a prominent method for this purpose, as it systematically derives innovative ideas through creative combinations. However, most previous studies have relied on subjective approaches. Although some analytical and data-driven approaches have been attempted, limited research has addressed how to systematically extract relevant information from large-scale data and how to construct a data-driven morphological matrix using advanced methods such as large language models (LLMs). In response, this study proposes a data-driven approach to morphological analysis for discovering technological opportunities by leveraging LLM-based models to support decision making. Specifically, Latent Dirichlet Allocation (LDA) is used for dimension extraction, and NuNER is applied for value extraction. To evaluate the effectiveness of the proposed framework, a case study was conducted in the context of smart TVs. The results demonstrate that a systematic morphological matrix can be constructed and utilized based on patent data. This approach enables companies to explore innovative ideas through various combinations within the morphological matrix, thereby facilitating the discovery of technological opportunities.

Original languageEnglish
Article number111426
JournalComputers and Industrial Engineering
Volume208
DOIs
StatePublished - Oct 2025

Keywords

  • LDA
  • Morphology analysis
  • Natural-language processing
  • NuNER
  • Technology opportunity discovery

Fingerprint

Dive into the research topics of 'LLM-enhanced idea generation: data-driven morphological analysis with LDA and NuNER'. Together they form a unique fingerprint.

Cite this