In recent years, optical music recognition (OMR) has been extensively developed, particularly for use with mobile devices that require fast processing to recognize and play live the notes in images captured from sheet music. However, most techniques that have been developed thus far have focused on playing back instrumental music and have ignored the importance of lyric extraction, which is time consuming and affects the accuracy of the OMR tools. The text of the lyrics adds complexity to the page layout, particularly when lyrics touch or overlap musical symbols, in which case it is very difficult to separate them from each other. In addition, the distortion that appears in captured musical images makes the lyric lines curved or skewed, making the lyric extraction problem more complicated. This paper proposes a new approach in which lyrics are detected and extracted quickly and effectively. First, in order to resolve the distortion problem, the image is undistorted by a method using information of stave lines and bar lines. Then, through the use of a frequency count method and heuristic rules based on projection, the lyric areas are extracted, the cases where symbols touch the lyrics are resolved, and most of the information from the musical notation is kept even when the lyrics and music notes are overlapping. Our algorithm demonstrated a short processing time and remarkable accuracy on two test datasets of images of printed Korean musical scores: the first set included three hundred scanned musical images; the second set had two hundred musical images that were captured by a digital camera.
Cong Minh DINH
Chonnam National University
Hyung Jeong YANG
Chonnam National University
Guee Sang LEE
Chonnam National University
Soo Hyung KIM
Chonnam National University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Cong Minh DINH, Hyung Jeong YANG, Guee Sang LEE, Soo Hyung KIM, "Fast Lyric Area Extraction from Images of Printed Korean Music Scores" in IEICE TRANSACTIONS on Information,
vol. E99-D, no. 6, pp. 1576-1584, June 2016, doi: 10.1587/transinf.2015EDP7296.
Abstract: In recent years, optical music recognition (OMR) has been extensively developed, particularly for use with mobile devices that require fast processing to recognize and play live the notes in images captured from sheet music. However, most techniques that have been developed thus far have focused on playing back instrumental music and have ignored the importance of lyric extraction, which is time consuming and affects the accuracy of the OMR tools. The text of the lyrics adds complexity to the page layout, particularly when lyrics touch or overlap musical symbols, in which case it is very difficult to separate them from each other. In addition, the distortion that appears in captured musical images makes the lyric lines curved or skewed, making the lyric extraction problem more complicated. This paper proposes a new approach in which lyrics are detected and extracted quickly and effectively. First, in order to resolve the distortion problem, the image is undistorted by a method using information of stave lines and bar lines. Then, through the use of a frequency count method and heuristic rules based on projection, the lyric areas are extracted, the cases where symbols touch the lyrics are resolved, and most of the information from the musical notation is kept even when the lyrics and music notes are overlapping. Our algorithm demonstrated a short processing time and remarkable accuracy on two test datasets of images of printed Korean musical scores: the first set included three hundred scanned musical images; the second set had two hundred musical images that were captured by a digital camera.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2015EDP7296/_p
Copy
@ARTICLE{e99-d_6_1576,
author={Cong Minh DINH, Hyung Jeong YANG, Guee Sang LEE, Soo Hyung KIM, },
journal={IEICE TRANSACTIONS on Information},
title={Fast Lyric Area Extraction from Images of Printed Korean Music Scores},
year={2016},
volume={E99-D},
number={6},
pages={1576-1584},
abstract={In recent years, optical music recognition (OMR) has been extensively developed, particularly for use with mobile devices that require fast processing to recognize and play live the notes in images captured from sheet music. However, most techniques that have been developed thus far have focused on playing back instrumental music and have ignored the importance of lyric extraction, which is time consuming and affects the accuracy of the OMR tools. The text of the lyrics adds complexity to the page layout, particularly when lyrics touch or overlap musical symbols, in which case it is very difficult to separate them from each other. In addition, the distortion that appears in captured musical images makes the lyric lines curved or skewed, making the lyric extraction problem more complicated. This paper proposes a new approach in which lyrics are detected and extracted quickly and effectively. First, in order to resolve the distortion problem, the image is undistorted by a method using information of stave lines and bar lines. Then, through the use of a frequency count method and heuristic rules based on projection, the lyric areas are extracted, the cases where symbols touch the lyrics are resolved, and most of the information from the musical notation is kept even when the lyrics and music notes are overlapping. Our algorithm demonstrated a short processing time and remarkable accuracy on two test datasets of images of printed Korean musical scores: the first set included three hundred scanned musical images; the second set had two hundred musical images that were captured by a digital camera.},
keywords={},
doi={10.1587/transinf.2015EDP7296},
ISSN={1745-1361},
month={June},}
Copy
TY - JOUR
TI - Fast Lyric Area Extraction from Images of Printed Korean Music Scores
T2 - IEICE TRANSACTIONS on Information
SP - 1576
EP - 1584
AU - Cong Minh DINH
AU - Hyung Jeong YANG
AU - Guee Sang LEE
AU - Soo Hyung KIM
PY - 2016
DO - 10.1587/transinf.2015EDP7296
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E99-D
IS - 6
JA - IEICE TRANSACTIONS on Information
Y1 - June 2016
AB - In recent years, optical music recognition (OMR) has been extensively developed, particularly for use with mobile devices that require fast processing to recognize and play live the notes in images captured from sheet music. However, most techniques that have been developed thus far have focused on playing back instrumental music and have ignored the importance of lyric extraction, which is time consuming and affects the accuracy of the OMR tools. The text of the lyrics adds complexity to the page layout, particularly when lyrics touch or overlap musical symbols, in which case it is very difficult to separate them from each other. In addition, the distortion that appears in captured musical images makes the lyric lines curved or skewed, making the lyric extraction problem more complicated. This paper proposes a new approach in which lyrics are detected and extracted quickly and effectively. First, in order to resolve the distortion problem, the image is undistorted by a method using information of stave lines and bar lines. Then, through the use of a frequency count method and heuristic rules based on projection, the lyric areas are extracted, the cases where symbols touch the lyrics are resolved, and most of the information from the musical notation is kept even when the lyrics and music notes are overlapping. Our algorithm demonstrated a short processing time and remarkable accuracy on two test datasets of images of printed Korean musical scores: the first set included three hundred scanned musical images; the second set had two hundred musical images that were captured by a digital camera.
ER -