PreprintPDF Available
Preprints and early-stage research may not have been peer reviewed yet.

Abstract and Figures

Musicians often use online repositories to find sheet music. However , it can be challenging and time consuming to find pieces with the desired instrumentation, mood, tempo, difficulty, and other features. In addition, score previews and audio recordings are not always available before purchasing the sheet music. Because of these challenges, users may have to visit multiple sites to gather all the information they need prior to purchasing a score, and/or gravitate towards composers they are already familiar with or those highlighted in search results, potentially creating bias towards well-known composers. Our user research suggests that the text-dependent filtering tools on existing websites may be under-utilized. In this study we explored a visual-based approach to create a more intuitive and user-friendly experience. In this paper we introduce a web-based interactive tool to explore a repository of scores based on features of the music. We created a visualization for each piece representing multiple high-level features (tempo, mode, note density, time signature, instrumentation, and difficulty level) in a single image. Each musical feature, calculated from the symbolic score representation using music21, maps to a different visual element (color, shape, icons, and pattern density). Users interact with the tool by selecting from a set of visualizations, each representing one piece. For the selected piece, users can view and download the score, listen to a MIDI version, and listen to an audio recording in an embedded Spotify player. We discuss the tool's relevance for users browsing digital music collections as well as possible issues with implementing such a tool at scale. We also discuss the challenges that arise from comparing pieces that vary widely in terms of instrumentation and style.
Content may be subject to copyright.
Score Visualizations: A Visual Exploration Tool for Sheet Music
Laney Light
Claire Arthur
llight7@gatech.edu
claire.arthur@gatech.edu
Georgia Institute of Technology
Atlanta, Georgia, USA
Figure 1: Sample visualizations.
ABSTRACT
Musicians often use online repositories to nd sheet music. How-
ever, it can be challenging and time consuming to nd pieces with
the desired instrumentation, mood, tempo, diculty, and other
features. In addition, score previews and audio recordings are not
always available before purchasing the sheet music. Because of
these challenges, users may have to visit multiple sites to gather all
the information they need prior to purchasing a score, and/or gravi-
tate towards composers they are already familiar with or those
highlighted in search results, potentially creating bias towards
well-known composers. Our user research suggests that the text-
dependent ltering tools on existing websites may be under-utilized.
In this study we explored a visual-based approach to create a more
intuitive and user-friendly experience.
In this paper we introduce a web-based interactive tool to explore
a repository of scores based on features of the music. We created a vi-
sualization for each piece representing multiple high-level features
(tempo, mode, note density, time signature, instrumentation, and
diculty level) in a single image. Each musical feature, calculated
from the symbolic score representation using music21, maps to a
dierent visual element (color, shape, icons, and pattern density).
Users interact with the tool by selecting from a set of visualizations,
each representing one piece. For the selected piece, users can view
and download the score, listen to a MIDI version, and listen to an
audio recording in an embedded Spotify player.
We discuss the tool’s relevance for users browsing digital music
collections as well as possible issues with implementing such a tool
at scale. We also discuss the challenges that arise from comparing
pieces that vary widely in terms of instrumentation and style.
CCS CONCEPTS
Information systems Data management systems
;
Human-
centered computing Interaction design;Visualization.
KEYWORDS
visualization, interfaces, digital music repositories, search tools
1 INTRODUCTION
Musicians, music educators, and conductors often rely on online
databases to obtain sheet music. These online sources became
even more important in 2020, when traditional sources for printed
sheet music such as brick-and-mortar music stores and publisher
booths at conferences suddenly became more limited because of
the COVID-19 pandemic. There are several free or low-cost sheet
sites such as MuseScore
1
and IMSLP
2
, as well as publishers’ web-
sites and sheet music distributors such as J.W. Pepper
3
. However,
searching through these websites to nd an appropriate piece can
be challenging. Each site has its own search and ltering features
and may contain thousands of scores, but nding a score with the
desired instrumentation, style, tempo, and other features can be
dicult and time consuming.
There are several challenges contributing to this problem. First,
the user may experience a “cold start” problem. When a search
returns thousands of scores, guring out where to start can be
overwhelming. Second, it can be dicult to nd information about
a piece before purchasing or downloading the sheet music. On
1https://musescore.com/
2https://imslp.org
3https://www.jwpepper.com
Light
some sites, score previews may not be available before purchasing.
Even if a preview is available, it can take a considerable amount of
time to review a score to determine if it ts the user’s needs. Third,
recommendation and similarity comparison tools, while common
in the audio domain, are not widely available for scores.
Because of these challenges of searching for scores, users may
gravitate toward composers they are already familiar with. Some
sites such as MuseScore highlight the most popular composers/artists,
inuencing users’ browsing choices. This could introduce bias, with
popular composers continuing to be promoted as lesser-known com-
posers are buried in the search results. Users may also gravitate
toward composers whose work they are already familiar with. If
the user nds a piece they like, it can be dicult to nd a similar
piece by a dierent composer because of the lack of similarity and
recommendation tools. The text-based ltering options available
on most sites include features such as time period, instrumentation,
and sometimes diculty level. However, our user research suggests
these lters may be under-utilized. In addition, there are few lter-
ing options available for other features of the music such as tempo,
time signature, and mood.
To avoid reliance on text-based lters and allow the user to
focus on musical features when exploring scores, we approached
the problem from a visual perspective. From a user experience
perspective, visuals are often preferred over excessive text. Our
goal is to allow the user to compare features of pieces at at a glance,
determining quickly whether the piece is of interest to them and
meets their needs.
To test this visual-based approach, we created a visualization of
each piece from a repository of scores, representing classical music
written for a variety of instruments. Aiming for a simple abstract
visual style, we mapped high-level features of scores such as tempo,
time signature, and instrumentation to visual design elements such
as color, shape, and icons. We then created an interactive tool to
navigate a corpus of scores using these visualizations.
Potential use cases for this tool include: 1) categorization and
organization of score databases, 2) improving the user experience
of searching for scores, and 3) promoting lesser-known composers
by focusing on the features of a score rather than the popularity of
the composer.
2 BACKGROUND
2.1 Music Data Visualization
In a recent survey paper, Kulusi et al [
5
] described the dierent
ways in which visualizations have been used for musical data, fo-
cusing on “non-traditional” abstract representations that have been
used for specic tasks or research questions. Some of the open
problems the authors described are 1) the need for interdisciplinary
collaborations with graphic art and visual design and 2) a lack of
“collection-wide analysis tools for musical scores,” as most corpus
tools tend to focus on audio data. These open problems are areas
we will explore with this project.
There have been many design approaches used to visualize scores
with abstract colors and shapes. However, most of these tend to
be complex and display individual notes or phrases. For example,
Martin Wattenberg’s Arc Diagram [
15
], uses concentric arcs to
illustrate patterns. The piano roll-style visualizations of Stephen
Malinowski’s Animation Machine
4
use a video format to display
the visualizations synchronized to the audio. Nicholas Rougeaux’s
visualizations created for OpenScore
5
display every note in a piece
using a static image, with lower notes in the center and higher
pitches at the edges of the circle. These images convey detailed
information about a piece, whereas a simpler visualization of higher-
level features would be more eective for the purpose of browsing
a database and comparing multiple pieces.
Several visualizations or visual-based interfaces have been de-
signed for organizing and categorizing music, but they have gen-
erally focused on audio features. For example, MusicRainbow[
8
]
allows the user scroll through a collection of music displayed as
a circular rainbow, where tracks with similar audio features are
grouped together on the color wheel. Similarly, Jiajun and Lu’s [
4
]
interface allows users to navigate a database of audio tracks, using
scatter plot of perceived tempo (based on average tempo and other
features such as average onset duration) and timbre (bright to dark).
These tools have a similar goal to ours in terms of interactive navi-
gation of a database of music visually. However, to our knowledge,
there have not been any similar methods applied to scores.
2.2 Musical and Visual Features
Although there has been extensive research on audio similarity,
Schedl et al [
10
] discussed the importance of overlooked user-
centric methods in music information retrieval, noting that human
perception of similarity is inuenced by a wide range of individual
factors. When humans are asked to rate audio similarity subjec-
tively, the highly variable results suggests that people focus on dif-
ferent features when assessing similarity, as evidenced by Flexer’s
[
3
] ndings from the MIREX study. This indicates that it impor-
tant to include multiple musical features in our visualizations, to
account for the multiple ways in which users may perceive pieces
to be similar - for example, some may focus mainly on tempo, while
others focus more on instrumentation.
Several studies have suggested some natural perceptual map-
pings of visualizations to musical or audio features, and these map-
pings are sometimes tied to emotional associations. For example,
participants who were asked to draw visual representations of mu-
sical stimuli often used visual height to represent frequency and
line thickness to represent amplitude (Küssner [
6
]). Sievers et al
[
11
] demonstrated that the spectral centroid of audio stimuli was
associated with the characteristics of shapes and with emotional
arousal (high centroid sounds were associated with spiky shapes
and high-arousal emotions such as anger, and low centroid sounds
with rounded shapes and low-arousal emotions). Color has also
been shown to have associations with musical features. For exam-
ple, Palmer et al [
7
] showed that faster music in a major key was
associated with more saturated, lighter, yellow colors, and slower
music in a minor key was associated with less saturated, darker,
bluer colors. The associations were mediated by the participants’
emotional associations with the music. These ndings suggests that
color could be employed as a meaningful representation of features
that represent the emotion of a piece, such as the tempo and mode.
4https://musanim.com
5Example: https://musescore.com/openscore/scores/5733014
Score Visualizations: A Visual Exploration Tool for Sheet Music
Several prior studies inuenced our decisions regarding the gen-
eral appearance and style of our visualizations. For example, Ver-
schael et al [
14
] noted that children use a wide variety of visual
elements such as color, brightness, words, and pictographs to repre-
sent music, and we sought to incorporate several of these elements
into our images. Tan & Kelley[
13
], in one of the few studies to exam-
ine visual representations of complete musical compositions, found
that most participants chose to use abstract visualizations, rather
than pictorial images (70% vs. 30%). Musically trained participants
(the target users for our tool) were more likely to use abstract forms
than musically untrained participants. Taken together, these nd-
ings suggest that a suitable design approach for score visualization
could incorporate abstract shapes and colors, using a variety of
visual design elements to represent dierent features. Rather than
showing detailed changes over time in a continuous left to right
stream (the style used by many of the participants reported by Tan
& Kelley), we decided a global notation approach (as described by
Verschael) was more appropriate for our purposes. To aid in quick
similarity comparison between pieces, we chose to use a single
overall image for each piece.
In the studies mentioned above, while certain representations
were found to be more common than others, there was considerable
variability and subjectivity between participants. Although we did
not discover any “universal” mappings based on the literature, we
used the more consistent of these ndings as guidelines for cre-
ating our visualizations. We chose to use an abstract visual style,
with a single image representing the entire piece, as opposed to
showing changes over time. We also chose to incorporate icons to
represent certain features. The studies cited above also inuenced
our color palette selection, for example using brighter yellow colors
to indicate a happy mood and darker blue colors to indicate a sad
mood.
2.3 Diculty Calculation
One of the most important features for selecting a piece is the di-
culty level, which must be consistent with the musicians’ skill level.
However, diculty levels are generally by publishers or expert
reviewers after a manual review process, and there is no single set
of guidelines for diculty level across countries states, or ensemble
types. Andrews [
1
] notes that there is no consensus across publish-
ers on the criteria for dierent grade levels. Based on a synthesis of
guidelines from dierent publishers along with input from educa-
tors and conductors, Andrews developed a Music Complexity Chart.
This chart contains criteria for classifying music as easy, medium,
or hard for concert band, orchestra, or chamber ensemble. The
level-specic criteria included categories such as range, rhythm,
and piece duration. We drew from this chart to select features that
are important for assessing diculty. However, the guidelines are
not complete for our purposes, as they are subject to interpretation
by human reviewers, and they do not extend to solo pieces or other
instrumentation such as piano or choral music.
To our knowledge, the research on assessing diculty compu-
tationally from symbolic score data is very limited. Chiu & Chen
[
2
] Sébastien et al [
12
] each proposed methods of calculating the
diculty level of piano music, in MIDI and MusicXML formats,
respectively. Both studies included some features that were piano-
specic, such as ngering, hand displacement, and hand stretch.
Polyphony, or number of simultaneous voices, is not relevant for
monophonic instruments and would not have the same meaning in
terms of diculty for larger ensembles. However, they also included
features that could be extended to other instruments. For example
Chiu dened as average playing speed based on the tempo and note
durations, and Sébastien dened a similar measure by the tempo
and shortest signicant note value. Both studies incorporated a
measure of the relative frequency of accidentals. Other measures
included the length of the piece.
Sawruk and Walls [
9
] presented a system for searching digi-
tal sheet music developed for the J.W. Pepper website. Although
this tool only uses instrument part ranges as a proxy for diculty,
it is notable in its use of visual-based methods for nding sheet
music and its use of symbolic scores to extract that diculty infor-
mation. Easy, medium, and hard total ranges and tessituras were
dened for each instrument. Users select from visual presentations
of these ranges to lter by instrument-specic diculty, allowing
customization of diculty by part.
3 USER RESEARCH
To nd out more about users’ experiences searching for music, we
conducted 10 semi-structured interviews. Participants included am-
ateur and professional musicians, composers and arrangers, a music
therapist, and several music educators (band and orchestra conduc-
tors, a high school band director, a state band chair, and a piano
teacher). We asked about topics including experiences with existing
websites, challenges and frustrations, and important criteria for
selecting music.
We analyzed the results using an anity mapping approach,
grouping important points into several key themes as shown in
Figure 2. We found that participants are using a variety of tools
to nd sheet music, including distributors, free websites, libraries,
and composers’ and publishers’ websites. They reported having
limited time to spend searching, but many are using an inecient
process, visiting multiple websites to nd the information they are
looking for. One reason for this is the inability to view the score
and/or listen to a recording before purchasing or downloading,
prompting participants to search other sites for that information.
Other downsides to existing sites included the inability to search or
lter on features such as tempo or length, and free sites not being
visually appealing or user friendly. If participants were considering
a particular piece or composer and wanted to nd something similar,
they might go to the composer’s website or listen to Spotify playlists
of an ensemble that had performed the piece. Participants did not
report using ltering options on sheet music websites very often.
Instead, they tended to get ideas for pieces from sources such as
Youtube, then multiple sites to nd out where they could download
or purchase the piece.
4 METHODOLOGY
4.1 Data Processing
Figure 3 provides an overview of our data processing steps. Our
data source is a classical corpus of Kern les in symbolic format,
representing the medieval through modern eras (approximately
Light
Figure 2: User Interview Anity Map. Our interview analysis revealed common themes, with notable ndings indicated in
bold font.
1150 through 1950). Although we used Kern les for this analy-
sis, the process could be expanded easily to accommodate other
symbolic formats such as musicXML. We sampled one work per
composer, for a total of 58 pieces. If there was any missing or in-
complete information such as instrumentation or tempo, we edited
the Kern les manually to add those elds. We created a list of
works in .csv format, including the specic le names to include
and documentation of any edits made from the original le. We
also searched for the pieces in Spotify and inserted the Spotify ID
for each piece when available. Using Verovio Humdrum viewer, we
created one-page previews of the scores in .png format.
We used music21, a Python toolkit for computer-aided musicol-
ogy, to calculate a set of features for each piece and export them
to a JSON le. Finally, we used Javascript and HTML to create the
visualizations and the interactive tool. We used p5.js, a Javascript
library for creative coding, to draw the visualizations on the screen.
We used Midi.js to generate audio from the MIDI les.
4.2 Features
From our user interviews and literature review, we identied a set
of musical features important for comparing and selecting scores:
diculty, instrumentation, tempo, estimated note density, time sig-
nature, mood, and diculty level. We used music21 to calculate fea-
tures for each piece, as described in Table 1. We identied the tempo,
time signature, and mode at the beginning of the piece. Although
this is an oversimplication because these features can change dur-
ing a piece, using a single value at the start of the piece allows for
a simpler visualization. We categorized instruments present in the
scores into instrument families (woodwinds, piano or harpsichord,
brass, strings, and vocals).
Next we calculated a composite diculty measure from 9 fea-
ture components. These captured information from the following
categories: speed (tempo, note density, and proportion of notes
lasting <=0.25 seconds); key signature (number of sharps or ats,
proportion of accidentals); meter (indicators for compound meter
and meter changes); part-specic range; and the total duration of
Score Visualizations: A Visual Exploration Tool for Sheet Music
Figure 3: Data Flow. The diagram illustrates the data process-
ing steps, from the source les to feature calculation to the
nal interactive tool.
the piece in minutes. To create the composite measure, we rst
scaled each component to a range of [0,1]. Then we summed the
scaled components for each piece. Finally, we categorized each
piece into a ve-point rating scale based on the quintiles of the
summed score.
4.3 Visual Mappings
Informed by prior research on visual representations of music, as
described in the Introduction, we mapped a dierent visual feature
to each musical feature. We selected four color palettes to represent
mood, as dened by the tempo and mode, roughly corresponding
to a valence/arousal model (Figure 4. The color brightness and
saturation indicated tempo (arousal), with bright, saturated colors
indicating faster tempos of >=100 BPM. The color hue corresponds
to the valence commonly associated with major vs. minor mode.
For example, “happy” (fast and major key) pieces are bright yellow
and green and “sad/melancholy” (slow and minor key) pieces are
less saturated blue and grey.
The shapes used in the background ll represent the time signa-
ture, as shown in Figure 5 (triangles for pieces with a time signature
numerator in 3 or 6, and squares for pieces in 2 or 4). The density
of the background ll corresponds to the average note density, as
shown in Figure 6 (number of notes per second). Icons represent
instrument families (Figure 7), and a ve-point scale indicates the
diculty level (Figure 8).
5 INTERACTIVE FUNCTIONALITY
The tool displays visualizations for seven randomly selected scores
from the database, as shown in Figure 9. One larger image in the cen-
ter is the currently “selected” piece, and six smaller images around
Figure 4: Mood. Color palettes represent mood, dened by
tempo and mode.
Figure 5: Time Signature. Shapes in the background pattern
represent the time signature.
Figure 6: Note density. The density of the background pat-
tern ll corresponds to the note density (average number of
notes per second, taking local tempo into account).
the edges represent other randomly selected scores. The composer
name and piece title are displayed for the current selection only,
a design decision that is intended to focus the user’s attention on
the piece’s features as opposed to the composer’s name. To pick
a dierent piece, the user clicks on of the smaller images, which
Light
Table 1: Music Features
Feature Diculty Measure Component Denition
Time signature Time signature at the beginning of the piece
Mode Major vs. minor mode at the beginning of the piece
Instrument family Instrument families present in the score (e.g. woodwinds, vocals)
Duration Total length of the piece in minutes
Accidentals Proportion of all notes in the piece that are accidentals
Sharps or ats Number of sharps or number of ats in the initial key signature
Range Distance between lowest and highest pitch in semitones for each part
Tempo Initial tempo marking (beats per minute)
Compound meter 0/1 indicator for whether time signature is compound meter
Meter changes 0/1 indicator for whether there are any meter changes
Note density Average number of notes per second, taking local tempo into account
"Fast" notes Proportion of all notes that have a duration <=0.25 seconds
Figure 7: Instrumentation. Instrument families are repre-
sented using icons.
Figure 8: Diculty level. A ve-point scale illustrates di-
culty level relative to other pieces in the corpus.
becomes the new "current selection", and the smaller images are
re-populated with other pieces from the corpus.
A drop-down gives the user control over the randomization
options. Changing the drop-down from the default "Show random
pieces" to "Show pieces similar to current selection" displays the
pieces that are most similar to the current selection. Similarity is
determined by a point-based system, as described in 2. Each piece
Figure 9: Interactivity design. This screenshot of the web-
based tool shows the layout of the visualizations, buttons
to view the score and listen to audio, and the drop-down to
control randomization options.
in the data set is compared to the current selection, and the tool
displays the six pieces with the highest number of similarity points.
The user can view a one-page preview of the score by clicking the
"view score" button. Additional options are then presented to close
the preview or download it as .jpg le. A "Play MIDI" button enables
play and pause functionality for the MIDI version of the piece. A
"Listen on Spotify" button opens an embedded Spotify player, if
the piece is available on Spotify. If the user is logged into a Spotify
account, they will be able to listen to the entire audio recording.
Otherwise, they will hear a 30-second preview.
Score Visualizations: A Visual Exploration Tool for Sheet Music
Table 2: Similarity Measure
Condition Similarity Points
Same diculty score 2
Diculty within += 1 point 1
Same instrumentation 2
At least one instrument in common 1
Tempo in same range (>=100 vs. <100 BPM) 1
Note density within +- 2 notes per second 1
Same mode (major vs. minor) 1
6 USER TESTING
We administered a follow-up survey to our interview participants,
as well as several email lists and a discussion board for music re-
searchers. This survey asked participants to test out the prototype
6
and provide feedback on how usable and useful they found the
tool. Preliminary results from six participants indicate that users
found the visual representations to be intuitive, although there may
be some confusion over the meaning of the background pattern
ll. Users liked being able to control the randomization options.
They found the design to be eective overall. Some technical chal-
lenges were reported, including cross-browser compatibility (the
tool works well in Chrome and Edge but not Firefox) and some
issues with the MIDI playback. Users suggested some extensions
and improvements such as adding search and ltering options.
In addition to the survey, we received feedback from two mu-
sic researchers through a discussion board, who suggested some
tweaks to the visualizations such as outlining the shapes in black
to make them more visible. Another suggestion was an alternate
layout allowing the user to explore a larger number of scores at
a time to view, something that was also suggested by one of the
survey respondents.
We also solicited feedback on the color palettes used to represent
mood from one (non-musician) individual with red-green color-
blindness. We learned that while some of the mappings were mean-
ingful (e.g., using red to represent "angry/intense" made sense), the
less saturated colors such as the "sad/melancholy" blue and grey
were harder to distinguish and did not carry as much symbolic
meaning. The gure explaining the color palettes (Figure 4) was
also slightly confusing, leading to the expectation that the colors
would be presented as dots on an x-y axis.
7 DISCUSSION
The prototype we developed serves as a proof of concept for using
visualizations to explore a database of sheet music, which to our
knowledge has not been done before. Our preliminary user feedback
suggests that users liked the concept and generally were able to
use the tool to explore the repository of scores. It addresses several
issues we identied in our background research, including 1) having
score previews, MIDI, and audio playback easily accessible within
the tool, 2) enabling the user to compare pieces by features such as
6https://llight.github.io/classical-music-viz/
time signature, mood, and diculty level. Rather than using text-
based lters and/or relying on composer name recognition, this
tool focuses on the features of the piece in a visual representation.
There are several limitations of this work. First, our dataset was
a limited set of 58 Classical pieces, with no pieces from the past 70
years represented. To see if this visual approach could help address
the issue of bias, we would need more diversity in terms of time pe-
riod, genre, instrumentation and composers (gender, race/ethnicity,
and country), as well as a mixture of well-known and lesser-known
composers. Second, the visuals capture information about time sig-
nature, key, etc. at the beginning of the piece, but these features
could change over time. Third, there is a tradeo between more
information and visual simplicity. Features that might be useful to
add to the visualizations could also result in visual clutter.
Finally, we found that measuring diculty computationally is a
very challenging task. We developed a diculty measure based on
our literature review and features identied as important during
our interviews, but our measure has limitations. It has not been
validated to see if the ratings would be consistent with expert
review. It is also dependent on the pieces in the dataset, since the
thresholds for levels were based on quintiles of the data as opposed
to xed thresholds. We are weighting all components equally, but
some factors could be more important than others. The diculty
factors may not be consistently meaningful across instruments
(i.e. a three-octave range would be much easier for piano than for
vocals), and instrument-specic diculty factors such as ngerings
are not included.
There are several challenges of implementing this project on a
larger scale. The biggest obstacle is obtaining pieces in symbolic
format, as we are currently limited to pieces that are not under copy-
right. Some manual review and editing of the Kern les was needed
to add missing information, a process which could potentially be
streamlined. Finding the pieces in Spotify was also a manual pro-
cess. Many of the pieces contained multiple metadata records in
addition to the title such as opus and movement numbers. Song
titles in Spotify could contain any combination of these elds and
could be in English or the original language of the composition.
As a result, identifying the correct songs and extracting their IDs
required some trial and error, listening to the recordings while
reviewing the scores.
To address some of the issues identied during testing, our fu-
ture work would include addressing cross-browser and mobile com-
patibility. We would also further develop the interface to include
features such as instrument lters and search capability. We would
explore dierent color schemes or other visual representations of
mood to ensure they are meaningful for people with color blindness.
Finally, we would expand our dataset to include a more diverse set
of pieces in terms of time period, instrumentation, and composer
background.
ACKNOWLEDGMENTS
To Cole Anderson, Michelle Ramirez, Ruvinee Senadheera, Duo-
Wei Yang and Dr. Anne Sullivan for assistance with user interviews
and user interface wireframes.
Light
REFERENCES
[1]
B W Andrews. 2020. Identifying the Characteristics of Levels of Diculty in
Educational Music: The Music Completity Chart (MC2). The Canadian Music
Educator 62, 1 (2020), 17.
[2]
Shih-Chuan Chiu and Min-Syan Chen. 2012. A Study on Diculty Level Recog-
nition of Piano Sheet Music. In 2012 IEEE International Symposium on Multimedia.
IEEE, Irvine, CA, USA, 17–23. https://doi.org/10.1109/ISM.2012.11
[3]
Arthur Flexer. 2014. On Inter-Rater Agreement in Audio Music Similarity. In
15th International Society for Music Information Retrieval Conference (ISMIR 2014).
ISMIR, Taipei, Taiwan, 6.
[4]
Jiajun Zhu and Lie Lu. 2005. Perceptual Visualization of a Music Collection. In
2005 IEEE International Conference on Multimedia and Expo. IEEE, Amsterdam,
The Netherlands, 1058–1061. https://doi.org/10.1109/ICME.2005.1521607
[5]
R. Khulusi, J. Kusnick, C. Meinecke, C. Gillmann, J. Focht, and S. Jänicke. 2020.
A Survey on Visualizations for Musical Data. Computer Graphics Forum 39, 6
(March 2020), 82–110. https://doi.org/10.1111/cgf.13905
[6]
Mats B Küssner. 2013. Music and Shape. Literary and Linguistic Computing 28, 3
(2013), 8. https://doi.org/10.1093/llc/fqs071
[7]
S. E. Palmer, K. B. Schloss, Z. Xu, and L. R. Prado-Leon. 2013. Music-color
associations are mediated by emotion. Proceedings of the National Academy of
Sciences 110, 22 (May 2013), 8836–8841. https://doi.org/10.1073/pnas.1212562110
[8]
Elias Pampalk and Masataka Goto. 2006. MusicRainbow: A New User Interface
to Discover Artists Using Audio-based Similarity and Web-based Labeling. In
Proceedings of the International Society for Music Information Retrieval. ISMIR,
Victoria, BC (Canada), 4.
[9]
Jeremy Sawruk and Jacob Walls. 2020. Personalized Sheet Music Search. In 7th
International Conference on Digital Libraries for Musicology. ACM, Montréal QC
Canada, 45–47. https://doi.org/10.1145/3424911.3425516
[10]
Markus Schedl, Arthur Flexer, and Julián Urbano. 2013. The neglected user in
music information retrieval research. Journal of Intelligent Information Systems
41, 3 (Dec. 2013), 523–539. https://doi.org/10.1007/s10844-013- 0247-6
[11]
Beau Sievers, Caitlyn Lee, William Haslett, and Thalia Wheatley. 2019. A multi-
sensory code for emotional arousal. Proceedings of the Royal Society B: Biological
Sciences 286, 20190513 (2019), 10. https://doi.org/10.1098/rspb.2019.0513
[12]
Véronique Sébastien, Henri Ralambondrainy, Olivier Sébastien, and Noël Con-
ruyt. 2012. Score Analyzer: automatically determining scores diculty level
for instrumental e-Learning. In Proceedings of the 13th International Society
for Music Information Retrieval Conference, ISMIR. ISMIR, Porto, Portugal, 7.
https://doi.org/10.5281/zenodo.1416518
[13]
Siu-Lan Tan and Megan E Kelly. 2004. Graphic Representations of Short Musical
Compositions. Psychology of Music 32, 2 (2004), 191–212.
[14]
Lieven Verschael, Mark Reybrouck, Christine Jans, and Wim Van Dooren. 2010.
Children’s Criteria for Representational Adequacy in the Perception of Simple
Sonic Stimuli. Cognition and Instruction 28, 4 (Oct. 2010), 475–502. https://doi.
org/10.1080/07370008.2010.511571
[15]
Martin Wattenberg. 2001. The Shape of Song. http://turbulence.org/project/the-
shape-of- song/
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Digital methods are increasingly applied to store, structure and analyse vast amounts of musical data. In this context, visualization plays a crucial role, as it assists musicologists and non‐expert users in data analysis and in gaining new knowledge. This survey focuses on this unique link between musicology and visualization. We classify 129 related works according to the visualized data types, and we analyse which visualization techniques were applied for certain research inquiries and to fulfill specific tasks. Next to scientific references, we take commercial music software and public websites into account, that contribute novel concepts of visualizing musicological data. We encounter different aspects of uncertainty as major problems when dealing with musicological data and show how occurring inconsistencies are processed and visually communicated. Drawing from our overview in the field, we identify open challenges for research on the interface of musicology and visualization to be tackled in the future.
Article
Full-text available
People express emotion using their voice, face and movement, as well as through abstract forms as in art, architecture and music. The structure of these expressions often seems intuitively linked to its meaning: romantic poetry is written in flowery curlicues, while the logos of death metal bands use spiky script. Here, we show that these associations are universally understood because they are signalled using a multi-sensory code for emotional arousal. Specifically, variation in the central tendency of the frequency spectrum of a stimulus-its spectral centroid-is used by signal senders to express emotional arousal, and by signal receivers to make emotional arousal judgements. We show that this code is used across sounds, shapes, speech and human body movements, providing a strong multi-sensory signal that can be used to efficiently estimate an agent's level of emotional arousal.
Article
Full-text available
Personalization and context-awareness are highly important topics in research on Intelligent Information Systems. In the fields of Music Information Retrieval (MIR) and Music Recommendation in particular, user-centric algorithms should ideally provide music that perfectly fits each individual listener in each imaginable situation and for each of her information or entertainment needs. Even though preliminary steps towards such systems have recently been presented at the “International Society for Music Information Retrieval Conference” (ISMIR) and at similar venues, this vision is still far away from becoming a reality. In this article, we investigate and discuss literature on the topic of user-centric music retrieval and reflect on why the breakthrough in this field has not been achieved yet. Given the different expertises of the authors, we shed light on why this topic is a particularly challenging one, taking computer science and psychology points of view. Whereas the computer science aspect centers on the problems of user modeling, machine learning, and evaluation, the psychological discussion is mainly concerned with proper experimental design and interpretation of the results of an experiment. We further present our ideas on aspects crucial to consider when elaborating user-aware music retrieval systems.
Conference Paper
Looking for a piano sheet music with proper difficulty for a piano learner is always an important work to his/her teacher. In the paper, we study on a new and challenging issue of recognizing the difficulty level of piano sheet music. To analyze the semantic content of music, we focus on symbolic music, i.e., sheet music or score. Specifically, difficulty level recognition is formulated as a regression problem to predict the difficulty level of piano sheet music. Since the existing symbolic music features are not able to capture the characteristics of difficulty, we propose a set of new features. To improve the performance, a feature selection approach, RReliefF, is used to select relevant features. An extensive performance study is conducted over two real datasets with different characteristics to evaluate the accuracy of the regression approach for predicting difficulty level. The best performance evaluated in terms of the R2 statistics over two datasets reaches 39.9% and 38.8%, respectively.
Article
This study investigates children's metarepresentational competence with regard to listening to and making sense of simple sonic stimuli. Using diSessa's (2003) work on metarepresentational competence in mathematics and sciences as theoretical and empirical background, it aims to assess children's criteria for representational adequacy of graphical representations of sonic stimuli, as well as to investigate the impact of various factors, such as age, and musical background of the children and the nature of the sound fragments that were represented on the use of these criteria. Four groups of children (8–9 years olds and 11–12 years olds with and without extra music education) were exposed to three short fragments that were distinct from each other by the saliency of one sonic parameter. For each fragment, the children received six pairs of researcher-generated representations, with the representations of each pair differing from each other in terms of the representational criterion being stressed (either correctness, completeness, parsimony, formality, transparency, or beauty). The children had to choose one of both representations from each pair and explain their choice. Generally speaking, our results confirmed three major findings of diSessa (2002). They revealed (a) indications of metarepresentational competencies among the children, even among the youngest and musically inexperienced ones, (b) age- and education-related differences with respect to their preferences for certain representational features, and (c) great difficulties among children to articulate their metarepresentational competencies. Theoretical, methodological, and educational implications are discussed.
Article
Sixty college students were asked to 'make any marks' to visually represent five short orchestral compositions, and to write essays to explain their graphic representations. Most musically trained participants provided abstract representations (symbols or lines), while most pictorial representations (images or pictures telling a story) were created by musically untrained participants. A content analysis of the essays revealed that trained participants focused on themes, repetition, mode, changes in pitch, instruments, interplay of different instruments, and sections of the composition. Untrained participants were more likely to focus on arousal of emotions or sensations, fleeting images, and to create stories to accompany the music. Four styles of representations emerged, depending on whether listeners attended primarily to global qualities, fluctuations in dimensions of sound, perceptually salient features, or underlying structure of compositions. Copyright
Article
Experimental evidence demonstrates robust cross-modal matches between music and colors that are mediated by emotional associations. US and Mexican participants chose colors that were most/least consistent with 18 selections of classical orchestral music by Bach, Mozart, and Brahms. In both cultures, faster music in the major mode produced color choices that were more saturated, lighter, and yellower whereas slower, minor music produced the opposite pattern (choices that were desaturated, darker, and bluer). There were strong correlations (0.89 < r < 0.99) between the emotional associations of the music and those of the colors chosen to go with the music, supporting an emotional mediation hypothesis in both cultures. Additional experiments showed similarly robust cross-modal matches from emotionally expressive faces to colors and from music to emotionally expressive faces. These results provide further support that music-to-color associations are mediated by common emotional associations.
Conference Paper
Music visualization provides users with a new interface to browse, search, and navigate their personal digital music collection. Although there are several previous works on visualizing music collection based on some "surface" musical metadata, such as artist, album and genre, there has been few works on content-based or perception-based visualizations. In this paper, we developed an algorithm to automatically estimate human perceptions on rhythm and timbre of a music clip. Then, based on these two values, each music clip is mapped into a 2D (timbre-rhythm) space. Thus, a 2D perception-based visualization is built. Experimental evaluation indicates that this kind visualization is efficiently helpful in many cases of music management manipulations, such as music navigation, similar music search and music play list generation