Show simple item record

dc.contributor.authorDadman, Shayan
dc.contributor.authorBremdal, Bernt Arild
dc.contributor.authorBang, Børre
dc.contributor.authorDalmo, Rune
dc.date.accessioned2023-01-04T12:34:54Z
dc.date.available2023-01-04T12:34:54Z
dc.date.issued2022-11-30
dc.description.abstractMusic generation using deep learning has received considerable attention in recent years. Researchers have developed various generative models capable of imitating musical conventions, comprehending the musical corpora, and generating new samples based on the learning outcome. Although the samples generated by these models are persuasive, they often lack musical structure and creativity. For instance, a vanilla end-to-end approach, which deals with all levels of music representation at once, does not offer human-level control and interaction during the learning process, leading to constrained results. Indeed, music creation is a recurrent process that follows some principles by a musician, where various musical features are reused or adapted. On the other hand, a musical piece adheres to a musical style, breaking down into precise concepts of timbre style, performance style, composition style, and the coherency between these aspects. Here, we study and analyze the current advances in music generation using deep learning models through different criteria. We discuss the shortcomings and limitations of these models regarding interactivity and adaptability. Finally, we draw the potential future research direction addressing multi-agent systems and reinforcement learning algorithms to alleviate these shortcomings and limitations.en_US
dc.identifier.citationDadman, Bremdal, Bang, Dalmo. Toward Interactive Music Generation: A Position Paper. IEEE Access. 2022en_US
dc.identifier.cristinIDFRIDAID 2090539
dc.identifier.doi10.1109/ACCESS.2022.3225689
dc.identifier.issn2169-3536
dc.identifier.urihttps://hdl.handle.net/10037/28026
dc.language.isoengen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.relation.journalIEEE Access
dc.rights.accessRightsopenAccessen_US
dc.rights.holderCopyright 2022 The Author(s)en_US
dc.rights.urihttps://creativecommons.org/licenses/by/4.0en_US
dc.rightsAttribution 4.0 International (CC BY 4.0)en_US
dc.titleToward Interactive Music Generation: A Position Paperen_US
dc.type.versionpublishedVersionen_US
dc.typeJournal articleen_US
dc.typeTidsskriftartikkelen_US
dc.typePeer revieweden_US


File(s) in this item

Thumbnail

This item appears in the following collection(s)

Show simple item record

Attribution 4.0 International (CC BY 4.0)
Except where otherwise noted, this item's license is described as Attribution 4.0 International (CC BY 4.0)