{"id":2834,"date":"2024-05-13T14:45:28","date_gmt":"2024-05-13T21:45:28","guid":{"rendered":"https:\/\/www.artsjournal.com\/diacritical\/?p=2834"},"modified":"2024-05-19T18:42:10","modified_gmt":"2024-05-20T01:42:10","slug":"classical-music-has-lost-a-generation-blame-the-metadata-in-part","status":"publish","type":"post","link":"https:\/\/www.artsjournal.com\/diacritical\/2024\/05\/classical-music-has-lost-a-generation-blame-the-metadata-in-part.html","title":{"rendered":"Classical Music has Lost a Generation. Blame the Metadata (in part)"},"content":{"rendered":"\n<figure class=\"wp-block-image size-full\"><a href=\"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/05\/gasteig-empty-1.jpg?ssl=1\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"817\" height=\"468\" src=\"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/05\/gasteig-empty-1.jpg?resize=817%2C468&#038;ssl=1\" alt=\"\" class=\"wp-image-2838\" srcset=\"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/05\/gasteig-empty-1.jpg?w=817&amp;ssl=1 817w, https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/05\/gasteig-empty-1.jpg?resize=300%2C172&amp;ssl=1 300w, https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/05\/gasteig-empty-1.jpg?resize=768%2C440&amp;ssl=1 768w\" sizes=\"auto, (max-width: 817px) 100vw, 817px\" \/><\/a><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>Classical music has lost a generation&#8217;s worth of music lovers beginning in the late-90s with the rise of file-sharing and Napster. A significant part of the reason might be: metadata.<\/p>\n\n\n\n<p>Metadata are the tags that travel with every audio recorded track. For a piece of music or a recording to be found, it needs to be tagged. Metadata comes (mostly) in three varieties:<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>Descriptive &#8211; nuts and bolts listing including artist, composer, titles, date, etc.<\/li>\n\n\n\n<li>Ownership\/Rights &#8211; who owns the music, licenses, royalty splits, etc.<\/li>\n\n\n\n<li> Recommendation &#8211; genre, mood, mix, use, etc.<\/li>\n<\/ol>\n\n\n\n<p>Each track travels with hundreds of tags allowing it to be found and attributed. But the commercial music industry has complained for years about the inadequacies of the metadata tagging system, and the system is rife with errors and omissions. This is important because it impacts how music is found, who gets paid for it, and even whether or not it is likely to find an audience and become popular.<\/p>\n\n\n\n<p>Classical music has been at particular disadvantage with metadata because the classifications are so very different from pop music. Search &#8220;Taylor Swift&#8221; and you get a list of the songs performed by her. Type in &#8220;Dvorak&#8221; and a list of pieces comes up with random orchestras, conductors and performers, movements of sonatas and symphonies separated from one another, and little means of sorting performances of the same pieces.<\/p>\n\n\n\n<p>Simply finding classical music is tough. But also consider how pop music listening habits have been transformed by Spotify and Pandora streaming algorithms and the ability to make a preference and have music effortlessly fed to you, and you have some idea of how classical music has been left behind. <\/p>\n\n\n\n<p>It wasn&#8217;t until 2015, when the classical music streaming service Idagio launched (followed by in 2018 Primephonic, which was acquired by Apple and relaunched as Apple Classical) that classical music fans finally had a useful digital music discovery platform. Both services are a huge step forward from platforms like Spotify and Pandora, which are a disaster for finding and listening to classical music. But in the meantime, from the mid-90s to 2015, a generation of listeners who might have turned into regular listeners, were essentially impeded from exploring by a series of digital speedbumps .<\/p>\n\n\n\n<p>Think of it this way: You can publish a blog or put products up to sell on your website. Objectively it can be found if you know exactly where to find it. But if your site isn&#8217;t designed and optimized with metadata in ways that Google likes, it&#8217;s vanishingly unlikely you&#8217;ll be found when your site is listed on page 463 of Google&#8217;s search results. In other words, you&#8217;re pretty much invisible. <\/p>\n\n\n\n<p>This also happens with social media algorithms. You can have 10,000 followers and publish something. But if the content isn&#8217;t &#8220;optimized&#8221; to what the algorithm is programmed to reward, even your closest 10,000 followers &#8212; people who&#8217;ve previously expressed an interest in seeing what you have to say &#8212; will never be shown it. <\/p>\n\n\n\n<p>One of the ways I think AI will liberate musicians and music lovers from metadata hell, is liberation from the current band-aid metadata system. As I wrote in <a href=\"https:\/\/www.artsjournal.com\/diacritical\/2024\/04\/the-essential-ai-translating-what-we-see-hear-and-experience.html\">a previous post<\/a>: <\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p class=\"has-text-color has-link-color wp-elements-7ec239075304d26b7b4766d35621dfee\" style=\"color:#38537d\"><em>To an AI model, a picture is data, sound and music are data, as is traditional spoken or written language. That data is translatable, interchangeable, and, most importantly, linkable and actionable. That means that video, music, sound, movement, image can interact in common language.<\/em><\/p>\n<\/blockquote>\n\n\n\n<p>If the AI is looking at the actual sound data rather than depending only on tags, music can be directly compared to other music (rather than generic descriptors). Watermarks (perhaps another term for metadata, only not removeable) could identify artist and production data, but also &#8212; incorporating blockchains &#8212; calculate plays, payments, splits and ownership. Perhaps most important, the &#8220;recommendation&#8221; tagging, which is now so imperfect, could accurately compare like with like.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"alignleft size-full is-resized\"><a href=\"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/05\/metadata.jpg?ssl=1\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"1000\" height=\"619\" src=\"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/05\/metadata.jpg?resize=1000%2C619&#038;ssl=1\" alt=\"\" class=\"wp-image-2837\" style=\"width:400px\" srcset=\"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/05\/metadata.jpg?w=1000&amp;ssl=1 1000w, https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/05\/metadata.jpg?resize=300%2C186&amp;ssl=1 300w, https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/05\/metadata.jpg?resize=768%2C475&amp;ssl=1 768w\" sizes=\"auto, (max-width: 1000px) 100vw, 1000px\" \/><\/a><\/figure>\n<\/div>\n\n\n<p>It is this third category that could change the way we find and listen to music. If data is just data and interchangeable, the &#8220;creation&#8221; data (artist, ownership, payment rights etc.) will be able to interact with the &#8220;listener&#8221; data that could tell us how the track is being used, remixed, built on, etc. That kind of data both informs listeners&#8217; choices but also potentially suggests dialog with the creators in helping them make new work. Audiences have always played a role in the creative process; AI suggests potential for a new level of co-creation and collaboration.<\/p>\n\n\n\n<p>I spent part of my weekend playing with AI music-creation apps &#8212; there are now hundreds of them. Most give you a few parameters you can adjust, and the results are pretty unsatisfying. But one, <a href=\"https:\/\/www.udio.com\/\">Udio <\/a>lets you interact with it with voice prompts as you would with ChatGPT. You can describe an idea, a scenario, a mood, genre, instrumentation, style &#8212; pretty much any description you can imagine &#8212; and the AI will create music in response. <\/p>\n\n\n\n<p>You can endlessly tweak the description to modify the music, querying the AI in conversation. I tried creating an orchestral fanfare and it came up with some interesting ideas. I even gave it some technical direction &#8212; eight first violins rather than 16 and asked for more vibrato at the top of a long phrase. It&#8217;s still clunky, but it&#8217;s easy to imagine in the future that composers might find this a more powerful way to get the music from their brains to finished product than traditional notation.<\/p>\n\n\n\n<p>We&#8217;ve been stuck in a crude technology era in which the notion that because something technically works (for example that all music has metadata), it functionally also works (music is easily found). Social media platforms and metadata structures have proved over and over that this is not true. It&#8217;s entirely possible we&#8217;ll come to think of our crudely imperfect metadata tagging system of text descriptors as the Stone Age of content discovery as dynamically comparative metadata across not just music but images, video and everything else enabled by AI becomes the new standard.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Classical music has lost a generation&#8217;s worth of music lovers beginning in the late-90s with the rise of file-sharing and Napster. A significant part of the reason might be: metadata. Metadata are the tags that travel with every audio recorded track. For a piece of music or a recording to be found, it needs to [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":2838,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_genesis_hide_title":false,"_genesis_hide_breadcrumbs":false,"_genesis_hide_singular_image":false,"_genesis_hide_footer_widgets":false,"_genesis_custom_body_class":"","_genesis_custom_post_class":"","_genesis_layout":"","advanced_seo_description":"","jetpack_seo_html_title":"","jetpack_seo_noindex":false,"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[15],"tags":[],"class_list":{"0":"post-2834","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-arts-tech","8":"entry"},"jetpack_publicize_connections":[],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/05\/gasteig-empty-1.jpg?fit=817%2C468&ssl=1","jetpack_shortlink":"https:\/\/wp.me\/p4ePZm-JI","jetpack_sharing_enabled":true,"jetpack-related-posts":[{"id":1123,"url":"https:\/\/www.artsjournal.com\/diacritical\/2016\/09\/why-music-and-the-concert-experience-are-on-the-front-lines-of-virtual-reality.html","url_meta":{"origin":2834,"position":0},"title":"Why Music And The Concert Experience Are On The Front Lines Of Virtual Reality","author":"Douglas McLennan","date":"September 14, 2016","format":false,"excerpt":"Following on my post from yesterday about anticipating the kinds of experiences people will want from concerts comes\u00a0this article from Wired about virtual reality and music. Evidently creating content for virtual reality is proving to be a challenge and music is so far the best showcase for VR. Outside of\u2026","rel":"","context":"In &quot;arts &amp; tech&quot;","block_context":{"text":"arts &amp; tech","link":"https:\/\/www.artsjournal.com\/diacritical\/category\/arts-tech"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2016\/09\/Coachella_2014_Week_2_Day_2_-_Sahara_Tent.jpg?fit=800%2C360&ssl=1&resize=350%2C200","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2016\/09\/Coachella_2014_Week_2_Day_2_-_Sahara_Tent.jpg?fit=800%2C360&ssl=1&resize=350%2C200 1x, https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2016\/09\/Coachella_2014_Week_2_Day_2_-_Sahara_Tent.jpg?fit=800%2C360&ssl=1&resize=525%2C300 1.5x, https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2016\/09\/Coachella_2014_Week_2_Day_2_-_Sahara_Tent.jpg?fit=800%2C360&ssl=1&resize=700%2C400 2x"},"classes":[]},{"id":3361,"url":"https:\/\/www.artsjournal.com\/diacritical\/2026\/04\/from-messages-to-conversations-ai-agents-are-changing-how-we-find-culture.html","url_meta":{"origin":2834,"position":1},"title":"From Messages to Conversations: AI Agents are Changing how we Find Culture","author":"Douglas McLennan","date":"April 7, 2026","format":false,"excerpt":"The first audience for your art is becoming a machine. The question isn't just how to optimize for that machine, it's what you give it to say, and whether what it says is worth a conversation.","rel":"","context":"In &quot;arts and AI&quot;","block_context":{"text":"arts and AI","link":"https:\/\/www.artsjournal.com\/diacritical\/category\/arts-and-ai"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2026\/04\/sunriseforever-robot-6688548_1920-1.jpg?fit=1000%2C590&ssl=1&resize=350%2C200","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2026\/04\/sunriseforever-robot-6688548_1920-1.jpg?fit=1000%2C590&ssl=1&resize=350%2C200 1x, https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2026\/04\/sunriseforever-robot-6688548_1920-1.jpg?fit=1000%2C590&ssl=1&resize=525%2C300 1.5x, https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2026\/04\/sunriseforever-robot-6688548_1920-1.jpg?fit=1000%2C590&ssl=1&resize=700%2C400 2x"},"classes":[]},{"id":2816,"url":"https:\/\/www.artsjournal.com\/diacritical\/2024\/04\/the-essential-ai-translating-what-we-see-hear-and-experience.html","url_meta":{"origin":2834,"position":2},"title":"The Essential AI: Translating the Art of What We See, Hear and Experience","author":"Douglas McLennan","date":"April 29, 2024","format":false,"excerpt":"To an AI model, a picture is data, sound and music are data, as is traditional spoken or written language. That data is translatable, interchangeable, and, most importantly, linkable and actionable. That means that video, music, sound, movement, image can interact in common language.","rel":"","context":"In &quot;arts and AI&quot;","block_context":{"text":"arts and AI","link":"https:\/\/www.artsjournal.com\/diacritical\/category\/arts-and-ai"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/04\/ai-generated-8578467_1280-1.jpg?fit=1000%2C565&ssl=1&resize=350%2C200","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/04\/ai-generated-8578467_1280-1.jpg?fit=1000%2C565&ssl=1&resize=350%2C200 1x, https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/04\/ai-generated-8578467_1280-1.jpg?fit=1000%2C565&ssl=1&resize=525%2C300 1.5x, https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/04\/ai-generated-8578467_1280-1.jpg?fit=1000%2C565&ssl=1&resize=700%2C400 2x"},"classes":[]},{"id":2793,"url":"https:\/\/www.artsjournal.com\/diacritical\/2024\/03\/a-framework-for-thinking-about-disruption-of-the-arts-by-ai.html","url_meta":{"origin":2834,"position":3},"title":"A Framework for Thinking about Disruption of the Arts by AI","author":"Douglas McLennan","date":"March 30, 2024","format":false,"excerpt":"What would a strategy for the arts sector be for anticipating artificial intelligence, if consensus seems to be it will change everything?","rel":"","context":"In &quot;arts &amp; tech&quot;","block_context":{"text":"arts &amp; tech","link":"https:\/\/www.artsjournal.com\/diacritical\/category\/arts-tech"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/03\/workshop-4863393_1280-1.jpg?fit=1000%2C579&ssl=1&resize=350%2C200","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/03\/workshop-4863393_1280-1.jpg?fit=1000%2C579&ssl=1&resize=350%2C200 1x, https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/03\/workshop-4863393_1280-1.jpg?fit=1000%2C579&ssl=1&resize=525%2C300 1.5x, https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wordpress\/wp-content\/uploads\/2024\/03\/workshop-4863393_1280-1.jpg?fit=1000%2C579&ssl=1&resize=700%2C400 2x"},"classes":[]},{"id":165,"url":"https:\/\/www.artsjournal.com\/diacritical\/2011\/08\/the-classical-music-critic-goes-extinct.html","url_meta":{"origin":2834,"position":4},"title":"The Classical Music Critic Goes Extinct","author":"Douglas McLennan","date":"August 23, 2011","format":false,"excerpt":"Seems important to note the passing of music criticism as a legitimate job in Canada. John Terauds, for six years staff classical music critic of the Toronto Star, was reassigned this week to the paper\u2019s business section. He was the last full-time classical music critic at a Canadian newspaper. The\u2026","rel":"","context":"In &quot;arts journalism&quot;","block_context":{"text":"arts journalism","link":"https:\/\/www.artsjournal.com\/diacritical\/category\/arts-journalism"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/wp\/wp-content\/uploads\/2011\/08\/eacae9164a65af69f68bb9fe5451.jpeg?resize=350%2C200","width":350,"height":200},"classes":[]},{"id":101,"url":"https:\/\/www.artsjournal.com\/diacritical\/2009\/05\/how_perfection_killed_classica.html","url_meta":{"origin":2834,"position":5},"title":"Is Perfection Killing Classical Music?","author":"Douglas McLennan","date":"May 3, 2009","format":false,"excerpt":"Not literally, of course, at least not yet. The ability to edit and fix recordings has long conditioned audiences to expect that the music we hear should be perfect. Has it changed the way performers play in live concert? The role of recordings in the music business has changed. Once,\u2026","rel":"","context":"With 9 comments","block_context":{"text":"With 9 comments","link":"https:\/\/www.artsjournal.com\/diacritical\/2009\/05\/how_perfection_killed_classica.html#comments"},"img":{"alt_text":"recording.jpg","src":"https:\/\/i0.wp.com\/www.artsjournal.com\/diacritical\/recording.jpg?resize=350%2C200","width":350,"height":200},"classes":[]}],"jetpack_likes_enabled":true,"_links":{"self":[{"href":"https:\/\/www.artsjournal.com\/diacritical\/wp-json\/wp\/v2\/posts\/2834","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.artsjournal.com\/diacritical\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.artsjournal.com\/diacritical\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.artsjournal.com\/diacritical\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.artsjournal.com\/diacritical\/wp-json\/wp\/v2\/comments?post=2834"}],"version-history":[{"count":4,"href":"https:\/\/www.artsjournal.com\/diacritical\/wp-json\/wp\/v2\/posts\/2834\/revisions"}],"predecessor-version":[{"id":2840,"href":"https:\/\/www.artsjournal.com\/diacritical\/wp-json\/wp\/v2\/posts\/2834\/revisions\/2840"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.artsjournal.com\/diacritical\/wp-json\/wp\/v2\/media\/2838"}],"wp:attachment":[{"href":"https:\/\/www.artsjournal.com\/diacritical\/wp-json\/wp\/v2\/media?parent=2834"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.artsjournal.com\/diacritical\/wp-json\/wp\/v2\/categories?post=2834"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.artsjournal.com\/diacritical\/wp-json\/wp\/v2\/tags?post=2834"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}