Dr. Jérôme Nika

Researcher at Ircam and electronic musician

Jérôme Nika is a researcher in interactive and generative technologies for musical creation in Ircam’s ISMM team, and an electronic musician. His research is structured around the metaphor of “musical memory”, from its modeling and learning to its mobilization in a performance context, and has led to the development of several software environments dedicated to interactive musical composition and improvisation. At the crossroads of scientific research and creation, this work of designing computer models and instruments for musical creation on stage or in the studio has given rise to over 60 artistic productions in which he is involved as an electronic musician, computer music designer or scientific advisor: in jazz and improvised music (Steve Lehman, Orchestre National de Jazz, Bernard Lubat, Benoît Delbecq, Rémi Fox), contemporary music (Pascal Dusapin, Ensemble Modern, Marta Gentilucci, Alexandros Markeas) and contemporary art (Le Fresnoy). Latest production: creation of the generative electronics for the Orchestre National de Jazz album Ex Machina (Pi Recordings / L’Autre Distribution) by Steve Lehman and Frédéric Maurin, released in September 2023.

Composing human-machine musical interaction

On April 17, 2024 at 05:00 PM (UTC+1)

The new generative digital lutherie is capable, for example, of listening to a voice and reacting in real time to create accompaniment, producing novel effects, using audio recorded live to generate new melodic lines... If the controls are increasingly high-level and the degrees of freedom left to the machine seem to be increasing, the purpose of these novel software instruments is to create new practices and not to recreate `credible' music over and over again. From this point of view, technology in general and AI in particular are not ends but means to invent an electronic lutherie enabling new practices that encourage formalization and reflexivity in the human creative process. The lecture will be illustrated by demonstrations and excerpts from concerts and artistic productions using the Dicy2 environment created by Jérôme Nika.

Useful links

Ask your questions to Jérôme Nika!

Test

by Téo  



How does the new generative digital lutherie influence the creative collaboration between humans and machines in music composition?

by Charlotte Bettermann  


Is Dicy2 and its outputs more on the autonomous side, or does it still require a large amount of human input/controls? From the videos, it doesn't look like something a beginner could pick up and use straight away, but just how technical is it really, and how many musicians do you think would be able to use it?

by Anastasia Shulman  


What was the creative process like for developing the generative electronics for the Orchestre National de Jazz album “Ex Machina”?

by Timo Warendorf  


Can the Dicy2 recognize when the musician is playing out of key by accident and ignore this 'mistake'? Or does this device possibly think that the key has been changed?

by Mercedes Neuwirth  


How do interactive and generative technologies, like those you develop, impact the creative process for musicians?

by Kathrin Liesens  


Could you share some more insights in the collaboration process between you as a researcher and musicians in the creation of generative electronic music?

by Anna-Lena Scheunemann  


I don't know if I understood Dicy2 correctly, but is it possible for Dicy2 to react to more than one input, e.g. orchestra? And if so, is there a limit to the number of inputs (instruments, voices, …)?

by Eva Madl  


When controlling co-improvisation agents on stage, do you identify as a musician, a conductor, or a different role?

by Téo Sanchez  


In practice, how difficult is for the musicians to integrate the dicy2 tool to their creative process? Did they have to take a lot of time to understand how making what they want exactly or the processus is "natural"?

by Antonin Couton  


In order to interact with expert musicians and reflect them, I think a musical interface and user interface are important. How is DYCI2 providing the interfaces?

by Daeun Jeong  


Do you think AI and ML tools will lower the barriers of entry of music making?, and by doing that lower the average quality of music, or by enabeling creative people with a lack of resources the average quality will acctually increase?

by Julius Greppmair  


How complex and innovative can Dicy2 compose? Is it comparable with great and big artists in the genres or is it just a starting point where the Person using Dicy2 has to put an effort in to make it as good? Where are the limits?

by Alisa Valentiner  


Do you think the high technical complexity of tools like Dicy2 is holding back AI music tool sets from being more commonly adopted by musicians without a computer science background?

by Michael Buchholz  


How can you make sure, that the outcomes are new creations, rather than 'credible' music based on a dataset of existing music? Which data is used to train these AI-Systems?

by Samuel Pucher  


How are modern digital instruments of generative music technology typically employed in musical creation, especially within the domains of live performances and studio recordings?

by Annika Grauer  


When one melody is played, how does the Dicy2 decide, which chords & changes it will use and what note will be the Tonic, Dominat etc? Did you train it with common progressions from different genres (like e.g. II-V-I in Jazz) or does it choose randomly and accompanies kinda atonal.

by Julia Fröhlich  


Can Dicy2 find out the errors of the music? And also can dicy2 music effectively convey emotions to the public?

by Junghyun Lee  


How does generative digital guitar music react to a singer's rapid changes in tone, such as from the note F to the note D? Is there a noticeable delay in the algorithm or is it fast enough to ensure seamless transitions?

by Niklas Evmenenko  


How can the collaborative process between musicians and researchers be improved when using tools like Dicy2? Are there ways to bridge the gap between musical and technological expertise more effectively?

by Janis Reisenauer  


How do you approach the ethical considerations in the development and use of generative technologies in music, particularly in terms of copyright and intellectual property?

by Katharina Grünwald  


How do you ensure that the musical AI tools you've developed, such as Dicy2 for Max and Dicy2 for Ableton Live, enhance human creativity rather than replace it? And what methods did you use in your research to answer this question?

by Isabel  


In your opinion, how does the introduction of AI and interactive technologies in live performances impact audience perception and engagement with the music? Is there the possibility for the audience and the performing musician to co-enhance or even co-create music during the live performance?

by Vlad Panait  


If Dicy2 is capable of not recreating credible music over and over again, is making something ‘new’ the priority of Dicy2?
Will the process of gathering resources become more important in the future for those using generating AI? I am curious about what would be helpful in developing such abilities.

by Eunwoo Kim  


In your bio you mentioned involvement in over 60 artistic productions. Could you highlight a particularly memorable project where the integration of technology significantly influenced the creative process?

by Xhenis Metolli (Jenny)  


What personal qualities or skills do you believe have been most important in your work as a researcher and musician?

by Lea Zeilbeck  


What strategies do you use to ensure that the generated music remains coherent and aesthetically pleasing, while also pushing the boundaries of traditional composition?

by Jannik Vieler  


So you think that eventually with tools like Dicy2 and others we will reach a kind of barrier where there are no new Melodie’s to create?

by Daniel Wiesenfeld  


Has Dicy2 a memory of his productions during the time (few weeks or months after), or is the model always making his music in local and only generates about the imput provided during each using of the tool ?

by Antonin Couton  


How do you expect Dicy2 to change music interaction in the world? What is the ideal idea at the end? Will Dicy2 generate real-time interactive music during the live show?

by LiHsuan Chang  


In terms of usability, how accessible is Dicy2 for musicians who might not be technically proficient with AI and machine learning technologies?

by Viktoria Grigoriev  


When a collaborator learns about Dyc2 for the first time, how do you guide them in interacting with Dyc2? Because Dyc2 seems difficult for beginners or people who have only known it for the first time.

by Kuan Hua Chen  


In the case of dicy2, but also just in general, what is your take on copyright and AI? If someone uses the audio created by dicy2 in a product and sells that, is that ones "creation"? is it the developers "creation"?

by Eduard Krasnov  


Could music technologies like Dicey2 help to improve the access to music education and encourage participation from people with different backgrounds and musical abilities?

by Lena Ertl  


How is the integration of tools like generative digital lutherie being received in the music world?

by Sarah Obermaier  


So you think that eventually with tools like Dicy2 and others we will reach a kind of barrier where there are no new Melodie’s to create?

by Daniel Wiesenfeld  


How do you ensure that the output of dicy2 remains expressive and engaging for the audience, and does not become repetitive or predictable?

by Melanie Bauer  


Your full name
Institutional email address (hm.edu or hmtm.de)
Your question

Back to top

Logo HM Logo MUC.DAI Logo HMTM Logo Wavelab Logo BIDT