Reimagining Annotation for Multimodal Cultural Heritage

7-9 févr. 2024
The conference will be hosted at Rennes 2 University and the MSHB (Maison des Sciences de l'Homme en Bretagne) (Rennes, France). It will be possible to attend the conference online. - Rennes (France)

https://reimagining-amch.sciencesconf.org

Rapid advances in digital technology are constantly expanding the ways in which we can annotate multimodal documents, be they texts, images, videos, sounds, web pages or code. Cultural heritage institutions have embarked on ambitious campaigns to digitize their collections; at the same time, born digital heritage joins archival collections. The valorisation of digital heritage, especially audiovisual documents and multimodal corpora, is becoming a major issue for both cultural institutions and researchers. One of the answers consists of creating annotation interfaces or automating annotation thanks to computational techniques, both for close and / or distant viewing analysis. Reimagining annotation for multimodal cultural heritage makes for an exciting and stimulating landscape, but also engenders a host of epistemological questions. How can we engage with and organize the informational hierarchies that emerge from these methods? What are the affordances and limits of close and distant reading methods and how can we articulate these two approaches? Faced with a multiplication of approaches and interfaces, how can we consolidate research and resources to encourage cumulative, collaborative work to occur? What becomes of the document's ontology — notably in the context of time-based media and the analysis of creative processes — when it integrates a network of annotations, readings and decompositions? This conference seeks to interrogate these questions across three primary axes: **Axis 1: Tools.** We wish to interrogate the tools available for the annotation of multimodal data and cultural heritage. What is the state of the art, what tools are available to researchers? What are the issues developers face when dealing with multimodal data and especially audiovisual data? How do developers overcome the friction between powerful computational methods and users? **Axis 2: Methods.** We wish to interrogate the methodologies for engaging with multimodal data and cultural heritage. How has annotation in the digital humanities developed with the emergence of computational techniques? What are the new approaches they allow for? How will the field develop from an epistemological point of view? **Axis 3: Projects.** We wish to shine a light on projects that have interrogated these first two axes in academia and the GLAM sector. We encourage researchers and cultural professionals from a large number of fields who work along these axes to contribute: the digital humanities and GLAM professionals, the performing arts, theater, cinema, music, visual art, history, video games, conservation and archival professionals, as well as cultural heritage institutions. The conference will be hosted at Rennes 2 University and the MSHB (Maison des Sciences de l'Homme en Bretagne) (Rennes, France). It will be possible to attend the conference online. Prior to the conference, the participants will have the possibility to attend a day-long workshop which will be the opportunity to showcase a new tool based on IIIF for the annotation of multimodal corpora. More information about the workshop shall be communicated soon. ## Calendar: 11/04/2023: publication of call. 15/09/2023: abstract submission deadline. 23/10/2023: notification of accepted abstracts. ## Submission: Papers : abstract of 500 words. Posters : abstract of 500 words. ## Publication: All the papers will be recorded before or during the conference and edited with MemoRekall-IIIF. They will be published as an open source multimodal book online for the conference.
Discipline scientifique :  Archéologie et Préhistoire - Art et histoire de l'art - Education - Histoire - Littératures - Héritage culturel et muséologie - Musique, musicologie et arts de la scène - Méthodes et statistiques

Lieu de la conférence
Personnes connectées : 404 |  Contact |  À propos |  RSS |  Vie privée |  Accessibilité