Populating the Smart Musical Instruments Ontology with Data

@article{Turchet2020PopulatingTS,
  title={Populating the Smart Musical Instruments Ontology with Data},
  author={Luca Turchet and Guixia Zhu and Paolo Bouquet},
  journal={2020 27th Conference of Open Innovations Association (FRUCT)},
  year={2020},
  pages={260-267}
}
One of the main instances of Musical Things within the Internet of Musical Things (IoMusT) paradigm is the emerging family of Smart Musical Instruments (SMIs). This is a category of musical instruments encompassing sensors, actuators, embedded intelligence, and wireless connectivity to local networks and to the Internet. Recently, an ontology to represent this domain has been proposed, the Smart Musical Instruments Ontology. However, a database gathering SMIs instances was missing. Such a… 
1 Citations

Figures and Tables from this paper

References

SHOWING 1-10 OF 39 REFERENCES
The Internet of Musical Things Ontology
Internet of Musical Things: Vision and Challenges
TLDR
This paper presents a vision in which the IoMusT enables the connection of digital and physical domains by means of appropriate information and communication technologies, fostering novel musical applications and services and identifies key capabilities missing from today's systems.
An ubiquitous smart guitar system for collaborative musical practice
TLDR
An ubiquitous smart guitar system which uses the guitar as a hub for collaborative music making and a survey conducted with 18 performers shows a positive emotional engagement with the system which overall was found easy to use and novel.
An Overview of Semantic Web Activities in the OMRAS2 Project
TLDR
The Music Ontology is described, an open-ended framework for communicating musical information on the Web, and it is shown how this framework can be extended to describe specific sub-domains such as music similarity, content-based audio features, musicological data and studio production.
Cloud-smart Musical Instrument Interactions
TLDR
A smart guitar prototype is presented that allows retrieving songs from large online music databases using criteria different from conventional music search, which were derived from interviewing 30 guitar players.
Semantic Web Technology for New Experiences Throughout the Music Production-Consumption Chain
TLDR
The overall vision of the FAST project is described, the broad landscape in which it is working is set out, some key results are highlighted and how they bring out a central notion of FAST, that of Digital Music Objects, which are flexible constructs consisting of recorded music essence coupled with rich, semantic, linked metadata.
The Audio Effects Ontology
TLDR
The Audio Effects Ontology is designed as an extension to the Studio Ontology to provide a framework for the detailed description and sharing of information about audio effects, their implementations, and how they are applied in realworld production scenarios.
Smart Musical Instruments: Vision, Design Principles, and Future Directions
TLDR
A vision for this recent research area, which merges the research fields on digital musical instruments and smart objects, and fosters new types of interactions between the player and the instrument, between thePlayer and other players, and between the Player and audience members is depicted.
Real-Time Hit Classification in a Smart Cajón
TLDR
Results indicated that while hit location scales relatively well across different players, gesture identification requires that the involved classifiers are trained specifically for each musician.
...
...