Department of Computational Perception
Department of
Computational Perception
Johannes Kepler Universität Linz


Home  –  Mission  –  Teaching  –  People  –  Research  –  Media  –  Awards  –  Impressum


ARTISTIC APPLICATIONS

<< back to overview


We are also interested in applying methods from Artificial Intelligence, Pattern Recognition, Signal Processing, Web Retrieval, etc. to the domains of creativity and interactive digital arts. If you have exciting and creative ideas for applying computational intelligence in an artistic context do not hesitate to contact us!

Music Production Technology and Creative Tools

In the context of music production and creation — in a studio setting as well as live on stage — a number of music understanding tasks need to be perfomed in order to support the composer/performer, ranging from search and annotation tasks to beat synchronization to automatic variation and modification. Music information retrieval methods can be used in these processes to facilitate workflow and artistic expression.

Examples for projects:
  • Real-time music listening and improvisation
  • Rhythm and melody generation based on generative models
  • Learning "Artistic Style" and Rewriting Music
  • Recommending samples based on beat structure
  • Building new music interfaces, e.g., as VSTs and/or by building upon Native Instruments hardware and software (Maschine, Traktor, Komplete)
  • Cover version, sample, and remix detection
  • Studying the effects of composers and remixers on perceived audio similarity and listening behavior
Contact: Peter Knees


credit: Native Instruments

sound/tracks

When travelling on a train and looking out of the window, the fleeting impressions of the moving scenery and the composition of the passing objects generate a piece of visual music with its own tempo and rhythm, its own colours and harmonies. The project sound/tracks aims at capturing these visual impressions and translates them into a musical composition in real-time. The view out of the window is captured with a camera and translated into instantaneously played back piano music. The passing scenery can be considered the score of a musical composition which is going to be interpreted based on outside conditions such as weather and lighting, the speed of the train, and the quality of the camera. Thus, every journey will produce a unique composition.

More details, background, and example videos on the sound/tracks project page

Examples for projects:
  • Implementing sound/tracks on iPhone, iPad, or Android devices
  • Experimenting with different image-to-music transformations
  • Online Optical Flow Estimation
  • Object/scenery reconstruction from recorded train journeys
  • Generating sounds from moving objects
  • Real-time music playlist generation based on surroundings
Contact: Peter Knees

sound/tracks - Tunnel near Mallnitz-Obervellach

Weltassoziator

The Weltassoziator is intended as a space that allows observation of machine cognition and association. Aspects of the world's knowledge are projected in the space, interact with each other, and organize autonomously. The presence of the visitor and the visitor's current focus of attention change priorities in concept association and initiate an interaction between visitor and machine. This stimulates a joint association process of human and artificial intelligence and ultimately leads to an intervowen stream of thoughts.

Examples for projects:
  • Capturing body movements
  • Determining a user's center of attention
  • Visualisation of "concept interaction" for large display projection
  • Efficient retrieval and caching of arbitrary multimedia objects from the Web
Contact: Andreas Arzt

last edited by pk on Sep 29, 2015