In a study of human perception of music in relation to different representations of video graphics, this project explores the automatic synchronization in real time between audio and image. This aims to make the relationship seem smaller and more consistent. The connection is made using techniques that rely on audio signal processing to automatically extract data from the music, which subsequently are mapped to the visual objects. The visual elements are influenced by data obtained from various Musical Information Retrieval (MIR) techniques. By visualizing music, one can stimulate the nervous system to recognize different musical patterns and extract new features.