PIANOS by AGF: visualized by AAU VFX Mograph

This project was created in 2010 as an audio/motion visualization challenge.

School: Academy of Art University, San Francisco
Department: Computer Arts, VFX
Class: Compositing for Motion Graphics 2
Instructor: Colin Evoy Sebestyen

Here is the lesson plan and challenge presented to the students:


This project will visualize „Pianos“, an audio composition by AGF. The piece will be broken into sections and handed off to a different designer. You need to abstract what is being said – the only allowable graphic structure is within a 5 x 5 grid framework. How can you use grids to abstract the concepts of your section? You can not use any pictorial representations – no images, only abstractions used to represent the ideas of each stanza through rhythm, timing, pacing and feel.

How can you use the restriction of line, symbol and shape within the grid? You must use only greyscale, no colors are allowed. The temporal movement of the graphics can be animated to reflect the audio intonation, message, or be generated from the audio data frequency levels. For example, if the stanza is referencing nature, you could execute an abstraction of the movement of leaves. If there is stress in the vocal, your animation could reflect that intonation. The lines, symbols and shapes created inside the grid can be derived from the following techniques: shape layers, a mosaic effect, or mask animation. All shapes have to reference and be based on the grid.


Lines can have any width or can mix thicks and thins. They can be attached from point to point in any fashion you like, but must be attached inside the grid in some way.

Planes can be used, turning the grid into a small icon or a modular series of squares. Each square can act as a pixel, with simple binary boolean operations filling in each square.

Shapes can be created using the grid as a guide. Each point inside the grid needs to act as the edges of the shape with each shape intersecting and layering with one another. Curves can be utilized, but they must visibly reference and accentuate the underlying grid system.


Execution of your shape animations inside of AE should be the length of your audio segment; however, each series of shapes can be reused or looped depending on your abstraction concept.

You are required to create three animation sequences based on your given audio segment. These three animations must all work in the same concept, For example, if you create points and lines that are based on 45-degree angles, you need to preserve this rule of modularity through your three animations.

Limiting these basic textures to simple shapes and planar abstractions will pay off when we move the textures to the third dimension. Your shape animations will work as transparency channels (masks) or luminance channels (brightness) when we map them to our primitives.

Render your texture files out as 800 x 800 square image sequences, along with Quicktime movie preview versions.


When creating your materials inside C4D, you may map your Quicktime textures to the color, luminance, or alpha channels – or a combination of all three. As long as your animations adhere to the same systems, the look of your materials should be visually consistent.

Using the power of cloner objects, effectors, and the CS_tool camera rigs we have explored so far in this course, you are required to animate a section of the poem in C4D. Your geometry should be limited to these simple primitives: sphere, cube, pyramid, or cylinder. How these shapes are replicated in space is up to you. You are welcome to clone cloner objects and mix and match the different modes as best illuminates your concept of the text.

You can manipulate the clones however you like using effectors – for example, random value mappings of the scale property. How the clones are arrayed in 3D space needs to reflect your concepts of your black and white grid animations. This means a system of rules that you carry through from 2D to 3D. What does a grid mean when taken to the volume space of XYZ?

The lighting setups – with the exception of those of you who wish to experiment with the HDRI lighting kit – and non-luminance properties of the material options on your clones will be locked to the example file. This setup is to insure that our animations will have some consistency from designer to designer. However, camera animation, primitive type, clone number, and arrangement are all up to you.


To provide opportunities for variety, you are allowed two variables in your piece. The use of these variables is optional; including them will bring you further away from your initial 2D grid animations and add technical complexity to your piece.

Variable 1 – Type:

Introducing one piece of type will heavily influence how your materials and abstractions will be perceived. If you do choose a word, choose it carefully. You as the designer can pick the one word that represents the whole stanza or concept to your audience. Some solutions using type might be: mapping the clones to the word using the spline effector, setting up clones to interact with the word glyphs, adding effector objects to influence the type.

Your typeface is limited to DF Gusto Solid, which is based on a 5 x 5 grid.

Lay your word out in Adobe Illustrator and then import into C4D as spline objects. You can then extrude the splines using the Extrude Nurbs generator.

Variable 2 – 1 Color
You may choose an emphasis color. Some ideas include: creating a light object with color properties, introducing an additional material, combining an area light object and primitive with a material tag used as a clone, or color treatment in post utilizing AE.


• Utilizing AE, all texture animation pieces must be 800 x 800, 24FPS, saved out as TIFF image sequences along with Quicktime versions for preview in class.
• You will be working 1280 x 720, 24FPS for final output.
• Utilize the production workflow of Illustrator to C4D to .aec file format as covered in module 2.
• Tag objects in your Object Manager with any needed compositing tags or external compositing tags.
• All lighting on this piece is set with the included C4D file, or alternatively, you may experiment with Grey Scale Gorilla’s HDRI Lighting Kit
• Use the provided materials
• You may experiment with as much post compositing and production that we have covered in Motion 1.

Thank you!


VILLA by Malevo

When we were invited to created a visual identity video for the REYKJAVIK VISUAL MUSIC PUNTOy RAYA FESTIVAL 2014, we wanted to have a real match with the eclectic spirit of both organisations and their mutual devotion to abstract film and live cinema.
RVM & PyR Festival is a challenging offer to go back to the basics and explore endless creatives possibilities of the dot and the line through various spheres of art, music, science and thought.
This video it’s fundamentally based in the idea of creating error and distortion over an abstract black and white clean dot and line motiongraphics work.
We created distortion thru creating a series digital glitches in the images, frame by frame, losing quality by filming the projection of the video on a wall and by printing the video frames and scanning them once again.
Thus throughout this process the images lost quality over and over again, but at the same time through distortion we created colour and texture thereby reading the original geometrical abstraction to distortion abstraction. The idea explores the beauty of randomness and expressionist power.
VILLA is a feast of pure dots & lines, colour, motion & sound.

VILLA means error in Icelandic.

Festival at HARPA, concert hall and conference centre in Reykjavik, Iceland. From january 30th to february 02nd 2014.

Program features among other: PyR 2014 Official Competition with around 100 dots-lines short films from 46 countries; PyR Academy with workshops and masterclasses; Premier Live Cinema Pieces with Anna Thorvaldsdottir, Hugi Gudmundsson,Sigurdur Gudjonsson, Bret Brattery; Live cinema competitions, Live sets with Ryoiji Ikeda, Ryoichi Kurokawa…

More info:

Direction, art concept & design by
Guillermo Daldovo

Error & edition by
Bernard Arce

Sonia Figuera

„Tao Ki“ by RSantos SupercineXcene

DROMOS – An immersive performance by Maotik and Fraction

Dromos is a Live audio visual performance created by composer Fraction and Digital Artist Maotik, designed for immersive environment, produced in the satosphere, at the Société des Arts Technologiques, for Mutek festival in 2013.
All Visuals by Maotik
More Info on maotik.com/dromos
contact m@maotik.com
Video Footages Sebastien Roy
Technical : SATO Team

LHC CERN Event Data Visualization and Sound Generator by JH Park

LHC Sound Generator makes musical sound.
This small program generates a special musical sound by LHC CERN Event data. The work process is like a spectrum Music generator.
The Frequency Axis (Y- Axis) arrange in curvature of each one particle path and the X-Axis is the distance from the middle point.

Mac Version (Yosemite) Download Link : jeonghopark.de/media/lhc_data_music.app.zip

sourcecode : github.com/jeonghopark/LHC_CERN_Music

„ig Data“ files are at linked website from openData CERN.

This file has 100 Events. After opening the file, just click space bar and play all events in regular order.

This program is made by openFrameworks, refreq and ofUnzip.