Gilles Grenon was born in Chicoutimi. He currently lives in Montreal. He holds a master’s degree in communication from the University of Quebec in Montreal and completed a specialized graduate studies program (D.E.S.S.) in arts, creation and technologies from the University of Montreal. He is currently completing a master’s degree in music, composition and sound creation option at the Faculty of Music of the University of Montreal. He is mainly interested in interactive digital arts.

Gilles Grenon presented a collaborative performance with visual artist Mancy Rezaei at the Sonopixel Festival on April 23, 2024. He also performed at the Ultrasons Festival on April 22, 2022 and his animated film “Yi Ching, the book of transformations” was selected for the The Dérapages event as part of the Sommets du cinéma d’animation Montreal International Festival on May 12, 2022.


​​Inverted Palimpsest

Salle Serge-Garant (Montreal), April 23, 2024

A palimpsest is a manuscript on which a new text has been written, after erasing the original text. The idea of ​​Inverted Palimpsest was to suggest a reversal of this notion of palimpsest. Starting from technological art to return in a metaphorical context to a kind of revelation or reminiscence of the original where something hidden or erased is brought to light.

The three-body problem

Salle Claude-Champagne (Montreal), May 12, 2022

The three-body problem is mentioned in Liu Cixin’s science fiction trilogy. It represents the impossibility of predicting the movements of three celestial bodies, some revolving around each other under the effect of their mutual attraction.

Three cubes hang from a mobile. On their faces appear arUco markers which serve as a target for a program in Unity which detects the XYZ spatial coordinates of the markers. Unity transmits the data by OSC to Max MSP which normalizes them before retransmitting them to Reaper. The XYZ data finally affects the oscillators and filters of TAL-NoiseMaker modular tracks.




The fixed radio telescope of the Canadian Hydrogen Intensity Mapping Experiment project (CHIME) records the Fast Radio Bursts detected by the measurement of hydrogen in and beyond our galaxy. Fast Radio Bursts are transient radio pulses lasting from less than a millisecond to a few seconds emitting as much energy as the sun emits in a few days. This phenomenon is still not explained today.

This musical exploration exploits data from CHIME’s catalog 1 recorded over a year, reducing them to an accelerated duration of 6 minutes. Through a mapping process, the frequencies of Fast Radio Bursts picked up from 400 to 800 MHz are recalibrated and processed through a polyphonic algorithm in Max MSP. The data thus transformed shapes different timbres and sounds mainly by using frequency modulation synthesis, popularized in 1967 by John Chowning. The original positions of detection of Fast Radio Bursts in the galaxies are then brought back in the Max MSP algorithm to the left/right spatial orientation.

Reference: arXiv:2106.04352 [astro-ph.HE]


Off all horizons




Musical study on a visual plot from an unknown source

On the run…

Experimental movies

Yi Ching, The Book of Changes

The Yi Jing, which appeared 3500 years ago, was a divination tool using wooden sticks. Over time, it became the basis of philosophical and moral thought that attempted to understand the world and its transformations. These were represented in the hexagrams which depicted the unfolding of events according to dynamic categories.

The creation of 485 images was necessary to display the Yin Yang symbols, the trigrams and the hexagrams. These images were generated with Python code written for this project using OpenCV and Numpy. These images were then edited in Premiere Pro. The musical pieces were generated using artificial intelligence algorithms : Amper Music from the New York company specializing in music generation.

4500 images in AI

This project is based on an artificial intelligence algorithm which uses reference images (here “texture” images of Montreal) and which attributes their “statistical” characteristics to the content of other images in order to transform the latter into the style reference pictures. “Style” models are thus produced, these are artificial neural networks created from several layers of pixel analysis, nineteen layers in the case of the VGG19 model used for this work. There will be one processing applied per image, with a greater or lesser impact depending on the number of iterations performed.

My living room

I propose to bring the viewer to see an imaginary space through the eyes of a stereo camera and to recreate this space. In the first part of the video, I superimpose the images of 2D reality on those of a 3D digitization of space transposed into pixels in color-coded distance calculations. I quickly add another visual layer where polygon edges are generated to reconstruct 3D space. In a second part, I suggest a “nebulous” state, referring to a kind of cosmogony representing a state of reconstruction of data acquired from antagonistic forces, organization and disorder, to recreate a small world, my imaginary living room , with an appearance of reality, but an altered reality.

In the manner of…

This project is based on an artificial intelligence algorithm based on style transfer. The images to be modified come from a one-minute video of a gutter during a rain cut into 1800 images. I then used six images of known paintings, edited in 10 second cycles and re-exported into 1800 images of the “style”:
• The Guernica of Pablo Picasso
• Composition VIII by Wassily Kandinsky
• Water Lilies and Japanese Bridge by Claude Monet
• The Scream by Edvar Munch
• The Starry Night by Vincent van Gogh
• The Great Wave off Kanagawa by Hokusai


Braccio ++ (Arduino, Max MSP)

The project consists of generating the movements of an Arduino Braccio + robotic arm. This project lies between engineering and art. The gesture of design is the most striking aspect of the project as well as the “mechanical” versus “artisanal” duality resulting from this movement. The gesture is dependent on a motorized mechanical impulse, but it can only recall the human gesture by its analogy with human joints (shoulders, elbows, wrists, etc.) and the creativity aspect. The idea is to compare the mechanical aspect with industrial type motor noises with that of the noise of the brush rubbing on the paper.

The Braccio++ will use a brush to ink a paper with random movements coded in an “Arduino Nano RP2040” microcontroller. The movement list is sent from Max MSP to the Arduino via an OSC connection. Piezo microphones will capture the noises of the motors and the brush on the paper using a Zoom H5 in “Audio interface” mode. The noises collected will be processed with Max MSP with simple effects of echoes and reverberations.

Hypermediacity (Pure data and Processing)

Inspired by a text by Katherine Hayles (“The Transformation of Narrative and the Materiality of Hypertext”, 2001) and representing it by splitting it into fragments which appear randomly on a screen and descend until they disappear at the bottom of this latest. At this moment, a letter chosen at random in each sentence that disappears triggers a sending of coordinates from Processing to Pure Data with the OSC protocol where a sound system is produced. I come back to the representation of events, occurrences in time a space of “hypermediacity”.

Montreal crime (Unity and Max MSP)

The purpose of this research-creation project is to question the use of data/metadata and their visualization by including them in a representation diverted from the usual infographics. The representation of statistics in connection with the concept of an open city is a pretext for offering an experience to spectators by emphasizing the importance of rendering. This is an exploration of open data made available to citizens in the concept of a smart city aimed at exploring the representation of data, their categorization, their scrambling and their positioning in the territory.

A data file lists criminal acts, their categories, including their date and location in latitude and longitude coordinates. Another file contains the coordinates of the polygons of the police district stations. Greenwich is the zero point of longitudes and latitudes in the real world and also in the program of this project. The first type of object will be the criminal act (blue spheres) appearing in geographic coordinates over the days and according to a random altitude; the second type of object (red rectangular parallelepipeds) will represent the layout of the geographical perimeter of the sectors of the police neighborhood stations.

Carousel (Processing)

The project interprets the relationship between a space location and temporal occurrences. It evokes the cone of light, a notion of special relativity explained by the mathematician Hermann Minskowski at the beginning of the twentieth century that represents the space-time in the past and the possible future for an observer at a time to a specific point.

What is proposed in these dimensional evolutions, it is to find at repeated intervals the photographs intact, but oscillating between the reality of the landscape elements present in the photographs towards the effects of superimpositions, cuts and geometric shear crisscrossed. We are then spectators of landscape transformations caused by nature (weather conditions, sunlight, reflections on the ground, on rain or snow) and by the urban human presence (motor vehicles, runners, walkers, reconstruction of the kiosk) regardless of the photographer’s action. The eye is returned to the centre of the work. The everydayness of images then explodes with their manipulation by computer coding. We are witnessing the reappropriation of the cosmic dimension of our daily lives.

Carousel project, photographs

A sample of photographs taken between 2015 and 2019 on Olmsted Road in Mount Royal Park.