
Gene Kogan's Neural Synthesis is a video artwork that explores and expands upon the technique popularly known as ‘deep dream’. This is an iterative process for optimising the pixels of an image to obtain a desired activation state in a trained convolutional network. It primarily experiments with the dynamics of feedback-generated ‘deep dream’ videos, where each frame in the film is initialised by the previous one. The novel aesthetics are achieved by ‘gating’ (or ‘masking’) the pixel gradients of multiple channels and mixing them using predetermined masking patterns, while simultaneously distorting the input canvas.
A number of strategies used to create the artwork are directly inspired by the initial work of Google’s Deep Dream implementation, particularly the work of Mike Tyka who first experimented with feedback, canvas distortion (zooming) and mixing gradients of multiple channels.
The trained network used is Google’s Inceptionism network. The workflow for generating the artwork is under continuous development: future improvements include a more generalised canvas distortion function and improved masking from source images.
As part of the ART AI Festival 2019 artwork trail, Gene Kogan's Neural Synthesis is on display at Leicester Central Library, Bishop Street, Leicester, LE1 6AA - download map.


Neural Zoo is an exploration of the ways creativity works: the combination of known elements into a new, previously unseen element. The creator in this case is the algorithm itself but with a human artist as its muse. Isn’t all art made by humans an execution or reshaping of data absorbed through biological neurons? How can we continue to inspire machine learning algorithms to create art for us, emotional humans?

SYNOPSIS
You must be logged in to post a comment.