Steve DiPaola

Learn More
This paper presents an approach to automatic video game level design consisting of a computational model of player enjoyment and a generative system based on evolutionary computing. The model estimates the entertainment value of game levels according to the presence of “rhythm groups,” which are defined as alternating periods of high and low(More)
This paper describes the conceptual and implementation shift from a creative research-based evolutionary system to a real-world evolutionary system for professional designers. The initial system, DarwinsGaze, is a Creative Genetic Programing system based on creative cognition theories. It generated artwork that 10,000’s of viewers perceived as human-created(More)
We overview our interdisciplinary work building parameterized knowledge domains and their authoring tools that allow for expression systems which move through a space of painterly portraiture. With new computational systems it is possible to conceptually dance, compose and paint in higher level conceptual spaces. We are interested in building art systems(More)
What visual cues do human viewers use to assign personality characteristics to animated characters? While most facial animation systems associate facial actions to limited emotional states or speech content, the present paper explores the above question by relating the perception of personality to a wide variety of facial actions (e.g., head(More)
This interdisciplinary paper hypothesizes that Rembrandt developed new painterly techniques — novel to the early modern period — in order to engage and direct the gaze of the observer. Though these methods were not based on scientific evidence at the time, we show that they nonetheless are consistent with a contemporary understanding of human vision. Here(More)
We propose a method to extract the emotional data from a piece of music and then use that data via a remapping algorithm to automatically animate an emotional 3D face sequence. The method is based on studies of the emotional aspect of music and our parametric-based behavioral head model for face animation. We address the issue of affective communication(More)
With the aid of new technologies, integrated design approaches are becoming increasingly incorporated into exhibit design in museums, aquaria and science centres. These settings share many similar design constraints that need to be addressed when designing multimedia interactives as exhibits. The use of adaptive systems and techniques can overcome many of(More)
We will describe a visual development system for exploring face space, both in terms of facial types and animated expressions. Imagine an n-dimensional space describing every humanoid face, where each dimension represents a different facial characteristic. Within this continuous space, it would be possible to traverse a path from any face to any other face,(More)
This paper proposes the Face Multimedia Object (FMO), and iFACE as a framework for implementing the face object within multimedia systems. FMO encapsulates all the functionality and data required for face animation. iFACE implements FMO and provides necessary interfaces for a variety of applications in order to access FMO services.