The making of Toy Story [computer animation]

  title={The making of Toy Story [computer animation]},
  author={Markus Henne and Hal Hickel and Ewan Johnson and Sonoko Konishi},
  journal={COMPCON '96. Technologies for the Information Superhighway Digest of Papers},
Toy Story is the first full length feature film produced entirely using the technology of computer animation. The main characters, Sheriff Woody and Space Ranger Buzz Lightyear; are toys that come to life when humans aren't around. Their story is one of rivalry, challenges, teamwork and redemption. Making this film required four years of effort, from writing the story and script, to illustrated storyboards, through modeling, animation, lighting, rendering, and filming. This paper examines… 
A Multimodal Interface for Virtual Character Animation Based on Live Performance and Natural Language Processing
A multimodal animation system that combines performance- and NLP-based methods that recognizes natural commands issued by the performer, extracts scene data from a text description and creates live animations in which pre-recorded character actions can be blended with performer’s motion to increase naturalness.
Production of Character Animation in a Home Robot: A Case Study of LOVOT
The practical methods to develop many behaviors in a single robotic agent are introduced and is the collaborative efforts of a group of people with diverse professional backgrounds.
Character Animation: An Automated Gait Cycle for 3D Characters Using Mathematical Equations
This paper derives mathematical equations used in describing the gait cycle and tests these equations on several 3D characters to prove their validity using Maya program to auto-generate realistic gait cycles.
Rendering computer animations on a network of workstations
  • Timothy D. Davis, E. Davis
  • Computer Science
    Proceedings of the First Merged International Parallel Processing Symposium and Symposium on Parallel and Distributed Processing
  • 1998
This system provides a powerful and general method for obtaining high-quality animations with a significant reduction in computation and overall processing time and presents several techniques for partitioning the data across the workstation processors.
Exploiting frame coherence with the temporal depth buffer in a distributed computing environment
  • T. Davis, E. Davis
  • Computer Science
    Proceedings 1999 IEEE Parallel Visualization and Graphics Symposium (Cat. No.99EX381)
  • 1999
The t-buffer is a conceptually simple data structure which stores knowledge about coherent pixel values across the duration of an animation which can potentially avoid recomputing a significant number of redundant pixel values and thus substantially reduce overall computing time.
Heritage Reproduction in the Age of High-Resolution Scanning:A Critical Evaluation of Digital Infilling Methods for Historic Preservation
High-definition digital scanning has established itself as a useful tool for documenting cultural heritage in the twenty-first century. Proponents of surveying technology are hailing the use of


Principles of traditional animation applied to 3D computer animation
The basic principles of traditional 2D hand drawn animation and their application to 3D computer animation are described and how these principles evolved is described.
The menv modelling and animation environment
This paper describes Menv, an environment for the development of interactive three-dimensional modelling and animation systems. Menv supports both interactive and procedural techniques. Systems built
Compositing digital images
The case for four-channel pictures is presented, demonstrating that a matte component can be computed similarly to the color channels, and guidelines for the generation of elements and the arithmetic for their arbitrary compositing are discussed.
The Reyes image rendering architecture
An architecture is presented for fast high-quality rendering of complex images that uses micropolygons to minimize paging and to support models that contain arbitrarily many primitives.
Approximate and probabilistic algorithms for shading and rendering structured particle systems
Detail enhances the visual richness and realism of computer-generated images. Our stochastic modelling approach, called particle systems, builds complex pictures from sets of simple, volume-filling
Distributed ray tracing
Motion blur and depth of field calculations can be integrated with the visible surface calculations, avoiding the problems found in previous methods.
Rendering antialiased shadows with depth maps
A solution to the aliasing problem for shadow algorithms that use depth maps that is based on a new filtering technique called percentage closer filtering and provides soft shadow boundaries that resemble penumbrae.