Wednesday, March 31, 2010

Computer generated motion graphics


The term motion graphics originated with video editing in computing, perhaps to keep pace with newer technology. Before computers were widely available, motion graphics were costly and time consuming, limiting their use to only high budget film and TV projects. With the reduced cost of producing motion graphics on a computer, the discipline has seen more widespread use. With the availability of desktop programs such as Adobe After Effects, Discreet Combustion, and Apple Motion, motion graphics have become increasingly accessible.

The term "Motion Graphics" was popularized by Trish and Chris Meyer's book about the use of Adobe After Effects, titled "Creating Motion Graphics". This was the beginning of desktop applications which specialized in video production, but were not editing or 3D programs. These new programs collected together special effects, compositing, and color correction toolsets, and primarily came between edit and 3D in the production process. This "in-between" notion of motion graphics and the resulting style of animation is why sometimes it is referred to as 2.5D

Motion graphics continue to evolve as an art form with the incorporation of sweeping camera paths and 3D elements. Maxon's CINEMA 4D is known for its ease of use, plugins such as MoGraph and integration with Adobe After Effects. Despite their relative complexity, Autodesk's Maya and 3D Studio Max are also widely used for the animation and design of motion graphics. Maya — traditionally used for high-end special effects and character animation — has the advantage of including an extremely robust feature set and wide-ranging user base. 3D Studio Max has many of the advanced features of Maya and uses a node-based particle system generator similar to Cinema 4D's Thinking Particles plugin. There are also some other packages in Open Source panorama, which are gaining more features and adepts in order to use in a motion graphics workflow. Blender and its node-editor is becoming more and more powerful.

Many motion graphics animators learn several 3D graphics packages for use according to each programs' strengths. Although many trends in motion graphics tend to be based on a specific software's capabilities, the software is only a tool the designer uses while bringing the vision to life.

Lending heavily from techniques such as the Collage or the Pastiche, motion graphics has begun to integrate many traditional animation techniques as well, including stop-motion animation, cell animation or a combination of both

Motion graphics

Motion graphics are graphics that use video and/or animation technology to create the illusionmotion or a transforming appearance. These motion graphics are usually combined with audiomultimedia projects. Motion graphics are usually displayed via electronic mediathaumatrope, phenakistoscope, stroboscope, zoetrope, praxinoscope, flip book) as well. The term is useful for distinguishing still graphics from graphics with a transforming appearance over time without over-specifying the form of for use in technology, but may be displayed via manual powered technology (e.g.

Motion graphics versus film

Motion Graphics include animations, movies, etc. The term "motion graphics" has the potential for less ambiguity than the use of the term "film" to describe moving pictures in the 21st century. "Film" is also used to describe photographic film (the 20th century medium of choice for recording motion), the process of recording footage, and the industry it most serves. However, digital video recording and digital projection to display motion graphics have the potential to make photographic film obsolete. The term "capture" is often used instead of "film" as a verb to describe the process of recording footage, perhaps due to the term's compatibility with digital video and motion capture technology. "The motion picture industry" is the formal term for what used to be called the "film industry".


Tuesday, March 30, 2010

3D motion tracking

Effective use of motion tracking can transform your live-action footage. Motion graphics pro JJ Johnstone reveals how to use motion tracking when undertaking complex 3D compositing in After Effects and Cinema 4D

Motion tracking, or match-moving, is the term used to describe the simulation of live-action camera moves and perspective inside compositing software such as After Effects, Combustion, Shake or Flame for 2D, and Boujou, PF Track, SynthEyes or Matchmover for 3D. It’s commonly used to apply special effects to feature films and commercials, but it’s becoming more common within type and character animation in motion graphics.

marisTrack

Discover the Meaning of Motion

ImarisTrack is the most powerful commercially available tracking program that rises to the challenge of monitoring temporal changes in biological systems (2D and 3D images over time). Based on a choice of multiple sophisticated automatic tracking algorithms, the ability to manually edit and correct tracks if needed, and the ability to work on extremely large and complex data sets, ImarisTrack allows researchers to answer even the most demanding live-cell imaging questions.


Thursday, March 25, 2010

'Get AAAGR Before Your Friend Does'



" We Provide Appropriate Platform
For Your Endless Imagination."


Hurry Admissions Open
Get Your Lucky "Coupan"


Join Us
Make Your Life
ColoRFul

For more information visit -

www.3D-BunkaR.com

Wednesday, March 24, 2010

Motion capture, motion tracking, or mocap are terms used to describe the process of recording movement and translating that movement onto a digital model. It is used in military, entertainment, sports, and medical applications. In filmmaking it refers to recording actions of human actors, and using that information to animate digital character models in 2D or 3D computer animation. When it includes face, fingers and captures subtle expressions, it is often referred to as performance capture

Optical systems utilize data captured from image sensors to triangulate the 3D position of a subject between one or more cameras calibrated to provide overlapping projections. Data acquisition is traditionally implemented using special markers attached to an actor; however, more recent systems are able to generate accurate data by tracking surface features identified dynamically for each particular subject. Tracking a large number of performers or expanding the capture area is accomplished by the addition of more cameras. These systems produce data with 3 degrees of freedom for each marker, and rotational information must be inferred from the relative orientation of three or more markers; for instance shoulder, elbow and wrist markers providing the angle of the elbow.

[edit] Passive markers

A dancer wearing a suit used in an optical motion capture system
Several markers are placed at specific points on an actor's face during facial optical motion capture

Passive optical system use markers coated with a retroreflective material to reflect light back that is generated near the cameras lens. The camera's threshold can be adjusted so only the bright reflective markers will be sampled, ignoring skin and fabric.

The centroid of the marker is estimated as a position within the 2 dimensional image that is captured. The grayscale value of each pixel can be used to provide sub-pixel accuracy by finding the centroid of the Gaussian.

An object with markers attached at known positions is used to calibrate the cameras and obtain their positions and the lens distortion of each camera is measured. Providing two calibrated cameras see a marker, a 3 dimensional fix can be obtained. Typically a system will consist of around 6 to 24 cameras. Systems of over three hundred cameras exist to try to reduce marker swap. Extra cameras are required for full coverage around the capture subject and multiple subjects.

Vendors have constraint software to reduce problems from marker swapping since all markers appear identical. Unlike active marker systems and magnetic systems, passive systems do not require the user to wear wires or electronic equipment. Instead, hundreds of rubber balls are attached with reflective tape, which needs to be replaced periodically. The markers are usually attached directly to the skin (as in biomechanics), or they are velcroed to a performer wearing a full body spandex/lycra suit designed specifically for motion capture. This type of system can capture large numbers of markers at frame rates as high as 2000fps. The frame rate for a given system is often balanced between resolution and speed: a 4-megapixel system normally runs at 370 hertz, but can reduce the resolution to .3 megapixels and then run at 2000 hertz. Typical systems are $100,000 for 4-megapixel 360-hertz systems, and $50,000 for .3-megapixel 120-hertz systems.

[edit] Active marker

Active optical systems triangulate positions by illuminating one LED at a time very quickly or multiple LEDs with software to identify them by their relative positions, somewhat akin to celestial navigation. Rather than reflecting light back that is generated externally, the markers themselves are powered to emit their own light. Since Inverse Square law provides 1/4 the power at 2 times the distance, this can increase the distances and volume for capture.

The TV series ("Stargate SG1") episode was produced using an active optical system for the VFX. The actor had to walk around props that would make motion capture difficult for other non-active optical systems.

ILM used active Markers in Van Helsing to allow capture of the Harpies on very large sets. The power to each marker can be provided sequentially in phase with the capture system providing a unique identification of each marker for a given capture frame at a cost to the resultant frame rate. The ability to identify each marker in this manner is useful in realtime applications. The alternative method of identifying markers is to do it algorithmically requiring extra processing of the data.

[edit] Time modulated active marker

A high-resolution active marker system with 3,600 × 3,600 resolution at 480 hertz providing real time submillimeter positions.

Active marker systems can further be refined by strobing one marker on at a time, or tracking multiple markers over time and modulating the amplitude or pulse width to provide marker ID. 12 megapixel spatial resolution modulated systems show more subtle movements than 4 megapixel optical systems by having both higher spatial and temporal resolution. Directors can see the actors performance in real time, and watch the results on the mocap driven CG character. The unique marker IDs reduce the turnaround, by eliminating marker swapping and providing much cleaner data than other technologies. LEDs with onboard processing and a radio synchronization allow motion capture outdoors in direct sunlight, while capturing at 480 frames per second due to a high speed electronic shutter. Computer processing of modulated IDs allows less hand cleanup or filtered results for lower operational costs. This higher accuracy and resolution requires more processing than passive technologies, but the additional processing is done at the camera to improve resolution via a subpixel or centroid processing, providing both high resolution and high speed. These motion capture systems are typically under $50,000 for an eight camera, 12 megapixel spatial resolution 480 hertz system with one actor.

IR sensors can compute their location when lit by mobile multi-LED emitters, e.g. in a moving car. With Id per marker, these sensor tags can be worn under clothing and tracked at 500 Hz in broad daylight.

[edit] Semi-passive imperceptible marker

One can reverse the traditional approach based on high speed cameras. Systems such as Prakash use inexpensive multi-LED high speed projectors. The specially built multi-LED IR projectors optically encode the space. Instead of retro-reflective or active light emitting diode (LED) markers, the system uses photosensitive marker tags to decode the optical signals. By attaching tags with photo sensors to scene points, the tags can compute not only their own locations of each point, but also their own orientation, incident illumination, and reflectance.

These tracking tags that work in natural lighting conditions and can be imperceptibly embedded in attire or other objects. The system supports an unlimited number of tags in a scene, with each tag uniquely identified to eliminate marker reacquisition issues. Since the system eliminates a high speed camera and the corresponding high-speed image stream, it requires significantly lower data bandwidth. The tags also provide incident illumination data which can be used to match scene lighting when inserting synthetic elements. The technique appears ideal for on-set motion capture or real-time broadcasting of virtual sets but has yet to be proven.


Monday, March 22, 2010

Superfad Seattle brings more than just eye candy to this surreal exploration of live action and vfx for Sony Bravia HDTV. Unfolding in three parts, “Birth of Color,” ”Explosion of Color,” and “Release of Color,” the piece takes us on a dreamlike journey, with each section visually manifesting Sony’s global brand message of “make.believe”.

Drawing from the theatrical world of fashion photography, Superfad chose spherical objects to represent the dot in “make.believe” and serve as a thread that runs throughout the piece.

For an extra bonus, we’re including both the final piece and a behind the scenes video in HD. Also check out the process frames and style boards sent from Directors Will Hyde and Carlos Stevens


IBM’s recent campaign exposing the data that flows through our world and keeps it moving, has produced Data Anthem and Data Baby. They called upon two names, respectively, who have become synonymous with beautifying data: James Frost of Zoo Films (partnered with The Mill) and Motion Theory.

Yes, we know the extraneous use of numbers, particles, etc. has run it’s course when used as stylistic flourish. However, this immaculately executed “data” and it’s aesthetic components, play supporting fiddle to this campaign’s concept.

Sunday, March 21, 2010

Motion 4

3D effects in record time.

Motion 4 icon

Just drag and drop to send particles exploding through space. Swing cameras around an object with breathtaking ease. Apply new, realistic shadows and reflections with a click, and animate credit rolls in seconds. With Motion 4, it’s easier than ever to create astonishing 2D and 3D motion graphics.

Intuitive real-time design environment

Start animating right away with the easy-to-use tools and intuitive interface in Motion 4. You can see the results in real time as you work and adjust animation settings on the fly. Instantly set an object in motion by dragging a behavior to the Canvas, then use the Keyframe Editor to make precise adjustments.

Effortless 3D graphics

Create motion graphics in 3D space without learning a brand-new interface. The integrated 3D multiplane environment in Motion 4 is a natural extension of familiar 2D tools. All of the Motion features, such as particles, text behaviors, motion paths, and Replicator, work in the 3D environment.

Easy-to-use text and titling tools

Make the letters in a word, phrase, or sentence tumble onto the screen by adjusting a single character. Create instant sequences of numbers that count up or count down. Build editable lower thirds and other titles for use in Final Cut Pro, with regular text or animated LiveFonts. Working with text has never been easie

RealD builds 3D branding with RabbitHoles interactive displays @ ShoWest

ShoWest Logo

RealD, the world’s leading supplier of 3D projection systems in movie theatres, commissioned RabbitHoles Media to design and craft 3D branding displays for the ShoWest Conference & Exhibition in Las Vegas, NV.

The illuminated “jewel cases” feature over-sized replicas of RealD’s signature glasses with RabbitHoles 3D animated holograms acting as their lenses that contain an interactive experience with the techno-dog from RealD’s in-theatre promo trailer.

ShoWest, the world’s largest exhibitor conference, is being held at the Paris & Bally Hotels from March 15 — 18. The RealD 3D displays are lit up in both RealD’s domestic and international sales suites and in ShoWest’s “Hall of Posters”, where upcoming movie attractions are promoted leading into the conference registration and presentation area.

RealD uses 3D Animated Holograms for Promotional Display during CES 2010 Trade Show

Holographic 3D Glasses - RealD Trade show stand Global leader in 3D projection technology for cinema and home, RealD, commissioned RabbitHoles Media to create a branded promotional display for use during CES 2010. The portable display features a giant replica of their trademark 3D glasses atop a glowing blue & white prism; the lenses of the giant 3D glasses are RabbitHoles 3D animated holograms that reveal an interactive holographic scene from RealD’s branding trailer, complete with 3D robot dog at play with RealD’s animating logo in a serene 3D environment. The display will be revealed at the 3D Film Festival (3DFF) special event at Planet Hollywood opening night of the CES 2010 Trade Show in Las Vegas, NV.

RealD Awards AVATAR 3D Motion Holograms to Jim Cameron & Jon Landau for Innovation in 3D @ ShoWest

Avatar Hologram, Showest 3d awards

During Coca-Cola’s Final Night Ceremony at the ShoWest Conference & Exhibition, RealD’s CEO Michael Lewis presented their Innovation in 3D award to Jim Cameron and Jon Landau of Lightstorm Entertainment for AVATAR. The commissioned RabbitHoles holographic awards are the first 3-dimensional imagery ever printed of AVATAR and publicly displayed, created by Weta Digital in collaboration with RabbitHoles Media. The 3D print depicts AVATAR’s Neytiri, crouched in the Pandora jungle, bow drawn, as an ethereal Sprite alights onto the tip of her arrow. As viewers interact with the print, they experience true-3D perspectives on the Sprite’s action, Neytiri’s awestruck expression, and the rich jungle environment as it reveals around her in nearly 180-degrees. Following the ceremony, attendees flowed to the front of the room to shake hands, interact with the AVATAR holograms, and capture photos and videos of the experience (keep an eye out on YouTube!).

Several thousand people were in attendance, including studio executives from 20th Century Fox, Disney/Pixar, Warner Bros, Paramount/Dreamworks, Universal, Sony, as well as independent production houses Summit, Laika, and Legendary. Other ShoWest award winners present included actors Sam Worthington (Avatar / Clash of the Titans / Terminator Salvation), Kathryn Heigel (Knocked Up / 27 Dresses / The Ugly Truth), Amanda Seyfried (MammaMia / Dear John / Letters to Juliet) and Sarah Jessica Parker, Kristin Davis, and Cynthia Nixon of Sex and The City, as well as directors Jay Roach (Meet the Parents / Austin Powers / Borat) and Todd Phillips (The Hangover / Old School / Due Date), and producer Jerry BruckheimerThe Rock). The event also hosted worldwide theatre exhibitors AMC Theatres, Regal, Cinemark, and Rave Motion Pictures among many others, as well as technology and service companies like Deluxe Digital, RealD, Dolby, IMAX, and Technicolor that cater to the movie exhibition industry, and especially to the emergence of the digital 3D movie-going experience.