Studio 1 – Project Journey – Week 12

Color Corrections

 

Color correction is a vital part in the postproduction. It refers to the process where every individual clip is altered and corrected to match the color temperature of multiple shots.  Color is fundamental in design and visual story telling, as it conveys much more than story telling. This tool is vital for postproduction as it balances the colors, making the whites actually appear white and the black appear black.

The goal of doing so is to match the video footage as to a standards set by the visualizer and the colors as viewed by the human eye.

Also if the shot is outdoors, the exposure of sun and the time spent in sun is going to change the temperature of the video and will make the clip brighter than the rest. And this is why color correction is so important as it will make your shots seamless and make your video look like they have been shot at the same time.

Color corrections can be done using primary and secondary tools. This tool has been used in many action movies like Transformers, Black Hawk and many horror movies like The Ring and Saw; to make the movie look more natural and closer to the way human eye views the object.

colour-correction

References

@. (2015). Understanding Color Correction vs. Color Grading for Post Production. Retrieved December 9, 2016, from https://www.motionelements.com/blog/articles/understanding-color-correction-vs-color-grading-for-post-production

H. (2013). How to colour-correct your 3D renders. Retrieved December 10, 2016, from http://www.creativebloq.com/audiovisual/colour-correct-3d-renders-9134621

 

Advertisements

Studio 1 – Project Journey – Week 11

Dr Strange Animation

 

Doctor Strange is a American superhero film based on the Marvel Comics.  His character is a usually gifted neosurgeon who only loved money and is a sorcerer with many super powers.

As an animator what makes Doctor Strange stand out from both the Marvel universe and many other blockbuster competitors are the spectacular trippy visual effects.. It’s a journey through time, space, the astral plane and something called the ‘mirror dimension’ in which the reality folds according to the will of the movie’s heroes and villians.  The space and landscapes become increasingly distorted to the general public. That means the roads and the skyscrapers, roads of new york  surrounds are constantly being folded and warped, ultimately creating an effect of kaleidoscopics array of buildings that are harder to escape.

Marvel studios got ILM to make those insane bending buildings in Doctor Strange. The final visual effects were using Industrial Light and Magic, ILM, a visual effect tool used for many motion pictures like Star wars, Iron man, Matrix, Jurrasic park , Pirates of the Caribbean to name a few. ILM has been a constant innovator in visual effects especially digital.

nullnulldr-strange3dr-strange4

References:

@. (2016). How ILM Made Those Insane Bending Buildings in ‘Doctor Strange’ Retrieved December 2, 2016, from http://www.cartoonbrew.com/vfx/ilm-made-insane-bending-buildings-doctor-strange-144770.html

Brady, R. (2016). The Architecture of “Doctor Strange” Retrieved December 2, 2016, from http://architizer.com/blog/the-architecture-of-doctor-strange/

Suderman, P. (2016). The many inspirations for Doctor Strange’s trippy visuals, from Steve Ditko to The Matrix. Retrieved December 2, 2016, from http://www.vox.com/culture/2016/11/7/13513134/doctor-strange-marvel-visual-effects

Studio 1 – Project Journey – Week 10

Skybox in Games

Skybox is a panoramic texture drawn behind all objects in games represent the background or great distance. E.G The sky.

Skybox helps sets the mood and atmosphere of the world you’ve built.

Understanding skyboxes

A skybox is split into six textures representing six directions visible along the main axes (up, down, forward, backward, left and right). The six directions offer a panoramic view. After skybox is generated, the texture images will fit together seamlessly at the edges to give a continuous surrounding image. This can be viewed from “inside” in any direction. The panorama is behind all other objects in the scene. It rotates to match orientation of the camera. A skybox is an easy way to add realism to a scene. It puts minimal load on the graphics hardware.

 

Skyboxes consist of six panels. They fold together and form a seamless scene in every direction. With skyboxes it is easy to just drag-and-drop simplicity. Here’s how easy it is to add to any game:

  1. Take the models (linked below) from the catalog
  2. In Studio, click Insert > My Models
  3. Select the skybox you want to apply and click it to improve the look of your game

Skybox in Unity

the Standard Assets package in unity comes with a number of high-quality skyboxes (menu: Assets > Import Package > Skyboxes).

skybox1

The skybox is a material using one of the shaders from the RenderFX submenu. If you choose the Skybox shader, you will see an inspector like the following, with six samaplers for the textures:-

skybox2

The Skybox Cubed shader requires the textures to be added to a cubemap asset (menu: Assets > Create > Cubemap).

.

skybox

 

References:

Unity – Using Skyboxes. (n.d.). Retrieved November 25, 2016, from https://unity3d.com/learn/tutorials/topics/graphics/using-skyboxes

These High-Res Skyboxes Make Games Beautiful — Fast. (n.d.). Retrieved November 25, 2016, from http://blog.roblox.com/2014/04/these-high-res-skyboxes-make-games-beautiful-fast/

Studio 1 – Project Journey – Week 9

MassFX for 3ds Max enables you to add realistic physics simulations to your project. This plug-in makes 3ds Max-specific workflows, using modifiers and helpers to annotate the simulation aspects of your scene.

MassFX uses rigid bodies: During the simulation these objects do not change shape. the rigid bodies can be one of three types:

  • Kinematic:Kinematic objects are animated using standard methodsThey can also be stationary objects. A Kinematic object cannot be affected by dynamic objects in the simulation but can affect them. A Kinematic object can switch over to Dynamic status during the simulation.
  • Dynamic:The motion of Dynamic objects is controlled by the simulation. They are subject to gravity and forces that result from being struck by other objects in the simulation.
  • Static:Static objects are like Kinematic objects but cannot be animated. However, they can be concave, unlike Dynamic and Kinematic objects.

MassFX additional features:

  • TheMassFX Visualizer displays simulation factors such as contact points and object velocities. This feature is key for debugging simulations.
  • MassFX Explorerworks with MassFX simulations. It is a special version of Scene Explorer.

Use constraints (Eg such as with a hinged door) to allow objects to restrict each other’s motion.

mass-fx-1mass-fx-2

References:

MassFX. (2016). Retrieved November 19, 2016, from https://knowledge.autodesk.com/support/3ds-max/learn-explore/caas/CloudHelp/cloudhelp/2016/ENU/3DSMax/files/GUID-3A3E8929-A7A4-4BA8-80F2-8B32AAA7BC7B-htm.html

3ds Max Help. (n.d.). Retrieved November 19, 2016, from http://docs.autodesk.com/3DSMAX/15/ENU/3ds-Max-Help/index.html?url=files%2FGUID-6C05B7A0-EA13-4A91-AF85-9BF900103948.htm%2CtopicNumber

Studio 1 – Project Journey – Week 8

Face Morphs:

Morphing but can be used to change the shape of any 3D model. It is also used for lip sync and facial expression on a 3D character. The modifier provides many channels for morph materials and targets.

Morpher modifier is used to change the shape of a patch, mesh, or NURBS model. You can morph World Space FFDs and shapes (splines). The Morpher modifier also supports material morphing.

Facial Animation

Create a character’s head in an “at rest” pose to achieve lip sync and facial animation. The head can be a patch, mesh, or NURBS model. Modify and Copy the head to create the facial-expression and lip-sync targets. Select the “at rest” head and apply the Morpher modifier.

Morph Targets for Speech animation

Speech animation uses nine mouth-shape targets. You might want to create extra morph targets to cover additional mouth shapes in case your character speaks an alien dialect.

Include cheek, nostril, and chin-jaw movement when creating mouth-position targets. Examine your own face in a mirror. You can also put a finger on your face while mouthing the phonemes to establish the direction and extent of cheek motion.

Expression: Morph Targets

For a character, create as many expression targets as necessary. Sadness, Joy, sadness, evil, surprise, can all have own targets. Depending on the personality of the character, like a happy target, may not be necessary.

 

References

Morpher Modifier. (2014). Retrieved November 9, 2016, from https://knowledge.autodesk.com/support/3ds-max/learn-explore/caas/CloudHelp/cloudhelp/2015/ENU/3DSMax/files/GUID-506247E2-1F5D-4857-998E-8256FD88626D-htm.html

Create funny face animations. Morph them ALL! (n.d.). Retrieved November 9, 2016, from http://www.facemorpher.com/

Studio 1 – Project Journey – Week 7

 

CAT Rigs

CAT (Character Animation Toolkit) is a character-animation plug-in for 3ds Max. CAT facilitates character rigging, non-linear animation, layering animation, motion capture import, muscle simulation and more.

The CATRig is the hierarchy that defines the CAT skeletal animation system. It is a flexible character rig that is designed to let you create the characters you want without having to write scripts. It is also adds speed and sophistication to rigging.

The CATRig character keeps the structure as generic as possible. This is enabled by the CAT’s modular composition design. This a key feature that makes it flexible tool. It is this flexibility that allows users to add and remove different rig elements to get the exact skeleton you need for your character.

Each rig has its procedural walk-cycle system,own layered animation system, and pose/chip system. Each rig element also combines geometry with special capabilities specific to its function.

cat-rig-1

Animating with CATrig

CAT’s FK/IK rig-manipulation system lets you push and pull the rig parts into your desired pose, whether in IK or FK. Walk-cycle sequences, CATMotion allows you to create customized walk cycle and direct the character around the scene. No need to place individual footsteps.

 

Animation is created in CAT’s nonlinear animation (NLA) system. This is made possible by the Layer Manager. One of the key advantages of CAT’s NLA system is that you work directly in an animation layer. You do not need to to go back out and tweak the source animation elsewhere

References:

3ds Max Help. (n.d.). Retrieved November 4, 2016, from http://docs.autodesk.com/3DSMAX/16/ENU/3ds-Max-Help/index.html?url=files%2FGUID-BB87B15F-7A2C-4C6F-AADF-3A5F2962549E.htm%2CtopicNumber

3ds Max Help. (n.d.). Retrieved November 4, 2016, from http://docs.autodesk.com/3DSMAX/16/ENU/3ds-Max-Help/index.html?url=files%2FGUID-EA1D6D09-A2CD-4204-8093-A7AE5EC5E333.htm%2CtopicNumber

Studio 1 – Project Journey – Week 6

Hair and Fur  for Tangled and Brave and Zootopia

Pixar’s animators and their technological counterparts have looked ways to make the animated world look like real world. Pixar developed an entirely new hair stimulation software- known as Taz.used for  movies like Zooptopia, Tangled, Brave to name a few.

For Brave and Tangled; the hair required much greater hair-to-hair collision, which means it needed to look more flowy and full. Hair was modeled using series of mass and springs and to avoid tangling or stiffness of hair. With this new software, hairs would be dealt with as one group and the hair stimulation could be multi threaded. That’s in turn would solve the daunting task to make hair stimulation look real.

meridas-hair

Dealing with fur was another challenging task for animators to create verisimilitude of an animal only body in the movie Zoopotia. The fur technology makes animals look realistic and believable. Disney’s team of engineer’s introduced- iGroom; which helps shape about 2.5 million hairs. It is a fur-controlling tool, which has never used before. This software gave animators tonnes of flexibility. They could play with the fur, give it texture, fluffiness- brush it, shape it and shade it. You can push fur around and find the form you want.

fur

References

Brave New Hair. (2014). Retrieved October 30, 2016, from https://www.fxguide.com/featured/brave-new-hair/

Lalwani, M. (2016). Fur technology makes Zootopia’s bunnies believable. Retrieved November 4, 2016, from https://www.engadget.com/2016/03/04/fur-technology-makes-zootopias-bunnies-believable/

Studio 1 – Project Journey – Week 5

Color Scripts

There’s a science to choosing color schemes in a movie, to make the movie visually more attractive and to provide psychological assists, and how designers use complementary pairs in a movie’s art design.  Ralph Eggleston from Pixar was the person who introduced color scripts. He suggests that it provides a definite color palette for the movies. It defines the lighting, the color scheme for the Pixar movies.

Color script serves a functional purpose in animation. Its provides the director all clues he can get from start to finish of the movie on screen. Color script is an early attempt to map out the color, emotion and moods for the film.

Having a color script will not make or break your animation but it can definitely help the studio to evolve new ideas and figure out different approaches to early stages of story telling. The first attempt of any animator is to set the mood for the project.  Color script is not about making a pretty piece of art; it evolves throughout the early stages of the film, hand in hand with the story development.

It is often best to start from a traditional predefined color scheme. Analogous, Complimentary and monochromatic color schemes are just a few of the traditional color schemes available as a starting point for designers

 

colour-script1colour-script-2

References

Colour Script. (n.d.). Retrieved October 22, 2016, from http://pixar-animation.weebly.com/colour-script.html

Creating a Color Script – Mike Cushny. (2011). Retrieved October 22, 2016, from http://www.mikecushny.com/color-script/

Studio 1 – Project Journey – Week 4

We are now at the beginning of texturing our created assets. It is an exciting phase that we are all looking forward to. I did a little research on some tips & tricks to keep in mind while we texture & the top 3 were:

  1. Material definition – a good way to judge if a texture is working or not is to see if the material definition is clear or not. That means that one should be able to tell what kind of material it is just by looking at the texture even if it is not on the product. Example : A wooden texture vs plastic must have the difference noticeable just by looking at the amount of spec in the material.material-definition
  2. Base material can be re-used – If one is creating a set of props with similar materials, creating a base material texture well ensures it can be re-used & variations can be added. Eg: in our project we have multiple trees & also wooden baskets & barrels etc. So once we have the base wooden texture created we can re-use that & just create variations so as to fasten the workflow.
  3. Beauty is in the subtle details – the beauty of a texture is in the depth that can be added to it & one of the ways is to put subtle details into it. Eg: cracks in a wooden plank, moss on certain corners of a rock etc. These small subtle details that are not obvious in the 1st glance is what makes the texture more interestingsubtle-details

 

References:

The Top Ten Tips of Texturing | CGSociety. (n.d.). Retrieved October 14, 2016, from http://www.cgsociety.org/index.php/cgsfeatures/cgsfeaturespecial/the_top_ten_tips_of_texturing

@. (2015). 3D Texturing Tips | Tips to Push Your Textures to the Next Level. Retrieved October 14, 2016, from http://blog.digitaltutors.com/tips-to-push-your-textures-to-the-next-level/

 

 

Studio 1 – Project Journey – Week 3

Another week’s gone by & it was an ok week. We had a few successes in terms of modelling but more of redo work to correct mistakes & bring the quality further up to what is expected. It gave me some time to explore & read about some of the tools & technologies that interest me.

This week I was reading about “Holoportation”. This is a fairly new technology that was introduced by Microsoft earlier this. This is a new way of 3D capturing that allows high-quality 3D models of people to be reconstructed, compressed and transmitted anywhere in the world in real time. When combined with mixed reality displays such as HoloLens, Microsoft’s augmented reality headset & a special camera rig setup, it allows users to see, hear, and interact with remote participants in 3D as if they are actually present in the same physical space. The sense of smell & touch cannot be transmitted & with the current technological development, the hologram of the person appears only within the field of view. Hence, the person being holoported could actually appear to be standing inside a couch or a table or even floating in air based.

Technically, the camera setup captures very high quality details from every angle & the custom software stiches them together to generate a fully formed 3D model. This is then transmitted/holoported to anywhere in the world. However, this process generates a huge load of data & currently most video streaming codecs are not 3D friendly. Hence, compressing gigabytes into megabytes of data for fast transfer is a big part of making this technology a success.

Currently the spatial problems are something that still needs to be worked on, but we are fast getting there.

Microsft hopes to eventually make this a consumer product. It could change the way of 2D video calls to something a lot more tangible & real. They also hope to make using this technology to communicate and interact with remote users become a natural face-to-face communication that could bring down expensive business travels down significantly.

Star Wars’ holoportation may not be a far fetched reality now.

 

References:

  1. Holoportation – Microsoft Research. (n.d.). Retrieved October 7, 2016, from https://www.microsoft.com/en-us/research/project/holoportation-3/
  2. (2016). Microsoft introduces the world to ‘holoportation’ Retrieved October 07, 2016, from http://www.techradar.com/news/wearables/microsoft-introduces-the-world-to-holoportation–1317809
  3. How Microsoft Conjured Up Real-Life Star Wars Holograms. (n.d.). Retrieved October 07, 2016, from https://www.wired.com/2016/04/microsoft-holoportation-star-wars-hologram/