We all know Unity’s game engine is good for, well, games. Unity Technologies has been around for a long time now. Founded in Copenhagen, Denmark back in 2004, the goal was simply for Unity to help people make games. But as it grew more popular, it became clear to the company that it could do more. It had to do more.
But what else is there for Unity? Could mastering Unity’s many features lead you to previously-unrelated careers? Is there room for even more opportunities with Unity in the future?
Augmented Reality, Virtual Reality, and Mixed Reality
It should be no surprise that Unity Technologies has developed tools for the three reality types. They are all closely related to video games. In fact some have been suggesting that reality-altering technology is the future of gaming.
Half-Life: Alyx, released in 2020 by Valve, had impeccable reviews. Sony is gearing up to release the PSVR2. A 2021 article by The Washington Post noted serious financial growth in the virtual reality market. By this year almost anyone somewhat familiar with entertainment technology would have heard of at least one reality mode.
But what does each reality mode even mean?
Augmented reality overlays objects on reality. A good example is Pokémon Go.
Virtual reality, on the other hand, does not show reality.
Lastly, mixed reality, such as Microsoft HoloLens or interactive holograms at events are a level between augmented and virtual in which real and fake objects interact.
None of these are restricted to games though. Some VR helmets can detect facial movement, such as smiles or winks. While this is still used predominantly by the gaming world, similar to the blink detection in Through Your Eyes (which you should absolutely play) or just to emote to other players, some are looking to it for assisting people with disabilities. On an OculusVR forum from 2018, user ppdedios asks the community if those who lack certain muscle movements could use facial detection to blink an eye or lift the end of one’s mouth to call for help. While their idea received mixed responses, it showed a new opportunity for anyone who can program for VR, AR, or MR.
Architecture and Architectural Engineering, Interior Designing
With the realities listed above, interior designers are finding an easier way to communicate ideas to clients. By showing potential changes over existing room dimensions and features, clients can quickly
decide if the designer moves on to actually changing living spaces. Similarly architects and architectural engineers can test both form and function and provide results to clients.
We all know Unity and Cinemachine can assist with animations. But do you know how well these can assist with animations?
While Walt Disney Animation Studios and Disney-Pixar have an abundance of tools from Autodesk Maya to Renderman to Presto, Unity has caught their attention in recent years. As shown on Unity’s “Made With” page, Disney has used Unity for Baymax Dreams, a short inspired by the 2014 animated film Big Hero 6 directed by Bee Movie‘s Simon J. Smith.
In a behind the scenes video, executive director of Big Hero 6: The Series, Bob Schooley states the premise of Baymax Dreams was “something we wouldn’t do in the series probably, and [using Unity for this short] really feels like it’s taking Baymax to some new place.” Fellow executive director Mark McCorkle adds “It’s a show about technology and that’s why [incorporating Unity] really was a marvelous fit.”
And sure, most 3D animation is still created primarily in software such as 3DS Max, Cinema 4D, and the programs listed before, because those programs are dedicated to 3D production. But Unity is catching up. At the moment Unity is most useful for polishing animations and using post production after importing the animation files from another software. But Unity had far less tools for animation years ago and now it has several. What is stopping Unity—whether via the community, the asset store, or the company itself—from getting even closer to a dedicated animation software?
Simulations and More
Unity is now being used for simulations, many of which will educate people to save lives. Medical training sims, for instance, are an excellent way to portray the consequences of poor execution. Most medical dummies can’t bleed out or change oxygen levels or show physical signs of distress. If they could, those dummies would likely cost far more than a simulation program. Specific treatments, operations, etc can be thoroughly practiced in VR to feel as if the surgeon is hovering over the incision area. Emergency response teams can practice prioritizing victims in mass casualty incidents. Search and rescue sims in Unity would help teams practice protocol, potentially hastening a team’s search. Safety training simulations for working with machinery or corrosive chemicals are slowly replacing dull and unmemorable e-learning videos.
With the ability to create simulations, it’s possible you and Unity experts could find yourselves developing software for crash test studies with no material damage. Or your software may create scenarios for insurance companies to analyze. You might create both physics simulations and physically accurate animation calculators as an alternative to stunt doubles in films. You could create touch screen menus that guide travelers to their attraction sites.
There are plenty of possibilities. As Dean Takahashi says about Unity in a 2018 article, “’creation engine’ might be a better name for it.”