Today’s game developers reveal a new level of reality in game design
If you ever visited an arcade in the early 90’s, you no doubt witnessed dozens of gamers huddled around NBA JAM, a two vs. 2 NBA game featuring actual NBA players. The popular phrase, “He’s on fire,” was born in NBA JAM as the arcade blockbuster fused authentic-looking NBA players and mascots with over the top dunks, half-court shots and breaking backboards.
25 years ago, we believed video games had reached the pinnacle of the realistic player looks in video games.
Boy were we wrong.
Modern video games are so realistic that most people can’t decipher between real life and the games. Today’s games feature a player’s actual voice and likeness complete with tattoos, facial expressions, mannerisms and moves.
But have you ever wondered, how do the video game developers do it? How do they capture each nuance so perfectly to create a level of realism that rivals reality?
You might think that these expert level developers watch a lot of films and then meticulously create the movements of each player. While there is a high degree of digital development, the foundation of the player movement is much deeper. Game studios of popular athletic games utilize an intensive motion capture studio (MOCAP), a technologically advanced visual studio.
Take-Two Interactive, the game developer behind the famed NBA2K franchise, motion captures each NBA athlete for its games. Each player is fitted with a skin-tight suit filled with golf-ball-size nodes on the suit all over the player’s body head to toe that mark data points. The player is asked to perform several moves, shots, tricks, celebrations, and other real-life in-game movements. These movements are captured by almost 150 cameras that simultaneously track the data points of all the nodes on the player’s body. Each camera takes 128 pictures each second to ensure every nuance is captured.
The software attached to the cameras then uses a triangulation feature, similar to how GPS works, whereas cameras from different angles mark each movement and triangulate the position of each node and each body part so that no matter the camera angle or view used in the video game, the action of the player is swift and accurate.
After motion capturing the player, the game developers then overlay real film of the player performing the movements to make sure that each move is realistic as possible. The players then undergo digital body scans to capture skin tone, tattoos, facial hair, stretch marks and every other distinguishing feature of each athlete. Due to the enhanced realism of the games that now feature in-game and post-game interviews, athletes are sent to an audio studio and asked several questions to which they respond how they would in a standard post-game interview in real life.
The digital scans are placed on top of the motion capture performance of each athlete and paired with the audio of each player. The result is a photo-realistic digital character that performs and sounds like it’s real-life counterpart. The digital renders are so realistic that players that tattoo studios are now suing game developers, alleging that the game companies don’t own the copyrights to the player’s tattoos.
They say that art imitates life. That statement couldn’t be more true than with modern sports games. As technology improves and game developers find new ways to create even more realistic games each year, I can’t help but wonder, what’s next?