What is VTubing?

Society, Culture

Short description

This article explains the term VTubing and the benefits of considering VTuber as a job instead of the regular YouTuber.

Origins of VTubing

VTubers originally come from Japan. The term stands for virtual YouTuber, although most VTubers are not currently found exclusively on YouTube. VTubers create content with a virtual avatar that is generated digitally. The characters are often heavily inspired by anime. It is estimated that there were over 10,000 active VTubers at the beginning of 2020 and the number of active VTubers has grown exponentially over the last two years. VTubers use characters individually designed by artists and bring them to life with programs such as Live2D. This allows both streamers and YouTubers to hide their real identity to be represented by an avatar.

While the main audience base for VTubing currently sits outside of Japan, who love Japanese culture and anime, there is real potential to grow beyond this origin.

VTubing technology

The technology behind VTubing usually involves facial and gesture recognition in combination with animation software. VTubers often have to invest in the creation of a full-body avatar and some hardware to make the technology work. This includes 3D animation software that allows the avatar to move naturally. Motion capture systems and some smaller components, such as webcams or cell phones, can also be used. Some VTubers also use additional software to change their voice. This allows streamers to disguise their identity even further and give the avatar its own voice. There are complex and cost-intensive setups, but there are also ways to keep costs low. If you are interested in this technology, please contact us to arrange an individual consultation.

Advantages of VTubing

  • Typically, the advantages lie in the creation of content. The content belongs to the streamer, because it is their own “identity” and personality that appeals to consumers and not a separate brand.
  • VTubers often separate their personal identity from the avatar they play. This means that they can have millions of views and viewers online and go unnoticed in real life. For example, they can take to the streets, create content under a different avatar and attend fan conventions - without attracting the attention that other creators are exposed to.
  • A content creator can control the avatar together with other people. For example, one VTuber plays the facial expressions and gestures and another speaks the voice. As long as the character is carefully crafted and styling decisions and personality traits are defined, actors can also take the reins. This reduces the burden on a single Vtuber and may even mean that the avatar's IP could be sold.
  • If Ninja were to sell its brand to another streamer, it would be almost impossible for the new owner to maintain or extend the IP. This problem does not exist in the world of VTubing - a world that is in its early stages and will continue to grow in the coming years.

Are you ready to bring your game vision to life? Contact us today to learn more about how we can help you successfully bring your ideas to life with our game development services!

Well-known VTubers

The VTuber community on Twitch is growing steadily and creators like Pokimane are even streaming with their own animated 3D models. But one animated streamer that is quickly gaining traction on the platform, currently with just over 200,000 followers, is CodeMiko.

Behind CodeMiko's colorful model is the creator, known in the stream as “the Technition” or “The Technician”, who often pops up at the end of Miko's streams to talk to the community as a streamer normally would. The Technician uses an Xsense suit and the Unreal Engine to power Miko. In her Twitch “About” section, she even reveals that the development was done entirely by her and that Miko was 100% modeled and rigged by her. The streamer went viral on Twitter in November after showing a side-by-side clip of the mocapping process, which left the audience in awe of her technical setup and how well her movements translated to the 3D model.

What is needed for VTubing

To successfully VTubing, various hardware and software elements are required. Below are the ones that have been tested and used by Studio Merkas.

Unreal Engine 4.26

Unreal Engine is a 3D game engine by Epic. Its application scope now extends far beyond just "gaming." The engine is used not only for movies but also for real-time applications such as visualizations or VR applications. In relation to the VStreaming project, Unreal Engine provides an excellent opportunity to create a world that delivers impressive visual results "out of the box." With the right hardware and blueprint setup, 3D avatars can be controlled via live motion capturing and face tracking, and streamed directly through OBS.

Unity

Unity is a 2D/3D game engine. The Unity engine is now also used for various real-time applications. With Unity, 3D models created in software like Blender are converted into the VRM format. This format allows the avatar to be used in various software such as VSeeFace, Luppet, or others as a VR model. To convert the avatar from Blender to VRM, the UniVRM plugin and Unity3D 2019.4 LTS or a higher Unity version are required.

VSeeFace

VSeeFace is a free, highly configurable facial and hand-tracking VRM and VSFAvatar avatar puppeteering program. The software is designed for virtual YouTubers, with a focus on robust tracking and high image quality. VSeeFace offers similar functionality to Luppet, 3tene, Wakaru, and other similar programs. The software runs on Windows 8 and higher versions. VSeeFace can also send, receive, and combine tracking data via the VMC protocol, enabling perfect iPhone sync support through Waidayo. Facial recognition, including gaze, blinking, eyebrow, and mouth tracking, is done using a standard webcam. An optional gesture control requires a Leap Motion device.

Hardware

For VTubing, at least a webcam is required to control the character. To achieve better quality in Vstreaming, more expensive hardware must be used. For very accurate face tracking, an iPhone X or higher version is needed. A Leap Motion or HTC Vive trackers help make hands and arms visible and movable in the stream. If money is no object, the customer can also opt for a mocap suit, which tracks the entire body.

You can find more information on this topic in the article “8 tips for your VTubing success”.

Questions & Wishes

We hope you like our article and would like to invite you to share your thoughts and questions on the topic with us. If you have any questions, comments or feedback on the content of this article, please don't hesitate to let us know in the comments section. We're always happy to hear from our readers and engage in meaningful discussions about game development.
Just ask us anything you want to know and we'll do our best to provide the answers you're looking for. Thank you for your support and we look forward to hearing from you!

Comment form

Write comment

* These fields are required

Comments

Comments

No Comments

More exciting articles