Discord: Start an Activity
How to level up the team spirit in your company!
Society, Culture
This article explains the term VTubing and the benefits of considering VTuber as a job instead of the regular YouTuber.
VTubers originally come from Japan. The term stands for virtual YouTuber, although most VTubers are not currently found exclusively on YouTube. VTubers create content with a virtual avatar that is generated digitally. The characters are often heavily inspired by anime. It is estimated that there were over 10,000 active VTubers at the beginning of 2020 and the number of active VTubers has grown exponentially over the last two years. VTubers use characters individually designed by artists and bring them to life with programs such as Live2D. This allows both streamers and YouTubers to hide their real identity to be represented by an avatar.
While the main audience base for VTubing currently sits outside of Japan, who love Japanese culture and anime, there is real potential to grow beyond this origin.
The technology behind VTubing usually involves facial and gesture recognition in combination with animation software. VTubers often have to invest in the creation of a full-body avatar and some hardware to make the technology work. This includes 3D animation software that allows the avatar to move naturally. Motion capture systems and some smaller components, such as webcams or cell phones, can also be used. Some VTubers also use additional software to change their voice. This allows streamers to disguise their identity even further and give the avatar its own voice. There are complex and cost-intensive setups, but there are also ways to keep costs low. If you are interested in this technology, please contact us to arrange an individual consultation.
The VTuber community on Twitch is growing steadily and creators like Pokimane are even streaming with their own animated 3D models. But one animated streamer that is quickly gaining traction on the platform, currently with just over 200,000 followers, is CodeMiko.
Behind CodeMiko's colorful model is the creator, known in the stream as “the Technition” or “The Technician”, who often pops up at the end of Miko's streams to talk to the community as a streamer normally would. The Technician uses an Xsense suit and the Unreal Engine to power Miko. In her Twitch “About” section, she even reveals that the development was done entirely by her and that Miko was 100% modeled and rigged by her. The streamer went viral on Twitter in November after showing a side-by-side clip of the mocapping process, which left the audience in awe of her technical setup and how well her movements translated to the 3D model.
To successfully VTubing, various hardware and software elements are required. Below are the ones that have been tested and used by Studio Merkas.
Unreal Engine is a 3D game engine by Epic. Its application scope now extends far beyond just "gaming." The engine is used not only for movies but also for real-time applications such as visualizations or VR applications. In relation to the VStreaming project, Unreal Engine provides an excellent opportunity to create a world that delivers impressive visual results "out of the box." With the right hardware and blueprint setup, 3D avatars can be controlled via live motion capturing and face tracking, and streamed directly through OBS.
Unity is a 2D/3D game engine. The Unity engine is now also used for various real-time applications. With Unity, 3D models created in software like Blender are converted into the VRM format. This format allows the avatar to be used in various software such as VSeeFace, Luppet, or others as a VR model. To convert the avatar from Blender to VRM, the UniVRM plugin and Unity3D 2019.4 LTS or a higher Unity version are required.
VSeeFace is a free, highly configurable facial and hand-tracking VRM and VSFAvatar avatar puppeteering program. The software is designed for virtual YouTubers, with a focus on robust tracking and high image quality. VSeeFace offers similar functionality to Luppet, 3tene, Wakaru, and other similar programs. The software runs on Windows 8 and higher versions. VSeeFace can also send, receive, and combine tracking data via the VMC protocol, enabling perfect iPhone sync support through Waidayo. Facial recognition, including gaze, blinking, eyebrow, and mouth tracking, is done using a standard webcam. An optional gesture control requires a Leap Motion device.
For VTubing, at least a webcam is required to control the character. To achieve better quality in Vstreaming, more expensive hardware must be used. For very accurate face tracking, an iPhone X or higher version is needed. A Leap Motion or HTC Vive trackers help make hands and arms visible and movable in the stream. If money is no object, the customer can also opt for a mocap suit, which tracks the entire body.
You can find more information on this topic in the article “8 tips for your VTubing success”.
We hope you like our article and would like to invite you to share your thoughts and questions on the topic with us. If you have any questions, comments or feedback on the content of this article, please don't hesitate to let us know in the comments section. We're always happy to hear from our readers and engage in meaningful discussions about game development.
Just ask us anything you want to know and we'll do our best to provide the answers you're looking for. Thank you for your support and we look forward to hearing from you!
How to level up the team spirit in your company!
VR attractions are special facilities where visitors can have immersive virtual reality experiences. In this article, we will focus primarily on…
This article is intended to explain the term fighting game and list some examples of well-known fighting games.
Write comment