How the integration of Tube Studio with NVIDIA makes VTubing more accessible

0

VTubing can be simple, but at a higher level it can be an expensive and intensive process. VTube Studio and NVIDIA are looking to change this with new technologies designed to make the streaming environment more accessible to everyone.

Getting into VTubing is as simple as getting a couple of PNG images, connecting them to the Discord bot and publishing them on Twitch. However, it can also be a complex mixture of Live2D or 3D models with individual body parts animated and adapted to track every movement of the streamer.

This upper echelon of VTubing can be quite difficult to access. This requires high—quality hardware – either an iPhone or iPad to use face tracking, or a powerful enough computer to track the same expressions from a webcam. In any case, large investments in equipment are necessary for smooth operation.

VTube Studio is one of such face tracking programs for VTubing and one of the most popular. The PC program allows users to upload their model before using a webcam or smartphone to track movements and simulate them on the model.

Denchy, the person behind VTube Studio, knows how resource-intensive the entire medium can be. Although it is essentially a PC application, many users prefer to run it on their smartphones and send tracking data to their computer to reduce CPU and GPU usage.

“Over the past few years, I have tried almost all face tracking systems, but they are often unstable, extremely experimental or prohibitively expensive,” they told Dexerto.

“Most people now use either webcam face tracking or iOS face tracking. The existing webcam face tracking in VTube Studio, an open source library called OpenSeeFace, is already really impressive, especially when you consider that it is made from scratch by one person.

“But both webcam tracking and iOS tracking have their own problems. Webcam tracking is relatively resource-intensive and not as accurate as iOS tracking, while iOS tracking is very accurate and tracks more facial features, but users need an expensive iPhone or iPad to use it.”

However, this entry barrier is getting even lower thanks to a new collaboration between VTube Studio and NVIDIA. NVIDIA Broadcast’s new face tracking feature reduces the load on GPUs for VTubers users who want to store everything on their computer, and the Live2D program is one of the first to take advantage of this.

It has been “optimized to run most of the AI code for face tracking… on its high—performance tensor cores, which are on all RTX series cards” – the same thing that makes your AAA games smooth on PC, but can now also help with face tracking.

It also looks smoother without affecting performance too much — in fact potentially outperforming what’s currently on the market, Denchi claims.

“The impact on performance will be minimal, and tracking can be run even in the most demanding games,” they continued. “NVIDIA’s face tracking accuracy is also very high, approaching the quality of current iOS tracking, and in some aspects, perhaps even surpassing it.”

This feature is useful not only for VTube Studio, but also for any developer who wants to use face tracking on NVIDIA GPUs. This opens up many opportunities for development in the VTubing space, which can lead to an even greater reduction in the entry barrier.

This is a space in which NVIDIA is also firmly trying to position itself. Gerardo Delgado Cabrera, product manager at NVIDIA Studio working on new broadcast features, said it’s part of long-term plans to “optimize” the VTubing space.

“As part of NVIDIA Studio, we work with all the best creative applications, as well as with new ones,” he told Dexerto. “And one of the hottest areas of development of live broadcasts is VTubing.

“A few months ago, we contacted all the leading VTubing apps and started working with everyone to help them optimize their apps. In fact, improvements have already been made through NVIDIA Studio drivers for optimization and stability.

NVIDIA Broadcast face tracking will be launched in October, at the same time an update will be released in VTube Studio. This will help about 30% of users who have RTX-enabled GPUs. The update will also be completely free for everyone, and the manufacturer is working with the VTubing community to constantly add new features and updates.

This includes a new tool in NVIDIA’s augmented reality software development kit called Face Expression Estimation, which “helps animate facial grids that can better convey emotions,” Delgado said.

This represents a big leap for the technical side of the VTubing space, but in the end, it’s only a small part of the experience. In terms of what VTubers can become, there is still a lot to do, and Denchy is going to constantly explore this with VTube Studio.

“I think tracking will definitely improve, but I also think it’s important to remember that tracking is only one aspect of VTubing. Personally, most of the VTubers I regularly watch have very simple tracking and often quite simple models.

“After all, VTubers aren’t that much different from regular streamers. People watch VTubers because they like their personalities and streaming content. While a good tracking setup can be useful, nothing can replace a fun personality and interesting stream content.

“This is what I want to focus on in VTube Studio. Most of the features I plan to add in the future are focused on improving viewer interaction and collaboration with other VTubers users. This is what I personally like the most, as well as what, in my opinion, distinguishes VTubers from ordinary streamers.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here