More Than Watermarks: Proactive 3D Moderation

3D art is gaining popularity as more artists and creators turn to digital mediums for expression. However, as the popularity of 3D art increases, so does the issue of theft. We once looked at this as a solely human problem, but Generative AI has now soared to topping the long list of threats to the integrity of intellectual property (IP). As design software becomes more sophisticated, providing users a quicker, easier way to develop 3D work, it becomes increasingly difficult to keep track of everything entering the platforms which make up our virtual ecosystem.

Watermarks have been around for what feels like forever, yet they are still the reigning response to art theft. This isn’t because watermarks are entirely effective, but because we simply don’t have any other tools available to protect work. Watermarks are a common method used to protect digital images from theft. They are typically small, unobtrusive text or graphics that are overlaid on top of an image. At times, a watermark is placed within a 3D model’s geometry. The idea is that the watermark makes the image unusable, prevents resale of the content, and enforces rightful ownership.

Getty Images watermarks have surfaced in images generated by DALL-E 2

However, watermarks are not effective at protecting 3D art. For one thing, they can be easily removed using image editing software. Additionally, many 3D assets, such as models and textures, are used in the context of larger projects like video games or films. In these cases, the watermark may not be visible, making it easy for thieves to use the assets without detection.

The unauthorized use of 3D models, textures, and other digital assets created by artists and designers is rampant. This can include using these assets in video games, films, and other media without permission or proper attribution. It can also include re-selling or distributing the assets without the creator’s consent. Unfortunately 3D art theft happens all the time. Whether an asset is ripped off and injected into a foreign game, or a stolen 3D model ends up being the centerpiece of an NFT scam, bad actors are profiting from the lack of moderation in the commercial 3D art space.   

No Solutions in Sight

So, what can 3D artists and creators do to protect their work? The answer falls short of expectations. While there are many steps creators can take to keep their work a little safer, once an asset is uploaded online it enters a digital wild west, vulnerable to theft, manipulation and copyright infringement. The correct approach would be to target the problem at a platform level, closely controlling the ingestion pipeline in real-time, but human moderation teams just can’t meet the scale at which 3D content is currently being uploaded.     

The truth is that we need a better system altogether, one which can target problematic content before entering a platform, blocking stolen and inappropriate assets from surfacing. Proactive 3D content moderation isn’t here yet – but it will be, and it will change the game. As users, brands and businesses all over the globe are leaning into the digital age and adoption of 3D, increasing the accuracy and efficiency of asset moderation is critical to keeping intellectual property (IP) secure. This is where Secur3D comes in.

Building Toward the Future of 3D Moderation

Our team has spent decades in the 3D space watching art theft and copyright disputes snowball, while moderation and resolution practices remain archaic and unchanged. With no solutions in sight, we began building one. Our approach is for every platform that wants to take back the time and revenue that is lost to asset theft, and the endless cycles or draining moderation practices that fall short of accomplishing their goal.

If you are interested in the opportunity to simplify content moderation once and for all, we want to hear from you. Say hello at [email protected].

Stay up to date