The dangers of deepfakes lurking in the Metaverse

When you’re in the metaverse, you’re usually represented as a blocky or cartoonish avatar, or a disembodied floating torso and a pair of hands. None of them are remotely like you.

But what happens when things get much more real?

A number of companies are developing ways for you to create hyper-realistic representations of yourself for the Metaverse, complete with your face, your voice and even the way you move. One is Metaphysic, a deepfake or synthetic media company founded by Chris Ume, creator of the Deep Tom Cruise Videos which TikTok took by storm last year.

The videos claimed to show the Hollywood actor things like eating a lollipop and playing golf. The footage was from a different actor, with Tom Cruise’s face so cleverly transposed over the top that it’s hard to tell it’s not real.

Now Metaphysic wants to put this technology in everyone’s hands so they can use it to create their own hyper-realistic avatar. The startup recently Raised $7.5 million from investors such as Winkelvoss Capital and YouTuber Logan Paul to help fund it.

“Anyone can come and create their own hyper-realistic synthetic avatar,” said Tom Graham, CEO and co-founder of Metaphysic.

Mais ce n’est pas tout : avec Metaphysic, vous pouvez également stocker en toute sécurité votre avatar (en tant que jeton non fongible), afin que vous puissiez conserver la propriété de votre propre image et, surtout, des donsnées bio pour create.

Other companies do it too. based in Romania also offers a service where people can make an NFT of their own face or voice.

Part of it is having fun – who wouldn’t want to create a mini-me of their own? Or see what they would look like dressed as Lady Gaga?

But there is also a serious side. If we don’t initially find ways to secure our identities in the Metaverse – as Metaphysic aims to do – the result can be a terrible loss of control over our own images and biometrics.

A quick dive into the deepfake dilemma

No one knows this better than Henry Ajder, a researcher who has spent years studying the malicious use of synthetic media. A joint survey he did with Karen Hao of MIT Technology Review in 2019 found that 96% of all synthetic media at the time was pornographic, mostly created by robots that could swap people’s faces on a person’s body.

This was at a time when deepfake technology was still in its infancy. Now it has become exponentially easier and more realistic.

“The future will be synthesized and the challenges ahead will not be softened”

“It used to take 150 CGI people and $250 million to create a really in-depth set of effects for a movie. Now we can do it for a few thousand dollars, a few GPUs and one person,” says Graham.

And deepfake videos keep popping up everywhere. At the outset of the Russian invasion of Ukraine, a clunky deepfake video of President Volodymyr Zelensky supposedly surrendering showed how to weaponize such media for political purposes. The video was relatively rough, but experts warn that the following may not be so easy to distinguish.

You can’t ban technology, Ajder says: “If you ban synthetic media, you ban all Instagram filters, you ban computer photography on your camera and smartphone, you ban Jurassic Park dinosaurs. It’s not going away – the future will be synthesized and the challenges ahead are not being softened.

Set a good example

All you could do, Ajder reasoned at the time of his article, was try to build an industry big enough around legitimate and ethical counterfeit technology to try and establish best practices — and maybe, just maybe, the nefarious uses in to bring balance.

Ajder teamed up with Ume and Graham, who founded Metaphysic at the time, to offer advice on how to move things in an ethical direction.

“I’ve come from a lot of this perspective to understand how technology can be used maliciously, but I’ve also seen an explosion of really interesting creative and commercial uses of technology, and the need for a more nuanced conversation about the synthetic media as a technology. ”, he told Sifted last year when they started. “In the right hands and used responsibly, it could be the future of creative expression. We need to make sure we see a good example.

Metaphysic sought to model the ways in which media companies could responsibly use synthetic media. For example, the company helped famous actors rent out their images to advertising agencies to create a campaign, but everything had to be done with permission and within agreed-upon limits.

“There are pretty obvious use cases that we explicitly consider bad — non-consensual misuse of images in the context of pornography, deceptive political gimmicks, cybersecurity issues involving fraud,” Ajder said.

Another synthetic media company, D-ID, which has collaborated with Warner Brothers on several film projects, has also lobbied for draw up a code of ethics in industry.

But now it’s everyone’s problem

As deepfake technology emerges from the realm of advertising and film projects and becomes available to everyone, companies like Metaphysic think they need to go further.

The team doesn’t just want to make sure the film and advertising industry uses synthetic media ethically – they want everyone to be able to create and secure their avatar. They are offer their service, called everyone free, users only need to spend about $20 for NFT keystrokes.

Faces can be manipulated in all kinds of ways on each person’s platform

Of course, that doesn’t stop someone from stealing your face to make revenge porn if they really want to. But they want people to understand what can be done with images and come up with ways to control the use of their face and voice.

“We want individual users to feel like they have more control over who they are and not have to worry about sending all their data to an untrustworthy company and what might happen to it in the future,” Graham says.

“It really is a matter of consent. We want to level the playing field a bit and create a paradigm where that is the norm.

Maija Palmer is Sifted’s innovation editor. She covers deep technology and business innovation and tweets from @maijapalmer

Leave a Comment