Going to drop a blog link at the bottom from last year here after watching the pretty unimaginative VMA Snoop/ Eminem video.
Remember the "Opposites Attract" video from Paula Abdul in 1988? It was a step up from that. Seriously, and Paula's video was far more innovative.
With all that cash they could have made an experience, not a video. What if your avatar could jump in at any point? Copy the dance moves? Interact with the environment? What if AI could make Snoop or Slim react?
This is just an animated video, not the metaverse. The team atStage11for example are doing much bigger things and going in the right direction.
Remember the Kinect from Microsoft? Released in 2010 it was a body tracking camera you could buy with the Xbox. In that decade since I'm pretty sure we could easily have had a home MoCap system with the Xbox Series X ifMicrosoftreally wanted to give power to creators again.
Even that could have powered a better experience than the Bored Ape performance last night (emphasis on bored).
Imagine sitting at home watching an immersive music video and then deciding to jump into the action and have fun?
The power of Web3 and metaverse could have produced something far greater.
A music video within the metaverse would no longer be bound by traditional 2D formatting, nor would a spectator. The combination of Unreal and Epic’s Metahumans would mean that you could place yourself as a viewer entirely within the volumetric environment of the music video and watch it from any angle, even walk/ fly/ move around at will while it plays out.
I keep hearing people bleat about "but it's still early" and yet others were building music events in Second Life 15 years ago.
Technology is not early - our imagination is. And maybe that's the bigger problem to address.
If you want to read a bit more check out my previous blog from a year ago on Movies, Music and the Metaverse.