Connect with us

News

Everdome Announces First-Ever Metaverse Soundtrack

The interplanetary metaverse project has revealed the opening single from the world’s first metaverse-specific soundtrack.

Published

on

everdome announces first-ever metaverse soundtrack
Everdome

Created in tandem with music composer Wojciech Urbański, a single called “Machine Phoenix” is being released today which forms part of the first-ever metaverse-specific soundtrack.

The project is called the “Everdome Original Metaverse Soundtrack”. It will give a unique atmosphere and background ambiance to Everdome’s hyper-realistic landscape, which is based on a future Mars settlement, using blockchain and VR technology.

Wojciech Urbański is a gifted composer with a portfolio of award-winning tracks that have been used by the likes of Netflix and Canal+. Wojciech’s Spotify community currently comprises over 600,000 monthly listeners, while overall, his tracks have been played more than 50 million times. The Everdome project hopes to use the composer’s talents to add drama to its hyper-realistic storytelling and visuals.

“Creating a musical illustration of the cosmos and space exploration is probably a dream held by every composer. This collaboration is for me a great artistic opportunity, as the setting of an immersive metaverse world permits the use of a very wide range of sounds and tools,” says Wojciech Urbański.

Metaverse users will hear Wojciech’s work as they launch from Everdome’s virtual Hatta spaceport to the Mars Cycler, situated in Earth’s lower orbit, with the musical collaboration evoking a similar atmosphere to Harry Gregson-Williams’ soundtrack for “The Martian” or Hans Zimmer’s “Dune”.

Also Read: Bedu Has Built A Metaverse Of The UAE’s Planned Mars Trip

As well as partnering with Wojciech Urbański, Everdome is also utilizing the talents of Michał Fojcik, a sound supervisor and designer who has worked on blockbusters including X-Men: Dark Phoenix and Solo. A Star Wars Story.

“Building a sonic background for a metaverse experience is a uniquely challenging task. In the absence of true forms of touch or smell, the visual and the sonic take on huge levels of importance if the experience is to be considered truly hyper-realistic,” says Michał Fojcik, sound supervisor and advisor.

Both artists will be dedicated to the creation of a sonic soundscape that will significantly enhance the quality of Everdome’s storytelling, and you can hear the first single from their collaboration, “Machine Phoenix”, on all major music distribution platforms, including Spotify, Anghami, and YouTube from today.

Advertisement

📢 Get Exclusive Monthly Articles, Updates & Tech Tips Right In Your Inbox!

JOIN 17K+ SUBSCRIBERS

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

News

Adobe Teases New AI Editing Tools And Updates In Premiere Pro

The video editing app will be enhanced with a generative extend tool, text-to-video, improved timeline waveforms, and more.

Published

on

adobe teases new ai editing tools and updates in premiere pro
Adobe

After launching the generative AI model Firefly last year, Adobe is now showcasing how the technology will be used in upcoming versions of the editing app Premiere Pro. In an early sneak peek, the company demonstrated several new features, including Object Addition and Removal, Generative Extend, and Text to Video.

The first new feature, Generative Extend, targets a common video editing problem by using AI to “Seamlessly add frames to make clips longer, so it’s easier to perfectly time edits and add smooth transitions”.

Meanwhile, Premiere Pro’s Object Addition & Removal tool will leverage Firefly’s generative AI to “Simply select and track objects, then replace them. Remove unwanted items, change an actor’s wardrobe or quickly add set dressings such as a painting or photorealistic flowers on a desk,” Adobe states.

Adobe also showcased another new feature that can automatically generate new film clips using a text prompt. To use the content creation tool, editors can “Simply type text into a prompt or upload reference images. These clips can be used to ideate and create storyboards, or to create B-roll for augmenting live action footage,” Adobe explained. The company seems to be commercializing this particular feature extremely quickly, considering generative AI video only appeared a few months ago.

Also Read: UGREEN Unveils Nexode RG 65W Charger For Middle East

The new additions to Premiere Pro will be added later this year, but Adobe is also introducing smaller improvements to the editing app in May. The changes include interactive fade handles to enable easier transitions, an Essential Sound badge that uses AI to “automatically tag audio clips as dialogue, music, sound effects or ambience, and add a new icon so editors get one-click, instant access to the right controls for the job”, along with effect badges and a new look for waveforms in the timeline.

Continue Reading

#Trending