Minecraft Viki (video wiki) ➜ https://minecraft.viki.gg
The latest AI News. Learn about LLMs, Gen AI and get ready for the rollout of AGI. Wes Roth covers the latest happenings in the world of OpenAI, Google, Anthropic, NVIDIA and Open Source AI.
My Links 🔗
➡️ Subscribe: https://www.youtube.com/@WesRoth?sub_confirmation=1
➡️ Twitter: https://x.com/WesRothMoney
➡️ AI Newsletter: https://natural20.beehiiv.com/subscribe
#ai #openai #llm
source
It’s amazing how fast this runs! I think the speed comes from using consistent seeds for each frame, so it’s not recreating everything from scratch, which helps keep things looking smooth and consistent. Plus, it seems like they might be using interpolation and temporal consistency, which help blend frames together naturally. And running on a TPU really boosts performance, especially with these optimized diffusion techniques.
if we live in a simulation, i bet quantum physics not being deterministic is just the ai that simulates us not loading unnecessary data. kind of like not rendering all chunks of a minecraft world at all times.
i made it to that one part
i love all of you, internet people.
Can’t wait till the actual oasis becomes a reality
It’s both very impressive and also underwhelming. As nifty as these videos are, the reality is there is no consistency in the world. A tree is there in one moment, then gone next time you look.
There is no English Audio Track for this video on my Samsung TV.
❤❤❤❤❤
But you know, like we said about the Chinese, learning and copying is one thing – coming up with novel ideas (games) in this instance, is another.
So yes while it might be fantastic – we are far from "new" games running on llms (also I think the ESG departments will have a heart attack looking at the co2 footprint from all the gamers in the world having to run this 😅🫣)
It looks like someone who watches nonstop Minecraft videos (without actually playing it for himself) having a long nightmare about it.
Why are you making voice over videos? By now I'm invested in your facial expressions as you bring us news like this.
No object permanence. The model should be learning how to generate the actual state and map of the game from the visuals not just generating some diffusion based image. Kinda silly.
This way you can create endless games, the script for which will be written on the fly by o1, that is, the quality of development is not inferior to ordinary games.
Love how they called this AI program the same name as the software platform in "Ready Player One" which its interactive real-time virtual world is built on.
Was this video AI edited?
Really not a fan of this new video style. Complete with the one word captions popping up as you speak. It just feels like filler, rather than placing emphasis.
And the weird background always there, with a small window showing content.
Feels off.
Great video! Minecraft video in the background is incredibly distracting though. 😂
So, where are we headed in the end, Mr. Roth? Is this to teach ourselves that we are merely characters in a simulation of life, with no chance of peering beyond our universe? So many questions… in time, I suppose.
yeah…. i do think that we will use this tech for … architecting and whatever you said… or mainly to generate the Brazzers videos we all dream off to play out the way we want with the faces of whoever we want…. i don't think you checked lately the stable diffusion models people have on Civit Ai…. anyway, yeah sure, if it makes you feel good, we will create virtual sets or architecture or whatever said, yeah yeah, definitely that's what it will be used for.
IMAGINE THAT AI BUT TRAINED WITH IRL FOOTAGE.
I tried it out. The biggest obstacle to the current version being used as an engine is object impermanence. I think it needs metadata available relating to what's not currently visible. This isn't easy for training from videos, but I'd think would be relatively possible for synthetic training data. For example an "off screen" permanent inventory and 360 fish eye camera. For larger scale coherence I could imagine a form of RAG. If we could estimate the position and direction of view, you could throw in frames from earlier in the gameplay that look in that direction as extra context for what should be visible.
Looks like a weird android's dream. Not playable. No practical application.
Make this your mantra.. What a world we live in. 😅
Now convince me we aren't living in a simulation.
its not a dream videogame, its a game video dream.
This will create a ready player one metaverse, as we have a prompt to describe it
if you can "simulate minecraft" why not you photorealistic game footage to train the AI and get a photorealistic game ?
🖤🔥
why would we need OS or even apps later on?
I see this as a dream machine for AGI to work out how the physical world works so training can be on going in lieu of upfront only. As an AGI system interacts with the real world every day it could take all those interactions and dream about all the other possible outcomes that didn't happen so to be trained how to react in the future. The AI would do this during downtime or offload the dream process to then integrate the dreams at a later time.
The next 5 years are gonna be absolute insanity, let alone the next 10. I for one can't wait for all the advancements everywhere, ethicality and morality dont concern me, as those are just human constructs that we will forever fight over whos "right" and "wrong" when in fact no one is either, objectively. Life will keep going no matter what happens, and im excited to see all that comes from it.
if this was trained on game world state as well as visuals, it would naturally be able to interact with actual long term game state. Keep working from there, and you could move towards an AI that can work in any style and with any state, and then you just have a realtime game generator
Do Androids Dream of Electric Sheep?
Oasis is ridiculous, it's like gaming on shrooms. Absolutely nonsensical but funny.
Physical rendering and infinite time.
Why is the video audio German generated ai voice ?
I have a game irl. Just no game dev ability digitally – so I'm eagerly tracking this stuff. Hopefully the barrier to entry diminishes.
"Researchers made sure that landscapes remained consistent over time." And that was demonstrated in this video? I saw the world shift and change throughout the video.