Humans are Living in a Fake Universe. The Metaverse has Proof!
The metaverse can proof that this universe is fake! What to do now?
Our universe is a ridiculous place. All the stupidest things we know happen there. And the most important of the stupidities is the crazy concept of time. Don’t get me wrong, the metaverse is powerful. A persistent dysfunctional approach that humans living in a fake universe are creating, especially in the Meta Platform’s vested interest in developing in the metaverse has been astonishing. But time is even stranger than changing the name of the world’s most famous tech company to one that literally means “self-reference.” Time is the opposite of self-reference. If it exists in a concrete and physical form, we may live in a simulated universe- our own custom-made layer of the metaverse. This may sound strange, but it’s actually pretty intuitive.
In this scenario of the metaverse universe, someone or something created a simulated reality for some reason and put us in it. This reality is made up of discrete pieces of space-time. From the researchers’ point of view, this space-time is the foundation of our universe. It is the bits that make up our data. All this raises the question: what if humans are living in a fake universe? What if time is just a measure and we live in a basic reality? If that is true, we will need to know what reality really consists of.
Here physical concepts such as string theory, metaverse universe, and dark matter work. All of these are theoretical ways to explain the need to describe the universe in intuitively reproducible terms. What if time here is Neural and has taken up the concept of time extensively as individual pieces? But it is sufficient to say that there is no empirical definition of time that fulfills our desire to determine its place in our universe. We need to consider the concept of time from a more measurable reference frame. Imagine a one-second video of a dandelion swaying in the wind. One second is a very short time, but it’s enough time for the eyes and brain to capture all the movements and get an accurate picture of what the image wants to convey.
Scientists believe that the human eye can perceive about 3060 FPS. Experiments have shown that some people can perceive movement at up to 75 FPS. Assuming that the universe is composed of discrete fragments of space-time, scientists have theorized the maximum frame rate. Unfortunately, there is currently no way to estimate the number of FPS running space or base reality. There might be debates about the measurements of phenomena such as the speed of light and the size of Planck units, but none of these perceived extrema represents the true limits of the universe. Whatever happens, we have to deal with the assumptions from a limited perspective.
What does this have to do with the Metaverse?
We are aquarium fish, trying to understand their relative position to the outside world. From our point of view, the universe follows at least two different sets of rules (Newtonian physics and quantum physics). But what if you can only see a small part of the big picture? Now it looks almost photorealistic. Have you seen an early Unreal Engine 5 demo? You are really amazing. In another 30 years, it may not be possible to distinguish between VR and reality without some buffer to indicate which one to perceive. Millions of gamers are currently paying premium prices for displays and graphics cards that can run games at frame rates above 120 FPS and refresh rates above 120 Hz, but the human eye and brain recognize those movements. It was easy to market these POV systems to enthusiastic gamers, as someone probably showed some sort of secondary use to increase the frame rate.
As we continue to push the boundaries of FPS and refresh rates, we will eventually develop a system that can display graphics at resolutions and frame rates that humans cannot perceive. This is almost the same as converting an entire music album into notes to get the frequency. But these systems help teach AI to see nuances at quantum levels that humans couldn’t see, even if they shrunk themselves. There is a possibility. That’s the last line that caught my interest: “And we’re far from it.” How far is “so far”? Now it looks almost photorealistic.
There is no evidence that it can be done. Someone probably demonstrated some sort of secondary benefit to increasing framerates that made it easy enough to market these gonzo systems to overeager gamers.