Tau 21, 03 / Apr 15, 19 04:03 UTC
That's probably the project of my life. And it is also the dream of many others. Today I'm glad to announce that Iris Technologies, are glad to announce the technology of the century. The Iris device which will transform science fiction, into reality. Something like Aincrad and Nerve gear are ...
That's probably the project of my life. And it is also the dream of many others. Today I'm glad to announce that Iris Technologies, are glad to announce the technology of the century. The Iris device which will transform science fiction, into reality. Something like Aincrad and Nerve gear are closer than what you never think of. So how does that "magic" happens? With a good amount of economical help, we would create a server capable of managing tons of data. We will call this server Retina from so on. Now, lets imagine two things, an apple and a source light, which some of its rays hit that apple. Now, lets see what happens at a subatomic level at a superficial rate. The photons there would collision with the magnetic field of the atoms of the apple, creating some derivations on its direction(calculated thanks to Vector3 reflection method on Retina as traditional ray paths/tracing would do), power (some losts on its speed) and particle wave length. Now imagine a lot of photons collision at all the surface of that apple, generating those 3 changes constantly. Next, lets add a receptor of photons, an eye, which will capture all those reflected photons from the apple. Those photons would collision at the iris, generating from such collisions a series of electrons which the cornea nerve will capture, to a brain, in that case, the player's brain, but where is it? Well, unfortunately the brain is outside of the Retina virtual processed world with such fancy photons, so how could we build a bridge between reality and Retina? Conversions! It's known that depending on such properties the photons inherit, our brain processes the output image on a different way. That difference is known as colors, and fortunately, they are constant, which is the key for us to process them. Lets encode that to a color system that a machine can interpret, such as the old but gold RGBA, creating a virtual entity which won't interact with the photons properties(write'em) but instead, just act as a reader, to extract the information of the photons which collision at the virtual Iris). Once the RGBA output is calculated, we can send it from Retina to the Iris device linked with such eye id (multi-player systems will have multiple eyes, so there's an id to identify which RGBA packed data will correspond to which eye, allowing a correct send to the Iris device with such ID). But hey, the result would be rasterizing the output at a screen, and we don't want it. We want to send that data to the user brain directly, allowing total immersion. So lets decode that RGBA again into that photon properties, but unfortunately, the brain at this point won't have any idea about what a photon is, because as i already explained, brains only work with electromagnetic impulses (electrons). Also there's another problem which is at the same time, the solution, the real eye. The real eye would be sending constantly data to the brain from the real life photons, so we will need to intercept that signal to don't vause our poor brain mixing both electromagnetic impulses, so let's overwrite that signal. Fortunately for us, there already existed solutions to generate an electromagnetic wave in certain power and direction, so we send that electrons to some point at the cornea nerve, intercepting the electrons from the real eye and giving the ones from the source light the way to our beloved brain and... VOILÁ!!! We would be seeing the virtual apple as it would come from our real eye.
Accept friend request
You are friends