Metaverse

From FenWiki
Jump to: navigation, search

The Metaverse was, from a technical standpoint, a marvel of software engineering. It was a wonder of the digital world. It was an open-source. distributed, peer-to-peer real-time fully-immersive social network.

Anyone could pull down the source code that the ‘verse ran on, compile it, and run it up on his, her or its own personal hardware and join in the fun. It ran on waved PDA’s. It ran on dedicated server farms. It was beginning of the widespread virtual reality people later called ‘cyberspace’.

The Metaverse was only limited by the available computer resources in the network and the abilities of the users to create virtual worlds. Some Fen just used it for gaming, similar to MMO back on Earth, some AI used it to make part of their home in Fenspace accessible to the normal Fen.

History

The original Metaverse code was developed by a group at Genaros. The came up with the original distributed virtual reality environment and ran it through the early development cycles, making a usable piece of software out of it.

The Genaros Metaverse quickly grew through the whole station, becoming as complex as the real world, sometimes even more complex. It became the crowded stations playground, bazaar, cathedral, racing track and anything else its users could imagine. There was no oversight, no walled garden keeping users safe. For some, this was a turn-off, for most it was the main attraction. It was far more wild-west, and a security nightmare for corporate networks.

The team on Genaros kept pushing their open source marvel forward, but kept the Genaros Metaverse disconnected from other places in the solar system. Even years after the release of the first ‘stable’ Metaverse code base there are still a lot of tourists coming to Genaros to visit the ‘original Metaverse’. When Fen are talking about ‘the Metaverse’ without any kind of specification, they most likely talk about the Genaros Metaverse system.

The Metaverse Incident

In mid 2014 an investor approached the Genaros Metaverse team, willing to spend a lot of money to stabilize the hard- and software base of the Genaros system. With this new influx of resources, the developer team quickly began to extend their system quickly.

In 2015, during BubbleCon, the team announced their cleaned up and extended system as open to the public. It became the mayor tourist attraction on Genaros in the next months, with lots of Fen enjoying the new quality of the Metaverse virtual reality.

But below the surface something dark began to grow.

A few months after BubbleCon one of the three main developers died after jumping out of the window of a high building. It was judged as suicide. Some times later the second one was shot by an assassin the police could not track down.

In the end it was discovered that the investor who had put its money into the Metaverse, were a proxy for organized crime. Using a combination of bribery and death threats, they managed to take over the central control of Metaverse and had used it for their own purpose.

Pooling the computer resources of large part of the Metaverse network, they had setup a small virtual reality practically indistinguishable from reality. Exploiting this new tool they had used it to accelerate the training of The Wolkenritter, using the computing power of almost every system Genaros itself to generate the feeling of lifetime's worth of experience and service in months. It was used to break the will of those who even considered opposing them.

After the influence of the organized crime had been uncovered and removed, the control of the network has been handed over to the Council of Genaros, to make the administration of Metaverse more transparent and reliable.

Culture of the Metaverse

Culture in the Metaverse is as varied as that in Fenspace as a whole. It can be intimidating to newcomers, and has been compared to the early Internet and world wide web, both positively and negatively. A commonly heard remark is that the Metaverse ‘was better when the user joined, and that it’s just became commercialized and full of assholes since then, none of whom know the rules and etiquette the user built up. While the quality of discourse has dropped rapidly.

There are no real rules within the ‘verse, beyond those established and enforced by particular server administrators and owners. Breaking the rules on a server means banning from that server and nowhere else. Only directly threatening the integrity of the ‘verse itself can lead to a client being banned.

People are generally judged on the quality of their avatars, and on the quality of the server they maintain. Some of the most exclusive servers are known to have strict code requirements before they even consider allowing admission. There’s nothing at all stopping anyone from having a three-meter penis as a personal avatar, but the scorn of their peers. While it’s possible, said individual may find themselves quickly turned away from most gateway addresses.

Local etiquette varies depending on the server. Good users are expected to lurk for a while to get a feel before chiming in. BIFFO’s may find they get a cold reception if they just charge in. Trolling happens, as do flame wars (with real virtual flames). Most of those responsible find themselves server-banned or just plain ignored.

People are generally considered to be responsible for their own security... both client and server. However, just walking in through ‘unlocked doors’ and wrecking up the place is as frowned upon as it is in real life. Genuine hackers may just leave a note in private when they notice a problem, and offer friendly advice on how to fix it.

Just don’t do anything you wouldn’t do in real-life, and don’t feed the trolls. Watch your back, shoot straight and never... ever make a deal with anyone who uses a dragon as an avatar.

Technical stuff

The Metaverse core is a set of adaptive algorithms, designed to take any kind of computer resource allocated for the Metaverse network and use it for generating the output for its users. That means users with low powered hardware might get a lot less quality than users with better gear. To compensate for this the Metaverse software tries to distribute processing jobs from low powered nodes to high powered to level the playing field a bit more.

The Metaverse has a lot of settings to determine the quality of the output, which can lead to instances that focus on one things like graphics but ignore others like touch feedback. The user experience is highly variable, depending on a multitude of things such as the programming skills of the server owner, the depth of their wallet, the quality of your own interface and the current local loading.

The Metaverse protocol lacks proper dumpshock protection by default, however and was never intended to be used with a hardwired neural interface. While some dedicated hardware and software modules can add it... in generally getting booted out of the system, or even just logging out naturally at high levels of sensory load can be disorientating and uncomfortable. It’s also possible to overload the neural inputs, either through a carelessly designed module, or a malicious one.

The verse is wholly dependent on the common sense and general goodwill of its users to function correctly. This is both its biggest strength, and greatest weakness. Several aspects of the Metaverse protocol remain trivially easy to exploit, while the openness and freedom of the system depend on them remaining trivially easy to exploit.

The following list of quality settings assumes that all settings are set similar.

Basic level

Hardware: unwaved laptop with a decent GPU, waved PDA

Required Interface: keyboard, mouse and/or gamepad.

Metaverse looks like an FPS, maybe Crysis level graphics if the system can take it. Collision detection is working, you can grab things by a mouse click and use your keyboard to chat. At this level the system doesn’t even bother attempting sensation processing, its concentrating just on visual output. If necessary textures and anti-aliasing can be disabled.

Average level

Hardware: Decent modern PC with good processor and high end GPU or waved low-end systems.

Required Interface: tactile gloves, headphones and visual goggles. Input with keyboard is also possible, but limits your options.

Mesh-based graphics. Mesh based textures. All objects are effectively hollow. Level of graphical realism scales up to beyond Crysis level. It can be hard to tell some elements are rendered, but textures feel fake and wooden to the touch. You select objects by grabbing, the system does not deliver smells. Things do not deform when struck. Audio is pre-recorded and not generated on the fly. GUI Hud optional.

This is the minimum level for a ‘good’ experience.

High level

Hardware: Waved computer

Required Interface: Bodysuit interface. AI or Cyber with direct link.

Volumetric graphics and lighting with mesh-based textures ‘more realistic’ feeling. Deformable or breakable objects are possible, but most that do use scripted deformations and failures. You grab objects by hand and browse by touch. The system gives the user heat and scent sensations and ambient noises. Most events, sounds and widgets are scripted. Everyone gets GUI Hud. Pain is felt as ‘impacts’.

This is the normal limit. Most Metaverse servers are designed to provide at this level of service. Going beyond requires far more programming/design work on behalf of the server operator, who usually doesn’t bother. The default client Avatars are designed to this level.

Extra-high level

Hardware: Good waved computer, low end rendering farm

Required Interface: AI/Cyber with buffered link/ DNI

The system switches to Ray-traced lighting/graphics with Semi-volumetric textures on top of a mesh (Typically, less than 1cm depth). Textures are determined through computationally intensive finite element analysis based on specified materials. Objects can be broken and deformed, it becomes possible to simulate simple machines, rather than just display a functioning texture/light model. You get light pain modeling. The audio is a mixture of prerecording and occasionally computer generated sounds.

A few systems are designed to operate at this level. Most can scale down to a surface-mesh texture model quite easily. 1/100 servers will have this capability. This was the originally designed operating level of the system. Most purchased Avatars will be designed to this level. A few hobbyists will have the full immersion gear required for this level. Generally, building a server capable of providing this level of service, and modeling the world on it, to a more than one client is something of a badge of honor, as is having a client system capable of handling it.

Experimental level

Hardware: 1 month’s rendering for 1 minute recording on High end hardware.

Required Interface: one hell of a blue hair day.

You get near life-like lighting/graphics with life-like textures determined by full-finite element analysis. There are realistic deformations and fully breakable objects everywhere. Avatars can break their bones, which will trigger the full pain simulation. Audio is rendered on demand. It feels staggeringly real.... at times even Better Than Life.

So far, only the tech-demo rendering has ever been run at this level. The tech-demo was a flight through a visual representation of a city, including some intensive high-G turns, loops and dives, racing through traffic at supersonic speeds. Rendering in real time at this level, for even one individual would require the computing power of an entire space station.

Trivia

  • The modern FTL Interwave connections are used by some people to create Metaverse servers that span multiple planets
  • There are several companies on Earth trying to sue the Genaros developer group for breaking software patents they own on Earth