company:lazarus:duos

Dream Uplink Operational System (DUOS)

This article is a work in progress; It is currently not approved as canon.

Developed in YE36 and released in YE38, DUOS or Dream Uplink Operational System is a brain-computer display technology, used to provide feedback to the user. It serves to augment existing brain computer interface systems as software and works by altering user spacial and comprehension cognition, circumventing the use of iconography or language in most cases to supplement an existing control scheme.

A neural interfaces suite, developed by the Lazarus Consortium in YE-37 for general consumer use and sold internationally for military and non-military purposes - allowing nations and companies to forgo developing their own neural systems - saving money and allowing them to focus on the platform and armaments themselves.

In radically simplified terms, the human awareness of the environment in reality is not a function of the senses but instead, one of the mind. Raw information enters the mind from the senses and is scrutinized, assessed and distilled (filtering) and then only useful information is fed onward into the subject's cognition. A dream for example, is a case where in the absence of meaningful raw data, the brain relies on feedback loops and internal inputs to propagate this “dream experience” of reality“.

DUOS works by circumventing these senses and filtering of the user, populating this experience of reality with accurate statistically made assessments from the environment, taken from sensors often many times more advanced than the senses of the user and capable of surveying much larger areas at higher speeds and larger scopes.

The benefit here is that given humanoids evolved to deal with a macroscopic world (with the smallest object a human can comprehend innately generally being a grain of rice and the largest, a city block - for example the scale of a sun being incomprehensible to a person needing numbers to be measured) this allows a pilot to comprehend much larger objects, higher velocities and even information beyond their own senses innately with no special learning.

This when combined with machine-level filtration of the senses which lack the problems of attention span or special focus (given there's plenty of processing power to go around) result in a user that is more informed, less overwhelmed, more quickly informed and less likely to fall into the pitfalls of cognitive bias.

OOC Notes

So a fighter-pilot in a cockpit gets all their information that's predigested by the computer. They don't actually experience what's happening outside most of the time. They're told things, displayed info and so on. Well, DUOS is all about applying this computer predigiested mentality a person's real senses, forgoing the need for bulky consoles, dials and systems and instead applying it to the mental image a person has of what's happening around them, their instinctive urges, gut reactions, feelings and semblances.

Its like a blend of augmented reality, dreaming and a cockpit HUD system that's dumping information into a pilot's brain. Like, how a person can't mentally wrap their heads around how big the sun is and how big orbits are, because we deal with macroscopic stuff in our world like tables, chair, people, hills, roads and animals? Well, DUOS lets a person experience those big phenomena AS macroscopic phenomenon and zoom in and out like some kind of third person telescope three dimensional world-map on steroids. This even applies during combat in the first person, showing stuff like how dense objects are, how likely something is to be somewhere and point-cloud-mapping object so pilots can see straight through them or super-zooming a distant object up close to explore its features and shape without losing that big peripheral vision - like an overlay.

Trippy and kind of badass.

osakanone 2016/04/03 18:19

  • company/lazarus/duos.txt
  • Last modified: 2016/12/18 20:08
  • by osakanone