Thursday, July 7, 2016

Put Your Glasses On, You'll Hear Better

Since last week's post, I've been doing a lot of reading into game engines, the Entity-Component-System pattern, talking to a few people who actually designed RTS engines from the ground up and made kick ass RTS games with them (Command & Conquer and Grey Goo, for example). The comments you guys gave me, both here and on Facebook, were very helpful and allowed me to switch my perspective a bit and start building up the required Systems and Components for my prototype.

That's the first milestone, really. A working prototype - a Vertical Slice. So, by next week I'm hoping to define the MVP: the Minimum Viable Prototype (yes, I know...). It is quite defined, conceptually, but not architecturally.

However, in the mean time, I'm going to present a small "diversion".

There's an old joke, a banter, telling someone to put their glasses on, so maybe they would hear better. It always gets the polite laugh at meetings, when said to someone who isn't paying attention, for example. But, does that joke hide some truth to it?

You know how when someone loses a sense, the others compensate? When someone loses their sight, the hearing compensates, or becomes more attuned. This happens consciously - obviously, that person would actively try to compensate and understand his environment with his other senses. Touch, Sound etc',

The Israel Children's Museum in Holon has an exhibit called Dialogue In the Dark, aimed at exactly that:
In this extraordinary exhibit, we can't see anything, but we discover a whole world and maybe, especially, ourselves.
During the tour, blind guides lead the visitors through dark but designed spaces: nature, a noisy pedestrian crossing, a port, a market, and a pub.
Blind and vision-impaired individuals take an active part in opening the visitors' eyes in the darkness, demonstrating that their world is not poorer, but simply different.
The exhibit seeks to lead the visitors to have a unique experience of themselves, to introduce them to the rich sensory world hidden within each and every person, and to create an unbiased encounter between blind and sighted people.
The Attentional Resources Theory in Cognitive Psychology suggests that this also happens unconsciously. There are several competing theories on this matter, so far without a definitive answer as to which is the most encompassing and accurate. But, the gist of these theories deals with there being an Energy supply, a limited one, that allows our Cognitive processes to function - process the data cascades from the various sensors and modalities and reason those torrents into actionable information for our use. An important thing here to note, without going into the finer details of these theories, is that the limited resources are not a constant; different situations can allow for different ad hoc pools of said Energy. Perhaps, similar to how bursts of adrenaline can enhance strength or stamina.

However, while the Dialogue In the Dark deals with people who are blind, I'd like to ask what about people in the process of Becoming Blind?

Say someone has Glaucoma. Or just a really high glasses number and he isn't wearing them. That person hasn't lost that sense. In fact, that sense is a heavier burden on the energy store. This may result in slower reasoning, slower distinction between visual queues and - as these parts of the cognitive process are shared, are central to our overall awareness and may become a bottleneck for other modalities and peripheral processes - essentially, worse hearing.

So... if you put your glasses on, you may actually hear better, or at least perceive and process it better.

So what?

I'm experiencing, in Unity, as part of the learning process, with building a simple AR or VR experience, that will try to convey these modal exchanges. I've actually built, already, two Scenes for this - one a basic Google VR and Vuforia AR scene, utilizing a mobile phone's Main Camera; the other just a Google VR scene, set in the Urban map I have set up for my RTS project.

This isn't a game. Just an experience. Something bite sized, that will show diminished or impaired vision and try to simulate some form of induced deafness. With a few "in game" options, like Putting your glasses on, or closing your eyes - or maybe meditation.

The idea is to make it a quick learning experience.

You may notice that in the beginning of this blog post I described this endeavor as a "diversion", with quotation marks. It's because the two - this experience and the RTS - are related. Closely even.
I don't mean to make an RTS to simulate how vision impaired people play one, of course.
The relation comes from the design of the Machine faction, the Smart Agent enhanced faction. These same ideas which stem from the Attentional Resources Theory, may very well be the chink in the AI army's armor.

We should all keep that in mind, for the coming Singularity Apocalypse - Processing Power, that's the Achilles' Heel of Skynet.

More to come....


  1. To keep some order, gonna post here some of the stuff I find relevant to the post.

  2. Okay, so this is currently broken - in Unity 5.whatever.
    Which kinda sucks, cause parallel sound processing and visualization already working, more or less.