SIGGRAPH came back to Colorado, to the Colorado Convention Center, for their 50th anniversary conference, the original SIGGRAPH conference was in Boulder in 1974.
The first SIGGRAPH keynote was a session called Beyond the Illusion of Life, presented by Mark Sagar, Soul Machines, Co-Founder and former Chief Science Office.
The theme of the session was mainly on how AI needs an embodiment to achieve a true breakthrough. Without embodiment, AI is just another secluded machine function and interacting with it will always be divorced from human existence and as such, much harder than interacting with other people.
As an example of embodied AI, Mark presented BabyX, a virtual 12-24 month old infant.
BabyX shows how creating a digital embodiment of a human can lead to faster, easier and more inherently natural, human-machine interactions. This is because we, as humans, have evolved to interact with other humans and do this much better and faster than we can interact with machines, chatbots, and other digital simulacrum.
With BabyX, they have created an emulation rather than an animation or simulation of a human.
BabyX
BabyX is a virtual infant that interacts with a virtual screen AND real people on the other side of that screen. BabyX simulates a real infant in front of a screen with adult supervision.
BabyX interacts with people using verbal cues, virtual screen images and virtual hands/fingers in real time.
BabyX appears to be actually learning and interacting with different people in real time.
If you check out their video (in link above), one can see just how close the emulation can get.
BabyX’s emulation is based on a digital cognitive architectural that mimics the real brain, that includes memory and learning system, motor control system, visual system, etc.
All these systems are distinct computational modules, that in unison, represent the “virtual connectome” of BabyX’s brain emulation. Each of these cognitive systems can be swapped in or out, whenever better versions become available.

This cognitive architecture was designed to digitally, re-construct, the key components of the brain of a 18-24 month infant.
As a result, BabyX learns through interactions with its environment and by talking with the people and viewing a screen. With BabyX, they can even simulate hormonal activity. With the end result the ability to provide real time emotional expression.
With such a cognitive architecture, one could simulate real (virtual) humans interacting with another person, on the other side of a virtual screen.
Soul Machines “virtual” assistants
Soul Machines (like above) has taken BabyX research and created AI avatars used for customer support agents, educational assistants and any commercial activity that depend on human interacting with machines via screens.
It’s unclear just how much of the BabyX cognitive architecture and simulation has made its way into Soul Machines’ Avatars, but they do show similar interactions with a virtual screen and humans, as well as emotional expression.
Soul Machines is in the market of supplying these digital avatars so that companies can provide a better, more human like experience when interacting with AI.

In any case, BabyX was the first time I saw the true embodiment of an AI that uses a cognitive architecture as it is understood today.
AGI?
One can’t help but think that this is a better, or at least, potentially, a more correct way to create human level artificial intelligence or AGI. BabyX uses an digital emulation of human memory & learning, behavior, attention, etc. to construct a machine entity that acts and ineracts similar to how a human would.
With this sort of emulation, one could see training a digital emulation of a human, and after 20 years or so, resulting in a digital human, with human levels of intelligence.
And, of course, once we have re-created a human level intelligence, the (industry) view is all we need do is to focus it on improving (machine) learning algorithms and maybe, (machine) learning hardware, and let it loose to learn all there is to know in the universe and somewhere along the way we will have created super general intelligence or ASI.
Thankfully, it turns out that BabyX’s long term memory has been constrained to be temporary and limited. So, we aren’t able to see how a TeenX would actually behave (thank the powers that be).
Sager mentioned some of the ethical issues in letting BabyX have an indefinite, permanent long term memory.
I’m thinking this won’t stop others from taking this approach on.
Which, in the end, scares the heck out of me.
~~~~
Comments?



