Giuseppe De Lauri
ezgif.com-optimize-6.gif

_The Avatarization Process

The Avatarization Process

Design Tools: Premiere, Sketch, Make Humans, Python, IBM Watson, DialogFlow, Go

The Avatarization Process is my thesis project for the M.S. program Integrated Digital Media at NYU. My prototype consists of five steps that transform data collected from social networks into a visual avatar.

 
 
 

_the project

Our world is immersed in bump smoothing technology. Avatarization lets users envision what they would physically look like based on their social media content, as well as how they would be imagined by commercial Artificial Intelligence systems. My prototype also has value for marketing purposes, as avatars can be used in customer service or to more generally attract consumers to products and platforms. Think about Amazon’s Alexa. What if her voice was tailor-made just for you? Or even in the future, as a hologram, that she looked exactly like how you make yourself look on social media with all those filters? This type of avatar visually encompasses how we want to be seen and designs ideal digital appearance of us. Avatarization can be used to manipulate how we engage with technology and vice-versa, forever transforming the dynamics of Human-Machine Interactions. 

 

_the process

 
 
 

_step 1: gathering data

By using the software NCapture and NVivo, I gathered data from my testers' Facebook and Twitter accounts. Then, I organized that information into topics and categories. 

 
wYOZ8QL.gif
 
 

_step 2: data analysis

I imported the information into IBM Watson Personality Insights. IBM Watson is a Cloud Service working in Natural Language Processing. It uses algorithms to analyze data and make connections among categories of words and meanings.

 
clipart2068647.png
 

IBM Watson’s outputs are a detailed personality summary, mapped in a sunburst chart where the Big5 traits (Agreeableness, Conscientiousness, Extraversion, Openness, Neuroticism) are highlighted.

Tester n*1 Sunburst Chart

Tester n*2 Sunburst Chart

Tester n*3 Sunburst Chart

 

_step 3: avatar modeling

In order to transform these data and corresponding personality traits into visual avatars, I used a combination of 3 softwares. I used an interface called QT into which I inserted data percentages, a Python script to translate data inputs into command outputs, and MakeHumans as the avatar’s modulator.

 
 
 

_step 4: chatbot

So, I had created the body of the avatar; now I had to create its mind. I wanted to create basic Artificial Intelligence that could be the avatar's mind. 

 
giphy.gif
 

I made a chatbot by using Dialogflow, a Google-owned system that works in Natural Language Processing. I trained the chatbot with the data coming from Step 1 (e.g. post, tweets, comments) and Step 2 (personality summary). I created a .Json file database with more than 50 questions and at least 3 answers per question. When you ask the chatbot a question, Dialogflow will pick the answer from the .Json database based on relevance. 

 

_step 5: double interview

I wanted to compare the real person with the avatar created out of her social media data. First, I interviewed my tester in person, asking 20 random questions. Next, to make the digital avatar, I combined the avatar’s body from MakeHumans with the mind from Dialogflow. Lastly, I questioned the avatar, recorded the answers, and edited both interviews together with Premiere CC.

 
 
 

_conclusion

Humans think. Avatars know.

It may seem like a trivial conclusion, but it’s crucial. The way humans think is unpredictable and susceptible to change with every passing moment. Instead, AI only follows the user’s traces on social media. Over time (in this case, my tester’s Facebook and Twitter between 2017 and 2018), the real user may have changed her mind from what she posted. This is particularly evident in the last question, “Are you happy?.” While the tester thought for a moment and then replied, “At the moment, no,” the avatar answered that she was blessed.

 

_applications

Avatarization has both commercial and ethical potential.

AI systems can employ users’ avatars on their platforms in their feeds, literally portraying each particular user’s ideal appearance, which attracts more business. The development of digital avatars dramatically metamorphoses Human-Machine Interactions. AI systems directly engage with our own engagement with digital media.

Constructed avatars can also inform users on the impressions their digital footprint has on others. Our social media says something about us to our followers. The way an avatar looks shows us what that looks like.

BiodegradableConsciousFireant-small.gif