baaria chaudhary

mixed reality artist | storyteller | postcard maker

ai in vr

If a human posed as an A.I. bot and behaved like an A.I. in virtual reality, at what point will the player doubt that the A.I. is not human?

In order to answer the question above, we created a little 'experiment' to test if a person would differentiate human movement and human responses from an AI generated response. The A.I. can create a better way for companies and universities (anything that requires an interview process) to screen applicants. The user sends in a job application and resume before the experience. During the interview, the A.I. can use the data from the resume to cater the experience to its current user. At the same time, we will be capturing data from the headsets, position tracking data, and conduct a survey at the end of the experience.

  • collaborators: marjorie wang, christian grewell, guillermo carrasquero
  • tools: unity3d, axis neuron, ableton live, oculus rift
  • made for: fun :)
  • project date: december 2016

level 1
Designing and testing the office scene in Unity.
level 2
The human puts on the Axis Neuron suit that feeds motion capture data to the 3d-animated robot in the office scene the user sees in VR. The human is in another room and their voice is distorted through ableton live to make it seem more like an AI bot.
level 3
The first user test, actually believed the human one was more AI like than the actual AI.