Lauren McCarthy


Lauren McCarthy is an American artist and computer programmer. McCarthy creates artworks that use a variety of media and techniques, including performance, artificial intelligence and programmed computer-based interaction. She created p5.js, an open-source and web-based version of the software Processing.

Education

McCarthy graduated from MIT with a BS in Computer Science and a BS in Art and Design. At MIT she studied technology's impact on physical interactions with her work,Tools For Improved Social Interactions, where she made an Anti-Daydreaming Device, a Happiness Hat, and a Body Contact Training Suit out of a knitted, wearable material. The devices included sensors to monitor the wearer and evoke uncomfortable stimuli if the user is not doing what the piece is designed to achieve. For example, if the user does not smile big enough while wearing the Happiness Hat a spike would poke the back of their neck. For her thesis at MIT, McCarthy focused on the similarities between virtual and physical interactions by comparing gym culture and social networking culture.
McCarthy received her MFA degree from UCLA in 2011, where she has been an assistant professor since 2016.

Artificial intelligence projects

McCarthy often creates works that humanize the roles that smart devices like Amazon Alexa or Google Home take on. The idea for most of these projects was rooted in McCarthy's social anxiety. Getting to know people, and the small talk necessary to build connections is something that is stressful for McCarthy. She stated that she felt jealous of how Amazon Alexa automatically has an intimate place in people's lives.
In 2017, for her work LAUREN, she installed cameras, microphones and speakers in her apartment, then interacted with visitors by performing the role of assistive technology, similar to Amazon Alexa. The roles were reversed in her project SOMEONE, where visitors had 24-hour access and control of McCarthy's home.
In her collaborative work, Waking Agents, visitors are prompted to lie down and use "smart" pillows that can have conversations, play music, ask the users name, tell stories and be an overall guiding intelligence. The users were unaware that the "smart" pillows they were conversing with were actually human performers with their voices disguised to sound like A.I. robots.
McCarthy collaborated with David Leonard, in the project I.A. Suzie, to evaluate how artificial intelligence is used as a care-taking device, and how the user creates a relationship with the device. For this project, McCarthy and Leonard acted as a smart home device in the home of Mary Ann, a 80-year old woman living in North Carolina. For a week straight they had 24-hour watch over Mary Ann and had the ability to speak with her, control the lights and activate the appliances.

Social media projects

McCarthy explored projects regarding social media in an effort to connect others and meet new people with the help of technology. McCarthy wished there was a computer program that could scour through social media profiles and automatically make her friends in real life. She decided to manually do this in her work, Friend Crawl, a project she live streamed on the internet. For 10 hours a day for a week, McCarthy looked at 1,000+ social media profiles, spending about five minutes per profile. Another project she live streamed was her 2013 work, Social Turkers. McCarthy wanted to explore what including an unbiased third-party would do to a social situation and if they could provide her with helpful instruction. To make this happen, McCarthy employed Amazon Turk workers to comment on OkCupid dates that she secretly recorded and live streamed. McCarthy actually met her husband through this project, when one day he was watching one of the live streams. One the website McCarthy made for the project, she has 16 public logs that ranges from January 4th to January 30th. These logs include her personal thoughts on how the dates went as well as the Turk Workers entry transcripts that McCarthy received.
McCarthy helped create Social Soul, a large instillation for the TED Conference with Delta Air Lines and MKG. She her partner Kyle McDonald worked to bring the Twitter pages of participants, TED presenters, and attendees to life. To do this they streamed the social media profiles in an immersive 360 degree environment, where the viewer is surrounded with monitors, mirrors and sounds all relating to an individual’s specific feed. This project had custom algorithms to match the viewer with other attendees by showing them the strangers social feed. Once the viewer left the simulation they received a tweet connecting them to the person that the algorithm matched them with, so after streaming another's social media fee they could connect with that individual in person.
In Follower, a 2016 work, users could use an app to voluntarily request a person to follow them around New York for an entire day, without knowing the identity of the follower. McCarthy collaborated with Kyle McDonald again in the work How We Act Together, which encourages viewers to follow computer-generated prompts to interact with video persona by nodding, screaming, greeting or making eye contact with the projection.