After graduating from McGill, Dutt headed the marketing department of a Y Combinatorstartup. She then took over the internal operations of Notman House, Montreal's Google for Entrepreneurs tech hub, where she fostered the local startup community by supporting and promoting innovative ventures and initiatives. Using her background in economics and innovation, Dutt then co-founded Botler AI in 2017 to enhance accessibility to the legal system, through artificial intelligence.
Prior to co-founding Botler, Dutt found herself faced with a stalker during a terrifying, months-long ordeal. The man showed up at her workplace everyday, stalked her on social media to learn of her location, and even followed her to her home. Though fearful, Dutt found herself making excuses thinking "It’s all in my head" or "I don't know if something is really wrong or if I'm too sensitive". She didn’t know what her rights were, what she should do, or if the man’s actions were illegal. The experience left her feeling trapped and Dutt struggled to call it what it was: stalking, or criminal harassment in Canadian law. Months later, after the Harvey Weinstein sexual abuse allegations and ensuing spread of the #MeToo movement, Dutt started researching the relevant legal codes and learned what had happened to her was a crime. She gained confidence from learning there was a legal basis to what she had felt and that she had been justified with her discomfort. Dutt realized that sexual harassment was a far bigger issue than imagined and found herself angered thinking “How many people think they can do this and get away with it?".
Botler AI
In December 2017, motivated to take action by her personal experiences, Dutt led Botler AI to launch a free tool to help survivors of sexual harassment determine whether their rights had been violated. The tool was aimed as an impartial resource to empower the average person through information and education, without fear of judgment. Dutt's premise was that, unlike humans, a robot has no prejudice of race, gender, sexual orientation or socio-economic background, would never ask “What were you wearing?” or “How many drinks did you have?”, and therefore provided an emotion- and judgement-free neutral tool to complainants. The Artificial Intelligence system, which also used deep learning, was trained using over 300,000 court documents from Canada and the United States. Natural language processing was used to determine whether an incident described by the user could be classified as sexual harassment. The user was provided with a summary of the relevant legal codes, based on their jurisdiction, and a detailed report of the incident which could be handed over to the relevant authorities, from HR to the police, if desired. The goal was not to let the user know whether they could win a case in court, but rather to empower them with confidence grounded in legal doctrine. Dutt stressed “Once people have the information then it’s up to them what they want to do with it... maybe they feel comfortable to approach somebody like HR…or maybe it makes them feel better that it’s not just in my head, and I have the right to stand up to my abuser because I have rights in this situation.” Dutt also commented “This is just the first step” and revealed plans to expand Botler to connect users with resources appropriate with their situation, including legal representation.