Find out more about the teams of young Canadians who took home a collective $20,000 in cash prizes to further pursue their business ideas in the Hack Against Hate Challenge
There’s been a sharp rise in hate crimes across Canada these past few years, and it’s time to put a stop to it. In fact, Canada has seen a record high in police-reported hate crimes since 2009.
As Canadians, we tend to think our country doesn’t have these sorts of issues – but we must recognize the shortcomings in our systems, and collaboratively work together to develop solutions that mitigate hate in an effort to make our world a more safe, inclusive, and happy place to live.
In the wake of the rising number of hate crimes, the DMZ and Penny Appeal Canada teamed up to launch Hack Against Hate. The 4-day national competition took place between November 23rd and 26th and challenged young Canadians to brainstorm and build a prototype for a digital solution that combats hate crimes. At the end of the Hack Against Hate competition, a panel of judges picked the top 4 teams to each receive $5,000 in cash prizes.
The hackathon kicked off with 40+ teams. Each team went through professional training and mentorship on building and pitching a tech solution. Participants received hands-on support to ideate and build innovative anti-hate tech solutions and took part in expert-led workshops on design thinking, product development, UX/UI, customer discovery, pitching, and more.
Last Friday, the DMZ and Penny Appeal held the finals where the winners presented their solutions. The finals were open to the public and featured speakers from the DMZ and Penny Appeal Canada, as well as keynote speaker Nabeela Ixtabalan, the Executive VP of People and Corporate Affairs for Walmart Canada.
The DMZ awarded $20,000 in funding to help teams kick-start their solutions. While all of our winning teams were comprised of high school students, their solutions to put a stop to hate crimes were anything but juvenile.
Check out the winning teams!
PROtectABot
Team Members: Arya Peruma, Harshul Gupta, and Peter Lee
PROtectABot is an AI-powered bot that filters hatred and educates users on harmful content on social networking platforms.
“Discord is a very popular social networking app that has over 150 million monthly users. However, it does not have built-in or external systems to prevent hatred from spreading,” highlighted Arya.
“Discord is a very popular social networking app that has over 150 million monthly users. However, it does not have built-in or external systems to prevent hatred from spreading.”
Harshul explained how Discord played a large role in the deadly 2017 Charlottesville protests, as it was used to coordinate logistics and encourage violence for the rally. “Though at the time Discord cracked down on hate crimes, there is no real-time personalized moderation in Discord, which is exactly what we were hoping to tackle with this project.”
Pridtect
Team Members: Harsehaj Dhami and Samantha Ouyang
Pridect is a solution working to ensure pride parades are safe spaces. The app uses safe zone mapping and distress signalling.
Haresehaj highlighted the rise in hate crimes at pride parades, and how some members of the LGBTQ+ community are left feeling scared to attend. “So many different people from different backgrounds come together to unite for the pride they have for themselves. But it can be dangerous. Hate crimes at pride parades are at an all-time high.”
“There is no tangible solution currently that is working to improve safety at pride parades. But we want to change that with our app. Parade-goers and organizers will now be able to obtain the utmost safety.”
“There is no tangible solution currently that is working to improve safety at pride parades. But we want to change that with our app. Parade goers and organizers will now be able to obtain the utmost safety.”
Specula
Team Members: Adam Omarali, Eamonn Lay, Colin Hill, and Navid Farkhondehpay
Specula is working to make people aware of racial biases before they post on social media platforms to reduce harmful psychological effects to others.
“Race is one of the biggest biases that lead people to commit hate crimes, and physical hate crimes are way more prevalent than online crimes,” explained Adam.
Adam also spoke to how a lot of physical hate crimes today are actually driven by psychological bias. “Our explicit and implicit biases are shaped by the media. They impact how we view things. At some point, if you can express hate, these biases can come out in physical crimes.”
Unhate
Team Members: Gabriel Bernal, Ryan Chan, Aryan Jha, Yelim Kim
Unhate is an AI tool that helps detect hate speech online and can be integrated into consumer apps and educational services.
Gabriel spoke to the rise of hate speech and its unfortunate prevalence online around the world.
“The internet was supposed to be something that would connect the world, but instead it’s leading some people to their death. This is exactly why we felt compelled to solve this problem.”
Unhate leverages over 100,000 categorized real tweets to train its AI models, allowing it to be extremely accurate with its services.
“The internet was supposed to be something that would connect the world, but instead it’s leading some people to their death. This is exactly why we felt compelled to solve this problem.”