Canada’s hate law raises free speech fears
Cheryl Bowman, The Rural Alberta Report
October 12, 2025

Canadian Politcs
Two major federal bills now before Parliament are raising alarms about how far the government can go in regulating speech and public expression — and how easily those powers could be turned against Canadians.
The first, Bill C-9, known as the Combatting Hate Act, was introduced in September 2025 and is currently before a House of Commons committee. It proposes several changes to the Criminal Code designed to strengthen hate-crime laws. The second, Bill C-63, the Online Harms Act, targets harmful content on digital platforms and expands penalties for hate-related offences.
Together, the two bills mark a sharp increase in the government’s role in deciding what constitutes hateful or dangerous expression. Supporters say they are long overdue updates to protect vulnerable communities. Critics say they risk creating a regime where unpopular speech is criminalized and online debate tightly controlled.
The Combatting Hate Act now before Parliament would create a new offence for wilfully promoting hatred by publicly displaying symbols linked to terrorist entities or recognized hate groups, including the Nazi swastika. It would establish “hate-motivated crime” as a distinct offence to ensure harsher sentences for acts driven by hatred. It would also make it a crime to obstruct or intimidate people entering places of worship, schools or community centres tied to identifiable groups.
One of the most controversial elements removes the requirement for the Attorney General’s consent to prosecute hate-propaganda offences, giving police and Crown attorneys the power to pursue charges independently. Civil liberties groups argue that this change could open the door to politically motivated prosecutions or the criminalization of peaceful protest. The bill passed second reading on October 1 and remains under committee review.
Bill C-63, tabled earlier this year, would create a digital safety commission with sweeping powers to oversee social media companies and compel them to monitor, report and remove harmful content. The legislation revives a section of the Canadian Human Rights Act to handle online hate complaints and introduces heavier penalties for criminal hate speech. Platforms could face fines worth six per cent of global revenues if found non-compliant, while users convicted of hateful conduct could face new criminal sanctions or preventive peace bonds.
The model closely resembles the United Kingdom’s Online Safety Act, which took effect in 2023. That law gave the U.K. communications regulator, Ofcom, authority to demand the removal of content deemed harmful and to fine companies up to ten per cent of their worldwide turnover. British authorities have justified the act as a means to curb harassment and extremist propaganda, but the rollout has shown how quickly regulation can spill into censorship.
Since the U.K. law came into force, platforms have been accused of over-removing content to avoid penalties.
Police made more than 12,000 arrests in 2023 for online messages considered “grossly offensive” or distressing, an average of more than 30 a day. Even legal speech has been caught in the net, prompting warnings from journalists and free-speech advocates that the line between protection and suppression is blurring. Tech companies such as X, formerly Twitter, have said the law pressures them to silence political and controversial voices.
Digital rights groups in Canada worry the same patterns will follow here. They say the broad wording in both C-9 and C-63 leaves room for subjective interpretation of what constitutes hate or harm. The removal of the Attorney General’s oversight, combined with new regulatory penalties for platforms, could create a system where authorities and corporations alike err on the side of suppression.
Government officials maintain that the proposed laws are about protection, not control, and that Canadians’ Charter rights will remain intact. But as the British example shows, laws written to safeguard the public can quickly evolve into mechanisms of restraint once enforcement begins.
As Parliament studies both bills, the debate is no longer just about whether hate and harm should be addressed — few dispute that they should — but about how much power the state should hold to decide what Canadians are allowed to say, display and share.









