Monday, December 14, 2020

Time Is Almost Up To Make Your Canadian Website Accessible

 

If your organization has a Canadian website (.ca), you should know that Ontario has a web accessibility law that may soon require remediation of your website. 

Under the Accessibility for Ontarians with Disabilities Act (AODA), all private organizations with more than 50 employees (for profit and non-profit) and all public-sector organizations are required to make their websites accessible.

The AODA has been around for a while, and it required all new websites built after 2014 to be WCAG 2.0 Level A compliant.  The law did not immediately require remediation of old content, but by January 1, 2021, all web content posted after January 1, 2012 must meet WCAG 2.0 Level AA compliance standards.  (Content created before 2012 does not have to be remediated.)

There is no right to sue under the AODA, so don't expect a flood of lawsuits like we've seen in the U.S.   There are administrative fines that are small for first offenders but hefty for repeat offenders (up to CA$15,000 per day for repeated, serious noncompliance), so ignore the requirement at your peril.

 

Sunday, December 6, 2020

Beware of Dark Patterns in Online UI/UX


Have you ever found yourself spending countless hours unintentionally falling down an internet rabbit hole? Have you noticed it is often remarkably easy to sign up for a free trial, but found the experience of unsubscribing Kafkaesque?  The manipulation tactics behind these experiences are known as "dark patterns," and they are emerging as a new area of legal regulation.

Relativity, by M.C. Escher, 1953

Technology companies now employ specialized professionals to manage the users' experience (UI/UX), and an many ways the tactics they employ can be quite helpful. When UI/UX engineering goes awry, as millions have recently learned thanks to the popular documentary film The Social Dilemma, users can suffer. 

The Electronic Frontier Foundation describes  dark patterns as the opposite of Privacy by Design, and worries the patterns will have dire consequences for personal privacy. When technology is designed to manipulate, obfuscate, or misdirect, one of the first harms the user is likely to experience is a loss of control over their personal information.  Financial and other harms are also likely.

Though common on social platforms, dark patterns are also found outside of social media. I am particularly interested in the retail sector these days, and a recent Princeton University study examined more than 10,000 retail sector websites and found that more than 1,200 (12%) used dark patterns patterns!

The US Federal Trade Commission recently defined dark patterns as “design features used to deceive, steer, or manipulate users into behavior that is profitable for an online service, but often harmful to users or contrary to their intent.” In a recent enforcement action against Age of Learning for its use of dark patterns in connection an online learning tool for kids, the FTC claimed Age of Learning made it unduly difficult for users to cancel recurring subscription fees after initially promising “Easy Cancellation.” 

Similar concerns exist in Europe.  France's data protection authority, CNIL, has also published a report on dark patterns.

Although Section 5 of the FTC Act likely gives the FTC the power to address egregious dark patterns that are fundamentally unfair or deceptive, a legislative response may be coming.  In 2019, a bill was introduced in the US Senate to address these practices, the "Deceptive Experiences To Online Users Reduction (DETOUR) Act". As of today, it's still in committee.

Technology professionals should be aware of dark patterns and take steps to avoid them when designing consumer-facing technologies.  (You don't want to end up in a regulator's crosshairs, nor on Harry Brignell's Dark Patterns Hall of Shame!)