I’ve previously written about dark patterns because, on their face, they represent an ethical problem in technology. Just because you can make it difficult for a customer to close a pop-up, for example, doesn’t mean you should. And, as we know now, technologists do not take an oath to behave ethically (quite the opposite with the proliferation of the ethos “move fast and break shit”) and the government has neglected to regulate.
Until now, that is. California has a new law on the books to address this that complements the California Consumer Privacy Act.
However, time will tell if, like the CAN-SPAM Act and the Do Not Call Registry, the regulation will lack the teeth for any sort of enforcement. Wired Magazine contends that specificity on what will be covered is still lacking, leading me to believe that it will be difficult for this to result in real consequences for offenders:
California’s first-in-the-nation status on regulating dark patterns comes with a caveat. It’s not clear exactly which dark patterns will become illegal when the new law takes full effect in 2023; the rules are to be determined by a new California Privacy Protection Agency that won’t start operating until later this year.
The more I learn about the human condition, the more essential governance seems to become. We need rules and codes of conduct to help us navigate what’s pushing the envelope versus what’s just evil. We will need to watch what happens in California carefully as that will serve as a litmus test for the rest of the country making progress.