I’ve previously written about dark patterns because, on their face, they represent an ethical problem in technology. Just because you can make it difficult for a customer to close a pop-up, for example, doesn’t mean you should. And, as we know now, technologists do not take an oath to behave ethically (quite the opposite with the proliferation of the ethos “move fast and break shit”) and the government has neglected to regulate.
Sometimes things happen and they are just coincidence and sometimes things happen, especially on the internet, and someone explicitly went out of their way to make that happen. You may find this when you are casually browsing a retail site for a pair of shoes and then, through the magic of something called "retargeting," you keep seeing advertisements for that same pair of shoes. At this point, we all see this coming so it doesn't come as a surprise.
Everyone is tracking us everywhere — and sometimes we willingly let them track us by volunteering information about ourselves (i.e. what we all do on Facebook day in and day out). This is okay as long as everyone's complicit; when the product you are using is free, YOU are the product (the selling of information about you to target selling you stuff, in essence).
What I find far more disturbing is a trend toward dark patterns that I'm seeing in the design of products. I define a dark pattern as a product that takes you somewhere that you as a user don't want to go. It's intentionally leading you to something you may not want — usually the end game is to lead you to something that is profitable for the product but not so great for the consumer.