In Foreman’s article he claims that big data and prediction algorithms dehumanize us. “We become nothing more than a ball of probabilistic mechanisms to be manipulated with carefully designed inputs that lead to anticipated outputs.” In short, we are nothing but finely tuned mathematical models; companies provide a certain product with the right advertising (input), and we buy it all up (output). Personally, I think this silly. Humans are not something that can be simplified down and quantified. As technology increases and we collect more data, we are better able to predict what will make a consumer buy a product. But that’s all it is, a prediction. The choice still comes down to the individual. We may choose to act rational and follow our previous patterns, or we may choose to go on impulse and do something unprecedented. That’s the great thing about being human: we’re variable.

Foreman’s argument is that by predicting what we will buy and learning how to manipulate us through advertising that we will lose our freedom of choice. Once companies have hacked our decision making process, it will be all over for us. “The meaning of our lives will decrease.” But even he admits that individually tailored advertising may make us happier. Corporations will know what we want before we even know we want it, and it will be right there waiting for us. There will be no searching for your next book. Kindle will already know and have it loaded. As long as we still have the freedom to choose whether we want to read that book or some other book, I see mostly benefits.


On the other hand, when we veer away from advertising, big data and prediction algorithms can get ugly. In Marwick’s piece she mentions how large data gathering corporations have been tricked into selling sensitive data to hackers and crime rings, who can piece together that data to exploit you and your bank account. Big data can be a good thing, but we have to be responsible with it.