Learning your life
If you are a “regular” person living and working in modern civilization, it is likely that your life has a few recurring patterns:
- on weekdays, you travel from home to work in the morning, and back in the evening; perhaps punctuated with a school or daycare drop-off and pick-up.
- on weekends, you either mostly stay at home or frequent a small set of haunts (beaches, restaurants etc).
- infrequently, you break out of this pattern by leaving the orbit of work-home to go on a vacation.
This is easy to figure out by observing someone for a short period of time. Of course, your phone already does this.
The above model of your life could be described in a few kilobytes. It is a zeroth-order machine learning problem. Is that depressing? Does that mean your whole life has a few kilobytes worth of complexity?
Free will can now be defined as the ability to surprise a prediction model.
— Vivek Haldar (@vivekhaldar) May 14, 2014
John Foreman, in his poignantly named piece “Data Privacy, Machine Learning and the Destruction of Mysterious Humanity”, takes a view of the field from the inside, and it sounds like that of a physicist working on the Manhattan Project worrying about the impact of what they’re building.
Our past data betrays our future actions, and rather than put us in a police state, corporations have realized that if they say just the right thing, we’ll put the chains on ourselves… Yet this loss of our internal selves to the control of another is the promise of AI in the hands of the private sector. In the hands of machine learning models, we become nothing more than a ball of probabilistic mechanisms to be manipulated with carefully designed inputs that lead to anticipated outputs… The promise of better machine learning is not to bring machines up to the level of humans but to bring humans down to the level of machines.
That is a dim, if plausible, view.
It is based on a pessimistic view of humans as always giving in to nudges to behave as our worst selves. I hope that we are better than that. We are already surrounded by advertising for junk food, and yet a substantial fraction of us manage to ignore that and eat healthy. I hope machine predictions that try to nudge us face a similar fate: predictions that look down on us will be indistinguishable from low-quality ones. The quality and tone of predictions will become an important part of the brand identity of the company emitting them. Do you really want to be the company known for peddling fatty, salty food, legal addictive substances, and predatory loans?
A tangential point: naming matters. Statistical inference, which is what machine learning really is, is a less scary name. “Statistical inference” conjures images of bean-counters with taped glasses; “machine learning” of an inevitable army of terminators.
Circling back to the question I posed above: is your life really describable in a few kilobytes in a predictive model? No. A model is a probabilistic abstraction. You, and your life, are not a sequence of probabilities, in much the same way that hiking a trail is a much richer, “realer” experience than tracing your way through a detailed trail map.