"Today, the data we create in our comings and goings is mostly separate from the information we post on Facebook, Twitter, or Google. This division is temporary. The incentives for removing the barriers are already larger than the reasons for keeping them in place.

"For instance, after centuries of trying to use math to figure out who should be with whom, only very recently have we developed the means to measure on a day-by-day, second-by-second basis the billion bits, small information exchanges, conversations, moments of happiness, awe, and disappointment that make up a relationship. We can use this to make better decisions, better avoid affairs of the heart that are doomed from the start and strengthen those that we want to keep.

"What are some of the dangers of our new predictive age? Activist and author Eli Pariser wrote of some of them in his book The Filter Bubble. The title refers to a type of 'informational determinism,' the inevitable result of too much Web personalization. The filter bubble is a state in which 'what you've clicked on in the past determines what you see next — a Web history you're doomed to repeat. You can get stuck in a static, ever-narrowing version of yourself — an endless you-loop.'

"Google and Facebook are perhaps the two most obvious examples of companies using your data to better predict your behavior. But you can always opt out of using Facebook, as millions already have. And while cutting Google out of your life isn't as easy as it was a decade ago, there are ways to use Google anonymously. An arguably more pernicious threat is posed by systems and companies that are using our information to make predictions about us without our even knowing.

"As we become participants in systems, networks, and communities where data collection plays a role; as we encounter more apps, programs, and platforms that need our data to run; predictability improves as privacy vanishes, a consequence of computers making record keeping and record sharing easier and cheaper.

"If I can impart one piece of advice in reading this book, it's this: we will not win by shaking our fist in the air at technology. A better solution is to familiarize ourselves with how these tools work; understand how they can be used legitimately in the service of public and consumer empowerment, better living, learning, and loving; and also come to understand how these tools can be abused.

"I'm not yet entirely comfortable in this world. I, too, grow cold at the thought of robots peering down at me anticipating my location in the next few seconds, few minutes, perhaps years from now, or cops in patrol cars looking at me with a narrowed eye, seeing a 10 percent probability of a check bouncing or an 80 percent chance of committing a parking violation in the next hour. All I do know is that my discomfort won't stop the winds that are revealing to the world my intentions, purchases, illnesses, hopes, and fears, my life more clearly for what it is. These capabilities that are emerging from Silicon Valley and Washington, from labs, offices, and garages, are what the military refers to as 'force multipliers,' like mustard gas during World War I or night vision goggles during Desert Storm. The thing about force multipliers is that once they're out of the box, they don't go back in."