The reality is, technology has opened up all kinds of innovative ways for government to expand their iron grip, control their social engineering projects, and intrude into our lives. And like it or not, this is just the tip of the iceberg.
And it is not just the government that wants to keep an eye on you. Corporations want a piece of that action, too.
Surveillance exists in any way imaginable, from the cameras on police cars, to the monitoring equipment emerging on our city streets. Satellites spy on your property, and cellphone towers keep track of your every move. According to Liat Clark, it is a battle we can't win, so join in. Embrace the madness.
Never mind that the Constitution does not allow warrantless searches, or unreasonable search and seizures. Never mind that our rights may be kind of important to us, and we tend not to be hip when society intrudes on our privacy.
Liat Clark says it is too late. So, how does the technology writer propose we deal with it?
Liat Clark says it is too late. So, how does the technology writer propose we deal with it?
According to Clark's article, "It's already too late to stop the ubiquitous tracking and monitoring of the public through biometrics," says Peter Waggett, Programme Leader at IBM's Emerging Technology Group. "We need to stop worrying about prevention, and start working out how to make the most of data garnered from that kind of surveillance."
"We're fighting the wrong battle when we ask should we stop people being observed. That is not going to be feasible. We need to understand how to use that data better," urged Waggett, who was speaking as part of a Nesta panel debate on what biometrics mean for the future of privacy.
"I've been working in biometrics for 20 years, and it's reaching a tipping point where it's going to be impossible not to understand where people are and what they are doing. Everything will be monitored. It's part of the reason why when we put together the definition of biometrics it included biological and behavioural characteristics -- it can be anything."
To back up his point, Waggett identified a few of the futures once portrayed in science fiction movies, now a reality. Minority Report is generally the go to film for these kinds of comparisons. But it's the commercial aspects of the film Waggett flagged up, rather than the gesture technology. In the film, the protagonist walks into a shop where an advertisement immediately pops up and draws on his past preferences to offer up some suggestions. "The one thing they got wrong is you won't recognise you're being scanned -- the flashing red light in the film is for effect, but all that's now feasible."
It is a perfect example of how we need to be aware, now more than ever, of what data we are giving up, and, for companies, how best that data can be used without infringing on customer privacy and potentially threatening that relationship.
"We're fighting the wrong battle when we ask should we stop people being observed. That is not going to be feasible. We need to understand how to use that data better," urged Waggett, who was speaking as part of a Nesta panel debate on what biometrics mean for the future of privacy.
"I've been working in biometrics for 20 years, and it's reaching a tipping point where it's going to be impossible not to understand where people are and what they are doing. Everything will be monitored. It's part of the reason why when we put together the definition of biometrics it included biological and behavioural characteristics -- it can be anything."
To back up his point, Waggett identified a few of the futures once portrayed in science fiction movies, now a reality. Minority Report is generally the go to film for these kinds of comparisons. But it's the commercial aspects of the film Waggett flagged up, rather than the gesture technology. In the film, the protagonist walks into a shop where an advertisement immediately pops up and draws on his past preferences to offer up some suggestions. "The one thing they got wrong is you won't recognise you're being scanned -- the flashing red light in the film is for effect, but all that's now feasible."
It is a perfect example of how we need to be aware, now more than ever, of what data we are giving up, and, for companies, how best that data can be used without infringing on customer privacy and potentially threatening that relationship.
Clark goes on to tell us about mannequins (EyeSee mannequins) in stores that gather age, sex and racial data on retail customers using facial recognition, so that stores can market their stores accordingly. Another uses beacon technology in smartphones to automatically alert customers to product details via an app. When a customer comes within 100 meters of one of the mannequins, they will receive an alert about the available content, including details on the items the mannequin is wearing and links to purchase them straight from the shop's website. It operates 24/7, so a passerby can buy an item when window-shopping, rather than entering the store.
As this technology gains traction, customers may have options available to opt out, but the customers will be more willing to open their data for increasingly attractive "rewards". Already retailers have the option of asking customers to sign in to the app with Facebook or Google+, as with most apps, which could potentially open up a whole realm of analytics options depending on the user's privacy parameters. Add a camera to that mannequin, and using Facebook's facial recognition tools it could soon be asking you -- by name -- how you feel today, or pointing out that your clothes are looking particularly shabby.
"The pressure to ID people is becoming more and important with things like the internet of things," points Waggett. If we are to securely make the most of those future networks, we're going to have to free up more of our biometrics. "Google Glass wants to block facial recognition to stop people using invasive technology, but I think a lot of these things can be used for good."
STOP!
Did you read that part of Clark's article? The individual being quoted, Peter Waggett, Programme Leader at IBM's Emerging Technology Group, says that he is willing to accept the evil of monitoring and surveillance because "a lot of these things can be used for good."
That is how they get you. Good intentions.
The common good.
For the good of the community.
Darkness always appears as an angel of light.
"Biometric systems are becoming much more accurate and ubiquitous," said Waggett. "It is impossible not to be identifiable by some kind of signal you're leaving behind. Accuracy is going up almost exponentially and we are dealing with concerns about privacy and how we map that.
"But trying to stop this would be fighting the wrong battle. The information is out of the bottle already -- we have to deal with the issues surrounding it now. Embrace the challenge of what we've got, embrace understanding it and focus on what we can do with that new data."
"Biometric systems are becoming much more accurate and ubiquitous," said Waggett. "It is impossible not to be identifiable by some kind of signal you're leaving behind. Accuracy is going up almost exponentially and we are dealing with concerns about privacy and how we map that.
"But trying to stop this would be fighting the wrong battle. The information is out of the bottle already -- we have to deal with the issues surrounding it now. Embrace the challenge of what we've got, embrace understanding it and focus on what we can do with that new data."
In other words, they argue, "It's going to happen, whether you like it, or not. So quit fighting it, and embrace the change."
I can remember a great many tyrannies that said the same thing, as they weaseled their way into power.
-- Political Pistachio Conservative News and Commentary
Get ready to have your biometrics tracked 24/7 - Wired, U.K.
I hope that there are emerging technologies to block the "ubiquitous" scanning for facial and behavioral recognition. I am already perplexed that computers and smart phones and televisions don't have "hard" off switches so people can ensure that hackers cannot use them to spy on us when they are turned off. How simple a solution is that? And yet, they aren't being offered. I should think there would be a ready market for such a simple thing. I know the fact that my current laptop did NOT come with a built in camera was a selling point for me. I can connect a camera if I want to. I wish I had the same simple choice for the microphone.
ReplyDeleteThere are also wallets now being made of a simple, fine stainless steel mesh to act as faraday boxes for our wallets so others cannot use scanning tech to read our credit cards.
Next up. I'd like to see some personal jamming tech so that I can protect my privacy from drones or other publicly installed surveillance units that might be near my home. I'd also like to see hats or makeup that blocks facial recognition tech. (I prefer hats to makeup, personally. I know there is makeup out there that will, but I hesitate to put huge black slashes across large sections of my face, since that would create notice when the point is to avoid it. I'm not a spy or anything, I just value my right to privacy.