image: TV by Anuar Zhumaev http://thenounproject.com/term/tv/99975/, CC-BY 3.0
This recent Daily Beast news article has particular relevance to me as we’ve recently bought one of these Samsung tellies (although nowhere near as humongous as the one in this picture from the equivalent BBC article!).
The idea of “smart devices” is quite a cool one, and appears quite benign. Look, my telly is remembering the stuff I watch and making life easier for me.
Except the TV isn’t the thing that’s smart. The “smartness” of the device isn’t contained within the machine, it come from the data centre that holds all the information we give it. The data about my watching habits is collected by that frightfully useful internet connection (on a TV! Imagine!) that transfers the information back to the TV manufacturer or another 3rd party so they can provide the services they think I will find most appealing, encourage me to use the device more and give them another product they can sell in the data about my family’s behaviour.
Now, I’m not going to be sending my TV back. I’m not shocked or surprised by the news story. I might turn off voice recognition which I never used anyway. It’s a really cool TV especially after the little bucket of visual fuzz we had previously.
But this goes back to my last post about being critical about the technology we use. Do we as a society understand enough about how our data is collected and exploited? The fact that the news article has gained such a lot of traction today suggests not.
TV’s have been a part of our living rooms for many decades and we’ve got used to their status as receivers of information. We now have to adjust to the idea that they are now computers and we should think about them as such.
As we grow accustomed to a world where more of our devices are connected (wifi kettle, anyone?), understanding this becomes an ever more important part of digital literacy.