Monday, May 26, 2014

Detangler: Start

I want to build a system which helps and encourages people to self-introspect. It should be designed to aid them in understanding their own goals and motivations. It should be fun and easy to use, and have emergent anonymous social properties that form positive feedback cycles. Kind of like a directed journaling process.

Some friends and I have been talking about this for a while. It's time to start fleshing out the idea and giving serious though to a prototype implementation.

It's called the "Detangler", because "tangles" is our little loosely-grounded symbol for the symbols that people attach meaning to: the little bits that make you up, small pieces all the way up to huge aggregates. Puzzle pieces, tree branches and roots, whatever metaphor you choose. A tangle can be as big as your personal goal of living a healthy and happy life, and you can break that all the way down to choosing to eat a salad today for lunch because you want to be healthy later instead of fat and sassy from greasy food right now.

Thursday, May 15, 2014

Collaborative photo maps

Flickr's map site is a pretty cool usage of publically posted information. If anyone geotags their photos, they show up, and you can explore the globe and click on photo dots to see what people have explored.

Here's a park in the Adirondacks in New York.

But there's an additional feature I didn't notice until today: you can actually search by the tags as well as location, adding another axis to your adventures.

Now we can look at architecture in Seoul while sitting on our butts.

As long as people realize that their travels might be tracked by posting these photos, I guess this is a pretty slick use of collaborative data. For more sort-of-related photographic time-wastage, check out Microsoft Photosynth.

Old ways

Old people have a hard time with technology. Supposedly, this is just the way of things. Is that pattern due to technology being poorly designed for the elderly? (Certainly true, some of the time.) Is it that the elderly do not want to learn or switch to assistive technology that is designed for their needs? Or is it more due to the fact that the worlds they grew up in were so much less flexible and fungible than the technology-infused crazy place we currently live in? Will the younger generations of today have similar problems when they are elderly, with the new technology of tomorrow? Some researchers think so, but I have my doubts.

Sure, technology is always evolving rapidly, but the kids of today are (hopefully) learning how manipulable information and computing are. Maybe they don't know it in those words, but I feel like someone who grows up with these systems gets an intuition about them in a general sense, and can plug into things in a generic, always-learning sense. Perhaps it's simply the fact that computing really is, in some ways, ubiquitous [at least for us privileged first-worlders.]

Is it true that elderly people learn slower? Not necessarily. Perhaps this is an ageist response, but I feel like many elderly folks are not stopped short by bad design or disabilities with hearing and seeing and so on. When old people approach a new technology, instead of being interested and curious, they just get grumpy. If we can dodge the grumpiness as we catapult into the future, I suspect that we can all grow old gracefully; remembering that to stay inflexible is to fall dangerously behind.

Of course, this is all much more complex than a flippant blog post can delve into. I reserve the right to shake my fist and yell "Damn kids! Get off my cyber-lawn!" like every generation does when they are confounded by their descendants, feeling a little guilty perhaps that we didn't build ourselves (and, by extension, later generations as well) a better way.

Podcast episode 1

Well, okay. I'm not really a podcast person, but I edited a little of my band's music at the intro and outro of a conversation I had with some friends about security and ethics with encryption, hacking, and what data you put on the internet about yourself. There's even a moral at the end, I guess?


Pareidolica Podcast (6mb, 6min)

The void stares back

Did you know that it is apparently legal and totally accepted for billboards in public places to do face tracking and recognition? Oh, no worries, they don't look you up on social networking sites and display something more to your tastes. Yet. They only use the face tracking to measure how long you are "engaged" and what gender, age, and ethnic group you are.

That's not scary at all.

They claim that no images are recorded and "no uniquely identifiable data are extracted". We all know how well large anonymized datasets have worked out in the past, so I'm sure it's totally fine like the advertisers say.

Advertising leads some technological curves, by its nature. They want to penetrate the most minds, and not in an obvious way; they want you to think that you thought you wanted that object, a desire that arose inside you. "If you're watching it, it's for you." They want to improve their tools, so that we don't (or even can't) notice them at work, re-architecting our inner desires.

The advertising corporations are watching you from public places. If I sat across from that "active billboard" in the subway and wrote down a list of who made eye contact with me and what "demographics" they fit in, I would seem like a psychopath. But then, nobody ever said advertisers weren't psychpathic...

Wednesday, May 14, 2014

Bubbles

Filter bubbles are one thing, where the algorithms that build the walls you live in might prevent you from seeing outside of your one-way mirror. Facebook can be a filter bubble, and Google or Bing can be a different kind of search bubble. But what about the nature of news and bias in the first place? If you subscribe to News Magazine X, you are getting fed their bias. This is the way it always has been, but the internet allows people to carve further, deeper niches.

In a way this is similar to how like-minded communities collaborated before the internet: people got together, organized around what they agreed upon, and boom: collaboration develops. You had zine culture. You had think tanks. You had obscure groups with obscure goals, editing and sending newsletters.

The problem with the internet search and news filter bubbles is that they are somewhat illusory, in a sense that's hard to quantify. Both meatspace news and internet news feed you a line, and to see things outside that, you have to work at it. But the internet version of news, possibly due to its speed and seeming all-encompassing-ness, deludes us. I look at a single perspective on the news (pick one) and I can think: "Oh, I have amassed all the viewpoints on this news!"

It's just not in our nature to constantly be deconstructing bias in the news, or always trying to aggregate all sides of an issue. Is it because we're tribal? Destined to form small groups and stick with them, unable to collaborate more broadly?

How do we engage in a more meaningful sense across these divides? It seems that technological tools could help bring us together, rather than keep us in our little filter bubbles. But how?

Friday, May 9, 2014

Unstable feedback loops



Figure 1. The major disadvantage of modern technological society.

We have an addiction problem, folks. We like shiny things. We like new things. And we especially love shiny new things. Nothing is more exciting than a shiny new technological toy. But we don't understand what the app permissions mean as we load a new thing on our phone. We don't understand how our privacy is slowly being eaten away by big data and the corruption of the NSA. We willingly put our lives online, mash them up, shake them around and see what big corporations whose only interest is to make money off our flaccid eyeballs can do with the data.

I am no Luddite. I'm not saying that the problem is that we can't go back to better times, starving in the woods huddled in unheated cabins hoping bears don't eat our faces. And I'm not saying that the bad part is our ignorance. The bad part is that we can't see that we're caught in a trap, a poison trap taking slow-growing gaseous pieces from capitalism, laziness, and cynicism. We get to the bottom of the cycle and shrug our collective shoulders: oh well, guess Facebook is going to store my data for all time and use it for whatever fun and nefarious purposes they can come up with! Nothing we can do!

Instead of stopping, standing up, and wondering how it could be different, we slog onward and just keep adding more poisons to our plate. An ad-driven internet? Sure, that's pretty nasty. How about we kill net neutrality so that censorship and class divisions can really start kicking into high gear on the internet? Sounds great. Let's let the NSA weaken our infrastructure for their needs, so it's weak against them and everyone else's attacks. Groovy. Nothing we can do about that, now!

I am no cynic, either. I think there are certainly solutions to these problems. We're just not going to find them by frantically flailing around, refreshing Facebook in a narcissistic haze. We're going to have to study things closely, adjust our uses of technology carefully, and stop assuming that the system has our best interests in mind.

The system just wants to keep careening in the direction it's going. We can sit and watch it and wait if we want, because who knows? Maybe it will careen into an interesting place that's good for everyone. Now, that's the kind of optimism we can't afford.

Birds


What do you see?

The birds are crawling from the depths of your mind. All that ink I splashed on paper and blew on grew into birdlike forms of its own accord.

The wings of technology can lift society up or they can distract us with flapping winds.

Big Data questions

The White House just released its Big Data report, finishing their 90-day big data review which involved input from the President's Council of Advisors on Science and Technology as well as surveying the public. The review grapples with issues like discrimination, privacy, and power asymmetry, which certainly feed back into ethics. The results show that the American public is very concerned about transparency and oversight on practices involving their data; as the report shows, even where people were least worried (collection of location data) a majority of them were still “very much” concerned.

I certainly fall on the extremely concerned end of the spectrum. Many of these issues involve knock-on effects which are difficult for the average person to understand, such as how similarity analysis can de-anonymize data. These problems go back a long ways; I remember the furor around the“anonymous” search logs that AOL Research leaked in 2006. These days, data researchers have gone so far as to rigorously prove that supposedly-anonymous datasets with specific properties can bereliably de-anonymized.

It's also interesting to see that the report focuses on how broken our current expectations for privacy are, and also raises concerns about how long-running assumptions about how personal information is used in the marketplace are going to be overrun by the possibilities of big data analytics. As a non-expert, I don't feel informed enough to weigh in on their recommended policy framework for privacy and big data, but it's clear that some big ethical questions about information science are being addressed and formulated into law for the first time, to fill various policy vacuums.

There are many complex issues involved. The push-and-pull between law enforcement and our private lives is a strange and obvious ethical quandary; but the report also touches on smaller, trickier questions involving ethics. It questions things such as how we should evaluate government use of commercial data to ensure consistency with our national values and civil liberties. It is clear that the intersection between information technology and ethics is only going to grow more tangled as more of our lives become integrated with data-driven systems.

Here's a Danah Boyd talk from 2012 - she was one of the researchers involved, and has been thinking about privacy and big data for a good long while:


Seeing things

Pareidolia is the very human perceptual problem of experiencing things that are not there: feeling your cell phone buzz in your pocket when you're not carrying it, seeing faces in clouds and trees, hearing music in the babbling of a stream. It is the sensory subset of a psychological effect called apophenia: seeing connections in random data.

We're very good at this, and I am interested in how we see patterns in the developing, technological world around us: we assign meaning to things that may not actually be there. We feel like Facebook is connecting us to our friends similarly to how a face-to-face chat might, but is that really so? What if the Facebook feed is more like random data that we are happily judging as meaningful interaction, carefully curated by algorithms to make us keep coming back for more?

Pareidolica, with a c, is a made-up word for this idea I have that we humans get addicted to pattern-seeking to our detriment, without stepping back to wonder what's really going on inside our minds. What are the ethical impacts of the technology around us? How can we use the patterns it makes to our mutual benefit, without getting trapped in little dopamine addiction loops, refreshing the page, looking for those tiny faces to pop up and give us a burst of fake reality?

While I'm making this blog for a class assignment, who knows -- I might keep it going after. I can use it to cross-post any interesting essays that come out of my flexible option schooling, or any of my strange fictions.