• That moment when you think “you know, I should put this little script I cobbled together up on github, someone might find this useful.” Then you do a little search and find 18 other similar tools in varying states of disrepair…

      (in this case, a more customizable xdg-open replacement)

  • Privacy Inequality

    The idea of privacy as a modern form of inequality has been rattling around in my head for a while, now, and I wanted to jot down some thoughts, particularly in light of the recent rise of Mastodon.

    Typically, when people talk about inequality, they are focused on the obvious forms of socioeconomic inequality that result in advantages being conferred to some groups and withheld from others. The most obvious example is economic inequality–the recognition that economic benefits accrue primarily to the wealthy. But there’s a wide range of other forms of inequality out there, most of which are incredibly old and are structural in nature. For example, zoning laws frequently allow polluting industries to be built up next to minority communities, resulting in increasing environmental inequality. Jobs occupied by those lower on the economic ladder are more likely to be subject to unsafe workplaces, resulting in health inequality. And these same communities are the least likely to have the political and economic power to change these circumstances, an example of political inequality.

    In the world of software and technology, we’ve seen the rise of surveillance capitalism, defined as the “widespread collection and commodification of personal data by corporations.” In this new world, individuals are, either unknowingly or voluntarily, subject to vast data collection operations which scoop up, collect, and connect these datasets. These datasets are then fed into systems designed to derive additional data about individuals–data about their economical and political interests, personal relationships, consumption patterns, and so forth.

    Today, these massive apparatus are then used to deliver hyper-targeted messages intended to influence purchasing decisions, voting decisions, and so forth (though just how effective these techniques are is the subject of significant debate).

    However, the uses of these data are vast, and they’ll soon be used (and in some cases are already being used) to influence things like hiring decisions, insurances rates, loan approvals, and so forth. The result is that one poor choice, one incorrectly interpreted data point, one broken or biased algorithm, could result in individuals being denied access to critical social and economic infrastructure.

    Until and unless governments catch up, these trends will only continue. That means individuals have to protect themselves.

    Unfortunately, protecting ones privacy requires knowledge, skills, and resources that are often the domain of a select few. As a result, privacy itself is increasingly becoming a mark of privilege.

    Continue reading...