Available May 27 on iTunes and in stores as part of Zeds Dead’s “Somewhere Else” EP.

Somewhere Else EP:

01 Collapse [ft. Memorecks]
02 Hadouken
03 Lost You [ft. Twin Shadow and D’Angelo Lacy]
04 Bustamove
05 Stoned Capone [ft. Omar LinX and Big Gigantic]
06 Where Are You Now [ft. Dirtyphonics and Bright Lights]
07 Dead Price [ft. Sean Price]
08 Blink [ft. Perry Farrell]

© Mad Decent Protocol, LLC

  • The balanced scorecard supplemented traditional financial measures with criteria that measured performance from three additional perspectives—those of customers, internal business processes, and learning and growth
  • Most companies’ operational and management control systems are built around financial measures and targets, which bear little relation to the company’s progress in achieving long-term strategic objectives. Thus the emphasis most companies place on short-term financial measures leaves a gap between the development of a strategy and its implementation.
  • The scorecard introduce four new management processes that, separately and in combination, contribute to linking long-term strategic objectives with short-term actions: translating the vision, communicating and linking, business planning, eedback and learning.

  • Another way to put it: For-profit “sharing” represents by far the fastest-growing source of un- and under-regulated commercial activity in the country. Calling it the modern equivalent of an ancient tribal custom is a rather ingenious rationale for keeping it that way. After all, if you’re a regulator, it’s easy to crack down on the commercial use of improperly zoned and insured property. But what kind of knuckle-dragger would crack down on making friends?
  • If I said my startup was part of the “loving economy,” there’s a good chance I could find a poll showing that people are pro-love, but it wouldn’t establish anything about my company.
  • it’s helpful to separate the services that more or less existed before the sharing economy from those whose offerings are relatively novel.
  • The same goes for a variety of other sharing-economy services that are simply cheaper or more convenient versions of services we’re accustomed to using.
  • The tendency of some sharing economy companies to undermine worker protections and bid down the value of labor is the phenomenon’s most alarming feature.
  • he second sharing-economy category consists of companies that offer a familiar service with a twist—say, a more customized product.
  • Airbnb doesn’t release data that would indicate how much its users value the social aspect of the service. But, while there is probably a core group who are genuinely out to bond with their hosts, the company’s own moves suggest its users care much more about the authenticity of their surroundings than befriending hosts.
  • The most exotic part of the sharing economy—companies that broker experiences it would otherwise be difficult, if not impossible, to have.
  • (Feastly) suggest that users enjoy the social dimension of their meal, not just the gastronomic. 
  • But even here, in the most share-y corner of the sharing economy, it’s still unlikely that community and connectedness is the primary motivation (GrubWithUs)
  • In December, the Columbia Journalism Review’s Ryan Chittum suggested an interesting thought experiment: Replace “sharing economy” with “sublet economy,” and see how you feel about it then. “Subletting,” he wrote, is both a more accurate and complicated term: “It raises questions about the rights of neighbors and of owners not to have their building turned into a hotel—not to mention the ability of the government to tax these transactions.
  • Brad Burnham, “What we’re talking about is the natural tendency of capitalism to consistently find a more efficient way of delivering something” he says. “It’s information technology lowering transaction costs and revealing assets that can be utilized.

Backpack, a Y Combinator-backed startup, is a peer-to-peer marketplace that connects shoppers and travelers to empower consumers to buy overseas products at a discount. Shoppers get access to foreign products by paying travelers coming to their country a fee to purchase and deliver the items.

  • people whose confidence is more closely tied to the strength of their romantic relationship—or those with higher levels of relationship-contingent self-esteem, in psych-speak—are more likely to use the social networking site to broadcast their happiness.
  • It makes sense that relationship-contingent self-esteem, or RCSE, which has previously been linked to lower overall self-esteem and higher social anxiety, could lead someone to seek validation by systematically “liking” each of their partner’s status updates or insisting on making things Facebook official
  • Introverts were also more likely than extroverts to use Facebook both to show off their relationships and to keep tabs on their partners’ activity, even though past research has shown that extroverts tend to be more active on Facebook and to have larger online networks
  • Relationship-contingent self-esteem didn’t necessarily mean the relationship itself was lacking; in fact, the same people who posted couple-y items more frequently also tended to be more satisfied with their partners than those who did not. In other words—to the sure disappointment of many a cynic—when it comes to romance, the oversharers may not be trying to compensate for anything.

I passaggi da conservare:

  • Speaking earlier this year, Jim Farley, a senior Ford executive, acknowledged that “we know everyone who breaks the law, we know when you’re doing it. We have GPS in your car, so we know what you’re doing. By the way, we don’t supply that data to anyone.” That last bit didn’t sound very reassuring and Farley retracted his remarks. As both cars and roads get “smart,” they promise nearly perfect, real-time law enforcement. Instead of waiting for drivers to break the law, authorities can simply prevent the crime.
  • Thanks to sensors and internet connectivity, the most banal everyday objects have acquired tremendous power to regulate behaviour,
  • In this context, Google’s latest plan to push its Android operating system on to smart watches, smart cars, smart thermostats and, one suspects, smart everything, looks rather ominous. In the near future, Google will be the middleman standing between you and your fridge, you and your car, you and your rubbish bin, allowing the National Security Agency to satisfy its data addiction in bulk and via a single window.
  • This “smartification” of everyday life follows a familiar pattern: there’s primary data – a list of what’s in your smart fridge and your bin – and metadata – a log of how often you open either of these things or when they communicate with one another.
  • In addition to making our lives more efficient, this smart world also presents us with an exciting political choice. If so much of our everyday behaviour is already captured, analysed and nudged, why stick with unempirical approaches to regulation? Why rely on laws when one has sensors and feedback mechanisms? If policy interventions are to be – to use the buzzwords of the day – “evidence-based” and “results-oriented,” technology is here to help.
  • This new type of governance has a name: algorithmic regulation. In as much as Silicon Valley has a political programme, this is it. Tim O’Reilly,  […] has been its most enthusiastic promoter. In a recent essay that lays out his reasoning, O’Reilly makes an intriguing case for the virtues of algorithmic regulation – a case that deserves close scrutiny both for what it promises policymakers and the simplistic assumptions it makes about politics, democracy and power.
  • To see algorithmic regulation at work, look no further than the spam filter in your email. Instead of confining itself to a narrow definition of spam, the email filter has its users teach it. Even Google can’t write rules to cover all the ingenious innovations of professional spammers. What it can do, though, is teach the system what makes a good rule and spot when it’s time to find another rule for finding a good rule – and so on.
  • O’Reilly draws broader philosophical lessons from such technologies, arguing that they work because they rely on “a deep understanding of the desired outcome” (spam is bad!) and periodically check if the algorithms are actually working as expected (are too many legitimate emails ending up marked as spam?).
  • he principle behind “algorithmic regulation” would be familiar to the founders of cybernetics – a discipline that, even in its name (it means “the science of governance”).
  • This principle, which allows the system to maintain its stability by constantly learning and adapting itself to the changing circumstances, is what the British psychiatrist Ross Ashby, one of the founding fathers of cybernetics, called “ultrastability”.
  • This is no trivial departure from how the usual technical systems, with their rigid, if-then rules, operate: suddenly, there’s no need to develop procedures for governing every contingency, for – or so one hopes – algorithms and real-time, immediate feedback can do a better job than inflexible rules out of touch with reality.
  • Such systems, however, are toothless against the real culprits of tax evasion – the super-rich families who profit from various offshoring schemes or simply write outrageous tax exemptions into the law. Algorithmic regulation is perfect for enforcing the austerity agenda while leaving those responsible for the fiscal crisis off the hook.
  • With his belief that algorithmic regulation is based on “a deep understanding of the desired outcome”, O’Reilly cunningly disconnects the means of doing politics from its ends. But the how of politics is as important as the what of politics – in fact, the former often shapes the latter. Everybody agrees that education, health, and security are all “desired outcomes”, but how do we achieve them?
  • Today, when the presumed choice is between the digital and the analog or between the dynamic feedback and the static law, that ideological clarity is gone – as if the very choice of how to achieve those “desired outcomes” was apolitical and didn’t force us to choose between different and often incompatible visions of communal living.
  • By assuming that the utopian world of infinite feedback loops is so efficient that it transcends politics, the proponents of algorithmic regulation fall into the same trap as the technocrats of the past.
  • while Singapore’s leaders might believe that they, too, have transcended politics, it doesn’t mean that their regime cannot be assessed outside the linguistic swamp of efficiency and innovation – by using political, not economic benchmarks.
  • As Silicon Valley keeps corrupting our language with its endless glorification of disruption and efficiency – concepts at odds with the vocabulary of democracy – our ability to question the “how” of politics is weakened. Silicon Valley’s default answer to the how of politics is what I call solutionism: problems are to be dealt with via apps, sensors, and feedback loops – all provided by startups.
  • The intelligence services embraced solutionism before other government agencies. Thus, they reduced the topic of terrorism from a subject that had some connection to history and foreign policy to an informational problem of identifying emerging terrorist threats via constant surveillance.
  • the Italian philosopher Giorgio Agamben discussed an epochal transformation in the idea of government, “whereby the traditional hierarchical relation between causes and effects is inverted, so that, instead of governing the causes – a difficult and expensive undertaking – governments simply try to govern the effects”. 
  • this shift is emblematic of modernity. It also explains why the liberalisation of the economy can co-exist with the growing proliferation of control – by means of soap dispensers and remotely managed cars – into everyday life. “If government aims for the effects and not the causes, it will be obliged to extend and multiply control. Causes demand to be known, while effects can only be checked and controlled.
  • Consider how Fred Wilson, an influential US venture capitalist, frames the subject. “Health… is the opposite side of healthcare,” he said at a conference in Paris last December. “It’s what keeps you out of the healthcare system in the first place.” Thus, we are invited to start using self-tracking apps and data-sharing platforms and monitor our vital indicators, symptoms and discrepancies on our own.
  • O’Reilly. “You know the way that advertising turned out to be the native business model for the internet?” he wondered at a recent conference. “I think that insurance is going to be the native business model for the internet of things.”
  • accepting such tracking systems is framed as an extra benefit that can save us some money. But when do we reach a point where not using them is seen as a deviation – or, worse, an act of concealment – that ought to be punished with higher premiums?
  • The unstated assumption of most such reports is that the unhealthy are not only a burden to society but that they deserve to be punished (fiscally for now) for failing to be responsible. For what else could possibly explain their health problems but their personal failings? It’s certainly not the power of food companies or class-based differences or various political and economic injustices. […] those injustices would still be nowhere to be seen, for they are not the kind of stuff that can be measured with a sensor. The devil doesn’t wear data. Social injustices are much harder to track than the everyday lives of the individuals whose lives they affect
  • algorithmic regulation offers us a good-old technocratic utopia of politics without politics. Disagreement and conflict, under this model, are seen as unfortunate byproducts of the analog era – to be solved through data collection – and not as inevitable results of economic or ideological conflicts
  • Silicon Valley wants to rid us of government institutions. Its dream state is not the small government of libertarians – a small state, after all, needs neither fancy gadgets nor massive servers to process the data – but the data-obsessed and data-obese state of behavioural economists.
  • the algorithmic regulation lobby advances in more clandestine ways. They create innocuous non-profit organisations like Code for America which then co-opt the state – under the guise of encouraging talented hackers to tackle civic problems. Such initiatives aim to reprogramme the state and make it feedback-friendly, crowding out other means of doing politics.
  • Cash-strapped governments welcome such colonisation by technologists – especially if it helps to identify and clean up datasets that can be profitably sold to companies who need such data for advertising purposes. […] after all state assets have been privatised, data is the next target. For O’Reilly, open data is “a key enabler of the measurement revolution”
  • compare the welfare state with the algorithmic state on those grounds is misleading.
  • Silicon Valley’s offer is clear: thanks to ubiquitous feedback loops, we can all become entrepreneurs and take care of our own affairs! As Brian Chesky, the chief executive of Airbnb, told the Atlantic last year, “What happens when everybody is a brand? When everybody has a reputation? Every person can become an entrepreneur.”
  • Reputation does a better job of ensuring a superb customer experience than any amount of government regulation”. Someone, somewhere will eventually rate you as a passenger, a house guest, a student, a patient, a customer. […] to make reputation into a feedback-friendly social net that could protect the truly responsible citizens from the vicissitudes of deregulation.
  • O’Reilly wants governments to be “adopting them where there are no demonstrable ill effects”. But what counts as an “ill effect” and how to demonstrate it is a key question that belongs to the how of politics that algorithmic regulation wants to suppress. It’s easy to demonstrate “ill effects” if the goal of regulation is efficiency but what if it is something else?
  • while the welfare state assumes the existence of specific social evils it tries to fight, the algorithmic state makes no such assumptions. The future threats can remain fully unknowable and fully addressable – on the individual level.
  • this growing cult of resilience masks a tacit acknowledgement that no collective project could even aspire to tame the proliferating threats to human existence – we can only hope to equip ourselves to tackle them individually.
  • Just because Silicon Valley is attacking the welfare state doesn’t mean that progressives should defend it. First, even leftist governments have limited space for fiscal manoeuvres, as the kind of discretionary spending required to modernise the welfare state would never be approved by the global financial markets. Second, the leftist critique of the welfare state has become only more relevant today when the exact borderlines between welfare and security are so blurry. This will expand government’s hold over areas of life previously free from regulation.
  • For MacBride the conclusion was obvious. “Political rights won’t be violated but will resemble those of a small stockholder in a giant enterprise”.
  • In other words, since we are all entrepreneurs first – and citizens second, we might as well make the most of it.
  • Technophobia is no solution. Progressives need technologies that would stick with the spirit, if not the institutional form, of the welfare state, preserving its commitment to creating ideal conditions for human flourishing. Even some ultrastability is welcome.
  • Creating the right conditions for the emergence of political communities around causes and issues they deem relevant would be another good step.
  • What can be specified is the kind of communications infrastructure needed to abet this cause: it should be free to use, hard to track, and open to new, subversive uses. Silicon Valley’s existing infrastructure is great for fulfilling the needs of the state, not of self-organising citizens.
  • The fault is not with that amorphous entity but, first of all, with the absence of robust technology policy on the left – a policy that can counter the pro-innovation, pro-disruption, pro-privatisation agenda of Silicon Valley. In its absence, all these emerging political communities will operate with their wings clipped. Whether the next Occupy Wall Street would be able to occupy anything in a truly smart city remains to be seen: most likely, they would be out-censored and out-droned.
  • MacBride understood all of this in 1967. "Given the resources of modern technology and planning techniques," he warned, "it is really no great trick to transform even a country like ours into a smoothly running corporation where every detail of life is a mechanical function to be taken care of."
  • Stanislaw Lem : “Society cannot give up the burden of having to decide about its own fate by sacrificing this freedom for the sake of the cybernetic regulator.”

Sì lo so, è praticamente è quasi tutto l’articolo. Ma è più forte di me, diciamo che il pensiero di Morozov mi risuona particolarmente. È cristallino, vigoroso e riesce a dare un nome e sostanza a molti di quei ragionamenti in cui capita sempre più spesso d’imbattersi mentre ci si destreggia nel nostro prossimo-futuristico quotidiano.