Vladimir Jerić Vlidi: Notes for Net Activism – Net Demagogy discussion forum

[IMPORTANT: these are only the notes and propositions inspired by the outline of the panel as published on ZKM site; NOT FOR PUBLISHING as yet. Quite possibly full of errors, ”first takes” and stuff I will regret ever having written. This is assembled together only for the purpose of the possible discussion! Some excerpts are adapted from Dispossession by Numbers, a text in progress for the Issue 4 of Red Thread Journal (Ed. Zeyno Pekunlu, Banu Karaca, Erden Kosova, Asli Cetinkaya and Vladimir Jerić Vlidi, summer 2017), and from Untitled As Yet, a text in progress for THE LONG 1980s. Constellations of Art, Politics and Identity, (Ed.​ ​Nick Aikens, Teresa Grandas, Nav Haq, Beatriz Herráez and Nataša Petrešin-Bachelez, Valiz and L’Internationale confederation, Amsterdam, summer 2017)]


If the word of the year 2016 was “post-truth”[1], did the candidate for 2017 already arrive­­ in January?[2] After all, in the era that had to, willingly or not, acknowledge the post-truth state of affairs, the establishment of “alternative facts” arrives as the natural element of such world building, as another corresponding “alt” product and building block of what wants to be called “alt-right”[3]. This is the expected – normal – continuation of the process trough which the very concept of alternative seems to be appropriated, mangled and twisted beyond recognition, to outline today precisely that one worldview that does not allow for any alternative than itself.

The debate on and the analysis of how this phenomena occurred seems to be mostly unfolding in the space created between the questions of if some genuinely profound disturbance suddenly affected the populations on such a massive scale, or if is it just the billionaires’ money[4] well spent.[5] But the process of deconstructing the “alternative” could be observed before, as1 that is the destiny – not the one that inherently had to be – of many other words that were the building blocks of so-called “grand narratives” of (mainly) the previous century. We may consider what happened with the concept and the word of “revolution” [6]; we could look into how “progress” or “democracy” changed its meaning over time, and witness the former becoming almost a derogative[7], while the latter was reduced to the ruthless no-rules numbers manipulation game[8] in which having and stating any fundamental principles only renders as a burden and gives advantage to opponents. Also, we may observe the trajectory of “hacking”[9], or “resistance”, or “expertise” and many other worlds in the similar regard.

Back to the fate of “alternative”, to it’s new meaning and the consequences of such turn. There seems to be a lot of questions, and none that seems easy or obvious to address first. What explains the easiness with which the populations willingly forgot what the word used to mean? Is it because of the sense of betrayal of the promise of inclusiveness, or getting worn-out of invoking the future-to-be, or about becoming disenchanted with the principles of collective advancement trough the permanent critique, as embedded in the old meaning of alternative? Or is it really about the strength of the belief in the promise of “taking back” something or making it “great again” of the new meaning of the word, which is now looking into the yesterday-that-never-was as the only possible future-that-is? Or is it about the irresistible allure of some algorithmically-engineered fine-tuned personalized-to-each-user-profile perfect blend of both?

Did alternative got confused in time? By cheering for tomorrow! all the time just to eventually become the cry for yesterday!, did it at some point forget to address today?

Was the biggest conceptual problem the (original, Californian) premise of the otherness, otherworldliness, “independence” of the net from the rest of the Universe (cue Barlow)? Was that what created the point of break of the “alternative”, as established during the 1980’s, with any tendency to offer a face value, a tit-for-tat in regard to the living, breathing, political reality, and to address the issues of former society? To understand this, I find helpful to consider the events of 1968 as “a critique from the left”, and to investigate more into the 1970’s, which incubated the “alternatives” of 1980’s,[10] the period when alternative was still called and understood as avant-garde.


The practice of activism could be observed as always being connected to the idea of alternative. A simple explanation of activism might, among other definitions, also be “advocating (a certain) alternative”. With “alternative”, originally meaning “not this” (sometimes, and with a lot of risk, “anything but this”) but now emptied and then appropriated by the contemporary alt-(right, facts, whatever) to mean “nothing but this”, what does it mean for activism? If “alternative” used to outline what was a minority opinion at a certain moment, a fringe or borderline idea with a potential to become accepted into the mainstream (or not), what changes when the forces in power take the “alt-“ prerogative themselves?

Can activism offer anything genuinely -alt anymore, with seemingly no possibility to (re)establish its inherent environment, without the apparatus of alternative that has switched to “the other side”?[11] [12]

The forms of organization, recognized as activism in this particular domain of concern, were historically (now “historically” can be applied to even 2000’s) connected with several “streams”, to say, most out of which seem not to be applicable anymore,[13] and mostly because of the problem that could be, if observed trough the material capacity of activists or “former alternatives” to even take part, be described as “the matter of scale”.

Alt-hardware or Post-hardware?

Departing from the very first and often self-assembled computers of 1980’s, trough the modular (“improvable”, “abuseable”, “upgradeable”) commercial concepts of 1990’s and early 2000’s, we now live in the world of superdense unibody robotically assembled hardware with ever shorter programmed planned obsolescence cycles, and the trend develops towards the humanly non-producible, non-reparable, non-hackable nanotechnologies; and, equally if not more important, we also live in the world of “startups” now.

The former means that, besides being presented as the eccentricities and artistic endeavors, no one would be interested in any of the “alternative” hardware propositions or solutions[14] that are an inch bigger of 50 grams heavier that what corporate robots can make; and the latter marks the change of the entire tendency, of the horizon of thinking, from dreaming of establishing an entire game-changing independent industry (as romanticized by the stories of Gates and Jobs and other “I started in the garage and changed the world” heroes of the previous decades) to brainstorming about the best sales pitch in selling one’s concept to venture capital, or if very successful directly to the Microsofts and Apples of today.

All this also marks the unannounced and unacknowledged but profound core change of the ideology of Silicon Valley and of corresponding Silicon Values, the transformation that will have to be sooner or later addressed by the very internalities of that particular worldview. At present, what still remains “unspeakable” in the ideology – that the myth of garage tech entrepreneur was not the latest but probably the last incarnation of the principle of American Dream, now abandoned with no hope of return – is being very obvious in practice (e.g. the story about the Gang of Five oligopoly[15]), and brings the additional layer of cinicism in the very Silicon heart of the matter. What does it mean for activism?[16] [17]

(It is interesting that in Chaos Computer Club’s Frank Rieger’s text “We lost the war. Welcome to the world of tomorrow.”, from the perspective of 2005. when the sudden rise of the Gang of Five was just about to take off, Rieger had seen the State as the greatest of evils and in his analysis “corporations came second”, to say. Wonder how he would observe it from the perspective of 2017?) [18] [19]

Today, Internet does not live in gadgets we could hack anyways, nor is in any way dependent on networked personal storage media anymore, and this situation is far from the p2p days of the decade ago. [20] It lives in vast, thousands of square meters big data centers that take billions of watts of power to cool down their globe enwrapping computation, and are now beyond anything the people can offer the alternative to.[21] To counter or to parallel that will take the resources of the scale that is unique only to several corporations and states. It is not that those centers of power (here, a digression; how long it took to re-establish the idea of “the center”[22] in the new, horizontal, “decentralized” networked media environment?) are unhackable or non-repossessable; it is that in such analysis of possibilities we do not talk about activism anymore, but of war, guerilla war, civil war, World War if you like; but war, not activism.

But what would that mean for activism?


Software activism in all it’s shapes and forms seems to have hit the similar, if not the same wall; lulled into the Barlowian belief that the their debate unfolds in the world entirely of their own making and control, the conversation became splintered into advocating a myriad of approaches, where almost each individual actor “forked” it’s own “distinct ideology”. It all condensed and materialized in corresponding software as object and as product.[23]

In the meantime, corporations were working steadily in piling up the data, able to invest virtually nothing but in own infrastructure as more people volunteered with submitting more of the data to the unavoidable free services of today,[24] and in turning the independent scene into the incubator of startups, spending nothing but a pocket money in the process.[25]

Big Data requires Big Money. We can all pile our computers together and not to match a remotely significant percent of the corporate processing and storage power. So no more “inside jobs” are the option anymore, and no “Luddite activism” is meaningful. But also, no studies of power are useful anymore, as it seems that no stone is left unturned in the long tradition of art, philosophy, criticism, activism, examining all the various mechanisms and leverages and motives of power. What is left on the table? What else, what more could be done, if the very basic framework of the world is something the people can not agree upon anymore (e.g. as McKenzie Wark writes that “This civilization is over, and even its defenders know it”[26])? [27]

Also, in this debate, the conclusion that “the solution needs to be political, not technological” is frequently being heard. While it makes sense to put the weight and importance on the lack or the insufficiency of current politics to address the matter and to outline the painful lack of corresponding ideological orientation, it makes less sense to almost dismiss the role of technology, as if the terms are absolutely disentangled.[28] It is not unworthy to explore the possibility of addressing the problem trough technology, however risky or ideologically distant such approach appears to be.[29]


Who are the new (or Alt-)philosophers of today, called for to explain the present state of affairs and to outline the horizon of the (former) future? One word comes to mind: “entrepreneurs”. This is not exactly a recent phenomena, and there we already had a succession of generations – Steve Jobs not around, Bill Gates retrieved to the status of “The Worrying Elder”[30], and pretty much the entire previous establishment that had a say gone to yachting and eternal youth now, so it is the likes of secretive Mark Zuckerberg (as maybe a contemporary replacement for Pythagoras, although he likes to quote Abraham Lincoln[31]) and talkative Elon Musk (as, perhaps, the contemporary avatar of Plato[32]), who will – or not – decide if something is media and under what conditions, who will define the statuses of citi-neti-zens and who will define rights and responsibilities.

Such people, and almost exclusively trough interviews and short TED-like talks rather than in books or papers, define what is democracy, what we do with medical and space research, what to be afraid of, what will be the role of the State, what is globalization and how it should be done, they are the voices to tell the world how the future will be,[33] what the social values are, what the truth is really about… Importantly, these people never actually volunteered for that exact role, and it was the choice of media, and the audiences, rather than their dedicated effort, that put them there. Hence, most of their wisdom would be something served as an afterthought during a seldom interview about the latest features of their new products, with philosophers of entrepreneurship appearing absent-minded, somewhat inconveniently disturbed in between their really important daily tasks of running the entire world.[34]

At best, philosophy became just one among many Alt-jobs.[35]


A flood of analysis emerged trying to understand and explain this massive and rapid post-truth turn. From post-modernism as such,[36] to the alt-fascist shape-shifting of the triangle of power (corporations, military complex and con artists that pass for contemporary reality show politicians, forming the uninterrupted line of disappointment in the mechanisms of government and of the state), all the way to even quantum physics confusingly appearing to challenge the existence of any firm reality[37], the populations seems to be growingly confused by both what is the natural, objective truth and what is the truth of our own making. And some even seemed to enjoy not to know, once again, who will win the WW2. Those would mostly be the pioneers of “alternative” world that followed.[38]

But most of the attention – and, why not, blame – was turned towards media, examining the power of contemporary digitized and networked media to be used to find and join any several existing or non-existing dots to support virtually any random belief or a wild idea.[39] [40] Maybe it is, after all, the consequence of the demise of the left-wing establishment’s vision for the future[41] rather than the sign of the sudden rise of the right-wing sentiments.[42] It is not hard to understand the resulting general sense of insecurity, of inadequacy and inequality.[43] For most, it seems impossible to resist the urge to “go back” to some (fantastic) point in the past, where they believe they could orient themselves better – hence the slogans of “taking BACK control” or making something “great AGAIN”.[44] Maybe it is about the very nature of the transformation of media itself,[45] somewhat lessening the responsibility of most of humans in what appears to be a war against any common reality[46] that is waged on a base, technical, environmental and historical level[47] [48] – perhaps the people just over time accepted that the world is a wiki/timeline/blog, and that historical facts can (and will) be different each time one refreshes the same page.[49] [50] Perhaps it was the case of too much, too soon.[51] [52]


No wonder that, after the various different sports, politics in the form of elections is what common people most frequently bet about (pseudo-aristocracy does that for a job, in which case it is usually called “stock exchange” or even “economy”). But no wonder that as more money entered the world of politics, the “predictive capacities” of what is essentially the financial/game-theory model used started to show it’s inconsistencies, so that the “quotas”, the predictions, in the latest high profile cases of UK referendum and US elections, could not hint with any precision to what will actually happen. [53]

This had a twofold effect: it additionally reassured the already paranoid “alt” populations that their candidates were treated deliberately as underdogs by the polling agencies “controlled by establishment” (which always remains a realistic possibility), and added to the sense of suspicion towards the merit of any expertise, scientific or other, producing the uninhibited contempt towards anything remotely connected with what is recognized as “intellectualism”.[54]


From the imaginary perspective of alt-fascism, into which the process was eventually being developed, it made sense to try to reconfigure itself in a slightly different manner than the previous – and almost successful – attempt; if historical fascism was a convenient partnership against the people between the State, military and capital where the State appeared as the center of coordination and attention, alt-fascism changes nothing but puts corporations first; and from there we can speak of the seductive, almost magical powers of data analysis (and data mangling, if necessary) and especially of globalized digitized networked media, to obfuscate what exact process was being triggered.

Curation is the process through which, mainly by the tools of selection and evaluation, a cultural phenomena or a product of culture is being stamped and restamped as the one of meaning and certain value, or not.

The facts (that seems to be worth nothing today without the “alt” prerogative) are that the only rise and consolidation of the middle class happened during the several decades of the viability of the politics that will be recognized as a left or leftist today (but perhaps quite mainstream for its time), and throughout the history of capitalism in the West, the inequality is indeed in the constant, almost uninterrupted rise. Depressingly, according to the latest “big data” analysis of January of 2017, the only periods when inequality was NOT on the rise in what we call the (former) West since 1300 were the Black Death plague of XIV century and the two World Wars.[55]

Today, AI (Artificial Intelligence) is about to add the additional – many already argue “final” – layer of complexity, that for some (the owners)[56] renders as insight and control (and profit), while for users appears as a variegated instances of dispossession (being, to use Flusserian terms, a “structurally complex system that is functionally simple”). The funding of such research – and there is no reason to think that most of that data is available to the public[57] – combined with dismantling of the institution of independent academic research[58] threatens to exclude almost everyone outside corporations and certain branches of government from even being aware, not to speak of taking part or having a say.

Alt-art, or Post-art?

What about the poetics of techno-fascism?

What is the poetic expression of the technology of today?

Perhaps there are no poems and new art movements inspired by the power and precision of Predator drones that we could witness today, in the manner similar to how the likes of Marinetti celebrated the speed and the power of combustion engine or the introduction of the aerial bombardment[59] some hundred years ago. What we have instead are, for example, the poems on the Fourth Industrial Revolution by Brian Bilston, commissioned by the World Economic Forum Annual Meeting 2016,[60] and that doesn’t feel like anything Marinetti tried to do. Probably because it is not the continuation of Marinetti’s vision of the future of poetic expression, the closest of which may be the transcription of the sound of the fans cooling the huge data centers somewhere in Indiana. Or watching the NumbersStation Twitter feed.[61]

Perhaps there is no “Manifesto tecnico sulla letteratura futurista” of today, but manifestos are meant for the world of different speed and division of time; what we have instead are the real-time maps, the spectacular images of Predators in action made by other drones and all the “alt-right” outlets to marvel and celebrate such technology. And there is Twitter now.

For some reasons probably best explained by all the conspiracy theories out there, computer algorithms seems to be crafted to be especially adept for allegorical doubling of the functions of curator. Sounds almost poetic – an algorithmic allegory. A perfect match on dating sites where one can choose a perfect robot to be replaced by.[62] [63]


Alt-Literacy or Post-Literacy? (Alt-War or Post-War?)

A lot of attention was being paid to the advancement on non-verbal communication that relies on the networked exchange of digital images (photos and emoticons and other graphs and symbols. moving or still). In some of the cases, the accumulation of such images seems to be permanently archived on various threads and timelines, seemingly focusing on the preservation and manipulation of such media text, that is, content, in the manner that resembles a “historical” approach (causal, linear, etc). In some other cases, such are abandoned, forgotten, today incompatible/indecipherable or deleted digital objects, and perhaps especially outlined in contemporary Snapchat-like tools that deliberately delete the content after exchange, the focus seems to be on re-examination and revival of the oral traditions of communication[64] and such development is focused on experience of the moment of exchange rather than on storing the information itself as a (possible) element of knowledge for later inspection and analysis. Here we perhaps face the “post-historical” approach which e.g. Flusser spoke about.[65] This is all important, and a lot could be said about what such technologies do and reveal in regard to the notions of memory, and, especially, time.

But, in some other view, the growing number of services that promise to delete the content once exchanged, the rising practice of deleting posts – and entire profiles – for various reasons, the growing pressure for enforcing the legal “right to be forgotten”, and the growing “normalization” of such instability of existence of the objects (what, presumably, puts ever more importance on the experience as such), tells about the rising importance of meta-data and about diminished usability of data objects in contemporary digital networked systems of power for the algorithms to analyze.[66]

Also, it implies the possibility of a certain “anti-historical”, rather than pre- or post-historical approach; perhaps something to be contemplated and discussed on the very panel in ZKM.

So the theory goes; in practice, it proves to be inconsistent. Pairing known patterns of behavior and established profiles with recommendation systems works like magic on the scale of Amazon or Netflix, gaining billions to the owners of such algorithms. But it is still only a marketplace, deploying all what we already know about this mechanism, albeit enhanced and amplified beyond expectation.[67] Importantly, it can not address the people who are not there, not logged in, nor it can sell anything to those who can not pay, and is especially inefficient in “selling” stuff to those who are aware of how it works. [68]

But then consider what we imagine happens in the vaults of criminal and terrorist recruitment that unfolds growingly on several huge social networks rather than in the “darknets” we used to speak of until recently, when we apply the same “logic of scale” in analyzing the metadata; the profiles and connections are being examined and established, and the profiles of “types” of (potential) various criminals and terrorists are being produced. The action taken based on such sorting / decision is dramatically different than in the case of markets; instead of serving the inefficient or annoying adds to the people identified with the wrong or too loosely defined “target group”, the “potential criminals or terrorists” are being incarcerated or annihilated. [69] [70]

Confusing the logic of the dominant cultural and economical model with the democratically controlled power to decide on life and death in extreme situations is practically guaranteed to have lethal outcome. It seems as the unfortunate logical continuation of confusing persons with property; when people were confused with property before, the result we called slavery; for the rapid contemporary development in which the property is being confused for persons – in the recent case of corporations or brands acknowledged a “personhood” – I still struggle to find the proper name, and it already develops into this even more pervasive logic of confusing the predictive market analysis tools with all other historically important decision-making mechanisms.

It can be said: Alt-Literacy is writing code; Post-Literacy is posting images and touching the pictures. But it can not be that simple.

The algorithm never tries to communicate with you, even if it is your very name at the start of the conversation; it attempts to communicate with humans like you.

These are precisely what they are: searching and sorting algorithms, and the data we decide to give them to compute. Interestingly, and somewhat mysteriously, despite all the efforts so far, we still seem not to be any closer to better understand either what human consciousness is or how exactly it may work.[71]

Alt-Human vs Post-Human

“Future is not a natural dimension of the mind, rather it is a modality of perception and imagination, a feature of expectation and attention, and its modalities and features change with the changing of cultures.” [72]

Franco Berardi Bifo

Like with climate change, talking about the dangers of overdeveloping technology is the prerogative of global occident; talking about the dangers of underdeveloping technology is the prerogative of global orient. And a speculative debate on how the world works used to be the prerogative of The Students – this is the part of their conversation we overheard[73] some time ago:

Everyone: Oh, criticism! It’s about criticism, right? Tell us more!

Julian: But first we need to get there. And today, the fastest way to get anywhere seems to be to use algorithms. A lot has been discussed on their speed and efficacy, including that the market will be using more and more of algorithms in the process of curating. Through algorithms, we can recognize all that is dispensable in a curator, and all what curating is probably not really about.

David: But what if we think that algorithms may be sufficient for curating? After all, the functions algorithms are envisaged to perform seem to mirror the list of the perceived tasks of the curator. There are searching algorithms, slicing algorithms, sorting algorithms and merging algorithms; there are, of course, hybrid algorithms, that combine and automate several of those functions, and more, in one neat package.

Hartshorne: Algorithms can, directly or indirectly, perform the functions of socializing, filtering, researching, archiving… Algorithms can even offer interpretation, up to a certain level. All these things are what a curator does. The figure and functions of curator can be observed from within the art context, but more and more also outside it; curating now resides in all instances of contemporary life.

Alexander: We wouldn’t say that algorithms create the possibilities to do unimaginable things; that would be similar to admitting that meaning is already there, only hidden in the pile of data. Imagination, as projective category, is rather different from prediction, and it is the latter, not former that we expect as the result of the process of computing. These are precisely what they are: searching and sorting and computing machines, plus the data we decide to give them to process, nothing more, also, nothing less. It is true that now, with algorithms in operation and with more and more cheap data around (it only works when data is plenty, and cheap!), all sorts of things that were deemed to be impossible became reality.

Peter: And perhaps we are just a tiny bit closer to perfecting the algorithmic curator. In the introduction to their paper published this summer, researchers from Germany wrote: “Moreover, in light of the striking similarities between performance-optimized artificial neural networks and biological vision, our work offers a path forward to an algorithmic understanding of how humans create and perceive artistic imagery.”[74]

Armstrong: Now, we seem to be back at the Shaolin territory, as Bruce Lee advised–in the case of self-learning algorithms, the code is trained rather than written. And those researchers from Germany have even greater algorithmic powers than we think – they can predict next characters to die in your favorite series.[75] Kung fu seems to be strong there.

Barnewall: Like when Bruce Lee entered a hall of mirrors in the famous scene from Enter the Dragon. Those hundreds of reflections of Bruce Lee and of his enemy did not have a chance against Bruce Lee himself.[76] He had to eventually destroy all the images in order to win, though.

Andrew: But what is ‘algorithmic understanding’?

Samuel: We think this is what we cannot know. Worlds are apart. Have you ever heard a joke that starts with ‘an algorithm walks into a bar…’?

G.N.: If we explore the differences between the two worlds in regard to the notion of time, then we learn about the impossibility of ever ‘going there,’ that is, of really understanding what the world of these mathematical entities ‘feels like.’[77]

John: The world of forms as examined by Plato and Aristotle was based on the notion of the ideal form. Algorithms, through all the filtering and profiling and looping all over again, are constantly perfecting the average form. Implications abound.

Augustine: Yes, art and curating are not about average anything. Can something made from averages, something that is meant to be average, even be criticized?

Leonard: Algorithms often co-perform with the social subject of ‘users’, creating the architecture that shapes the majority of contemporary social performances. So what are contemporary cultural objects and practices in which we can locate a gesture of the curatorial and observe it as part of curatorial education? And back to our Conundrum, can we think of ‘what (is) not’ as an integral part of ‘what (is)’?

Peter: With its ‘real-timeness,’ its inability to exist either in the past or in the future, how algorithms function does mimic the essential quality of the process of curating—it tries to replicate the outcome, characterized by that one gesture that has the ability to condense what was and what will be into what is.[78] [79]

Everyone: Yes, it’s about the gesture!

John: But an essential quality of such an act is missed if viewed as an aesthetical operation; a curatorial act draws its power precisely from its inherent criticism. Without the critical urge, no such act of transformation would have a sense of meaning. If everything was right with the world ahead of such intervention, a curator would be not more then a mad god constantly rearranging a perfectly fit world just to make humans aware of their lesser-ness, and his boredom, and the curatorial act would be nothing but an act of violence.

By looking at algorithms and the results of their action, all sorts of thing would be relieved about humans and how they may work, but nothing will be disclosed about algorithms themselves.


A new vision of the world has to go beyond criticism… But it needs to get to criticism first. In order to get to what we might call a full picture, we need to have a space outside of it, a certain distance. It was observed a long time ago that “criticism is a matter of correct distancing. It was at home in a world where perspectives and prospects counted and where it was still possible to take a standpoint.”[80]

Critique requires distancing, and there is no distance in supernow. The algorithmic perception does allow for the functions of filtering and especially evaluation, but not for criticism – a critic is the name we allocated to a different animal, the one we want on the list of controlled species now.

The future of voting is dating yourself.


[1] “The Oxford Dictionaries Word of the Year 2016 is post-truth – an adjective defined as ‘relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief’.” https://en.oxforddictionaries.com/word-of-the-year/word-of-the-year-2016

[2] Personally, I would place my bet on “post-work” or on the related UBI (“universal basic income”) debate, but perhaps it will be a bit slower in its arrival to popular perception as so far is being discussed mainly in the circles observing the growing uneasiness in the relationship between the economy and algorithms.

[3] Probably because it sounds better, or at last shorter, than “paleoconservatives”:

“Paleoconservative is a term that describes conservatives who support strong restrictions on immigration, a rollback of multicultural programs, the decentralization of the federal polity, the restoration of controls upon free trade, a greater emphasis upon economic nationalism and isolationism in the conduct of American foreign policy, and a generally revanchist outlook upon a social order in need of recovering old lines of distinction and in particular the assignment of roles in accordance with traditional categories of gender, ethnicity, and race. As such, paleoconservatives differ from mainstream conservatives.” (http://www.conservapedia.com/Paleoconservative)

“Paleoconservatism (sometimes shortened to paleocon) is a conservative political philosophy found primarily in the United States stressing tradition, limited government and civil society, along with religious, regional, national and Western identity.” (https://en.wikipedia.org/wiki/Paleoconservatism)

[4] Carole Cadwalladr, “Robert Mercer: the big data billionaire waging war on mainstream media”, The Observer, Sunday 26.02.2017 (https://www.theguardian.com/politics/2017/feb/26/robert-mercer-breitbart-war-on-media-steve-bannon-donald-trump-nigel-farage)

[5] What is certain is that the things suddenly started to feel strange, in a way that me and a lot of other people find hard to understand or explain. “Mr. Richter, I’ve a feeling we’re not in capitalist realism anymore,” to paraphrase Dorothy from The Wizard of Oz.

[6] If the people accepted that there is no other word than “a revolution” to call, for example, a religious uprising against the secular government, or the removal of the headphones port on the latest iPhone, no wonder that such almost perfect perversal of the original meaning makes the word (and the concept) empty, and open for any kind of “creative” use. Did people ever before called revolutions by the name of private foreign companies (e.g. “Twitter revolution” has it’s own Wikipedia page now, and is responsible for “revolutions” in Egypt, Ukraine, Tunisia, Moldova and Iran, https://en.wikipedia.org/wiki/Twitter_Revolution)?

According to the previous research I had an opportunity to conduct on a scale available to a certain regional network media franchise, the word “revolution” in the territory of Western Balkans in the 2011-2013 period was almost exclusively being used in connection with the various different telecommunication products and services.

But, after all, isn’t any revolution at its last instance a technological one? It is one thing to title revolutions after the proprietary technologies and corporate brands, and quite another to recognize the structural change within a certain social, political, economic, cultural system as the cause and effect of what we name “a revolution”:

“I would like to say the following, if I may: every revolution, be it political, economic, social or aesthetic, is in the last analysis a technical revolution. If you look at the big revolutions through which mankind has gone, let’s say the Neolithic revolution or the revolution of the Bronze Age, of the Iron Age, or the industrial revolution, every revolution is, in fact, a technical revolution. So is the present one. But there is one difference, So far techniques have always simulated the bod. For the first time our new techniques simulate the nervous system. So that this is for the first time I really, if you want to say so, a really immaterial, and to use an old term, spiritual revolution.” (Vilém Flusser – “1988 interview about technical revolution”, Osnabrück, 1988, https://youtu.be/lyfOcAAcoH8?t=558, 09’20”)

[7] … because it can not survive without its substrate, the now cancelled and abandoned framework of future – as F.B Bifo notes, all the XX century movements – from liberalism to social democracy, from communism to anarchism, were sharing the same certainty that “notwithstanding the darkness of the present, the future will be bright”. (Franco Berardi Bifo, After the Future, AK Press, 2011.)

[8] … if it ever was any different or better than that; like with everything else these days we are not sure anymore, as the claims that it was just about it’s very design and character all along are now louder than yesterday.

[9] “It was the term I thought you could use 10-15 years ago as a way of talking about kinds of social practice that tries to do things like that. But it was always the language that was partially co-opted, even at the late 20th century hacker was a word that was partially criminalized, and more recently it was being kind of commercialized. Every idiot making an app now thinks they’re a hacker, and I am like, no, it’s just a job. Part of the problem is finding languages that describe kinds of practice that aren’t necessarily labor, so how do we describe things that we want to be doing?” (McKenzie Wark, “From diachronic to synchronic”, https://www.youtube.com/watch?v=E76UTcwS2wk)


[11] If we see activism as indeed divorced from alternative now, there are fundamental concerns – simple to be expressed as questions but hard to shape into propositions – in regard to it’s future. Can the principle of activism be transformed into some other form that would translate its effect to contemporaneity, so to remain unaffected by the shape-shifting “quantum mechanics” the original term was submitted to? Can activism even act within these new surroundings that are not even hostile anymore, but refuse to recognize the very principle?

[12] From the perspective of “alt-right”, currently high on the success of being on the winning side of the transformation, all of this debate makes no sense, and there is nothing wrong with activism, actually, it never ever fared better. Why not frame the alt-right success like this: isn’t also all what is happening today a great opening for activism as such, isn’t the very fact that a great proportion of population expresses dissatisfaction with the state of the world and is ready to try alternatives actually one great opportunity to propose and fight for change? A chance for ideas to be communicated directly to the people, so the people could directly decide upon them, and eventually, “they did”? And is it not precisely that what is currently reflected in the critique of post-truthful methods and principles used, a simple case of envy by those who are perceived as losers in the process?

(Regretfully, some former progressive initiatives also seem to think exactly so by trying to adopt the similar “effective” approach of grand posturing and shouting whatever convenient at the moment of political campaigning, so to adjust to the “new media and political environment”. This, we just know, will not end up well.)

[13] But, this can only hold true if activism is viewed as just “a means to an end” and if the meaning of “means” is equalized with “anything that works”, essentially militarizing the concept. Any, even the slightest variation of such view, renders the whole construction dysfunctional. Hence the cynicism of post-truth-alt-facts operators who know, and are aware of the easiness by which alt-facts makes the alt-right boost, and another form of cynicism of post-truth-alt-right users who are not aware, and want to believe the operators that everything is the same as it used to be, only with “real facts” (“our facts”) taking their rightful stand under the spotlight now, not being suppressed by political correctness or grand conspiracies anymore. When these two forms of cynicism accumulate their critical masses and confront with each other, the entire post-truth-alt-facts universe will probably collapse in in-fighting; but that moment seems to be somewhere in the future yet, as the architects of the alt-power still do manage to hide how much they despise the people (and especially those they are able to manipulate), and the people are yet to understand what it means to try to navigate the world using the information about events that did not happen or about things that do not exist. (BTW, truth is not realism; things that do not exist are fine, and perhaps more important than those that do).

[14] Today people are free to “hack” drones or TV-boxes mostly as “users”, by purchasing and juggling the prefabricated add-ons trough Amazon or Alibaba, perhaps adding a bit of modified firmware downloaded after the Google search for the best option. It is a long way from DIY of previous decades where hardware hackers were mostly “makers” … And there seems to be no reversal to this process, considering the techno-imperial power of technology we both want and need, as the investments necessary to develop it further at this stage require budgets not affordable to most states and businesses. It bares similarities to the developments in science, where all the “low hanging fruit”, as they call it, seems to be already picked, so in order to break trough with new and more complex ideas one needs the access to and the confirmation by the huge particle accelerators or latest space telescopes. It is not that it is not possible anymore to arrive to the revolutionary invention or the epic discovery by drawing in sand or just thinking hard enough; it is that it is vastly less likely for it to happen so, compared to only several decades ago. There is nothing wrong or unexpected with such progression of complexity in the technology of science; the problems will arise in regard to how exactly it will be institutionalized.

[15] “In fact, what we have now in consumer tech, in 2017, is an oligopoly, at least superficially similar to the old industrial-era American corporate groups that once dominated key industries.” (Walt Mossberg, “Tech’s ruling class casts a big shadow”, The Verge, March 2017, http://www.theverge.com/2017/3/8/14848642/walt-mossberg-tech-gang-of-five-apple-google-microsoft-amazon-facebook)

[16] If the only “hardware activism” available today is embedded in the concept of “startups” as “incubators for the Big 5/the Market)” or in Amazoning and Googling the already available modifications, then the conclusion might be that there will be not much of Alt-Hardvare around in the form of genuine DIY or altered or hacked devices; the advancement of nanotech and Cloud seems to go towards Post-Hardvare that will be, as previously noted, humanly non-producible, non-reparable, non-hackable. (the same goes for software now; deep-learning and machine learning produce “black boxes”.

[17] Of course, there were, and still are, ideas. For example, OLPC was an interesting project, but OLPC kind of projects tend to always lose to Microsofts and IBMs, or to local corruption, and are lost in such environments where only the global corporate can thrive. Or, Raspberry PI; it is a great little thing, nevermind its firmware being closed. But eventually, whatever interesting use you make out of it, you want to control it or receive the result on your iOS or Android thing. Hardware activism seems to be a battle with windmills today. Hardware hacking is the domain of giant corporations and powerful states, and whatever happens in your neighbors garage, if remotely interesting, will surely draw some YouTube views and, if very lucky, a prompt call from The Gang of Five, but is not likely to change the world. EXPAND.

[18] For example, since 2011, the Apple Corporation is frequently financially “heavier” than the US government: BBC, July 2011: “Apple holding more cash than USA” (http://www.bbc.com/news/technology-14340470), or:

Forbes, April 2014: “Fun Number; Apple Has Twice As Much Cash As The US Government” (https://www.forbes.com/sites/timworstall/2014/04/13/fun-number-apple-has-twice-as-much-cash-as-the-us-government)

But, to make things more interesting and to abstract a bit this new power from the mere financial interest, from the perspective of investors a significant portion of big tech companies are not highly profitable, or not profitable at all, at least in a “reasonable” time span.

[19] “The waves of technology development seemed to work in favor of freedom, most of the time. The future looked like a yellow brick road to a nirvana of endless bandwidth, the rule of ideas over matter and dissolving nation states. The big corporations were at our mercy because we knew what the future would look like and we had the technology to built it. Those were the days. Remember them for your grandchildren’s bedtime stories. They will never come back again.” (Frank Rieger, 20.12. 2005: “We lost the war. Welcome to the world of tomorrow.”, archived here: http://frank.geekheim.de/?page_id=128)

[20] Do the hackers / net activists of 90’s and 2000’s see themselves now as a part of the world that used to belong to some ancien régime, or rather as a part of the today equally meaningless opposition to it? By almost celebrating the weaker, dumber, slower capitalism of the previous decades, do we all feel at least slightly reactionary and conservative? Is there anything to be learned from that?

[21] Taylor Glascock, “The Internet Lives in a Huge Hotel in Manhattan”, Wired, 11. November 2015, https://www.wired.com/2015/11/peter-garritano-where-the-internet-lives/#slide-1

Or how e.g. Google wants us to (spectacularly) imagine it: https://www.google.com/about/datacenters/gallery/#/

Or how it looks as a part of a big city from the perespective of common citizen with no access to the facilities: https://wheretheinternetlives.wordpress.com

[22] Ingrid Burrington, “Up to 70 Percent of Global Internet Traffic Goes Through Northern Virginia”, The Atlantic, January 8, 2016 (http://www.nextgov.com/big-data/2016/01/70-percent-global-internet-traffic-goes-through-northern-virginia/124976)

[23] Everybody just had to differ from all others at least by one significant word in the latest open license, or in the only right line of code in the latest version of something. Additionally fostered by the celebration of individualism and the belief in omnipowerful but invisible body parts of the market that eventually curates everything, the discussion on the strategy for the future turned into the endless line of disavowing and disunification of the present. All that was probably perceived as a certain liberalization at the time, and for a valid reason; it perfectly mirrored the “superfree market” idea that was built in the logic of the deregulative approach that such community felt is their substrate, the very condition of existence. But it did not produce a new and coherent field of ideology, or of politics. All “traditional” left-wing analysis was dismissed as authoritarian, ideological, stubborn in its argumentation and insufficient to cope with the historical novelty of this “unprecedented” development. It can also be said that the most of the ultraliberal free and open-source software developers were the pioneers in denial of arguments other than those ideologically convenient at the certain time, thus heralding the trend that condensed into what we discuss today as “alt-facts” approach towards reality. And there were conservative software developers, working mostly on proprietary software, with their quite coherent right-wing worldview; apparently there is something lacking in this picture, and the outcome of such constellation is what we have as the conundrum of today.

[24] This is a simulation and not a real real-time tracker, but informative to observe and contemplate: http://www.internetlivestats.com

[25] All the way to mid/late 2000s, a big portion of what was called open-source/free software activism decided to take the approach of educating the rest of the population “the hard way”, that is, by dismissing offering any application that is user friendly, aesthetically pleasing or ready to be used intuitively by people not proficient in compiling own software from source or in customizing interfaces, insisting instead that the end user should learn to assemble and maintain their own end products. It is easy to understand the merits of such ideological-educational determination. It is also easy to understand what happens when a tendency becomes authoritarian before the certain horizon is being reached. So after that it was simply too late – all the enthusiasm there was in users to embrace the free software evaporated as from their perspective not much was actually being offered, and corporations took them all in a single swipe by offering “free” products with desired looks and features ready out of the box.

[26] McKenzie Wark, “Metadata Punk”, Public Library, What, How & for Whom, Multimedia Institute, Zagreb 2015, p. 117

[27] This was traditionally observed trough examining the “studies of power”, and now is perhaps the moment to establish what I would attach the working title of “people studies”. I can say that much at this point, not because of obtaining some discretionary maneuvering space by keeping something secret, but because it will take some time to outline even the basic shape of such proposition. By nature and by design, it has to be simple, and quite often the simplicity is astonishing precisely because it reveals in its very form the amount of time and complexity involved in formulating it. All these worlds are yours, except Europa; attempt no landing there. THIS WORK IN PROGRESS WE LEAVE FOR DISCUSSION.

[28] It is not only, if we observe the situation trough for example Flusserian or Kittlerian and many other (mutually rather different) keys and terms, that it is precisely the technology and nothing else that both creates and expresses the reality, with all its attributes of truth; it is also to find similar thinking in the writings of authors like Benjamin or Marx or Barthes or Lacan and many others (again mutually rather different), who eventually submitted technology to ideology, but not without a recognition that it is an insight made possible by and trough the technology alone.

[29] “These are complex problems, and the solutions will not be simple. But a few broad paths to progress are already clear. We must work together with web companies to strike a balance that puts a fair level of data control back in the hands of people, including the development of new technology like personal “data pods” if needed and exploring alternative revenue models like subscriptions and micropayments. We must fight against government over-reach in surveillance laws, including through the courts if necessary. We must push back against misinformation by encouraging gatekeepers such as Google and Facebook to continue their efforts to combat the problem, while avoiding the creation of any central bodies to decide what is “true” or not. We need more algorithmic transparency to understand how important decisions that affect our lives are being made, and perhaps a set of common principles to be followed. We urgently need to close the “internet blind spot” in the regulation of political campaigning.” (Tim Berners-Lee, “Three challenges for the web, according to its inventor”, Web Foundation, March 12, 2017, http://webfoundation.org/2017/03/web-turns-28-letter, also see: https://solid.mit.edu)

[30] The Washington Post, “Bill Gates on dangers of artificial intelligence: ‘I don’t understand why some people are not concerned”, February 29. 2015 (https://www.washingtonpost.com/news/the-switch/wp/2015/01/28/bill-gates-on-dangers-of-artificial-intelligence-dont-understand-why-some-people-are-not-concerned)

[31] We can go on about this one the whole day, and that would probably be a suprimely wasted day: “Building Global Community” – a Global Public Address by Mark Zuckerberg, February 16. 2017:

https://www.facebook.com/notes/mark-zuckerberg/building-global-community/10103508221158471/?pnref=story (and yes, the comments!)

I did have certain arguments with the new stream of old-scholl political activists who eventually submitted themselves to Facebook. I have no problem with anybody’s decision to surrender to the convenience and social opportunities of interconnected profiles and timelines, but will remain insisting that this platform does not allow for any meanigful criticism, based on the non-negotiable, base and technical level of it’s very architecture. I hope that Zuckerberg himself finally provided the arguments to end this debate once and for all. Nobody’s timeline will become the new “Brave New World”; no one’s Messenger will read as the new “Finnegan’s Wake”. Criticism will never happen on Facebook, but can be advertised there. It is a tecnical thing, a matter of interface and of the algorithm, a material reality of that particular network. It is fine for keeping up with the guys from schoool who had spread all over the globe now, though. Local support groups for this and that seem to make some sense, here and there, and kill all sense, here and there.

[32] “Says Musk, “I essentially led them to a conclusion that they created. It was sort of a Socratic dialogue on a technical level. The essence of a Socratic dialogue,” he adds with another of his trademark soft laughs, “is that people wind up convincing themselves. People are much more willing to change their opinion if you’re not forcing it.” (Interview with Elon Musk, http://queensu.ca/alumnireview/rocket-man)

[33] “As a way of thinking about the internet, “the future of” is a particular form of procrastination. Recently, Kanye West said in an interview that he talks about “the futch” with his pal, Elon Musk. I suggest we borrow West’s coinage the “futch” to describe the “futurism” of snake oil internet gurus. The Shingys. The idea-ators. Everyone who instructed us to keep looking toward the horizon and never look down is guilty now. The “futch” is the recognition that we cannot begin to categorize let alone solve any problems in this moment now.” (Joanne McNeil, “Postcards from the Futch: Nothing looks like the past like talking about “the future” of the internet.”, The Message, March 31, 2015, https://medium.com/message/postcards-from-the-futch-595796d8a45d)

[34] In the post-truth world, who is the voice of “reality”? And who seemingly lost the credibility, whose words are now deemed worthless? Can we trace what happened with the “old truth” that was holding the world together – in love and in hate – for such a long time, where has it gone, whom it belongs now, and what people make of it today – if remembered at all?

When doing their Big Data research or writing small patches in the already sophisticated code, are the engineers and designers and programmers, those armed forces of today, trained in Universities and corporations rather than military academies and armed with proprietary knowledge and supercomputing clusters rather than blades and projectiles, are they even remotely aware of their role as soldiers, of their function of coding political change? Most of them, in general, probably not. Some of them, in particular, very much are; and this may be the situation we know from before, analogous with the history of social revolutions, liberation struggles and decolonializing battles unfolding throughout the previous century. But, in general, this social group likes to present itself as “outside” of all politics, so we deduce their ideological attitude by trying to observe their role models, the people who seem to be their intellectual inspiration and whom they consider to be their leaders – or, to express it more in tune with contemporary sentiments, the people whose profile they chose to follow.

[35] In this networked philosophical endeavor of today, even the people are honorably invited, as a twisted mirror image of the Marx’s’ vision of being possible “to drive Uber in the morning, to campaign on Facebook in the afternoon, to check Reddit or Pinterest in the evening, to criticize on Twitter after dinner (but not before the dinner gets Instagrammed).”

[36] “It is the crucible of modernism, which we can very loosely describe as the process of making only those things that fit and speak of our ever more complex times, creating new things for a new world. Postmodernism, which has probably lasted longer than modernism, is the process of interrogating the aesthetic discourse, disrupting the narrative. Modernism says that things can be right. Postmodernism says that nothing can be right. So if you ever wonder why nothing new ever seems to happen anymore, find a postmodernist and beat the shit out of them.” (Warren Ellis: “Some Bleak Circus”, FutureEverything 2015: https://youtu.be/9cfAmvdeZD4?t=245, 04:04)

[37] It is actually not, and this text has NO complains against quantum physics, but has a lot of complains how quantum physics is being treated by the media, by most of what passes as education, and, it needs to be said, by most of the people. We all could and should do better, and I hope we will.

[38] Like with the special theory of relativity, it seems that the most of progressive population now, as alt-right is doing, are also looking into the “better past”, and observing it from different positions they can not agree on any absolute reference frame and hence see quite different things. But unlike the special relativity, there is not something like an agreement on the constant speed of light to help them “agree to disagree”. This makes the problem of knowledge (and thus of communication) unsolvable: not only that the viewpoints are different, but the very world observed is not shared, and it is not about what we see but about the very “way of seeing” the people can not agree upon. Abandoning the institution of “facts”, however debatable in itself, in culture has the effect that abandoning the constant speed of light in natural sciences would have; no observation could be proved objectively true anymore. DISSERVICE TO THEMSELVES; NO OWN REFERENTIAL FRAME EITHER; PLUS THE PROBLEM OF “UNLEARNING” VS “NOT KNOWING”.

[39] “Rather than “fake news” in the sense of wholly fabricated falsities, many of the most-shared stories can more accurately be understood as disinformation: the purposeful construction of true or partly true bits of information into a message that is, at its core, misleading. Over the course of the election, this turned the right-wing media system into an internally coherent, relatively insulated knowledge community, reinforcing the shared worldview of readers and shielding them from journalism that challenged it.” (Yochai Benkler, Robert Faris, Hal Roberts, Ethan Zuckerman. “Study: Breitbart-led right-wing media ecosystem altered broader media agenda”, Columbia Journalism Review, March 2017, http://www.cjr.org/analysis/breitbart-media-trump-harvard-study.php)

[40] “This type of algorithmic news is not concerned about what the public needs to know in order to make informed decisions and act as citizens in a democracy, but rather what the public, at a given moment, seem to “want” (i.e. the public as consumers rather than as citizens).” (Christer Clerwall, “Enter The Robot Journalist – Users’ perceptions of automated content, http://www.tandfonline.com/doi/pdf/10.1080/17512786.2014.883116)

[41] Possibly the biggest failure of the (former) left-wing establishment was not only to lose the battle for the future, but to during the process dismiss the very idea of future itself. The poor future remained abandoned on the battlefield that was abandoned itself, and even the right-wing winners did not bother to pick it up as a trophy. They might be mad or evil but are aware that the future-as-trophy would become the ultimate burden to carry, and it would serve them a vastly more convenient purpose to be the corpse on the battlefield, the corpse that everybody knows used to belong to the left. For the right-wing, that was the ultimate meaning of the very battle – not to go after the left, but after the future that was the prerogative and the precondition of it; after the victory, not to make the martyr out of the corpse of the future, but a monument to the defeat, incompetence, romanticism and utopianism of the left, and finally, a monument to the very idea of future, presenting this unnecessary and possibly toxic burden as a matter of history, better left to scientists and artists and other treacherous and unreliable people to explore and file in archives somewhere.

[42] Then again, all this may have been a bit overenthusiastic, as perhaps is the debate on AI, in regard to the actual power of media and of technology. America may have this president now not because of the cunning media strategies and the power of Big Data, but because he had a lousy opponent. Or because 10 million of those who voted differently the previous time now decided to stay at home or go barbecue, and the reasons for that can only partially be ascribed to the role of media. To quote the old trope and paraphrase Adam Curtis, “in the world of ever-growing complexity, we are growingly being offered ever simpler stories for an explanation.”

[43] Faced with all the complex demands of contemporary capitalism and the process of its ever-shrinking of the fabric of social (in this latest turn, precisely by appropriating end exploiting the social itself), the vast majority of the people already consider themselves defeated by economic, bureaucratic and intellectual challenges that they feel far surpass what they are able to cope with: the bills for maintaining any decent (by the standards of XX century) life seems to have become unpayable, the ever-growing bureaucratic requirements are becoming unsolvable, and the new scientific and technological skills required to understand and to act in the world seem, without the structural educational effort by society, impenetrable.

[44] In another sense, besides all the media-induced drama reporting on the devastation of societies and natural resources all over the globe for the reasons that would not be acceptable only a decade ago, 2016 also felt like the things just officially stopped to happen; despite of the unexpected, and perhaps unwanted changes mentioned above, it all felt like an amplified, sped up version of what we already had in previous several years, and for any explanation or analysis, there were absolutely no new and enthusiastic ideas or directions in regard to the future; instead, we looked for understanding and for solutions of this contemporary situation almost exclusively by looking backwards, mainly trying to draw parallels with the big events of the previous century and beyond. Perhaps that is one of the reasons that all the horrors of contemporaneity are mostly met with cynicism and apathy and shrugging rather than with streets full of crying masses. (Not that some masses are not regularly out there on the streets; it is the numerical ratio between those on the streets and in the war zones and those cheering or grudging behind their screens that is important. Also, important to note: in the age of militarized police and mandatory surveillance of any critical voices, the streets and city squares are not welcoming the protesters anymore, and are growingly not considered to belong to anonymous public. (Indeed, it may appear that it is not only safer but more “effective” to protest on digital social networks. More on that phenomena – or the grandest of illusions – later.)

[45] ”Use of disinformation by partisan media sources is neither new nor limited to the right wing, but the insulation of the partisan right-wing media from traditional journalistic media sources, and the vehemence of its attacks on journalism in common cause with a similarly outspoken president, is new and distinctive.” (Yochai Benkler, Robert Faris, Hal Roberts, Ethan Zuckerman. “Study: Breitbart-led right-wing media ecosystem altered broader media agenda”, Columbia Journalism Review, March 2017, http://www.cjr.org/analysis/breitbart-media-trump-harvard-study.php)

[46] An ongoing experiment worth taking a look: Jon Keegan, “Blue Feed, Red Feed – See Liberal Facebook and Conservative Facebook, Side by Side”, May 18, 2016, The Wall Street Journal, http://graphics.wsj.com/blue-feed-red-feed

[47] There are many analysis of how both a lot of financing necessary to buy infrastructure and media space and a lot of cutting edge data science and exclusively vast data sets on entire populations was coordinated to suddenly replace both the existing “mainstream media” (MSM) and to re-curate the social media space (regardless if the “leak” about the involvement and the powers of companies like SCL Elections/Cambridge Analytica is true or “partially true” or not, it is a good enough estimation of how the observers and analysts think this had to be done). It required the extensive knowledge about the very architecture of Facebook and Google and similar systems that (intentionally or not – it is a different story) proved to be perfectly crafted for such manipulation, and allegedly perfectly defenseless against it. Such operation is also not, in many ways, an activist endeavor; it rather resembles carefully planned and perfectly executed occupation of the global media space that required vast resources and military-like discipline. Some links:

[48] Then, there is the unbearably (and perhaps deliberately) confused and confusing role of the State. As progressives, we are obliged to attack it; as citizens, in this situation, we are obliged to defend it. The mere act of observing such dilemma renders everybody involved slightly more conservative, and the only escape seems to be to deny this choice, without being possible to dismiss its existence. But as I am no expert in Deleuze, it is, according to the new post-alt world rules, quite safe to quote him: “we no longer want to talk about schizoanalysis, because that would amount to protecting a particular type of escape, schizophrenic escape.”

[49] Or that you can always apprehend and correct a timeline or two. Or maybe the problem is indeed deep down in the very design, in the nature of the thing (e.g. Galloway: “Eugene Thacker and I expressed this phenomenon in unambiguous language: “communications protocols are technologies of conservative absorption. They are algorithms for translating the liberal into the conservative” (131), or Kittler AS TRUE BELIEVER. (plus, Eli Pariser, “The Filter Bubble” – information determinism, etc)

[50] Be it about the small amount of people with ill intentions, burdened with megalomania and greed of the cosmic scale, or about the majority of people being guilty for disinterestedness, petty greed and sheer stupidity, or about some fateful and irresistible agenda of the technology itself that once triggered can not be stopped, lets leave all those options open, and add more; BUT, this approach is also conservative as it is backwards orientated (looks for the solution in recreating the past conditions), and perhaps we don’t want to trust it’s conclusions. How to think of resistance strategies as not a step back to what we already tried and failed? What will also not work, at least anymore, may be:

– parliamentary democracy, understood as a “numbers game”

– open source

– both apostasy AND blasphemy (cue Haraway’s manifesto here – I still think that apostasy works, but as personal salvation and investment, not as social revolution, making the salvation part somewhat meaningless in present, and a potentiality for the future)

[51] There is the argument that any restrains, restrictions, any control and censorship, that is any attempt in de-liberalization of media, besides being sure to backfire with even more mistrust, can not be a viable approach; the history of liberalization of media guarantees that whatever bad starts to happen in media, it will be disclosed in media first; and where the media were liberalized there was always the people(s) defense against lies, manipulation, propaganda. Again, there are some concrete experiences that do point to other possibilities, and to maybe a non-absolute character of what we recognized as “freedom of speech”, or just “freedom”; that propose a “relativism” of freedom (not unlike the “relativity” of material reality as discovered and established in natural sciences), or recognize the conditionality of freedom. Such arguments will never go down well with orthodox liberals, who will recognize any attempt to reconsider the absoluteness of “freedom” or to try to introduce any relationing/conditioning as an Orwellian operation. Is there good and bad propaganda, we may wonder once again, but at the same time asking are there people who we seriously consider have to be guided or “protected” and under what terms; under what terms such a view might be even acceptable? What happens with the notion of equality in that case? Is it not that this very inquiry leads down the path of dictatorship and discrimination, of eugenics, of outright fascism?

[52] “Entirely new branches of production, creating new fields of labor, are also formed, as the direct result either of machinery or of the general industrial changes brought about by it.” (Marx, 1961: 445)

“Film: unfolding … of all the forms of perception, the tempo and rhythms, which lie preformed in today’s machines, such that all problems of contemporary art find their definitive formulation in the context of film.” (W.Benjamin, Arcades Project, K3,3)

See also: Luis Camnitzer, “Thinking about Art Thinking”, e-flux journal #65, 2015 (http://supercommunity-pdf.e-flux.com/pdf/supercommunity/article_1148.pdf)

[53] One of the reasons may be this:

“Another problem with prediction markets is that they are not a voting machine, they are a weighing machine — unlike a vote in a democracy, your bet is worth more if you put more money behind it. That can create weaknesses.”(John Authers, “US election guide to prediction markets and bets”, Financial Times, October 24. 2016, https://www.ft.com/content/dd6a895e-951b-11e6-a1dc-bdf38d484582).

The other possible reason for such high margin of error in predictive analysis of these events may be the real-timeness of the new media manipulation tools provided by Big Money Big Data conservative warriors such are Mercers, Bannons, Kochs and Thiels of the world. It was the very first time that something like that was applied on such scale of operation. NOTE: the use of the notion of “real time” has to be clarified:

[54] The reaction of the (former) left-wing establishment (including the leftist/leftish media) speaks as much about the state of the left politics of today as it does about the new technologies of popular success as engineered by the alt-right phenomena. Instead of countering the fake theories and bad ideas with vastly better propositions, the former left cries for regulation and censorship; in revolutionary situations, the former left invokes the principles of law and order; instead of capitalism, they blame technology or people themselves for not being able to make some sense of the moment; instead of countering power, the former left does all what it possibly can now to preserve all the structures of power there possibly are. In short, and despite and after everything what happened, the (former) left still tries to save neoliberalism from its right-wing aspect gone “out of control”.

“It’s a shrinking democracy: the right speaks to the right, the left speaks to the right, where is the left’s discourse? What’s even more dramatic is that the whole world is speaking the language of the extreme right; Marine Le Pen is imposing the language, the subjects we talk about.” (Interview with French writer Édouard Louis, https://www.theguardian.com/books/2017/mar/19/interview-edouard-louis-the-end-of-eddy-front-national-marine-le-pen-kim-willsher). He adds: “I’m astonished at the feeble level of discussion that tries to explain the extreme right vote or the FN. Instead of inventing a new debate, people are falling back on historic explanations and errors, saying, ‘people live in misery’, but it’s not just poverty; misery means so much more, it’s anguish about your place in the world.” (…) “[F]or most individuals like my parents, like refugees, politics is still a question of life and death. We must put that idea of life and death back in the centre of politics.”

Indeed, it is both a moral and the ethical sin of the left(ist, ish) establishment, committed on the genocidal scale, to abandon the heritage of socialism and the insights of Marxism in favor of the liberal “centrism”, of accepting “the end of history” narrative, the sole purpose of which was to forget about those burdening historical troubles and responsibilities in order to join the right-wing establishment in enjoying the endless privileges of the eternal today.

[55] Guido Alfani, “The top rich in Europe in the long run of history (1300 to present day)”, February 15, 2017, VoxEU.org. (http://voxeu.org/article/europe-s-rich-1300). If I would be responsible for training AI to consult on this, there would be a great uneasiness, concern and cautiousness to feed this data to the appropriate algorithm; I feel it might be devastating to draw conclusions from those facts alone. Is the Truth in the Data, as we set our new worldview to be?

[56] “Artificial Intelligence (AI) technology is rising in popularity every day. Seemingly all major companies are hopping on the bandwagon, trying to find new and interesting ways to utilize AI. As part of this movement, the Partnership on AI to Benefit People and Society was created in September of 2016. When it was created, Amazon, Facebook, Google, IBM, and Microsoft joined as founding members of the initiative.” (https://futurism.com/apple-joins-amazon-facebook-google-ibm-and-microsoft-in-ai-initiative)

[57] “Funding in AI startups more than quadrupled between 2011 and 2015, to US$681 million from $145 million, and it could reach $1.2 billion in 2016, according to CB Insights.”

[58] “And of course this [deep learning, neural networks, big data] has caused a tectonic shift in the field, so there has been a massive investment by the industry into this filed – billions of dollars from the likes of Google and Apple and Baidu. Basically an entire Academic field has been privatized and brought in. And sometimes I feel like I am gonna be the only person left studying this, and if you are a representative of one of these companies and if you would like to buy up my lab, we can talk later.” (Prof. David Cox, Harvard, “Towards an Artificial Brain”, World Economic Forum 2017, https://youtu.be/f9pOtcadVpc?t=231, 3’50”)

[59] The Italian bombardment of Tripoli from biplanes and dirigibles that started at November 1st 1911 is considered to be the very first air bombardment in the history (if we do not count the previous and already at the time internationally outlawed technique of using the unmanned air balloons to deliver a single explosive charge). It was celebrated by Marinetti, and he personally as reporter from Balkan Wars witnessed the following bombardment of Adrianople (presently Edirne), considered to be the very first aerial bombardment of a city and conducted by a single Bulgarian aircraft against the Ottoman troops at October 29th 1912, the result of which was, among other things, the famous Marinetti’s poem/artist book Zang Tumb Tumb.

[60] World Economic Forum: “Introducing a series of poems on the Fourth Industrial Revolution”, 13. 06. 2016 (https://www.weforum.org/agenda/2016/06/introducing-a-series-of-poems-on-the-fourth-industrial-revolution)

[61] He might like some of these feeds too: https://twitter.com/NumbersStation, https://twitter.com/dril, https://twitter.com/UpWorthIt, https://twitter.com/big_ben_clock. For some reason, I like Babby:

[62] Algorithms are here for a reason, and that would perhaps be to answer the “need to amplify human power in order to cope with various environments,” as was (“fascistoidly”, according to Flusser) formulated by Marshall McLuhan. As writer and McLuhan’s biographer Douglas Coupland reminds, when McLuhan was working on “Guttenbergs Galaxy” and other writings in 1960’s when he envisioned the emerging of our new “electric” world, he was not able to describe the physical interface of what is to become Internet accurately. For that, he used various metaphors from Joyce’s “Finnegan’s Wake” or from Homeric poetry; but he was able to predict in great details the overall social changes that are going to be brought by these technologies, including Arab Spring, Amazon and Google and eBay, the phenomena of Flash Mobs, and even stuff like “Breaking Bad” suddenly turning television into the true art form.

[63] In addition: Avantguarde, conceptual art, Surkov, Russia (expand)

[64] “The fixed and static nature of the text is very different from the vivid occurrence of the spoken word. As Ong (1992) has suggested, “Recalling sounded words is like recalling a bar of music, a melody, a sequence in time” (pp. 294-295). Spoken words are one-time events: They fade away and disappear. What Snapchat is attempting is to apply technology to visual products to create a fading-away effect—just as spoken words fade away in the air after utterance. Like spoken words, the “digital objects” sent through the application disappear.”(Oren Soffer, “The Oral Paradigm and Snapchat”, Social Media + Society. July-September 2016: 1–4 http://journals.sagepub.com/doi/pdf/10.1177/2056305116666306)

[65] “I try to say in this book [Die Schrift, Göttingen: Immatrix Publications, 1987] the following: when alphabetical writing was invented, let’s say 3500 years ago, a total transformation of our – not only our experience, but even our action was involved. Before the invention of writing, traditional images where used as maps of the world, and the structure of images involves a specific way of looking at the world which is the mythical way. Now when alphabet was invented, mythical thought gave way to historical critical thought. Because the structure of linear writing is a uni-dimensional, un-directed line. So that, by and by, people started to think historically in a causal way, and in a critical way. Now that this line has been disrupted into points, now that discourse has been substituted by calculus, historical progressive thinking is being abandoned in favor of a new type of thinking which I would like to call, let’s say, a systemic or a structural way of thinking. So that I believe that we are present and witness to a revolution which can be compared to the one which gave origin to history. In my terminology I say that before the invention of writing, people thought in a prehistoric way, after the invention of the alphabet, historical consciousness was elaborated. And now we are in the process of elaborating a post-historical, structural way of thinking.” (Vilém Flusser – “1988 interview about technical revolution”, Osnabrück, 1988, https://youtu.be/lyfOcAAcoH8?t=180, 03’00”)

[66] Using the simple premise that people could be very unpredictable as individuals and comparatively rather easy to be computed en masse, it is the sample of your meta-data over time that should reveal more precisely your buying habitis, your voting patterns or your terrorist potential; to rely on analyzing some pieces of content that might be context (that is, time) sensitive is not only less precise, it s vastly more inefficient (that is, takes time and is expensive). It is also not that important; such vast systems benefit almost nothing from knowing who you exactly are, and gain a lot by being able to know a bit better about your averages, so that the insights (and predictions) about your kind of profile could be used to improve approach to the much larger “target group”.

[67] In general it seems to be true that a robotic salesman aware of you as a certain “type” can be more effective in selling you stuff than a random human salesman addressing the actual “you” could, but telemarketers of the land line age knew that as well, and developed their analogue tactics to recognize the “type of you”. Like most things with algorithms, this is a matter of scale.

[68] This sense of awareness of the later latter group is of most interest to me – those who consciously uses the system not to be pushed into the transaction they never intended to undertake but to refine and explore their topic of interest by examining the often incompetent recommendations for potential discoveries, and treating the majority of the dumb ones as the consequence of generalization and “averagezation” by algorithms, being aware that the “type” or “profile” assigned to them significantly fails to target the actual, particular – or even feasible – purpose.

[69] As stated by Alyssa Sims (New America Foundation) for The Guardian, the exact expressions used by the techno-politicians in charge of this program are “personality strikes” for when they know who exactly is about to be killed, and “signature strikes” when they let algorithms decide who to kill based on the analysis of the “typical behavior” of the “targets”: https://www.theguardian.com/world/2017/mar/13/predator-drone-retire-reaper-us-military-obama

The use of this is, as everything else considering the technologies we discuss, in the sharp rise; during the Bush administration, 57 drone strikes were conducted based on “personality strikes” policy, the number that the Obama administration boosted to 563 based on “signature strikes” policy; but to continue even faster, for the first several weeks of his presidency, Trump approved drone attacks in average once in 1,25 days, as a difference from once in 5,4 days during Obama: http://blogs.cfr.org/zenko/2017/03/02/the-not-so-peaceful-transition-of-power

The number is expected to be boosted even further starting July of this year, when the more than two decades old technology of infamous Predator drones will be replaced by the new generation of Reaper drones: http://www.acc.af.mil/News/ArticleDisplay/tabid/5725/Article/1092894/usaf-prepares-for-all-mq-9-force.aspx

[70] It seems that such system is yet to discover a real terrorist in advance, but that it already killed hundreds, if not thousands, of absolutely falsely identified “terrorist types”. Those would be the people who did not click in any recommendations, and mainly those not being “lucky” enough to even be served one. The number of civilian casualties of drone strikes is not possible to be precisely established, as the estimations by the independent sources often list numbers 5- 6 times bigger than the official US Government reports: https://www.thebureauinvestigates.com/stories/2017-01-17/obamas-covert-drone-war-in-numbers-ten-times-more-strikes-than-bush

[71] What was defined as Leibniz’s gap in 1714 – why and how the particular group of biological cells that make our brains and bodies becomes conscious – remains unresolved. Despite the resolution of the imaging and the processing power of computers available for analysis and all the other research conducted so far, we still just don’t know. To quote the woman with one of the greatest vocational titles ever, the eliminative materialist philosopher Patricia Smith Churchland, herself a “believer” that the consciousness could be properly understood by analyzing human brain, when she commented on the results of enthusiasm of some of the attempts in the field: “Pixie dust in the synapses is about as explanatorily powerful as quantum coherence in the microtubules.”

[72] Franco Berardi Bifo, After the Future, AK Press, 2011

[73] Jelena Vesić & Vladimir Jerić Vlidi, “Under The Sycamore Tree – Curating As Currency: Actions That Say Something, Words That Do Something”, The Future Curatorial (Ed: Paul O’Neill, Mick Wilson, CCS Bard, MIT Press, 2015)

[74] Using algorithms, Leon A. Gatys, Alexander S. Ecker and Matthias Bethge from the University of Tübingen managed to produce the way for any image to be rendered, that is, ‘painted’ faithful to the style of several historical artists like Picasso or Van Gogh. As they commented on their research titled A Neural Algorithm of Artistic Style (http://arxiv.org/pdf/1508.06576v1.pdf) after showing some rather spectacular initial results: “The key finding… is that the representations of content (the foundation image) and style (of specific artworks) in the convolutional neural network are separable. That is, we can manipulate both representations independently to produce new, perceptually meaningful images.” (http://www.theguardian.com/technology/2015/sep/02/computer-algorithm-recreates-van-gogh-painting-picasso,)

[75] “The rich worlds created in the TV series Game of Thrones (GoT) inspired a computer science class at the Technical University of Munich (TUM) in Germany. (…) Has Jon Snow survived season five? Who is going to die next? The students used an array of machine learning algorithms to answer these questions. The algorithm, which accurately predicted 74 percent of character deaths in the show and books, has many surprises in store, placing a number of characters thought to be relatively safe in grave danger.” (Technical University Munich, “Computer algorithms predict next characters to be eliminated in ‘Game of Thrones’”, April 20, 2016, https://phys.org/news/2016-04-algorithms-characters-game-thrones.html)

[76] Bruce Lee, Enter The Dragon – Destroy The Image (https://youtu.be/RBnIbqW6ZhM,)

[77] It takes approximately 500 milliseconds to click on the computer mouse, or to snap with your fingers. But algorithm lives in nanoseconds (so it is 500,000,000 nanoseconds, in this case). Today, algorithm can, for example, execute a single electronic trade in hundreds of nanoseconds. There could be for humans an ‘infinite’ number of algorithmic ‘thoughts’ happening during the time it takes to make that one click, and algorithms are heading towards the horizon of picosecond, one trillionth (one millionth of one millionth) part of a second. This (poor) exercise in math is not here to finally explain why we call the smaller rectangles smartphones, but to support the fact that the world of algorithms and the world of humans are very much apart.

[78] Time, for humans, has some special qualities that are derived from both its inevitable flow and its relative speed. That is, we understand time in historical categories, as past, as future, as something happening now. Most of the things and processes out there, including our own existence, we may observe either as diachronic or as synchronic phenomena, according to what precise application our observation aims; but algorithms are, as it seems, able to endlessly slice our ‘now’ into ever smaller units of time they can then manipulate at an ever-faster rate. Time is also an essential quality if you ask algorithm. In the processing unit serving the computing of algorithms, everything is happening now, in ‘real-time’; this is, also, where future briefly may happen for algorithm, in the shape of predictions and expected values, as something that can be computed now.

[79] With it’s real-timeness, it’s inability to exist either in the past or in the future, the way in which algorithms function does mimic in a way the essential quality of the process of curating – it tries to replicate its outcome, characteristic by that one gesture that has the ability to condense what was and what will be into what is. As it is precisely this act of gesture that curation is best explained by; by observing and analyzing such a gesture, by forensically tracing back its elements, by seeing its grand finales as outcomes of long threads of research or understanding that the moments of drowsiness are caused by long sleepless nights spent examining a certain conundrum, what is curated unfolds its world before and behind us. And only in the moment when such a curatorial act happens a “tremor of consciousness” can occur, to paraphrase Lacan, in which a temporary, empathical understanding of what curation is and of all what it entails.

[80] Walter Benjamin wrote this in 1928 and published in German as a part of the collection titled Einbahnstraße (Berlin: Rowohlt). Available online at http://archive.org/details/Einbahnstrasse. The translation in English was published in the collection One-Way Street and Other Writings (London, NLB, 1979). (p. 89)