Showing posts with label philosophy. Show all posts
Showing posts with label philosophy. Show all posts

Monday, 20 July 2015

Virtual Travel

I went to a family gathering this weekend, about two hundred miles from the places I call 'home'. This is actually the first time this decade I've been that far from home (I don't get about much). The journeys there and back were seven and five hours respectively in a car, and were the longest car journeys I've been on in the same period.

Travel is starting to become a theme of my writing about video games. Many of my favourite in-game moments have to do with travelling through virtual worlds. A big part of my excitement for upcoming games like Tales of Zestiria, Xenoblade Chronicles X and Final Fantasy XV is that they offer vast new worlds to explore.

By contrast, I really don't like to do anything in the real world that's vaguely analogous to my virtual explorations. I don't like hiking, I don't like driving (or being a passenger, since I can't legally drive myself anywhere), and I generally don't like to travel. There are a variety of reasons, but the biggest is quite simply that real travel is a lot of effort.

I want to see mountains like the ones my dad likes to climb without getting sore feet. I want to experience the vast sweeps of landscapes like the ones we drove through this weekend (some of which were very pretty) without the steadily-hardening crick in the base of my spine. I want to float through the world as effortlessly and intangibly as a videogame camera.

And I thought until this weekend that there was no harm in this, provided I kept to my lane and didn't get too whiny when I actually do have to travel. But Actually Travelling, and looking at the landscapes I travelled through with the same critical eye I've been training to look at videogames with, put me ill at ease.

Games, by the limits of their technology and the demands of their audience, compress distance. Sure, you can walk for some hours in a straight line without touching the sides of some recent games. But you could walk the real world for the same length of time and in many places not even leave the valley you start in. I found myself wondering, as we drove over the crest of a hill and the horizon retreated on Friday afternoon, how dishonest it is to indulge in this, and how harmful.

It would be a pretty shallow critique to say that the shortening of distance in games is straightforwardly misrepresentative and creates harmful expectations of travel - no better than tired old arguments that the mere presentation of violence is sufficient to induce people to be more violent. It's more complex than that.

But games are so often about the mastery of space, the individual eventually rising to an effortless, unchallenged mobility. There's no better example of this than the 'airship moment' in JRPGs, the point at which you've explored most of the world on foot and, to avoid forcing you to retread old ground, you get a tool that allows you to hop or float to wherever you want, bypassing even the abstractions that are supposed to add labour and time back into the earlier compressed journeys.

This is all a bit unfocussed and musing-y (which is why it's here and not on my actual games blog). The problem probably has more to do with the way games construct mastery than the way they handle travel and distance. But this weekend, looking out at the same sweeping vista for twenty miles and realising it would take ten times as long to walk it on foot as in-game, I had a moment of very sharp discomfort. I hope I'll be able to hold that in mind as I develop my critical ideas on this theme.

Wednesday, 1 July 2015

Feels and Reals


'Reals over feels!' and variants thereof have become a slogan for the internet right in the ongoing shitstorm over whether 'objective' journalism is a thing. Simplistic as it is, it's an expression of an ideology with roots in the work of some much-celebrated twentieth-century philosophers (at least, those in the English or anglocentric tradition), and indicative of a subtle shift they engineered in how we use language about truth and reality.

The general meaning of 'Reals over feels!' is that one kind of proposition, understood as impersonal and objective, should be considered more true, or more worthy of consideration, than another, understood as personal and subjective. Statements about objects independently of some personal perspective are considered accurate; statements about how something seems to some perspective are inaccurate or invalid.

In the journalistic field, the problem with this is that all journalism, no matter how diligent, is perspectival. This issue generalises, though; there are, quite simply, no non-perspectival facts. This is not the same as saying that there are no facts. How we arrive the idea that there are objective facts despite the truth of this claim is a topic to which I'll return later.

Let's start with the basics. We have known since Descartes that the only sure foundation for knowledge is conscious experience – that the only thing that absolutely cannot be doubted is the current content of our conscious fields. I may be only hallucinating that I sit in front of a computer right now, or the image of the computer that I see may be the product of a Matrix-like simulation (or, in Descartes' scenario, a trick played by an evil demon), but I cannot be mistaken that I seem to see the computer; I cannot be mistaken that a computer appears before me.

Absolutely every other thing we take ourselves to know is known by inference, and every means of inference we have, we know to be periodically fallible[1]. So however reliable a claim of knowledge beyond immediate awareness may be, we know there must still be some small chance that it is wrong.

Why emphasise this? Well, as the physical sciences developed in the wake of Descartes, the gulf between direct awareness and the theories of the sciences widened dramatically. The seventeenth century gave us cells, the nineteenth atoms and the twentieth quanta, so tiny that any conventional notion of being directly conscious of them goes out the window.

My point is not to dismiss the theories of science, not at all; just to remind where they come from. What I want to repudiate is the relatively recent philosophical contention that because direct awareness is of appearances that often do not correspond closely to the equivalent deliveries of scientific theory, it is inherently misleading.

The difference is subtle. No-one denies that appearances can be deceiving. The question is whether appearances are inherently deceiving. The shift from the former claim to the latter was accomplished in philosophy largely during the rise of Bertrand Russell, and the fall or at least pushing-aside of the British Idealists and Continental Phenomenologists his dominance replaced.

Idealism (in this, rather than the political, context) is the metaphysical and/or epistemological theory that the world, or at least our knowledge of it, is fundamentally based on experiential/conscious facts. Phenomenology is a philosophical method that requires one to start from what is observed most directly, explain that, and then build on the explanations. Both these positions have clear roots in the idea that conscious awareness comes first.

Russell, along with friends like G.E. Moore and disciples like the young A.J. Ayer, held that these approaches had given rise to obscure and absurd metaphysical systems, convoluted theories that were of little use in actually explaining things. It's true that the phenomenologies and idealisms of the 19th century were complex, but then so is the world[2]. Russell in particular held logical clarity to be the most important virtue of philosophical systems, and was willing to ignore or bury a great many issues that would not submit to logic-based treatments[3].

What all the theories discussed so far, including those of Russell and Moore, have in common is that they are attempts to explain the relationships between the experiences of different people. We take my experience, now, of sitting in front of a computer, to be accurate precisely because if you came and took my seat, you would have a similar experience, and because if I come back to this room later today and sit in this same seat, I will have another, similar, experience.

The standard early modern philosophical explanation of this consistency would be that there is an object, the computer, in this room that produces computer-like experiences for any who sit in front of it. With modern science, however, we know that the object in this room can't be simply described as 'a computer'; it's an immensely complex structure of polymers and electronics, each themselves complex structures of molecules, which are made up of atoms, which are in turn complex structures of subatomic particles and fields that can be described mathematically and logically but not terribly intelligibly.

This sets up a huge discrepancy between the experience of sitting at the computer and the scientific description of what's going on (it gets even worse if you try to factor in a scientific description of me, and/or the process of perception). The tendency of the (anglocentric) philosophers of the 20th century was to argue that this meant the experience was deceptive and false, while the scientific description was accurate and true.

But as I argued here, the scientific description is actually much less useful than the experiential one in most cases. Our knowledge, the everyday stuff that enables us to find our way to the shops and so on, is overwhelmingly experiential in character.

And this hints at another reason for preferring the experiential to the scientific; where scientific knowledge is useful, it is only because of some effect on experience that it enables us to generate. The microscopic precision that allows Intel to inscribe GHz CPUs on a postage stamp is only worth achieving because computers enable wondrous new experiences, whether that means exploring the Mushroom Kingdom or establishing personal relationships that stretch around the globe or even just being able to do your own accounting without needing pages and pages of maths paper.

In truth, all values – not just the emotional or aesthetic, but every kind of utility as well – are values only from some human[4] perspective. Feels are reals, both in the general sense that perspectival facts are real (because they are the only kind of fact), and in the specific sense that emotional perspectives are important, because it is those emotional perspectives from which the values that make anything at all that we do worthwhile spring.

The only question that remains is why, if all this is correct, some people are so convinced that there is an 'objective' perspective, one that is right above all others. I said above that the issue is about the relationship between experiences; Russell and his colleagues came to see the explanation for the relationship as more fundamental than the experiences it relates, but there is another process at work here too.

To examine the relationship among experiences generally, you must have a set of experiences to generalise from. Ideally, as indeed the theory behind the scientific method suggests, this sample will be representative; if it is not, there is a much bigger chance of missing something important. In practice, you cannot include experiences of which you know nothing.

And the philosophers I've discussed here had relatively narrow ranges of experience to draw on. Descartes (and other influential early modern philosophers like Locke, Hume and Kant) lived most of his life in and around the courts of Enlightenment Europe. Russell and his cronies were ensconced in the ivory towers of British academia (and I can tell you from personal experience just how narrow the windows there are).

Not only did these men have a limited range of experiences to draw on, they either had or have subsequently gained a great deal of influence to pronounce with. Their positions, social class, shared ethnicity and so on have made them Great Men with Important Views; people who have differing opinions seem unimportant by contrast. This actually applies to their historical opponents every bit as much as to marginalised people today; one hardly hears the names of Bradley and Meinong in philosophy classrooms anymore.

The tendency of self-proclaimed logical thinkers to exclude dissenting opinions from both history and contemporary debate should by this point be sadly familiar. It's a self-reinforcing process; when dissent has already been shut out once, it is much easier to dismiss a second time. People clinging to Russell's model now, a hundred years down the line, may not even realise how trapped they are in it[5]. Open-minded reflection on the views of people from different backgrounds and demographics is the only antidote.

In summary, the idea of an 'objective fact' is a mirage. The only indubitable propositions are subjective in character. The philosophical models that allow us to link them together into a coherent world are at best intersubjective, a negotiation shaped by social pressures much more than 'purely intellectual' considerations (if there are any such things).





[1] It's worth stressing at this point that awareness is not purely sensory – memory is a kind of awareness, so your memory of an event (though not the event itself, if it is in the past) may serve as the foundation for some knowledge, or at least reasoning).

[2] There's an interesting comparison here with the current state of quantum physics. Its more phenomenological elements – the mechanics that describe and predict actual measurements – are the most accurate science yet developed by man, but the interpretations that seek to explain why those relationships exist... well, here's the Wikipedia page on interpretations of quantum mechanics. Just count how many different interpretations there are, don't try to wrap your head around them all.

[3] This Spockish attitude persists today in the myriad ways our culture insists on quantifiability and computability. Things (like emotions) which are messy to compute tend to be regarded with suspicion.

[4] In the interests of inclusion, this should be 'appropriately human-like perspective', really. We want to be able to extend values and valuation to sentient aliens, sophisticated animals and so on.

[5] Which doesn't excuse their lack of awareness, since they're also likely to have more spare time and money to support self-reflection with than other social groups.

Tuesday, 31 March 2015

Boiled Potatoes and the Analytic Method, part 7

I found myself in need of counselling last year. The counselling I received was extremely helpful, but it's only as, in the intervening time, I've started to study critical perspectives from gender and race discourse in depth that I've been able to understand the wider context of my difficulties. These approaches emphasise connectedness; the marketing of children's toys, for example, contributes to a domestication of women that in turn commodifies their sexuality and devalues their consent, leading to rape culture.

By contrast, the idiom of 'analytic philosophy', the tallest and remotest of the academic ivory towers, to which I've given a decade of my life and all my adulthood, puts detachment and abstraction foremost. It was detachment and abstraction - an overdose of both - that led me to counselling. What follows is a reflection on that journey.

In part 1, I discussed the specific experience that led me to seek counselling.

In part 2, I talked about a lack of emotional sensation that I discovered during my counselling sessions.

In part 3, I blamed everything on boiled potatoes (and allowing my everyday life to become too bland).

In part 4, I surveyed the rise of analytic philosophy and attempted to show how it rejects the spiritual and the emotional.

In part 5, I evaluated analytic philosophy and the limits of its conception of meaning.

In part 6, I identified the limit that the analytic method places on discourses of morality and responsibility. 

Part 7: What Pieces Are You So Scared Of?

I wasn't expecting to write this part in quite the mode I'm in at the moment. I've been feeling generally pretty positive and upbeat so far this spring, and was looking forward to rounding this series out with a similarly cheerful summation on the theme of healing and embracing a life that values emotional sensation.

But I had a bad weekend in a handful of little ways that left me feeling a bit on the low side. As ever when I get on a downer, I started to pull back from things, and especially from people. Anxiety sets in, loading every potential encounter with a hundred disaster scenarios.

There's a numbing process that's part of this, too. It's a defensive reflex, I think, shutting down the mechanisms of self-regard and self-care that identify the problem to avoid having to think about it. We're supposed to solve problems by disinvesting, stepping outside ourselves to look at them 'objectively'. This is supposed to make solutions clearer and less clouded by emotion. But sometimes the problem is the emotion, more than anything else.

In my head, at least, this sits side-by-side with the analytic method. They present themselves to me as the same process. For years I have embraced them as one, and identified all sorts of objective solutions to my problems - limited budget, for example, or shared living environments that aren't well cared for, or (when I was still living at home) the fact that my parents insist on listening to the radio news four times a day, making it completely inescapable.

The real problem, though, is and has always been the denial of inner sensation, the failure to attend to so many important dimensions of well-being, the determination to rise above 'meat'. I am starting to learn, though. Slowly, I'm thawing out.

It starts, perhaps predictably in my case, with music. Music has always offered the most purely emotional experiences of my life - I don't have the theoretical knowledge to analyse it the way I can tackle novels, films and now to a certain extent also video games. It's in music that I'm normally closest to engaging bodily - while I'm a terrible dancer, I'm also basically incapable of standing still when there's music playing.

And I have some incredibly talented musician friends. Look, I know no-one ever takes my music recommendations, but click that last link and listen to Sam's most recent album. Seriously, it's not long, and the last track is the first piece of music in a decade to bring tears to my eyes. It's five minutes that I can get completely lost in. Sometimes it's good to be lost.

Sometimes getting lost is exactly what I need. Some problems don't need the analytic distance of the cartographer - the map is clear, the map is the problem, the map shows you all too clearly what stands between you and the shining horizon. The map tells you what the walk is like, but sometimes you need to stop thinking about that and walk anyway. That's the point at which the map can't tell you anything useful.

Monday, 23 March 2015

Boiled Potatoes and the Analytic Method, part 6

I found myself in need of counselling last year. The counselling I received was extremely helpful, but it's only as, in the intervening time, I've started to study critical perspectives from gender and race discourse in depth that I've been able to understand the wider context of my difficulties. These approaches emphasise connectedness; the marketing of children's toys, for example, contributes to a domestication of women that in turn commodifies their sexuality and devalues their consent, leading to rape culture.

By contrast, the idiom of 'analytic philosophy', the tallest and remotest of the academic ivory towers, to which I've given a decade of my life and all my adulthood, puts detachment and abstraction foremost. It was detachment and abstraction - an overdose of both - that led me to counselling. What follows is a reflection on that journey.

In part 1, I discussed the specific experience that led me to seek counselling.

In part 2, I talked about a lack of emotional sensation that I discovered during my counselling sessions.

In part 3, I blamed everything on boiled potatoes (and allowing my everyday life to become too bland).

In part 4, I surveyed the rise of analytic philosophy and attempted to show how it rejects the spiritual and the emotional.

In part 5, I evaluated analytic philosophy and the limits of its conception of meaning.

Part 6: On Taking Responsibility

The concept of moral responsibility has been at the heart of my journey through analytic philosophy. The first philosophical system I encountered which inspired and moved me was existentialism, a position that has moral responsibility as its foundation and centrepiece. This stands in direct opposition to the determinism which characterised much of 20th-century analysis.

Science has long been thought to promise a perfect system for predicting human behaviour (I choose my words carefully here, since few practicing scientists have embraced this belief - it belongs more to the realm of 'popular' or at least establishment commentary). It's a classic modernist tenet, and for a while scientific discoveries did seem to be progressing in that direction. Neuroscience and psychology made great strides through the nineteenth century and into the twentieth.

Still, as early as 1942, Isaac Asimov could acknowledge, with his invention of 'psychohistory' in the Foundation short stories, that a truly determinist understanding of human behaviour was out of reach, prohibited by the fundamentally probabilistic character of quantum physics. This is not to claim that prediction of human behaviour is impossible, only that it can never be done with complete certainty.

Philosophers, who have been arguing with Laplace's demon for two centuries now, were slower to catch on. Even ten years ago, when I was in my first year at university, hard determinism was still discussed as a plausible theory, rather than merely a far-fetched possibility. So great was my determination (hah) to hold onto moral responsibility that I once refused to read an assigned article because of its determinist slant, which is about as defiant as I've ever been towards a teacher ever.

Determinism and scientism suit the analytic approach. They are theories of absolute knowledge and certainty, of everything in its place, clear and predictable. In denying the possibility of free will, they deny the meaningfulness of the aesthetic, reducing emotions, beliefs and principles to the purely causal.

This outlook has persisted despite the eventual demise of hard determinism. The philosophers who would have been determinists in a previous generation now begrudgingly begin their papers with 'we know that hard determinism is false, but...' and go on to argue that quantum randomness leaves the defender of free will no better off.

The point is not entirely without merit. Fundamental randomness does not guarantee a meaningful freedom of will. Free will theorists have long held that free will is a necessary condition of moral responsibility. The best they can claim from quantum theory is the existence of a narrow sliver of space in which freedom of the will might hide.

More insidiously, the post-determinists have targeted moral responsibility itself, even as free will theorists began to abandon the connection between will and responsibility (the resulting positions are myriad, and better covered in detail elsewhere). The essence of the new determinist argument concerns motivation, understood as whatever mental state in an agent results in their action.

An agent is morally responsible for an action, the argument goes, if their action is a product of a motivation in an appropriate way (that is, not subject to hypnosis or other control). Motivations, though, are products of the agent's character, and said character is a product of the agent's birth and upbringing. If we are to hold agents responsible for their actions, then, it seems that we must hold them responsible for their upbringing and their ancestry. This, the post-determinists argue, is absurd.

And, on the face of it, it does sound absurd. A person cannot literally be responsible for their own birth - this would distort time itself. This argument, the causal argument, seems to present a profound challenge to the existence of moral responsibility.

And yet... Let us come at this from another angle. Critical theories, such as Marxism, feminism and queer theory, recognise differences among birth circumstances as important social phenomena. The concept of privilege is vital to understanding these models, and their well-grounded demands for social justice.

These days, it is common to hear reactionaries crying that it is not their fault they were born male, or white, or middle class, or straight, or cisgendered, or able-bodied, or neurotypical. Strictly, they are not wrong - but then, you will find no serious feminist arguing that they are. What the reactionaries are doing, though, is relying on the same simplistic, causal understanding of moral responsibility as the post-determinist analysts.

Responsibility for the circumstances of our birth, for the privileges and attitudes therefrom, is something we take. It is not something we are born to, nor something we are morally entitled to ignore. The essence of maturity, of adulthood, is making this transition; this is the sense in which children are innocent.

Practically speaking, the act of taking responsibility consists in critical self-reflection, the willingness to examine our own behaviour and the attitudes which condition it, and the seeking of ways to change them where appropriate. It is the act of taking seriously our relations, both structural and specific, to others, rather than viewing ourselves as isolated particles predestined to bang into one another with whatever arbitrary results a crude social physics dictates.

Theoretically, taking responsibility requires detaching responsibility from the purely causal, embracing the messy illogicality of a putatively free choice to escape the fist of determinism. The result is not a neat theory; it has little of the clarity that analysis craves. But it is honest and liberating, and above all else it allows a hope for general, meaningful change that the determinist mindset can never offer.

(part 7)

Monday, 16 March 2015

Boiled Potatoes and the Analytic Method, part 5

I found myself in need of counselling last year. The counselling I received was extremely helpful, but it's only as, in the intervening time, I've started to study critical perspectives from gender and race discourse in depth that I've been able to understand the wider context of my difficulties. These approaches emphasise connectedness; the marketing of children's toys, for example, contributes to a domestication of women that in turn commodifies their sexuality and devalues their consent, leading to rape culture.

By contrast, the idiom of 'analytic philosophy', the tallest and remotest of the academic ivory towers, to which I've given a decade of my life and all my adulthood, puts detachment and abstraction foremost. It was detachment and abstraction - an overdose of both - that led me to counselling. What follows is a reflection on that journey.

In part 1, I discussed the specific experience that led me to seek counselling.

In part 2, I talked about a lack of emotional sensation that I discovered during my counselling sessions.

In part 3, I blamed everything on boiled potatoes (and allowing my everyday life to become too bland).

In part 4, I surveyed the rise of analytic philosophy and attempted to show how it rejects the spiritual and the emotional.

Part 5: Aesthetics and Anaesthetics

I only recently made the etymological connection between 'aesthetics' and 'anaesthetics', but it's hardly an earthshaking revelation. Aesthetics is (roughly) the study of art, a fundamentally sensory thing; anaesthetics make us numb, insensate. The common Greek root originally means perception.

It would not be too far wide of the mark to describe analytic philosophy as anaesthetic. Above all else, what analytic philosophy denies is the subjective. It is the search for objective answers to the grand philosophical questions. The whole analytic construction of 'rationality' opposes the value of personal perspectives, appealing to a transcendent reason which may or may not bear any real connection to the divine intellect of the early modern or classical rationalists.

But analytic philosophy undoubtedly has its advantages. The detachment it advocates can be absolutely crucial for some debates. It's particularly important when responding to criticism; one cannot, after all, take up the point of view of another while clinging to one's own. There are other ways to develop the ability to detach, but practice in the analytic method is a particularly effective and pure one.

(Note: it's far from perfect, as anyone who's ever pricked the ego or threatened the funding of an academic can attest).

And the analytic tradition in philosophy has real triumphs to its name, too; the systems of formal logic developed in the first half of the twentieth century are not just a huge step forward over their arcane predecessors. They are legitimately powerful tools of reasoning, at least within the limit of Gödel's theorem, and underpin much of modern computing.

Another important product of the analytic tradition, one that is rather more complicated to endorse, is its discourse on meaning. This is usually what definitions of analytic philosophy centre on, but the analytic discourse on meaning is almost exclusively linguistic - it concerns words and sentences, spoken and written. In aesthetics, on the other hand, languages are only a small subset of things that mean (the first part of this video has a pretty robust introduction to some of these ideas, referencing the omega of analytic philosophy, Wittgenstein).

And in aesthetics, meaning is a very different beast to the meaning of the analysts. It is lived, experienced, bodily, not a clinical study of how words point to things in the world. Analysts have devoted a great deal of work to establishing what it means to say something exists; in aesthetics, the question is simply 'is it felt?'

The modern technophile's - my - obsession with transcending 'meat' (as William Gibson perfectly put it in Neuromancer) is born of this analytic understanding of meaning, thought and reason. We disdain bodily hedonism for the 'higher pleasures' of the mind, and in doing so fail to realise that our 'higher pleasures' are really just contempt for other ways of seeing the world, other tools that are in their own way as valuable and in many ways richer than those we have learned.

Aesthetic comprehension, in a way, is a much more basic part of the human condition than analytic. This, perhaps, explains some of our disdain; a baby can feel, but only a sophisticated adult can 'really think'. That we can believe this while yearning for our lost, or innocent, or joyful childhoods is a testament to the spectacular power of the (archetypally white, male etc.) privileged ego.

(part 6)

Thursday, 5 March 2015

41.62MB

'IT WILL CHANGE YOUR LIFE', thundered a friend of mine on Twitter when I said something about finally getting a smartphone. I took the plunge in January, at last feeling I have enough spare cash - over a long enough time-frame - to make being able to keep up with a 24-month contract a safe bet. I looked forward to joining the truly modern part of the modern age, the edge where we're beginning to bleed into cyberpunk, the networked species.

And yet, here at the end of my first monthly billing cycle, I've used barely 2% of my 2GB data limit - 41.62MB, to be precise. Obviously, part of that is that I'm new to this device and don't really know what it can do, so I'm not yet using it for many of the things it could be.

That's comforting, but it's the minor part of this issue. The perspicuous truth is sadder; I'm simply not mobile enough, in my day-to-day life, to get much out of mobile computing. I spend the majority of my time within ten feet of a high-powered PC with a cabled connection to a fibre broadband router that gives me download speeds in the region of 9MB/s. When I'm out of the house, I'm walking to places, most of which are workplaces of one or other kind.

A mobile phone cannot change a stationary life. And while the extent to which I don't get out much is perhaps a bit disheartening, it's equally true that I get most of the things that other people do on their smartphones on my PC. Crucially, it's when I'm on my PC that I'm most connected to the rest of the world.

That's what really matters with mobile communications technology, after all - how much more communication it enables. Being romantic and optimistic, we could say it's how much closer together it brings us, the potential to blur the edges not just of communities but ultimately of individuals as well. Talking about the rise of analytic philosophy last week, I mentioned Leibniz; his philosophical system, the 'monadology', posits human minds/spirits as the building blocks of the universe, with space, time and everything that fills them emerging out of the phenomenal (sensory/felt) tensions between us. I've always liked that image as a way of thinking about humans in networks.

I feel that way even when I feel disconnected from those networks. And maybe that's where the greater sadness resides in my current situation (I realise, writing that, that this is all terribly self-pitying, so sorry, I guess). If I already have all the benefits of mobile technology - something I can't anymore deny - then I can't blame any disconnect on technological barriers anymore.

And indeed, I am trying to reach out more, to engage more, to communicate whether from my desk or my pocket. So while the smartphone itself isn't going to change my life, it might yet prompt me to make some changes.

If nothing else, I've been able to spend hours laughing at this game about an enormous, hilariously fragile fish.

Wednesday, 25 February 2015

Boiled Potatoes and the Analytic Method, part 4

I found myself in need of counselling last year. The counselling I received was extremely helpful, but it's only as, in the intervening time, I've started to study critical perspectives from gender and race discourse in depth that I've been able to understand the wider context of my difficulties. These approaches emphasise connectedness; the marketing of children's toys, for example, contributes to a domestication of women that in turn commodifies their sexuality and devalues their consent, leading to rape culture.

By contrast, the idiom of 'analytic philosophy', the tallest and remotest of the academic ivory towers, to which I've given a decade of my life and all my adulthood, puts detachment and abstraction foremost. It was detachment and abstraction - an overdose of both - that led me to counselling. What follows is a reflection on that journey.

In part 1, I discussed the specific experience that led me to seek counselling.

In part 2, I talked about a lack of emotional sensation that I discovered during my counselling sessions.

In part 3, I blamed everything on boiled potatoes (and allowing my everyday life to become too bland).

Part 4: A History of Bertrand Russell's History of Western Philosophy

Bertrand Russell's A History of Western Philosophy is a landmark text. Russell's position as its author - author of one of the most influential histories of philosophy - is a testament to his stature and import in the first half of the twentieth century. If anyone is the father of 'analytic' philosophy, it is Russell; at very least, he was the first patriarch of its fractious family.

History is written by the victors.

Russell's career was built, founded, on the strength (or at least the success) of his attacks on the philosophies that preceded him; the British idealism of his teachers, and the late phenomenology of Brentano and Meinong that paralleled it. By the end of the 1920s, analytic philosophy was well-established, with Russell at its head.

His opponents were not just defeated, they were dead; Meinong died in 1920, at 67. Bradley, greatest of the British idealists, hung on until 1924. Analytic philosophy delivered triumph after triumph in logic and language, most notably in modernising formal systems for logic which had languished in an Aristotelian mode long into the Enlightenment. Since those formal systems underpin the computation sustaining this blog post, we can hardly reject the analytic approach outright.

But it bears asking what was lost to its triumph. Analytic philosophy is a cold, clinical thing, characterised by abstraction, a devotion to clarity pursued by stripping an object of any context that might introduce ambiguity. This is the mindset that numbed my body to serve my mind. This is the approach that relegates emotion to a backwater, nothing more than a hazard to reason.

The archetypal rationalists of the early modern period - Descartes, Spinoza and Leibniz - would have had no truck with this division. For them, there was no great conflict between mind and spirit (mind and body might be a different matter, but body-as-pertaining-to-felt-emotion would have been spiritual to them, not 'merely animal' if there was such a thing). Their tradition, and the work of those who inherited it, from Kant all the way down to Bradley and Meinong, is one of unified, harmonious worlds in which things can only be understood as they are in relation to one another.

It is very hard, when tackling the metaphysics of the post-Leibnizians, not to chuckle, not to view their spirituality as naive, archaic, a product of a 'less enlightened era' in which people still believed in wooly notions and lacked clarity of thought. It is easy to see these men as clinging to religion in the face of marching progress. To do so is, at the very least, to overlook how many of them flirted with outright heresy in challenging the established religions of their times; Spinoza was outright excluded from the Jewish community of Amsterdam, and their sanction against him stands to this day.

While it would be presumptuous of me to present this as an account of the origins of modern critical thought, there are definite links; Marx and Freud, for example, both draw on ideas from Hegel which are fundamentally legacies of Leibniz - ideas that are political, economic and psychological cognates of the metaphysics of Bradley and Meinong. Marx in particular went on to influence a broad range of modern critiques not just in matters of economic class but also the discourse around race, gender, sexuality and disability.

Even the fact that, in anglocentric culture, we view 'philosophy' as something esoteric and removed from daily life can be attributed to analytic philosophy, a product of a simplistic and privileging attitude to the academy and 'academics'. What I hope I have shown, or at least plausibly suggested, is that philosophy is lived, is at the foundation of how we live, is stitched through life and culture in a way that is shaped by but also helps shape everyone who participates in it. The shape it has fitted me into has not been kind, and I am in so many ways one of the fortunate ones.

(part 5)

Thursday, 19 February 2015

Boiled Potatoes and the Analytic Method, part 3

I found myself in need of counselling last year. The counselling I received was extremely helpful, but it's only as, in the intervening time, I've started to study critical perspectives from gender and race discourse in depth that I've been able to understand the wider context of my difficulties. These approaches emphasise connectedness; the marketing of children's toys, for example, contributes to a domestication of women that in turn commodifies their sexuality and devalues their consent, leading to rape culture.

By contrast, the idiom of 'analytic philosophy', the tallest and remotest of the academic ivory towers, to which I've given a decade of my life and all my adulthood, puts detachment and abstraction foremost. It was detachment and abstraction - an overdose of both - that led me to counselling. What follows is a reflection on that journey.

In part 1, I discussed the specific experience that led me to seek counselling.

In part 2, I talked about a lack of emotional sensation that I discovered during my counselling sessions.

Part 3: The Problem with British Food

Boiled potatoes are non-food. Without either flavour or texture, they are sustenance without experience, matter without properties, as close to the Lockean idea of the bare particular (no, that's not a euphemism, though I've just realised I missed out on a hell of a joke lecturing about them last week) as occurs in real life.

At least, they are when I cook them. I'm aware that various interesting things can be done with boiled potatoes, but I've never had much success when trying. It all seemed more effort than the marginally-improved results were worth.

I ate a lot of boiled potatoes during my PhD years. Money was tight, and I am a coward in the kitchen. Boiled potatoes are a very safe option for student cooking - it's not like they can get any blander from being overcooked, right? Yes, I could have mixed things up sometimes with rice or noodles, but that would have meant keeping rice and/or noodles in stock - more diversity of food means more money spent.

And I didn't really care that they were bland. I viewed eating - everything related to sustenance, basically - as a chore, something to be minimised. That doesn't just mean the simplest cooking possible, it also means the least attention-demanding food. The blandness itself became a kind of virtue, a way of reacting against my limited means; 'I can't afford good food? Well I DON'T CARE, SO THERE!'.

(Sidebar: I wasn't poor - in all sorts of structural ways, from parental support to a fees grant without which I wouldn't even have been able to start the PhD, I was well-off. But I was strapped for cash on a day-to-day basis for most of the four-and-a-half years).

Lots of other elements of my daily routine were similarly, deliberately anemic. I didn't care about them. I cared about the things that I thought 'enriched' my life - my work, my studies, my writing, music and gaming. All those things did, of course, greatly enrich my life. They all mattered to me, and still do.

But the quotidian stuff isn't meaningless, and one of the things I learned in counselling was how much I couldn't 'rise above it'. Quite the opposite, in fact - it dragged me down. Initially, I clung to rigid domestic routines to keep my budget under control, a strategy that worked but at a cost. The routine itself began to the object of my clinging, though, and therein became a problem.

When the disruption of decorating began to stress me out last summer, I initially identified my shattered routine as the cause of my mounting anxiety. I felt that if I could just get things back in order, I would stabilise. Only after the discomfort had almost boiled over into meltdown did I start to think that perhaps the routine itself - a rigid sequence of bland, boiled-potato nonexperiences whose only value to me was their place in the order - might be the problem.

I'm not actually eating much more healthily these days (and indeed, I'm still eating some of the same stuff - no more boiled potatoes, though). But I do try to think about what I'd like to eat before making decisions about buying meals. It wasn't hard to start developing actual preferences again.

(part 4)

Tuesday, 3 February 2015

Boiled Potatoes and the Analytic Method, part 2

I found myself in need of counselling last year. The counselling I received was extremely helpful, but it's only as, in the intervening time, I've started to study critical perspectives from gender and race discourse in depth that I've been able to understand the wider context of my difficulties. These approaches emphasise connectedness; the marketing of children's toys, for example, contributes to a domestication of women that in turn commodifies their sexuality and devalues their consent, leading to rape culture.

By contrast, the idiom of 'analytic philosophy', the tallest and remotest of the academic ivory towers, to which I've given a decade of my life and all my adulthood, puts detachment and abstraction foremost. It was detachment and abstraction - an overdose of both - that led me to counselling. What follows is a reflection on that journey.

As for what boiled potatoes have to do with anything? Wait and see... 

In part 1, I discussed the specific experience that led me to seek counselling.

Part 2: A Body with No Answers

I'm not going to go through everything I discussed in counselling. Not all of it is relevant, a great deal of it is probably extremely tedious, and the conclusions are likely obvious to all except the protagonist. My counsellor, Jules, was brilliant at drawing me out, getting me to reflect on myself without too much criticality. She didn't try to diagnose or explain, but let me draw my own conclusions and thus internalise each successive realisation.

I learned - perhaps it would be better to say 'reinterpreted' - a lot about myself in those five hours of discussions, but the standout experience is one that happened several times. When I was struggling, either for words or in discomfort, Jules would ask 'How are you feeling right now?' I never had an immediate answer.

In fact, I didn't really have an answer at all. Feelings are embodied things - they happen in the 'gut', the 'heart', sometimes the spine or the back of the neck. Jules would ask me, and (the first few times) specifically direct my attention to bodily sensation. I would frown, expecting an immediate answer (who doesn't know how they're feeling at a given moment?). When that didn't happen, I would interrogate my body, a technique I've learned for fiction writing.

And there would be nothing there. There were physical sensations - the chair, sometimes a headache or a dry throat, ordinary itches or aches - but no emotional ones. What I could identify of my emotions - usually a sense of dread about where a question might lead, how I might be pressured to change my behaviour - were 'head' things, and not sensory. It was the racing-thought, future-chasing anxiety seeded by stereotypes of therapeutic exercises ('Feeling lonely, you say? Okay, GO INTO TOWN AND START ASKING RANDOM STRANGERS FOR A HUG'), something that for all its unpleasantness is almost entirely mind, not body.

Trying to describe the silence in place of expected sensation is difficult at the best of times. I managed to be intellectually disturbed by the solid flatness of my chest - not cold or hard, like stone, just... there, like a well-plastered, plain-painted wall - but couldn't even feel afraid of it.

Occasionally, on the cusp of some realisation, there would be a vertiginous moment, a yawning, teetering on the edge of a bigger, more daunting perspective. That, at least, was a sensation, though mainly around the crown of my skull, sometimes spilling into my eyes as a headrush. It was all I ever managed to report to Jules.

I was self-reflecting the way I'd learned to reflect on everything else - Analysis, with a capital, historical A, a clinical process of standing outside an idea, surgically peeling away its context, tracing each vein and neuron one at a time. There's a time and place for that, perhaps, even when the idea is your own self, but it cannot, must not, be your only paradigm for thinking.

(part 3)

Tuesday, 27 January 2015

Boiled Potatoes and the Analytic Method, part 1

I found myself in need of counselling last year. The counselling I received was extremely helpful, but it's only as, in the intervening time, I've started to study critical perspectives from gender and race discourse in depth that I've been able to understand the wider context of my difficulties. These approaches emphasise connectedness; the marketing of children's toys, for example, contributes to a domestication of women that in turn commodifies their sexuality and devalues their consent, leading to rape culture.

By contrast, the idiom of 'analytic philosophy', the tallest and remotest of the academic ivory towers, to which I've given a decade of my life and all my adulthood, puts detachment and abstraction foremost. It was detachment and abstraction - an overdose of both - that led me to counselling. What follows is a reflection on that journey.

As for what boiled potatoes have to do with anything? Wait and see...

Part 1: To Paint a Comfort Zone, First You Must Destroy It

First, the journey itself, or at least the closing chapter of it. This, by the way, is not a dramatic or melodramatic story. Probably it's quite underwhelming. It has no histrionics, no blubbering collapses, and the longest redemptive journey involved walking round the corner from my department building to the university's counselling service.

Proportionately to that, it starts with decorating. Having made this rather optimistic post about how my bout of decorating last summer might go, things actually went pretty well for most of the process. The schedule was met, and by the Sunday of the week after that posting, I'd finished all the decorating work. All that remained was the carpet, which was to be delivered and fitted, along with a carpet for the adjacent bedroom, on the Monday.

And then, about lunchtime on Sunday, we spotted that the boiler, which is in the other bedroom, had leaked a few spots of water from what looked like a badly-corroded valve.

Obviously, there was no way we were going to put a new carpet into a room where a boiler might need a valve replacing (where, indeed, the whole boiler might turn out to need replacing - it's a pretty old one, though - *touch wood* - still reliable). And it was a Sunday, so reaching the carpet fitter to discuss arrangements with him was going to take a while.

I can't quite put into words how I felt about this (more on this point in a later part). But to resort to tired metaphors, a stone sank into my gut. My chest felt tight, and I found my jaw clenching a lot. Even thinking about the emotional state I was in then is making me feel a bit hollow now. In retrospect, it should have been a warning, but I was a little too self-absorbed to notice (if that even makes sense - too self-absorbed to notice my own emotional state?)

But it gets worse, because I wasn't the person dealing directly with any of the people who needed to be contacted about the carpet and the boiler. All that was handled by one of my housemates, the one whose bedroom had the boiler in it. I tried not to pester her, I promise I tried, but it still got to the point that I almost drove her to tears by passing my stress onto her.

Perhaps oddly, it was the break in tension that brought matters to a head. When she finally managed to get confirmation from the carpet fitters that they would be happy to come and fit just my carpet on the Monday, and do the other one at a later date, it was my expression of relief that finally pushed her to tell me to back off.

I spent the next fifteen minutes shivering in my temporary bedroom, fighting off a panic attack. A mild one, by the standards of some I've had. It was half an hour or more before I even managed to apologise.

AND EVEN THEN, I was only thinking about maybe seeking counselling, not really sure what I should be seeking counselling for. Being a rationalist is no guarantee of always being rational; being a lover of wisdom is no guarantee of always being wise. These revelations have a significant role to play in what's to come, but for now suffice it to say that I was eventually convinced to make good on the counselling idea.

(part 2)

Tuesday, 6 January 2015

Everyday sexism (that I am guilty of) part 2

Actually, this time it's not just sexism - it's every other dimension of privilege as well.

I'm working on a lengthy and complicated thing about white male identity and 'gamers' - my identity, basically. What I'm trying to do with it is address self-identified gamers defensive of our identity on the grounds that it's the only thing we have. I examine why it's possible to feel this, and how to think more broadly about our identity.

But it's really hard to do that without feeling embattled. 'Gamer' is an identity with a lot of really toxic associations. 'White' and 'male' are even worse, both having a long history of oppression and brutality. The urge is always there to get defensive, to rationalise or try to explain away my association with those identities. It's the urge to mansplain, whitesplain etc. (I'm not sure that 'gamesplain' is a thing yet, or just regarded as a combination of 'all the above') -  call it xsplaing in general.

The problem with xsplaining is difficult to state succinctly. It's most problematic when a privileged person butts into a conversation about a problematic pattern of privileged behaviour to explain why - even if not done in an explicitly abusive way, this reinforces existing power dynamics by demanding that every conversation be limited by our comfort. It also equates our discomfort with the actual harm suffered by other groups, which is dismissive of their experience as well as flat-out inaccurate.

Another problem is in demanding 'they' solve 'our' problems - the attitude of 'if you don't like it, you tell us what to do'. We're adults. If someone criticises us, we've got to be able to take responsibility for that. Before demanding specific attention from someone - adding to the burden you've already imposed on them - do some googling, or at least some self-reflection, to try to understand the problem.

This goes doubly for issues of identity. The piece I'm writing is an attempt to collect some criticism of 'gamer' and develop from that a better model of the identity. I don't agree 100% with everything I'm quoting, so there is some editorialising, but my primary purpose is not to refute or dispute those criticisms; it's to identify what we can learn from them.

So I have to be very careful of where I'm pointing my arguments. How often do I have to check for xsplaining? Every. Damn. Sentence. That's really what I want to get at here (as with last time out); this isn't something to only worry about occasionally. It's not even limited to times when you're actively engaging with someone from a different background (though that's when it's at its absolute most important).

It's so hard to resist the urge to make excuses, to haggle, to move from addressing the problem to denying it. And this is in an article specifically addressed to our concerns - I'm not trying to join an existing debate (though I am responding to one). It's even harder when engaging with people 'live'. But you can't learn or grow while rationalising; xsplaining serves your ego at the expense of your mind - not to mention at the expense of other people's peace of mind.

Tuesday, 30 December 2014

Cultural Vertigo

A few weeks back, someone challenged me on Twitter to come up with a New Year's Resolution and I came back with 'Open some of the doors I've got my toe in at the moment'. That's a worthy, if slightly trite, answer, but since then I've come up with a better one.

I'll get back to that at the end of this post, by which time I think it will be obvious what I've chosen. I've had a year, particularly this final third of it, of learning a lot. There are personal and professional elements to that, but where I've learnt most, where I've been most challenged, has been from the Twitter timelines of people I've followed because gamergate targeted them.

Gamergate has been and remains terrible, but in listening to those fighting it, I've received a whirlwind tour of critical gender and race theories on a par with the experience I had a couple of years ago as an amanuensis on a university-level Special Educational Needs/Disability Studies course. It's forced me to reexamine a lot of my preconceptions about games, about feminism and civil rights, about myself as a progressive and a liberal, and about my species as a whole.

And I'm starting to realise that there's a characteristic emotional state that accompanies the best of this learning. It's not a pleasant one. It often hits when least expected - this piece challenging the player-centrism of established gaming, for example, challenged me much more than any number of pieces about how reprehensible gamergaters are (because its critique applies to games I love just as much as, say, Hatred). It involves a slight feeling of nausea, and a stronger feeling of panic, of being overwhelmed by how much change might be needed to accept the argument.

I think of it as cultural vertigo. It's one thing to say 'I support diverse perspectives in art!', and another entirely to actually look down from the cultural pedestal (or out from the cultural bubble) of being straight, white and male and catch sight of those perspectives for the first time. It has nothing, of course, on the terror and hurt that straight white men inflict on others worldwide, but those are terrors that I am unlikely ever to experience the like of.

Cultural vertigo isn't comfortable, but it can be inspiring, and it has been a pretty consistent sign of opportunities to make myself a better person. Since there's a lot of work to do on that front, my New Year's Resolution for 2015 is to seek out cultural vertigo as much as I can stand to.

Wednesday, 24 December 2014

Idiot Overload

It's been a truism of human society for centuries that it's easier to sound convincing than to be right - 'a lie has run around the world before the truth has got its boots on' and so on. I want to pick out one particularly egregious manifestation of this, something I've only taken conscious note of recently, though it's probably not new. There may be a 'proper' name for it, but for now I'm just going to call it idiot overload.

Idiot overload happens when there are so many errors, inaccuracies and other logical problems with a statement that it's impossible to refute succinctly. 'Succinctly' here means 'coherently and within the attention span of the relevant platform' - to use a pretty blunt example, most tweets are very difficult to refute in the space of a single reply. (Another example would be how hard it is to write a comprehensive reply to something in a blog comment before other commenters get in and move the debate along).

The sole intelligible claim to emerge from gamergate, 'Gamergate is about ethics in game journalism!' makes a pretty good example. As far as I can see there are at least five major objections to this statement:

1: gamergate more or less ignores actual serious breaches of journalistic ethics, like the review embargo on Assaassin's Creed: Unity that meant no reviews were published until 12 hours after release. Sure, maybe some gamergaters shouted about it briefly, but there's been nothing like the sustained campaign of anger directed at gamergate's preferred targets.

2: gamergate has yet to articulate a clear system of ethics of any kind. Ethics are systematic - not just a collection of arbitrary laws, but a coherent framework that allows the extension of those laws into situations unforeseen by their authors (again, unsophisticated example, but the provision of the U.S. constitution for later amendments is a version of this).

3: unethical behaviour in the promotion of an ethical system is hypocrisy, and self-invalidating. If your behaviour is unethical, you are not supporting ethics of any kind, no matter how you shout about it. The ethics of what is and isn't OK in acts of protest are complex, but without thoroughly engaging in a discourse on that topic you have to err on the side of caution - one thing gamergate certainly hasn't done.

4: game journalism is, by and large, critical journalism rather than reporting. It's not purely descriptive. In the early days of gamergate, there were attempts by various gamergaters to codify the journalism they wanted from the games press - focus on facts that could be presented honestly without or in resistance to financial/privileged incentives from the development side of the industry, the removal of opinion. The problem with this is that that's not what game journalism is or was ever for. Yes, factual journalism has always been a part of it, in reports and previews of what's coming up in the near future, and in discussions of hardware, but the majority of game journalism is reviews, and reviews are always going to be a matter of opinion. Want a broader perspective than any one person's opinion? Read a bunch of different reviews.

5: the feminism that gamergate actually spends most of its time fighting is itself ethics. What Anita Sarkeesian, Brianna Wu, Leigh Alexander et al have been campaigning for is more ethical weight in gaming. I suppose in that sense one could argue that gamergate is about ethics in gaming in that it's about keeping ethics out of gaming, but I don't think that's the claim gamergaters are making.

(sidebar: no, feminism is not 'an ideology', at least not in any sense that implies it isn't ethics. People describing feminism as an ideology are generally trying to paint it as a matter of opinion, when many of the most important concerns of feminism - rape statistics, wage gaps and so on - are matters of clear, repeatedly-proven facts. Just as human rights are an ethical system, not an ideology, so it is with feminism)

So there you go. It took me 473 words to give a (very brief) sketch of the objections to a 7-word statement.  I may be missing some objections outright, I'm certainly missing key details from all of those points (to say nothing of evidence and examples, but this is one blog post and I am only human). It's just not possible to give an organised, ordered summary of the objection to 'it's about ethics in game journalism' (at least, that provides any more detail than 'NO') in a short space of time - human beings don't read fast enough.

I don't have a solution to this one, I'm afraid. Other examples include 'if global warming is real why is winter still cold?' and 'evolution must be wrong, my grandparents weren't monkeys'. When there's just too much to argue against, you have to rely on the general audience understanding enough to enumerate the problems themselves, which has never yet been a safe bet (though of course we can work towards that in the long term).

Thursday, 18 December 2014

Lecturer

No-one who knew me during my first year at university, or to be completely honest at any point in the five-to-eight years before that, would be surprised to hear that I enjoy lecturing. During that time, I lectured indiscriminately and at length to anyone who paused to listen or gave me a reason to open my mouth.

The years (yes, all eight of them) have humbled me somewhat. When my head of department asked me back in August if I'd be willing to do some lecturing this term to fill a gap left by a departing staff member, I was paralysed with something quite a lot like fear. On the one hand, I knew it would mean more money, money I do still need to be quite careful about. On the other, it meant standing up in front of dozens of undergrads - any one of them potentially as uppity as I was at their age - and desperately pretending to be an expert.

I did not feel qualified to be an expert.

I also didn't feel that public speaking could be a strong point of mine. Historically, I've done a much better job expressing myself in writing than verbally (which, long-time followers of this blog will realise, implies some truly horrific moments of verbal misexpression). I'm not good at improvising and I know from long, painful experience that an over-planned lecture, particularly one with a tight, complete script, is a miserable waste of student time.

But in the end I took the job. I arranged my share of the lecturing so I wouldn't have to be the first member of the team to go in front of the students, did my best to prepare, and fretted until it was my turn. I felt certain that I'd panic, or stumble over my tongue and say something completely false, or that I'd do that thing nervous speakers do where they steadily speed up and up and up and turn everything into horrible run-on sentences that go on and on and on forever until you're really desperate for the end of this paragraph right now aren't you?

Suffice it to say, none of that actually happened. My lectures weren't perfect - I flubbed jokes, ran too long in some sessions and too short in others - but I've not seen a catastrophic drop-off in attendance and no-one's made a formal complaint against me, so at the very least there have been no disasters. And it's actually been quite fun. Well, not preparing Powerpoint presentations for each session, that sucks, but the rest of it.

Probably some of this is unhealthy - the gratification from playing the expert for an hour or two a week (sidebar: turns out that compared to people with a decade's less experience of philosophy, I am an expert and don't need to do much pretending) - but unless and until someone complains I'm not going to overanalyse. For the most part, I'm just pleased that I've now been asked to do more lecturing next term - though this does mean I'm going to be just as busy and thus have just as little time for blogging, which is why new content here has been a bit sluggish in recent weeks.

Thursday, 4 December 2014

Everyday sexism (that I am guilty of)

I was walking across campus on Monday and it so happened that the person in front of me on the path was female and attractive. I made a conscious effort not to ogle, and yet, when she was greeted by a group of her classmates waiting outside a lecture hall, I still had this weird moment of cognitive dissonance. Suddenly, she was a human being interacting with human beings, rather than a shape taking up a central chunk of my visual field.

Was this entirely a sexist response? I don't know - it was, after all, Monday morning and I'd just been giving a lecture on semantics for quantificational logic, so I was a bit spaced out, and maybe there's an argument that I was just startled by the intrusion of voices in what had been a quiet environment - but I think I can tell the difference between sensory and cognitive startlement. My point is this: it's that easy (for me as a man) to dehumanise a woman, even despite a conscious effort not to. That's how insidious sexism can be.

Another example, this one perhaps a little less everyday, but more stark. Over the last couple of days, Crash, the dog belonging to game developer and favourite gamergate target Brianna Wu, took severely ill and died. Wu mentioned this on Twitter and was barraged with abuse in the form of mockery of Crash, photos of mutilated dog corpses, and at least one fake account for Crash proclaiming 'lol I'm going to die soon'.

All of which is horrible and reprehensible, but that shouldn't need saying. What does need saying is this: I felt a new level of shock and outrage at this kind of abuse, compared to the 'usual' abuse Wu has been receiving (threats of rape and murder against her, her family and friends, and her business, which among other things drove her from her home).

To put it bluntly: abuse aimed at the dog had more emotional impact with me than abuse aimed at the woman.

Perhaps it's tempting to say something along the lines of 'well, yes, but the dog's innocent, gamergate shouldn't be dragging a pet into this'. That's stupid, though, because it implies that Wu is in some way not innocent. That she deserves some part of what's happened to her, which is bullshit.

My point, guys (and I do mean guys) is this: these subconscious psychological mechanisms don't go away when we decide to try to care about other people. I don't know whether these two responses are things I've learned or are innate in some way, but they're habits of thought so deep that even when trying to be conscious of them I miss them working.

And they are responses I am responsible for. Even if I was born with the tendency to think about women this way, as long as it has the power to affect my behaviour, I am responsible for making sure that it doesn't. I am responsible for making sure that my poisonous habits of thought don't spill out into the real world.

That requires an effort of constant vigilance, regardless of whether it's Monday morning and I've just come from lecturing on difficult logic. And it really matters, because (for example) a huge part of the problem of stopping gamergate, and of taking it seriously as something that must be stopped, is a lack of empathetic understanding of what life is like for gamergate targets - of how damaging harrassment can be.

It's exactly the kind of empathy that I've failed at (at least) twice in the last week which we (men) most need in order to recognise, understand and tackle this problem.

Wednesday, 19 November 2014

Getting what you wish for

Back in early July I wrote this post, basically bemoaning the fact that video games aren't 'taken seriously' by our culture. Since then some very serious things have happened in gaming (i.e. gamergate), and so one negative element of gaming has come to be taken a little more seriously, but that's not what I'm going to talk about today. I refrain from talking about gamergate primarily because I've yet to think of anything I have to add to the discussion that hasn't already been said - I do, of course, wholeheartedly condemn gamergate itself.

Today, I have something rather more optimistic to offer. I'm now part of the planning process for a university-level 'gaming and interactive media' course (title not final) within the University of Liverpool's School of the Arts (backstory: UoL is my alma mater and now my primary employer; I lecture and teach in the Department of Philosophy, part of the SotA). We had our first departmental discussion of possible modules/topics this morning, and I got clearance to engage in some informal public consultation.

This is exactly the kind of thing I was yearning for when I wrote that post back in July. It's not a technical course - we're not expecting to cover programming, hardware design etc. though we may build links to courses that will - but a cultural/humanities one. The process will put gaming closer to film, television, theatre, literature and so on in terms of serious cultural consideration.

But this is a very new field, and if we're clear about one thing so far it's that we're not clear about much. I'm looking for suggestions of issues that a course like this - a humanities course, one approaching games as cultural artefacts - could or should address. If you're a gamer (either in the sense of 'someone who plays games' or of 'someone who identifies primarily as a gamer') what would you like to see discussed?

Some issues are obvious; for example, there's no clear definition of 'a game' or 'a video game', and phenomena such as augmented reality gaming and the gamification of education make the definitional question profoundly interesting. There are complex issues relating to authorship within videogames, too; who is the author of a narrative which is directed as much by the player (the audience) of a game as its developers? And, given everything that has come to the surface over the last three months, it would be negligent not to discuss feminist critiques of games (along with other dimensions of privilege - race, sexuality, ability etc.).

Not everything need come under the banner of philosophy. Our School of the Arts includes Music, English, Architecture and Communications/Media Studies, and there's a joint meeting in three weeks' time where we'll all be putting things forward. Any suggestions you can offer for what we should cover will be most welcome.

Wednesday, 12 November 2014

Fully Automated Luxury Communism

Go watch this. It's not necessarily a perfect sociopolitical model (after all, it's only an 8-minute video), but it's an interesting idea. The claim is basically that automation means that very soon - in the next 20 years or so - no-one will need to work longer than 12-15 hours in a week (note: we're talking about a quite complex notion of 'need' and a conservative estimate of the effects of automation in that sentence - but there's nothing in that to render the claim implausible).

And once you've watched the video - probably well before the end, in fact - you'll immediately be able to hear in the back of your mind the voice of your current political leadership (at least in Britain and America) raising the following complaint:

'Without the incentive to work more and work harder, everyone will just sit around all day doing nothing!'

Now, anecdotes are not data, but I believe I can provide at least one counterexample to that objection. For the last two-and-a-half years (longer, depending on how you account it), I have been in the fortunate position of working an average of less than 15 hours a week, and having living costs small enough and a wage rate good enough to make ends meet.

In that time, I've completed a PhD (including all the thesis-writing and most of the specific research), written well over 300,000 words of original fiction (seasons 2 and 3 of The Second Realm, two NaNoWriMo projects and a handful of short stories/novellas), written and recorded an EP of original music (which you shouldn't listen to because it's terrible but no-one can say I didn't put effort into it), decorated half a house, and studied a huge amount of stuff about the world, from feminist discourse to critical history to the publishing industry.

Perhaps none of that sounds very worthwhile (because it didn't make me any money, perhaps? But then what did I need the money for, if my living expenses were met?), but even the most cynical person could not accuse me of inactivity. And, since all these activities are things I value, no-one could accuse me of not trying to better myself (whether or not you think I'm barking up the wrong tree in terms of what I value).

The other objection to my example would be to suggest that I'm in some way exceptional - that most people in my (again, extremely fortunate) position would not behave the same way. There are two possible responses to that. The first is to take the objection as claiming that I possess some rare intrinsic virtue of productivity - and anyone who's seen me on an off day can tell you immediately that this is a particularly stupid idea. I am possessed of no exceptional will or drive at all, only a rare freedom to express a very ordinary human will.

The second response is to take the objection as making a purely statistical claim, that there is a body of data from which I am the exception. The problem with this is that no such body of data exists - my circumstances are simply too rare. There are very few jobs on which you can make even as much as I do from as few hours as I work, and I have exceptionally low costs of living.

In fact, the only people who work less than I do for more money are the very heirs and old-money institutions most likely to be found making this argument in the first place. So if I am the exception, it suggests that they are, in fact, a bunch of lazy tossers. News to no-one, perhaps, but nice to have it confirmed in their own arguments...