I went to a family gathering this weekend, about two hundred miles from the places I call 'home'. This is actually the first time this decade I've been that far from home (I don't get about much). The journeys there and back were seven and five hours respectively in a car, and were the longest car journeys I've been on in the same period.
Travel is starting to become a theme of my writing about video games. Many of my favourite in-game moments have to do with travelling through virtual worlds. A big part of my excitement for upcoming games like Tales of Zestiria, Xenoblade Chronicles X and Final Fantasy XV is that they offer vast new worlds to explore.
By contrast, I really don't like to do anything in the real world that's vaguely analogous to my virtual explorations. I don't like hiking, I don't like driving (or being a passenger, since I can't legally drive myself anywhere), and I generally don't like to travel. There are a variety of reasons, but the biggest is quite simply that real travel is a lot of effort.
I want to see mountains like the ones my dad likes to climb without getting sore feet. I want to experience the vast sweeps of landscapes like the ones we drove through this weekend (some of which were very pretty) without the steadily-hardening crick in the base of my spine. I want to float through the world as effortlessly and intangibly as a videogame camera.
And I thought until this weekend that there was no harm in this, provided I kept to my lane and didn't get too whiny when I actually do have to travel. But Actually Travelling, and looking at the landscapes I travelled through with the same critical eye I've been training to look at videogames with, put me ill at ease.
Games, by the limits of their technology and the demands of their audience, compress distance. Sure, you can walk for some hours in a straight line without touching the sides of some recent games. But you could walk the real world for the same length of time and in many places not even leave the valley you start in. I found myself wondering, as we drove over the crest of a hill and the horizon retreated on Friday afternoon, how dishonest it is to indulge in this, and how harmful.
It would be a pretty shallow critique to say that the shortening of distance in games is straightforwardly misrepresentative and creates harmful expectations of travel - no better than tired old arguments that the mere presentation of violence is sufficient to induce people to be more violent. It's more complex than that.
But games are so often about the mastery of space, the individual eventually rising to an effortless, unchallenged mobility. There's no better example of this than the 'airship moment' in JRPGs, the point at which you've explored most of the world on foot and, to avoid forcing you to retread old ground, you get a tool that allows you to hop or float to wherever you want, bypassing even the abstractions that are supposed to add labour and time back into the earlier compressed journeys.
This is all a bit unfocussed and musing-y (which is why it's here and not on my actual games blog). The problem probably has more to do with the way games construct mastery than the way they handle travel and distance. But this weekend, looking out at the same sweeping vista for twenty miles and realising it would take ten times as long to walk it on foot as in-game, I had a moment of very sharp discomfort. I hope I'll be able to hold that in mind as I develop my critical ideas on this theme.
Monday, 20 July 2015
Wednesday, 1 July 2015
'Reals over feels!' and variants thereof have become a slogan for the internet right in the ongoing shitstorm over whether 'objective' journalism is a thing. Simplistic as it is, it's an expression of an ideology with roots in the work of some much-celebrated twentieth-century philosophers (at least, those in the English or anglocentric tradition), and indicative of a subtle shift they engineered in how we use language about truth and reality.
The general meaning of 'Reals over feels!' is that one kind of proposition, understood as impersonal and objective, should be considered more true, or more worthy of consideration, than another, understood as personal and subjective. Statements about objects independently of some personal perspective are considered accurate; statements about how something seems to some perspective are inaccurate or invalid.
In the journalistic field, the problem with this is that all journalism, no matter how diligent, is perspectival. This issue generalises, though; there are, quite simply, no non-perspectival facts. This is not the same as saying that there are no facts. How we arrive the idea that there are objective facts despite the truth of this claim is a topic to which I'll return later.
Let's start with the basics. We have known since Descartes that the only sure foundation for knowledge is conscious experience – that the only thing that absolutely cannot be doubted is the current content of our conscious fields. I may be only hallucinating that I sit in front of a computer right now, or the image of the computer that I see may be the product of a Matrix-like simulation (or, in Descartes' scenario, a trick played by an evil demon), but I cannot be mistaken that I seem to see the computer; I cannot be mistaken that a computer appears before me.
Absolutely every other thing we take ourselves to know is known by inference, and every means of inference we have, we know to be periodically fallible. So however reliable a claim of knowledge beyond immediate awareness may be, we know there must still be some small chance that it is wrong.
Why emphasise this? Well, as the physical sciences developed in the wake of Descartes, the gulf between direct awareness and the theories of the sciences widened dramatically. The seventeenth century gave us cells, the nineteenth atoms and the twentieth quanta, so tiny that any conventional notion of being directly conscious of them goes out the window.
My point is not to dismiss the theories of science, not at all; just to remind where they come from. What I want to repudiate is the relatively recent philosophical contention that because direct awareness is of appearances that often do not correspond closely to the equivalent deliveries of scientific theory, it is inherently misleading.
The difference is subtle. No-one denies that appearances can be deceiving. The question is whether appearances are inherently deceiving. The shift from the former claim to the latter was accomplished in philosophy largely during the rise of Bertrand Russell, and the fall or at least pushing-aside of the British Idealists and Continental Phenomenologists his dominance replaced.
Idealism (in this, rather than the political, context) is the metaphysical and/or epistemological theory that the world, or at least our knowledge of it, is fundamentally based on experiential/conscious facts. Phenomenology is a philosophical method that requires one to start from what is observed most directly, explain that, and then build on the explanations. Both these positions have clear roots in the idea that conscious awareness comes first.
Russell, along with friends like G.E. Moore and disciples like the young A.J. Ayer, held that these approaches had given rise to obscure and absurd metaphysical systems, convoluted theories that were of little use in actually explaining things. It's true that the phenomenologies and idealisms of the 19th century were complex, but then so is the world. Russell in particular held logical clarity to be the most important virtue of philosophical systems, and was willing to ignore or bury a great many issues that would not submit to logic-based treatments.
What all the theories discussed so far, including those of Russell and Moore, have in common is that they are attempts to explain the relationships between the experiences of different people. We take my experience, now, of sitting in front of a computer, to be accurate precisely because if you came and took my seat, you would have a similar experience, and because if I come back to this room later today and sit in this same seat, I will have another, similar, experience.
The standard early modern philosophical explanation of this consistency would be that there is an object, the computer, in this room that produces computer-like experiences for any who sit in front of it. With modern science, however, we know that the object in this room can't be simply described as 'a computer'; it's an immensely complex structure of polymers and electronics, each themselves complex structures of molecules, which are made up of atoms, which are in turn complex structures of subatomic particles and fields that can be described mathematically and logically but not terribly intelligibly.
This sets up a huge discrepancy between the experience of sitting at the computer and the scientific description of what's going on (it gets even worse if you try to factor in a scientific description of me, and/or the process of perception). The tendency of the (anglocentric) philosophers of the 20th century was to argue that this meant the experience was deceptive and false, while the scientific description was accurate and true.
But as I argued here, the scientific description is actually much less useful than the experiential one in most cases. Our knowledge, the everyday stuff that enables us to find our way to the shops and so on, is overwhelmingly experiential in character.
And this hints at another reason for preferring the experiential to the scientific; where scientific knowledge is useful, it is only because of some effect on experience that it enables us to generate. The microscopic precision that allows Intel to inscribe GHz CPUs on a postage stamp is only worth achieving because computers enable wondrous new experiences, whether that means exploring the Mushroom Kingdom or establishing personal relationships that stretch around the globe or even just being able to do your own accounting without needing pages and pages of maths paper.
In truth, all values – not just the emotional or aesthetic, but every kind of utility as well – are values only from some human perspective. Feels are reals, both in the general sense that perspectival facts are real (because they are the only kind of fact), and in the specific sense that emotional perspectives are important, because it is those emotional perspectives from which the values that make anything at all that we do worthwhile spring.
The only question that remains is why, if all this is correct, some people are so convinced that there is an 'objective' perspective, one that is right above all others. I said above that the issue is about the relationship between experiences; Russell and his colleagues came to see the explanation for the relationship as more fundamental than the experiences it relates, but there is another process at work here too.
To examine the relationship among experiences generally, you must have a set of experiences to generalise from. Ideally, as indeed the theory behind the scientific method suggests, this sample will be representative; if it is not, there is a much bigger chance of missing something important. In practice, you cannot include experiences of which you know nothing.
And the philosophers I've discussed here had relatively narrow ranges of experience to draw on. Descartes (and other influential early modern philosophers like Locke, Hume and Kant) lived most of his life in and around the courts of Enlightenment Europe. Russell and his cronies were ensconced in the ivory towers of British academia (and I can tell you from personal experience just how narrow the windows there are).
Not only did these men have a limited range of experiences to draw on, they either had or have subsequently gained a great deal of influence to pronounce with. Their positions, social class, shared ethnicity and so on have made them Great Men with Important Views; people who have differing opinions seem unimportant by contrast. This actually applies to their historical opponents every bit as much as to marginalised people today; one hardly hears the names of Bradley and Meinong in philosophy classrooms anymore.
The tendency of self-proclaimed logical thinkers to exclude dissenting opinions from both history and contemporary debate should by this point be sadly familiar. It's a self-reinforcing process; when dissent has already been shut out once, it is much easier to dismiss a second time. People clinging to Russell's model now, a hundred years down the line, may not even realise how trapped they are in it. Open-minded reflection on the views of people from different backgrounds and demographics is the only antidote.
In summary, the idea of an 'objective fact' is a mirage. The only indubitable propositions are subjective in character. The philosophical models that allow us to link them together into a coherent world are at best intersubjective, a negotiation shaped by social pressures much more than 'purely intellectual' considerations (if there are any such things).
 It's worth stressing at this point that awareness is not purely sensory – memory is a kind of awareness, so your memory of an event (though not the event itself, if it is in the past) may serve as the foundation for some knowledge, or at least reasoning).
 There's an interesting comparison here with the current state of quantum physics. Its more phenomenological elements – the mechanics that describe and predict actual measurements – are the most accurate science yet developed by man, but the interpretations that seek to explain why those relationships exist... well, here's the Wikipedia page on interpretations of quantum mechanics. Just count how many different interpretations there are, don't try to wrap your head around them all.
 This Spockish attitude persists today in the myriad ways our culture insists on quantifiability and computability. Things (like emotions) which are messy to compute tend to be regarded with suspicion.
 In the interests of inclusion, this should be 'appropriately human-like perspective', really. We want to be able to extend values and valuation to sentient aliens, sophisticated animals and so on.
 Which doesn't excuse their lack of awareness, since they're also likely to have more spare time and money to support self-reflection with than other social groups.