Meaning, Monkeys and Misinterpretation: Neuropsychology and Humanities
Applying neuropsychological theory to the Humanities’ own study of reading is a potentially fruitful ground, but as usual much more complex than it first seems.
The two neuropsychology papers we’ve looked at this week address separate issues. Maravita and Iriki’s ‘Tools for the body (schema)’ analyses humans’ and monkeys’ neuropsychological response to tool use. The evidence it presents supports the idea that when we use a tool our brain treats it as part of our body. In ‘When Meaning Matters, Look But Don’t Touch’, Davoli et al devote their study to the effects of posture, in particular positioning of the hands, on reading capabilities.
Davoli et al is obviously more directly in the Humanities realm. Scientific theories about how we read can explain certain trends like the continuing preference for holding physical texts, whilst throwing up issues about how we read. Davoli et al suggest that reading with hands near to text improves spatial reading at the expense of semantic reading. In other words it is easier, but we get less out of it. Scientific data and interpretation like this can help Humanities locate itself in the current transitional climate of reading habits. It helps explain why so many have found their attention spans shortened online, but also provides a bit of support for online readers.
All well and good, you may say, existing trends have just been given some hard scientific evidence to explain them. However, studies like this risk generating competition over which is the ‘best’ way of reading. “We read faster!” yell the book-holders, “We understand more!” bark the digital readers. Some things in Humanities are difficult to express in scientific, quantifiable terms. Reducing reading to measurable ‘spatial’ and ‘semantic’ comprehension misses a great deal in terms of experience. Other hard-to-measure aspects (eg long term memory, intellectual impact and emotional response) get neglected, and the incredibly subjective, complex practice of reading is simplified to a unrecognisable degree. Further study presumably could build up a more complete scientific picture of reading, but there are some complexities of human experience which resist analysis and reduction to data.
Maraviti and Irika focus more on making links between human and monkey neuropsychology. Their study of how our brains act when using tools shows that we appropriate tools into our body schema – our psychological concept of what constitutes us. This is relevant to posthumanist theories about cyborgs and transcending the ‘meat’ of our bodies. Maraviti and Irika’s analysis supports theories of consciousness freed, or at least loosened, from its material grounding in the body. It also seems to support Heidegger’s philosophical ideas about how we come to understand an ‘equipment’ through use, through its function as a ‘something-in-order-to-‘ rather than a mere ‘thing’.
The concept of the body schema is perhaps better suited to neuropsychological/Humanities crossover study than reading is. Humanities’ questioning of body boundaries is a well-established field which might be further explored by looking at brain activity, but is unlikely to be restricted by it. Art will stubbornly refuse to be defined by science. Digital reading, as a newer field and one already immersed in the world of data, is at more risk of being defined by data trends and qualitative assessments. In addition, Humanities scholars are liable to misquote and misinterpret scientific data to support their own theses rather than basing their work on a solid scientific understanding. Neuropsychology and the Humanities must approach eachother with caution.
I found this article particularly interesting; it tapped into the reason why I’m so reluctant to go the whole hog with touchscreens (even though I’ve got a touchscreen phone myself) . Our hands are not only incredible tools in their own right, but why is it their potential WITH technology seems to be largely missed…http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesign/
Instead, flashy (and slightly OTT) companies have sprung up : http://www.tactustechnology.com/ “a new dimension of touch”…. to battle against “arm and finger fatigue [which] is a common occurrence due to constant hovering”… [it gets better] http://www.youtube.com/watch?v=t4eh-Cn3Pzk
Although I think it’s actually something I’d spend money on (tactile buttons that don’t compromise the screen view), I can’t help but think they’re missing the point. What that point is I’m not really sure – much how the rant offers no future options but highlights our need to be more aware of our hands as something more than just button pushers…even if we can now feel the buttons (again).
I feel like this is getting closer to where technology could take us, on the sensory side of cyber.
“Senseg’s “feelscreens” … applying a proprietary coating to the screen and varying the electrical charge in it, that screen can be made to attract or repel the flesh of a touching finger. The electrostatic force involved is small (think rubbing a balloon on your sleeve, and then sticking it to a wall), but it is enough for the nerve-endings in a fingertip to detect. Modulate the force appropriately and you can induce a feeling of sandpaper, corrugated iron or fabric. You can create the perception of raised keys on a screen, even though it remains flat, or of an e-book’s pages actually feeling like paper. You can even make the features on a map stand out to the touch, allowing the blind, as well as the sighted, to navigate.” http://www.economist.com/blogs/babbage/2012/06/haptics This will be coming to us in a year or two!
More like a Mary Poppins bag of tricks, a small interface that through the intelligence of our hands (working with the technology) we can access ‘real’ things in a convenient way, e.g. being able to feel fabrics whilst online shopping. (Or as I spotted in the comments of this article the new possibilities of playing video games, or watching porn…. :O )
It’s an interesting time to be writing in response to Adam Greenfield’s blog post regarding “What Apple needs to do now”, considering that on October 29th of this year, Apple had a reshuffle of their executive team, resulting in the loss of Scott Forstall, (of Siri and Apple Maps fame) from the helm of iOS design. Forstall was, as John Koetsier puts it, “a fan of skeuomorphic design… and the results can be found in Apple apps” such as iBooks’ IKEA style bookshelf or the leather-bound address book. With a new brain leading the design team forward, it seems more than likely that we will begin to see changes in these iOS classics, that may well answer Greenfield’s pleas.
Why? Because the current offering of apps provided by Apple are plagued by inconsistency, not only between themselves, but with Apple’s entire design ascetic. All of those which are attempting to digitise a pre-existing medium, whether that’s an address book, calendar or compass, all fall foul to the recognisable tropes of our non-digital lives, making us comfortable, relaxed and confident that we already know what we’re doing and how to do it. However, when we look at Mail, or Safari, apps that are purpose built to complete tasks that simply didn’t exist before digital, the picture is far more sleek, trimming all of the excess coding that goes in to producing a leather effect in favour of producing apps that genuinely fulfil their purpose, whilst simultaneously feeling much more at home on a device that looks far more Jetsons than Waltons.
So what does this mean? As we become more and more comfortable with the idea that we’re living in a post-PC world, it seems inevitable that the days of using anything bound in actual leather are soon to pass. It’s not long before schools will teach exclusively on tablets and e-readers, calendar shops will no longer pop up just before xmas, and the word ‘Filofax’ will be just a distant memory, and when that happens, apps will no longer be attempting to simulate these relics, but rather will BE the thing that fulfils their own purpose – removing the need to simulate comfort and confidence within their user. Not only that, but by stripping back all of that excess coding, they’ll perform the tasks set out before them to a far greater level, way beyond that capable of an app that acts only to emulate paper and pen.
Only when the whole Apple catalogue starts to abandon the skeuomorphic look will it feel like a complete package suitable for a device as technically innovative as the iPad. As technological developments eventually lead to an abandonment of analogue tools, applications move closer and closer towards what Baudrillard terms second order simulacra. In order for these applications to become truly innovative, however, it is imperative that they take the next step, embrace the digital medium and reach the third order of simulacra, copies without an original, but which fulfil their tasks better than any original ever could.
Nigel Ackland received serious crush injuries to his right forearm when it became trapped in an industrial blender 5 years ago – after 6 months of operations and infections he opted for an elective below the elbow amputation
He originally had a non-functioning hand that just gave the appearance of a hand, then a body powered hook (not very comfortable or functional), then an electronic arm but which did not look like a hand.
He is now one of just 7 people in the world undergoing a 4 month trial with the carbon fibre bebionic 3 myolectric hand. It responds to twitches in the muscle of the remaining part of his arm. This hand features:
- 14 selectable grip patterns
- Proportional speed control
- Powerful microprocessors – continuously monitor the position of each finger
Very active field of research:
- A company called Otto Bock is working on using multiple electrodes to increase wrist flexibility, and has carried out operations to rewire the nervous system, meaning that the arm could be controlled intuitively by the mind by the mind rather than by flexing of the surrounding muscles.
- One of the problems faced by bionic limb users is that sweat or movement of the prosthesis can cause a disruption of signals, and can also cause sores on the skin. Researchers in Britain have tried to get round this by Intraosseous Transcutaneous Amputation Prosthesis, which is a rod screwed into the bone of an amputee onto which prosthetics can be fitted directly and securely. This avoids friction and allows for the carrying of heavier loads.
- Cambridge Centre for Brain Repair are developing neural interfaces whereby prosthetics will communicate wirelessly with implants fitted directly into the nerve fibres in the stump. Once the device is inserted into the nerve, nerve fibres grow through it. Signals could then be transmitted wirelessly to a receiver in the prosthetic. This has been tested on animals, and they hope that in three years they will have developed a prototype interface that could be taken on for human trials.
- Flexible electronics or “prosthetic skin” – “We’re looking into putting electronics onto surfaces that can be deformed, flexed but also stretched like a rubber band,” says Stephanie Lacour of Switzerland’s Ecole Polytechnique Federale de Lausanne. This could then be used to make some kind of glove covered in electronic sensors that could mimic the sense of touch felt in human skin. This could then hopefully feed back to the brain through neural interface devices.
- Otto Bock is working on a simpler device of this sort where information from the prosthetic skin could be sent back to the user via vibrations or pressure on the adjacent skin.