Ethics in scientific research can be very, very frustrating. At MPI, we’re pretty lucky in that we have blanket ethical approval for all studies which use standard methodologies (behavioural, eye-tracking, EEG, and fMRI at the Donders Institute) and non-vulnerable populations (i.e. not children, not the elderly, and not adults with medical or mental disorders). Even then, though, it’s complicated.
For example, I have to include a section in my EEG consent forms which says that if I see any indication of a neurological abnormality in the signal, I will report it to a clinical neurologist. The thing is, EEG doesn’t work like that; you can’t look at the signal, point to it, and say, “yup, this bit’s gone wrong” like you can with an X-ray or a structural fMRI. Interpreting EEG signals depends on whatever the person is doing at the time, and unless they’re doing a specific task for making a specific diagnosis, all you can really tell with EEG is whether somebody is moving, blinking, or currently having an epileptic seizure (or if they have them often).
As another example, there’s a difficulty in reconciling data protection (which is a good thing) and Open Science (which is also a good thing). The Open Science movement advocates archiving your raw data and participants’ metadata so that other scientists can scrutinise your analysis and replicate – or not – your work. This is easy enough for behavioural data; we just ask participants whether they consent to the anonymised sharing of their raw data. With fMRI data, though, it’s technically possible to reconstruct a participant’s face from the structural scans, which could violate participant anonymity. And with video corpora, this is hugely problematic. The Language and Cognition group at MPI do a lot of work with video corpora for conversation analysis, which involves extra layers of consent from the participants so that the videos can be analysed and shown in conferences. After several hours of recording, they find this one perfect example of a particular gesture or phrase or turn-taking strategy… and then they realise that somebody’s just walked past in the background, and so the video can’t be used because they haven’t given their consent.
Dealing with ethics and consent creates a huge pile of admin work where a common sense strategy would be much quicker and easier… but on balance, this is definitely preferable to an experiment that puts people in any kind of danger. The problem is that outside academia (and similarly-controlled corporate and governmental research), all kinds of ethically questionable experiments are happening.
This is a long, roundabout introduction to an anecdote about how I was recently contacted by a high school student who wanted to know how EEG works with paralysis. I assumed they were asking about a brain-machine interface, such as the one in the 2014 World Cup opening ceremony where a paralysed man wearing an EEG cap was able to control an exoskeleton and kick a football…
Nope. They were actually asking about something they’d seen in an anime. After living in Japan for a year, one of my rules to live by is that the sentence “No, it’s okay, I’ve seen it in an anime” never indicates anything good, and this rule was proven again on this occasion. The anime in question is called Sword Art Online, and I’m not really sure what it’s about other than it features a virtual reality helmet which paralyses the characters from the neck down and overrides their sensory systems, thereby making the virtual reality feel real as well as look real. I wrote back to the student and said that people are doing all kinds of interesting VR research and brain-machine interface research, but that EEG is kind of like a set of scales for weighing things; it can tell you what your weight is, but that doesn’t mean it can change your weight.
The student wrote back to me saying that people are doing research on this in America. These teams are apparently attempting to induce paralysis from the neck down, but are running into problems with their “body stopper”, like vertigo, nausea, paralysis lasting for much longer than when the machine was turned off, and some body functions not working for a while afterwards. I did a bit of googling and found out that the people working on this are amateurs who have taken apart a tazer that they’ve bought from a hardware store, messed about with the power settings, and strapped it to each other’s necks to try to induce temporary paralysis (and the guy in charge of it seems to want to run his own maid café, which pretty much says it all).
It goes without saying that this wouldn’t get ethical approval at MPI or any other university, and that it is, to use the technical term, really fucking dangerous.
It’s a bit more complicated than that, though. It’s easy enough to look at people making their own TMS machines or buying tDCS sets because they think they can zap themselves smart (even though it doesn’t really work like that anyway) and write them off as potential Darwin Award winners… but science is somewhat complicit in this too. The mainstream media coverage of scientific findings is hugely exaggerated; mostly due to the media’s need to sell itself, but also because of the need for academics to overhype their own research. If people are presented with stories about how something about the brain and electricity can make you smarter or make paralysed people walk, and if scientific research isn’t all that open to non-scientists, it’s not really surprising that people are trying it out for themselves.
It boils down to science communication in the end. It’s one thing to talk about how amazing your own research is or how these great findings could mean brilliant things, but that’s actually kind of irresponsible without also talking about the ethics approval boards, the consent forms, the participant safety measures… in short, all the boring but essential things that make scientific research safe. Hence the long, roundabout introduction to this anecdote. You’ll remember the bit about the homemade paralysis machine from a tazer, but I’d rather you remember the bit about all the ethics forms I have to fill in before I can do any kind of experiments myself.