Archive | skepticism RSS for this section

Dissertation Log 1: My dissertation research project

I guess I’m not a typical blogger as I don’t like to generate my own reasons to blog: I blog when there’s a topic which interests me and about which I feel I can say something novel. (Which is another way of saying that a lot of blogs seem to be somewhat echo-chambery – not that echo chambers are inherently a bad thing, but that I don’t necessarily feel sufficiently married to any particular social group that I feel the need to necessarily reinforce my social bonds with that group by echoing discursive themes.)

However, having completed modules in science and the public and communicating science in the information age, and been awarded my post graduate diploma in science and society, I now have to research and write a 15,000 word dissertation. in order to get the full MSc in Science and society degree. (That’s mainly sociology of science and technology, with some philosophy of science, social psychology, linguistics, politics and other odds and sods.)

Well, my research proposal has been accepted. I’ll expand on the detail shortly, but part of the marking criteria includes keeping a regular research log. Rather than keep a purely private log, my intention is to develop it as part of my blog.

First, we are living in a networked world, and my research cannot exist independently of the world in which it exists, so this is a good way of acknowledging engaging what I’m doing with that world.

Second, one of my modules was on science communication in the digital age: if, as the basic thesis of that module proposes, the world of research is being transformed, I don’t think that can be restricted to just the natural sciences. I can do a bit of action research and actually try out the idea of a public lab book, and see how I feel about it, and how it compares with other projects I’ve worked on. May be it won’t work, but negative findings are still findings ;-)

Third, the discipline of writing something each week will actually benefit me, ensuring I’m writing down what I’m thinking and allowing me to keep a track of the development of ideas and theories. It will probably allow me to air drafts of sections of the dissertation, and refine the arguments and presentation.

Fourth, I’d be interested in getting feedback on my ideas and actions. The main weakness of Open University study (and it’s not much of a weakness in the grand scheme of things) is that I often feel disconnected from the academic community, and I’d welcome intellectual discussion and constructive criticism of my ideas and action. This is not a licence to slag me off because you don’t like me, social science, my values, my face (you’re not alone in this one) or whatever, and I will cull comments which aren’t constructive, but could be fun for me. (Of course, what will probably happen is no-one will ever comment. Meh.)

So over the coming few months (the dissertation is to be submitted in September 2012) I will try to post a blog at least once a fortnight touching on my research; for example, my next post will be on choosing a methodology for undertaking the research. If you have questions, ideas and comments, please post them.

So, now for the big reveal, what is it I’m studying? I’m interested in the way social groups construct their views of scientific topics, and in particular in medical subjects. I’m also interested in a social group who describe themselves as “skeptics”; they’ve become increasingly active and high profile in the UK and I have a lot of sympathy with a lot of what they say and do, but there’s very little research into them as a social group, influence and norms and values. So, my proposal aims to explore those key areas.

My accepted proposal reads as follows:

The Burzynski Affair: a case study in perspectives on alternative cancer therapy

Patients with terminal cancers, their friends and families often seek out therapies which would not normally be considered as a “last ditch” effort to save their own or loved ones’ lives. An example of this type of therapy, which appears in the press not infrequently, is the “antineoplaston therapy” promoted by the Burzynski Clinic in Texas, USA (http://www.burzynskiclinic.com/). A typical story is that of Chiane Cloete, a five year old girl who has been diagnosed with a supratentorial primitive neuroectodermal tumour in the brain; her parents are seeking to raise £130,000 to get her treatment “not available in the UK” at the Burzynski clinic (Parsons, 2011).

I am interested in exploring how four different groups represent the Burzynski Clinic’s therapies in discourse available publicly. The first group will be that of proponents of the Burzynski Clinic and its work. The principle sources will be the Burzynski website, promotional materials, testimonials (via blogs, tweets and publicly accessible fora) from patients and associated materials.

The second group will be that of skeptics in the UK (the spelling using a ‘k’ comes from the American spelling, and is used to differentiate skepticism – the movement – from scepticism – the philosophical position – although the two are not unrelated). Writing about the emerging skeptical community in the USA in 1993, Hess described skeptics as “antiantiscientists” (Hess, 1993, p 11) who focus on debunking and demystifying “parapsychology … superstition, occultism and ‘pseudoscience’” (ibid. p 11). In the intervening 18 years, a sceptical community has arisen in the UK, complete with its own internet counter-culture and real-world community events (such as skeptics in the pub, and conferences such as Q.E.D.). After skeptics in the UK raised concerns about Burzynski in September 2011, a threat of libel action was made against several of the sceptical voices. The principle sources will be sceptical websites, blogs, tweets and other associated materials of people in the UK who self-identify as skeptics, or who are strongly associated with the skeptic movement in the UK (for example, Dr Ben Goldacre, Robin Ince and Dr Brian Cox).

The third group will be the printed news media in the UK. The principle sources will be Nexis and using Google to find other material hosted on major outlet (ie. national distribution) websites. This will be limited to information available at no direct cost to the consumer.

The fourth group will involve a literature search exploring the view of peer-reviewed scientific journals and associated research (eg. Cochrane reviews). This will be as light-touch as possible, to identify the boundaries of knowledge rather than to provide a considered opinion on the use of such treatments.

In order to undertake this research, the following questions will be considered:

  1. What language is used by each group to describe the Burzynski therapy, and what are the implications of the language used?
  2. What are the similarities between the way the Burzynski therapy is represented by each groups, and what are the differences?
  3. What values are revealed or highlighted by the language used by the contrasting groups?
  4. How does what each of the groups say compare with what the scientific literature has to say on the issue? What situated knowledge is implicit or explicit in the discourse?
  5. How are groups (particularly proponents and skeptics) represented by the other participants in the discourse?
  6. What implications does this analysis of the discourse have for understanding the ways in which an issue is represented in the public?

Exploring these research questions will demonstrate an understanding of the ways in science, scientists and their limitations are understood and represented by contrasting public groups. This will touch on issues raised in S802 such as concepts of anti-science, ethics, expertise, regulation, policy and risk.

Certainly the issue raises questions of what constitutes science and non-science (see for example: Grove, 1989), but also raises the issues of the knowledge boundaries of scientific research (Collins & Pinch, 1998). It also touches on how scientific knowledge is used, constructed and reconstructed in everyday life (Irwin & Wynne, 1996).

Objectives

  1. Construct a narrative relating to the Burzynski affair based on publicly available material
  2. Compare and contrast through discourse analysis four differing perspectives on the Burzynski affair: proponents, skeptics in the UK, UK news media and published papers in scientific journals
  3. Understand how the perspectives reflect on the construction of scientific knowledge, uncertainty and how the values of each group influence their representation of the issue

Collins, H., & Pinch, T. (1998). The Golem: What You Should Know about Science (Second.). Cambridge: Cambridge University Press.

Grove, J. (1989). Anti-science. In Defence of Science: Science, Technology and Politics in Modern Society (pp. 151-177). Toronto: University of Toronto Press.

Hess, D. J. (1993). Science in the New Age: the paranormal, its defenders and debunkers, and American culture. Madison: University of Wisconsin Press.

Irwin, A., & Wynne, B. (1996). Misunderstanding Science? The public reconstruction of science and technology. Cambridge: Cambridge University Press.

Kozinets, R. V. (2010). Netnography: doing ethnographic research online. London: Sage.

Parsons, R. (2011, December 14). Cancer girl’s £130,000 plea for life-saving operation in US | News. The London Evening Standard. London.

Scientific Literacy: people aren’t very good at tests they haven’t revised for

One of the topics which comes up a lot in social studies of science – and which comes up regularly on the skeptical circuit – is the complaint that the public don’t understand science. There are numerous questionnaires which test the public’s knowledge and they get published about once every three months. One which I noticed earlier was this one, looking at relative levels of scientific knowledge between men and women. Another is this, also by Sheril Kirshanbaum. There are countless other examples of what is generally known as scientific literacy.

Many people take the view that the poor numbers of people who know these answers to the questions – in the examples cited, about the relative sizes of subatomic particles, orbital periods of planets, genetics and origins of the universe – shows that members of the public are not adequately educated about science. However, despite extensive efforts by educators, the relative rates of knowledge remain static between surveys.

For me the question really is what we’re measuring, and what the significance of that measurement is. When were people asked and in what social setting? The only time I’ve ever been invited to participate in social research is on the high street when shopping: how many people could cite scientific facts when worrying about getting a hot sausage roll from Gregg’s?

In addition, what relevance to every day lives does knowledge being assessed have? Whilst I knew the answers being sought to the questions above (and I’d love to discuss the question on genetic gender determinism further), I can’t say they form anything other than an interesting backdrop to the day-to-day decisions of my life. So what if it takes the Earth 365.25ish days to orbit the sun, does it affect the housework I’m procrastinating doing by writing this? The relative size of electrons have equally little import, and I’m becoming less and less concerned about how people think the universe was created (it was nearly 14 billion years ago expansion started, wait), provided they don’t think they have the right to force that onto other people. Granted there may be a positive correlation between people who believe in special creation and people who want to force that belief on me, but such people remain a small minority.

And why don’t we expect publics to know facts about philosophy and literature and religions? What is it that means people should definitely know that electrons are even smaller that small, but we don’t care if they know how and why Socrates died? Why does the rotation of our planet excite rage when knowledge of the contributions of Keats, Ives and Spinoza barely raise an interest?

Whilst many members of the public may fail these crude general knowledge tests, we also know that people don’t operate in a scientific vacuum. When they need to know something, they have the resources, either in themselves or in the people around them, to gain understanding. Classic examples are diabetes and cancer: non-specialists can gain a sophisticated understanding of the conditions and their treatment options when required, but might otherwise fail these tests. One of the advantages of the Internet is access to a vast library of human knowledge (which comes hand in hand with access to a vast library of human ignorance, of course).

Demanding that people have these pieces of knowledge may also lead us to underestimate the knowledge they do have. As a classic example, during the BSE crisis, failure to into account the knowledge of butchers about the preparation of meat meant that the first set of rules designed to prevent BSE entering the food chain failed to work, something which was only discovered by the policy makers 5 years after the regulations came into force. (I am assured by my master butcher brother-in-law that the first regulations were laughable, showing a compete ignorance about what happens in an abattoir.) People build complex knowledge structures of the world they live in. Sometimes, these knowledge structures are implicitly scientific but if you asked the people they wouldn’t say they had any scientific knowledge.

One of the groups I have an interest in, ambulance crews, have a vast body of knowledge about humans, their bodies and their responses to illness and injury, but not many of them would necessarily say they are applied scientists. If such a person is unable to answer questions on subatomic particles, should we discard the complex, subtle and deep knowledge they have gained over years of practice and declare them unscientific?

Even if we agree that pop science quizzes have some kind of intrinsic value (and that’s as much a question for the philosophers!), what implications does it have for society? The common assertion is that the scientifically illiterate cause harm because of poor quality decisions. For example, if someone doesn’t know how dilution works, they may mistakenly think that homeopathy works and fail to get vaccinated. I think it’s worth pointing out at this point that there’s a clear difference between science as a body of facts and science as a process. For example, a physicist and biologist probably have a good understanding of how science works, but probably don’t compare in terms of factual knowledge about each other’s fields.

Indeed, the evidence on science-as-knowledge doesn’t really support the position that ‘ignorant’ people make ‘bad’ decisions. In fact, the evidence suggests that the more science (both types) people know, the more ambivalent they are about science. Thus the people most likely to reject genetically modified foods and vaccination are also the people who would be most likely to pass the knowledge assessment. Indeed, anti-GM often have depths of knowledge which are comparable to those of practitioners in the field.

Indeed, I’m not even sure how a more scientifically literate world would differ from the world we see today. This is particularly true when we take into account evidence about motivated reasoning: people with pre-existing opinions are less likely to change their minds when confronted with evidence which strongly conflicts with their existing beliefs.

Now, I’m all for ensuring that science is made accessible to different publics, that publics have opportunities to engage with science if they wish and that our education system is as good as it can possibly be. Wrong information, particularly when it’s associated with exploitation, can and should be countered, if necessary by reference to appropriate enforcement bodies (the ongoing ASA campaigns, for example). However, we can’t use these measurements as indicators as evidence of anything other than what it measures. And the only conclusion that I can draw is that people who are given a science test without a chance to revise tend not to do very well.

But any undergraduate could have told you that though.

 

[Edit: Sheril pointed out that both of the posts I referenced were actually authored by her, so I corrected the text]

What is a Skeptic, and am I one? Part 1: Defining Skepticism

I have been wondering of late if I am merely sceptical – demanding to see evidence – or whether I should in fact identify as a member of the Skeptical community. This was triggered by a couple of people – with entirely complimentary intent – suggesting that I am a Skeptic, and because it’s an area of interest for me academically I thought it would be appropriate to reflect on the question and thus clarify my own position in relation to my research.

I shall start by saying that I have never attended a Skeptical event, although I have met people who self-describe as Skeptics at other events and regularly speak to people online, and I do regularly listen to a couple of Skeptical podcasts. The limit of my contribution to the Skeptical community so far has been my blog post on my research into the 10:23 campaign, which has been picked up by a couple of Skeptical outlets.

My intention is to explore Skepticism in a variety of different ways over a number of blog posts. I’m going to start with self-definition and self-description, but I’m also going to look at negative definitions (what Skeptics aren’t), the epistemology of the Skeptic (what methods of assessing claims do Skeptics use?), some sociology of the Skepticism (is it a movement? a community?), some history (where did Skepticism come from?) and possibly even some social psychology.

Given that I have also been accused of being a constructivist, I find it natural to start by looking at how people define the terms. Those of you who are particularly awake will notice that I have differentiated scepticism and Skepticism; this is because I think it’s possible to be sceptical without being Skeptical, and because the latter is effective a proper noun as opposed to adjective or verb. The term has American origins (hence the ‘k’), and I’m capitalising to make it clear I’m talking about a specific group of people. I’ve done a brief assessment of definitions of Skepticism by the simple expedient of typing “what is a skeptic” into google and surveying the first few matches.

Brian Dunning (host of Skeptoid, a Skeptical podcast and website) defines Skepticism as “the process of applying reason and critical thinking to determine validity. It’s the process of finding a supported conclusion, not the justification of a preconceived conclusion.” He also uses the term “critical thinker” as a synonym for the term.

Skeptic.com describe Skepticism:

“Skepticism is a provisional approach to claims. It is the application of reason to any and all ideas — no sacred cows allowed. In other words, skepticism is a method, not a position. Ideally, skeptics do not go into an investigation closed to the possibility that a phenomenon might be real or that a claim might be true. When we say we are “skeptical,” we mean that we must see compelling evidence before we believe.” (Skeptic.com)

Young Australia Skeptics describe a Skeptic as “… an individual who approaches every claim with a degree of scepticism proportional to its plausibility”. They then go on to quote Dr Stephen Novella of the New England Skeptical Society:

“A skeptic is one who prefers beliefs and conclusions that are reliable and valid to ones that are comforting or convenient, and therefore rigorously and openly applies the methods of science and reason to all empirical claims, especially their own. A skeptic provisionally proportions acceptance of any claim to valid logic and a fair and thorough assessment of available evidence, and studies the pitfalls of human reason and the mechanisms of deception so as to avoid being deceived by others or themselves. Skepticism values method over any particular conclusion.”

Oxford Skeptics in the Pub summarise Stephen Novella’s definition as “an intellectual specialty that is grounded in science and the humanities and includes any knowledge that deals with the nature of knowledge and belief, critical thinking, the foibles of the human intellect, and deception”.

UK Skeptics explore what they mean by Skepticism very carefully, focussing on “doubt and inquiry” as the core of the method, which they define as akin to the scientific method. Milton Keynes Skeptics in the Pub define it as “the place where science education intersects consumer protection. There are other definitions which use words like rationalism and critical thinking. It is, essentially, a position in which claims (particularly extraordinary ones) are not accepted unless they can be verified, or falsified, though use of the scientific method”.

So although there are lots of differences between the wordings used, I think the definitions have some common features, and I think it can be reduced to the following:

Skepticism is a provisional method, process or approach that values critical thinking, rationalism and the scientific method above comfort, convenience, belief and preconception when assessing claims about the nature of reality, and which demands evidence or support proportional to the magnitude of the claim being made before it is accepted.

A Skeptic, therefore, by this definition, is someone who subscribes to that method and those values, applies them in their life and (optionally) promotes such approaches to those around them.

I think I can subscribe to the definition I’ve outlined above, in which case, in the answer to “am I a Skeptic?” is Yes, by that definition I would self-identify as a Skeptic. However, I think there is more to Skepticism than a mere self-description, not least because, surely, there cannot be many people who would not subscribe to that definition.

So, when is a Skeptic really a Skeptic? Next time, I shall look at negative definitions: defining the Skeptic identity by reference to what a Skeptic isn’t.

I welcome critical feedback on what I write, so please feel free to comment on my definition and help me to further improve it!

Goldman Conjecture: How do I know who to believe?

I heard Massimo Pigliucci on For Good Reason nearly a year ago (episode 1 and episode 2) talking about his book Nonsense on Stilts: How to tell science from bunk.

I’ve just finished reading the book (well, technically, listening to it in my car), and it’s a reasonable, Skeptical discussion of philosophy of science and epistemology. Massimo Pigliucci has three doctorates (genetics, botany and philosophy of science), and writes clearly about the issues, ranging from the demarcation problem (what is science and what isn’t?) to how science works.

However, it was his penultimate chapter on expertise which promoted me to write this post. In it, he advocates five principles which are adapted from Alvin Goldman’s Experts: Which Ones Should You Trust?. Here they are (paraphrased):

  1. What is quality of the arguments presented by the experts (eg. are they citing evidence or using fallacious arguments)?
  2. Do the arguments agree with the arguments and evidence presented by other experts in the field (eg. do they agree with the consensus)?
  3. What recognition of expertise does the person have (eg. do they have a relevant academic qualification or other recognition)?
  4. What are the biases affecting of the experts, and how do they related to the positions they are espousing (eg. if they’re researching drugs, are they funded by pharmaceutic companies)?
  5. What is the experts success rate like (eg. do they have regular/recent/relevant peer-reviewed research published)?

In the book, Pigliucci compares an advocate of creationism with an advocate of evolution, and unsurprisingly his conclusion is that the evolutionist is more worthy of trust than the creationist. He also uses the criteria against a single person – Deepak Chopra – and concludes that he falls at the first hurdle: his arguments aren’t really arguments at all.

My problem with the first this is the context in which such assessment is supposed to occur. Outside of a few professionals, who is expert on assessing and weighing evidence? The fact is that most debates in the public sphere are decided on the basis more of effective rhetoric than expert application and assessment of inductive and deductive logical patterns.

The second assumes that people have access to the information necessary to assess the quality of the evidence, and assumes that it is necessarily possible to identify consensus. To Pigliucci’s credit, he does point out that consensus can be wrong, but I’d question whether developing – or even developed – science is quite so amenable to objective analysis, not least because information may be hidden behind paywalls, technical language or mere obscurity. It also assumes a desire to check this information.

[edit: added this paragraph at 2115] It also occurs to me that this requires you to know who is an expert in order to be able to identify who is an expect. Seem like this could result in a recursive loop in which every expert can only be an expert when the comparator expert has also assessed!

The third assumes that most people know what is, or is not, relevant to the issues at hand. For example, which type of biologist should I ask about evolution? Why is a biologist not the right person to talk about abiogenesis?

The fourth, it seems to me, can be problematic on both sides of the equation and requires a subjective decision as to which biases should be included and which should not (for example, is a persons religious affiliation relevant to a discussion on the development of life?).

Finally, the fifth has the same weaknesses as the second, and in addition assumes that people have the ability to assess the relative success of different papers and publications in the knowledge ecosystem.

Although Pigliucci’s writing is a good introduction to epistemology and sciphi, it very much preaches to the choir. His clear disdain of post-modern thinking is amusing (but doesn’t seem to have evolved much since Sokal), but I would hope – as Pigliucci himself encourages – that readers treat his conclusions with some, dare I say it, skepticism. After all, the people who most need to think critically are the people least likely to read it…

How Discourse About Homeopathy Was Affected By The 10:23 Campaign: A Case Study In Public Engagement

This is a summary of the report of a research project I carried out for the Science and the Public module of my MSc in Science and Society with the Open University. The OU bars publishing of assignments (for obvious reasons) so I have written this as an alternative. I got a distinction for the module, and comments on the research project included “timely”, “innovative” and “thorough”.

Abstract

A campaign led by skeptical amateurs aimed to change the way the public thinks about homeopathy by participating in a mass “overdose” event. Mainstream press media, blogs and tweets from timeframes around that event were analysed to identify how the campaign, plus other events, changed public discourse on homeopathy.

It is noted that there was a shift from technical discourse to political discourse calling for changes in public policy on homeopathy. I conclude that skeptics have great potential to act as agents for citizen engagement with science, but that professional support is essential for pro-am programs to be effective.

Read More…

Follow

Get every new post delivered to your Inbox.

Join 1,412 other followers