Madingley Minds

Welcome to our blog, Madingley Minds, where ICE academics write about their research, their teaching and what inspires them.

Join the conversation and post your comments, or sign up for regular updates.

Excavating war

Written by Gilly Carr Wednesday, 26 August 2015 09:31

GillianCarr2 180px

2015 is a year of anniversaries: the 70th anniversary of the end of WWII, the 70th anniversary of VJ Day, and we’re still in the midst of the centenary commemorations of WWI. Archaeologists have been marking these anniversaries in the best way they know how: by excavating war sites.

Conflict Archaeology – the archaeology of 20th and 21st century conflict – has been a growth area in the discipline since the year 2000, and more and more archaeologists have been developing an interest in the subject and defecting from their previous specialities in other areas of the discipline. This is also true for me: in 2006 I moved from Iron Age and Roman Archaeology to Conflict Archaeology, and have been spending much time analysing the archaeology, heritage and memory of the German occupation of WWII.

As my students will know, I spend a lot of time in the Channel Islands, where I have been doing fieldwork for nearly a decade now. In fact, I have just returned from a pilot project to explore the WWI POW camp in St Brelade in Jersey. After receiving seed funding from the Société Jersiaise, I invited my colleagues Professors Harold Mytum and Nick Saunders to join me in Jersey where we had a lot of fun wandering around the site in the rain, spotting the foundations and barrack hut stilts of the former camp.

We also excavated a test pit in an area which might have been the rubbish heap of the camp. It always amuses me how ridiculously happy archaeologists can get about rubbish pits! We spent an exciting afternoon getting soaked to the skin excavating a test pit and exclaiming over the old Jersey stoneware cider bottles we found.

Stoneware cider bottles 440px

In the spring of this year I directed the second season of excavation at Lager Wick, a forced labour camp in Jersey, dating from 1942-1944. Although the rubbish pit of this camp was not accessible, my team explored the camp latrine and what I reckon was the mess hut of the guards, which burnt down in 1944. I found (among many other things) the base of a Nazi mug complete with eagle and swastika, the button from a guard’s uniform, a schnapps glass, a cuff-link, and a spoon handle.

Excavating is incredibly addictive. In some ways it’s like panning for gold – once you start scraping away at the surface of the soil, you can’t stop! I hope that students on the ICE Diploma in Archaeology discovered the same sense of discovery in their summer excavation module.

This coming year, the Diploma in Archaeology will offer a course in Conflict Archaeology, and I look forward to showing the students the results of my excavations in addition to work being done by other archaeologists in the field. We also run an Advanced Diploma in Archaeology at ICE, and I always look forward to welcoming students with a project in Conflict Archaeology. The deadline for signing up for our courses is 7 September 2015 – see you in October!

Gilly excavation 440px

Dr Gilly Carr, ICE University Senior Lecturer and Academic Director in Archaeology


Cruising for a virological bruising

Written by Chris Smith Monday, 13 October 2014 16:07

chris smith 180px

In writing this blog I had to weigh up two options. On the one hand, it's Nobel week and we're celebrating world-breaking scientific endeavours that include the discovery, by UCL's John O'Keefe, of the brain's equivalent of a GPS system that stops us getting lost. There's also the Japanese team who have revolutionised the way we light up our homes and workplaces thanks to their pioneering work on gallium nitride-based blue LEDs; and then there are the chemists who found a way to use a light microscope to see details inside cells that are smaller than the light waves themselves.

These are amazing advances, but while all this scientific back-slapping is going on, the dark cloud on the horizon is the emerging ebola epidemic in West Africa and the warning undercurrent that comes with it.

At the time of writing at least 7,000 people have been infected and half of those have died. The CDC in America also estimate that, because the level of reporting is so poor, the numbers can, in all likelihood, be doubled or even tripled. And because the rates of infection appear to be growing exponentially, tens of thousands, or even millions, might ultimately be affected.

To put the scale of the present situation into perspective, since the first recorded case of ebola in the Democratic Republic of Congo 38 years ago there have been fewer than 2,500 deaths documented in total.

So this single present outbreak is already three times larger than the entire ebola deathtoll ever. It's also no longer just an African problem. The West has had its own wake-up call this week as the US and Spain, countries previously regarded as immune to the threat thanks to modern medicine, have reported imported cases of the condition and, despite strict infection-control guidelines and practices, onward transmissions of ebola on their home soil.

What is remarkable though is that, while ebola is terrifying and dramatic in its impact when it causes an outbreak, it appears to be a relatively easy agent to fight. Experimental vaccines tested so far on animals have been impressively effective, protecting against even injection of the live ebola virus. But because they are at a test stage, these agents, which will be critical if we're to nip this outbreak in the bud, are nowhere near ready for mass production. Trials are only now getting underway of human versions of the vaccines in Oxford, UK, and the US. "Way too late," many are saying, to prevent the inevitable.

So why is it that, nearly 40 years after ebola first surfaced, the world finds itself in a state of panic, and up to ten thousand people are dead, owing to a bug that's probably preventable thanks to scientific research done decades ago? The answer is that ebola was regarded as someone else's problem. It was a tropical disease of low importance and (presumed to be) constrained by geography and climate to a part of the world that held little economic interest to the rest of us. But therein lies a salutary lesson: because if even a tiny fraction - less than 1% - of what the present outbreak is now costing the world in terms of lost productivity, humanitarian aid and human lives lost had been spent 20 years ago to develop an ebola vaccine, we probably wouldn't be in this position now. It's easy to dismiss tropical diseases as an issue that won't affect the West, but the present situation is a warning shot across our bows that we ignore at our peril.

Even over the relatively short time that I've been a virologist we've seen several potential pandemic agents emerge: SARS appeared in 2003, bird flu has been an ever-present threat since the late nineties, swine flu struck in 2009, MERS-Cov, the SARS-like agent from the Middle East, appears to be widespread in camels and can kill susceptible humans, and now ebola has taken everyone by surprise. What connects all of these outbreaks is that, by and large they, have all stemmed from poor countries.

Emerging infections amongst humans are overwhelmingly zoonoses - in other words infections that originate in an animal and jump into humans.
The places where this is most likely to happen are under-developed nations, with poor healthcare infrastructure, poor sanitation, high population densities and close contact between humans and wild animals.
Once it's established in humans, however, an emerging disease is no longer constrained by habitat of its original host and can go global.
HIV, which has infected and killed over 70 million people worldwide, originated in one small part of colonial Africa in the early 1900s when the chimpanzee disease SIV spread into locals who were butchering the animals for bushmeat.

What drove the HIV explosion was the very same recipe that is putting the modern world at risk from other emerging diseases like SARS and now ebola. These are urbanisation, population pressures and fast global transport networks. Add the predicted effects of climate change to the mixture and the resulting toxic cocktail is sobering to consider.

The bottom line is that, in our quest for immortality, cheaper trainers, a thinner iPhone, batteries that last longer and budget holidays on tap, we're cruising for a virological bruising. And although we've now got white LEDs, microscopes capable of seeing structures smaller than light itself and we understand how the brain helps us to find our way to the fridge and back to retrieve a microwave meal, we could easily fall victim to diseases preventable by much more ancient technology...

Have a look at our Physical Sciences and Biological Sciences courses.

Dr Chris Smith
Public Understanding of Science Fellow

Consultant Virologist, Cambridge University & Addenbrooke's Hospital
Managing Editor, The Naked Scientists:


Beyond the 'wow' factor: why we should all study science

Written by Erica Bithell Tuesday, 17 June 2014 16:07

erica bithell 180px

Occasionally I am asked what made me choose to study the physical sciences. I think the questioner is usually hoping that I will cite a significant person or event: an inspirational teacher perhaps, a particularly transformational moment in technology, some key discovery, or perhaps a famous scientist.

It is true that I was fortunate to have teachers who taught with enthusiasm and without gender bias. My earliest memory of a world event is of Neil Armstrong’s first step on the Moon. I remember clearly the chemistry lesson in which I first learnt about the structure of the atom, and (at about the same time) watching the documentary series ‘The Ascent of Man’, in which Jacob Bronowski recounted the unfolding story of science and technology. At school, I read over and over the biography of the Nobel prize-winning crystallographer Dorothy Crowfoot Hodgkin, whose photograph hung on a wall in the school dining room.

I have certainly been influenced by all of these, but I should make a confession. I took science subjects because they were the ones in which I found that I got the quickest and most satisfying return. A more positive spin on this would be to say that I enjoy solving problems. In science the problems are right in front of you, from the very first moment, and you can pick away at understanding them piece by piece.

You are introduced to big questions on day one. Why does ice float? Why do magnets stick together? What does an atom look like? Where is the end of the Universe? Often, there are quite simple descriptive explanations but all repay closer examination. The simple descriptions fail to tell the whole story and the sense of satisfaction increases as one responds to the challenge of understanding each level of a progressively more detailed picture.

In recent weeks I have been repeatedly turning over the question of how I then came to have a professional interest in this area. June has seen something of a flurry of activity reminding us that women and girls in the UK remain significantly under-represented in the physical sciences and technology, but that much can and must be done to challenge and improve this state of affairs.

Campaigns to address gender stereotyping in toys have received a boost with LEGO’s announcement that the user-designed set of female scientist minifigures will be taken forward for production. The Opening Doors project has been announced to ensure that no student is deterred from studying certain subjects because of their gender. This is a response to the Closing Doors research and the IoP’s 2012 report 'It’s Different for Girls' which revealed that 49% of maintained co-educational schools sent no girls on to take A-level physics in 2011. Overall, just 20% of students progressing to A-level physics are girls, a figure which has changed little in a quarter of a century.

Campaigns of this nature most often emerge from recognising the need to educate and equip all young people with the right skills for the future. The focus tends to be on the wide open and largely unknown future of our young students. By contrast, most lifelong learners that I meet on science courses at ICE have a somewhat different agenda. They are people who have accumulated a great deal of experience and their horizons are wide. The need to acquire specific skills is less pressing, but the desire to understand and interpret is, if anything, much stronger. They want to distinguish the adventurous but achievable from the fanciful, to differentiate a genuine assessment of risk from scaremongering, to tell the difference between polemic and reasoned debate.

Cambridge is a fantastic place in which to engage with science and technology, and it is part of the fabric of the city. For a good number of years, I worked in the same rooms as had been used decades previously by James Clerk Maxwell, Ernest Rutherford and W L Bragg (and that was another scientifically inspirational experience). Each year the University’s lecture theatres and departments fill to capacity with visitors to the Cambridge Science Festival and Festival of Ideas. Those who cannot get to the city in person can follow the regular Research News postings. Cambridge’s success comes not from repeating what is known to work already, but from pushing the boundaries into the future.

But the fact remains that to get below the surface of all that is happening in science and technology – to dig a little deeper into those big scientific problems and to understand the next layer – takes a bit more than following the popular scientific press, or keeping up with the latest documentaries. Our courses for lifelong learners in the Physical Sciences at ICE are designed to meet this need. Our aim is always to make science accessible to as wide an audience as possible, and to enable our students to share in our own enthusiasm for our subject.

Dr Erica Bithell, ICE Academic Director in Physical Sciences

Physical science courses at ICE in 2014

Weekends at Madingley Hall:

Part-time University qualifications:

Science Summer Schools (6 July – 2 August 2014)

Erica on 'Deconstructing structures'

If your scientific appetite has been whetted, you can watch a video of a talk that Dr Erica Bithell gave as part of the 2012 Cambridge Science Festival.



An invitation and re-invitation to sociology

Written by Nigel Kettley Friday, 25 April 2014 13:48

NigelKettley2013 180px square

Just before I was born, Peter Berger in his Invitation to Sociology lamented that, ‘There are very few jokes about sociologists. This is annoying… but it may also be instructive. The dearth of jokes about sociologists indicates ... that there is a certain ambiguity in the images that people have of them...’ (1963, p.2).

I am now inclined – having studied, taught and researched in this discipline for well over 30 years – to think that Berger was wrong. There are far too many jokes about sociologists. I’ve heard most of them. Some of my favourites include those pertaining to spoof definitions of the discipline. For example, ‘Sociology is the study of people who do not need to be studied, by people who do’ (cited in Meighan 1981). Or, more recently,

‘Sociology is a cult based around the intellectual pseudoscience of studying society. Originally popular with old bearded men who smoke pipes whilst reclining in arm-chairs, it has now managed to find a younger generation of converts thanks to its introduction into colleges and universities. Synonymous with Scientology, Sociology uses various methods of empirical investigation and critical analysis to develop and refine a body of knowledge and theory about human social activity…’ (Uncyclopedia 2013)

In one respect, at least, Berger (1963) was correct: there remains considerable ambiguity in the public understanding of sociology. What is sociology? How is society studied by sociologists? What, if anything, is the utility of the discipline? Big questions. Indeed, this status ambiguity also pervades the academic community within higher education more generally, especially among those researchers inclined to view themselves as natural scientists (correctly speaking, of course, researchers engaged in a social enterprise labelled ‘natural science’).   

Berger’s (1963, p.25) answer to these questions was to advocate sociology as ‘a form of consciousness’ promoting a critical understanding of the relationship between individual biography, institutions and society. Hence sociology is a humanistic discipline, similar to philosophy or history, capable of: debunking common sense; looking behind the social construction of reality; and discovering human meanings and values to explain society in a non-prejudiced manner. Berger also cautioned sociologists against ‘humorless scientism’ since the dominance of a ‘foolproof methodology’ in the discipline would ‘lose the world of phenomena that it originally set out to explore’ (1963, p. 165).

Now, this is an intentionally humorous, polite and, unfortunately, benign definition and vision of sociology. At one time I would have agreed. However, years of teaching and research in Cambridge have persuaded me otherwise (Kettley 2007, 2012). So what would my preferred definition and re-invitation to sociology look like?

Negative definitions are generally to be avoided in academic work. In this case, however, I will make an exception. Let’s start by saying what sociology is not and then proceed to my preferred definition. Most definitions of sociology, qua Berger (1963), deploy antimonies or dichotomies to characterise sociology, its method and its utility. For example, sociology is either the scientific study of society, using a statistical method, to produce generalisations about behaviour which facilitate prediction and/or social engineering. Or, alternatively, sociology is the study of individual meanings and motivations, analysed through qualitative and narrative techniques, focused on understanding the social construction of reality to ‘give voice’ to disadvantaged people (Kettley 2012, p. 64).

These opposing definitions of sociology, problematically, trot out tired dichotomies in social science such as the supposed distinctions between: society and the individual; natural and social science; the quantitative and qualitative; pattern identification and causation; and social control and individual agency. In these cases, defining sociology, including its methods and utility, becomes a self-indulgent debate about the possibility of doing social science, given our subject matter is human creativity and freedom, rather than the production of powerful explanations of patterned behaviour – the social in sociology – and their underlying causes. (Kettley 2012, p. 38).

This conflation of the possibility of doing sociology with the study of the genuine causes of social behaviour has been labelled the ‘social scientific fallacy’ (Holwood and Stewart 1991, p. 42). If we accept such epistemological anxiety about the definition of sociology, its method and its utility, furthermore, there is little wonder that: 1) there is continued public and academic confusion and scepticism about the discipline; and 2) there has been a proliferation of jokes about sociologists, contrary to Berger’s (1963) wisdom, because sociologists have confused their scholarly activity with the object of their inquiry (patterns of social behaviour and their underlying causes).

Drawing on researchers such as Stewart, Prandy and Blackburn (1980) and Holmwood and Stewart (1991), the so-called Cambridge school, I prefer a definition of sociology that foregrounds the unity of various strands of the discipline (Kettley 2012). Sociology can only make sense of its object of inquiry – the underlying causes of patterned social behaviour – when we reject antinomies and synthesise competing traditions.

In this approach, sociology is the empirical investigation of patterned social behaviour, including deviations from such patterns, which tries to provide powerful explanations of the underlying causes of these patterns (and variations in them spatially and temporally). It is unproductive logic and labour to think of the object of sociological inquiry as either individual behaviour or society. Rather society is a relational construction – it is patterns and structures reproduced and transformed through human interaction – and the object of our empirical inquiry is precisely these relationships (not individuals or society).

It follows, logically, that sociology requires both an effective social psychology – to understand the motives behind individual behaviour – and an effective statistical method to explore how patterns of, for example, inequality come into being, are reproduced and occasionally transformed (Mennell 1980). Therefore, the method of sociology is both qualitative and quantitative, science and art, given its empirical study of the underlying causes of patterned behaviour. The discipline requires a mixed methodology. It makes no sense, moreover, to dichotomise the natural and social sciences, despite their varied objects of inquiry, for natural science is a social enterprise as fallible as any social science (Kettley 2012, p. 77). Nor does the natural world exist somehow independent of human relations to it.

Finally, not only is sociology an enterprise based on synthesising various intellectual traditions, but it also has utility. Sociology is a progressive discipline, it possesses a metamorphic capacity, for once the patterns of social relationships have been established and their underlying causes discovered it is possible to advocate social interventions and policies to change them. For example, much of my own research examines gender and social class inequalities in educational attainment and in access to higher education, and seeks to improve the educational experiences and outcomes of disadvantaged groups through curriculum innovation and changed education policy (Kettley 2007). A sociology based on the synthesis of intellectual traditions has utility for changing social life for the better. In an age of austerity and, for example, growing income inequality this is no joke and, unlike Berger (1963), I would invite or, perhaps, re-invite you to sociology not as a form of consciousness raising, but as a progressive intellectual and social force seeking to promote social justice (Dorling 2012).
If, like me, you are interested in competing methodological and theoretical traditions in sociology and the study of social differences and inequalities, the Institute offers a range of courses from non-accredited, open access programmes to full Master of Studies (MSt) degrees. In addition, the Institute is now offering an accredited Undergraduate Certificate in Social Sciences which provides a disciplinary-based introduction to Sociology, Politics and Psychology. The Institute also provides a Master of Studies in Advanced Subject Teaching which allows English and History teachers to update their subject-specialist knowledge, undertake classroom research and complete a dissertation. You are cordially invited.

Dr Nigel Kettley, University Senior Lecturer and ICE Academic Director in Education and Social Science


Berger, P L (1963) Invitation to Sociology: A Humanistic Perspective. New York: Doubleday.

Dorling, D (2012) Fair Pay. Bristol: The Policy Press.

Holmwood, J and Stewart, A (1991) Explanation and Social Theory. London: Macmillan.

Kettley, N (2007) Educational Attainment and Society. London: Continuum.

Kettley, N (2012) Theory Building in Educational Research. London: Continuum.

Meighan, R (1981) A Sociology of Educating. London: Holt, Rinehart and Winston.

Mennell, S (1980) Sociological Theory: Uses and Unities (2nd edn). Walton-on-Thames: Nelson.

Stewart, A, Prandy, K and Blackburn, R M (1980) Social Stratification and Occupations. London: Macmillam.

Uncyclopedia (2013) Sociology. (accessed 03.04.14)


Iron Age tax haven or gift to the gods? The strange world of archaeological interpretation

Written by Gilly Carr Monday, 10 February 2014 14:25

GillianCarr2 180pxJersey hoard 180pxJersey hoard detail 180pxJersey hoard coin 180px

On one of my recent visits to the Channel Island of Jersey, where I conduct some of my fieldwork on the heritage of the German Occupation of the Second World War, I was given a private viewing of one of the island’s newest archaeological discoveries. The Jersey Hoard was found in June 2012 and was judged to be the world’s biggest Celtic coin hoard ever discovered. Jersey Heritage’s conservator, Neil Mahrer, allowed me into his laboratory so I could see how the project was coming along.

Before I moved into the field of Conflict Archaeology around seven years ago, I was an Iron Age archaeologist, and so the hoard was particularly exciting for me to see. Sticking out of the massive clump of corroded green-coloured coins was a golden torque, a form of necklace which is usually associated with Iron Age hoards found in Norfolk. The most famous hoard of torques ever found in the UK comes from Snettisham, and these are on display in the British Museum.

The Jersey hoard is estimated to contain around 70,000 coins, as well as pieces of silver and gold jewellery, and it was found and excavated in one solid mass measuring 140 x 80 x 20 cm and weighing about three quarters of a ton. The coins are made of a silver-looking alloy called billon, a mixture of silver and copper, although during the conservation process (still in its very early stages) a gold coin was found and more, I predict, can be expected. The hoard was excavated and removed from the site as a large clump, complete with a 5cm covering of soil it so it wouldn’t dry out. That clump was wrapped in layers of a clear plastic film in order both to support and contain the hoard during excavation and conservation, to prevent it from breaking into pieces.

Since the hoard was brought to his lab, where it has been kept damp to prevent it drying out, Neil Mahrer’s job has been a busy one. He recently made an epoxy resin replica of the hoard, while it was still whole, and is now waiting to hear whether he will be given funding to purchase a laser scanner so that he can make a 3D virtual digital record of the hoard as it is dismantled. I was very pleased to be asked by Neil to write a letter of support for this important purchase.

Recording the hoard digitally in this fashion is very important; research in Iron Age studies over the last 20 years has revealed that the deposition of objects often shows a patterning which was governed by ritual concerns. If this patterning can be identified, it will facilitate a greater understanding of Iron Age cosmology and ritual practices in Jersey. Although thus far Neil has been able to work out that the hoard was deposited in a number of bags or packages of some sort, it will be revealing to be able to pinpoint precisely where in the hoard certain items were placed – items such as torques and jewellery which, currently, are peeping temptingly out of the corroded block.

Meanwhile, Neil’s job is a slow one: the hoard needs to be dismantled, one coin at a time, a job which he has estimated will take six to eight years for one person. His task will then be to record, clean and chemically treat each coin in turn to remove any corrosion. His conservation blog documents the process.

Talking to Neil about the laser scanner made me mull over the changes in the way that Iron Age coins have been treated and interpreted over time. We know that these coins were minted by the Coriosolites, an Armorican people who lived around 50BC in the area of modern-day St Malo and Dinan in France, just across the water from Jersey. In the past, archaeologists might have interpreted them as having been buried by Armoricans fleeing Roman invasion, to prevent Roman soldiers or others finding them in a time of conflict and uncertainty. Alternatively, they could have been payment to Iron Age Jersey people who fought as mercenaries in the Gallic Wars, alongside their kin across the water.

Today archaeologists might be more inclined to suggest that these coins were a diplomatic gift, or perhaps a ritual gift to the gods that no one had any intention of removing. The local papers joked that perhaps Jersey was an off-shore financial centre and tax haven 2,000 years ago!

While we cannot know for sure whether any of these interpretations (or even a combination of them) is correct (although we can make an educated guess at which one is wrong!), it is important for archaeologists to make sure that their interpretations are contextual and site-specific. Further excavation at the site of the coin hoard might help in this respect. Jersey (and the Channel Islands as a whole) is also acquiring a reputation for hoards; a few months after the coins were discovered, a late Bronze Age hoard of axe heads turned up just a few miles away.

Students who have registered for the Undergraduate Certificate or Diploma in Archaeology this coming Lent term will have an opportunity to learn more about these hoards. Those taking the ‘Prehistoric Peoples’ unit (part of the Certificate course) will be gaining an overview of the British Neolithic, Bronze and Iron Ages. Students on the Diploma course, meanwhile, will have an entire term devoted to Iron Age Britain. I hope that an Advanced Diploma student will be tempted to write a dissertation on the Jersey Hoard soon – this exciting discovery is crying out for further interpretation and Neil Mahrer has kindly given permission for such a student project. Applications are now open for those wishing to start an Advanced Diploma in autumn 2014!

Dr Gilly Carr, ICE University Senior Lecturer and Academic Director in Archaeology



Alice in Wordland

Written by Emily Caddick Bourne Saturday, 21 December 2013 15:50

emily caddick 180pxAlice in Wonderland

I recently watched Jan Švankmajer’s Alice (1988), an adaptation which mixes stop-motion animation with a live actor’s realisation of Lewis Carroll’s character Alice, in the setting of a bizarre Wonderland which often seems to all be crammed within a house.

Angela Carter mentions this film in a collection called On Strangeness (ed. Margaret Bridges, 1990, Tübingen: Gunter Narr Verlag). In an introduction to her story ‘The Curious Room’, Carter talks of an affinity between the surrealist aesthetic of Švankmajer’s work – which she describes as involving ‘the furious disruption of rationality’ – and the exploration of nonsense in Carroll’s stories.

The strangeness of the goings-on in Lewis Carroll’s work is part of what makes it philosophically interesting. For example, take its skilled pinpointing of what is absurd. For the Cheshire Cat to leave behind a bodiless grin is far more effective than, say, the idea of a bodiless mouth, because a grin is worn in a way a mouth is not – a grin is something which is done, and done by an embodied individual.

Another famous case is Humpty Dumpty’s claim to be able to make words mean whatever he wants, with his statement ‘There’s glory for you’ allegedly having meant ‘There’s a nice knock-down argument for you’. Humpty’s is not a theory many of us would endorse when trying to answer the questions of what meaning is and where it comes from. But a better theory should not only avoid fixing word-meaning by whimsy – it should also tell us why Humpty’s approach is perverse, and what this reveals about the role of the individual speaker in determining what they have said.

Communication is something so central to our interactions with one another that we often take it for granted until an occasion where it doesn’t work as we thought it would. For those who, like me, are interested in how communication underpins interpersonal exchanges and relationships, such occasions are important data. The Alice stories provide a study of possible breakdowns in mutual understanding between people. Some are extreme cases, like the impossibility of communication if the meanings of words are private rather than communal. (After all, if Humpty applies his strategy as a matter of course, there is no reason for Alice even to trust that his explicit definitions mean what she thinks they do!)

Others are versions of communicative problems more familiar from everyday discourse. The philosopher Paul Grice has argued that ‘conversational implicatures’ – where we manage to tacitly communicate something without explicitly saying it – can be understood in terms of mutual expectations concerning how one’s conversation partner will converse. For example, we generally have expectations that the person we are talking to will judge the things she says to be relevant to the topics of the conversation; and we have expectations concerning how much information a person should provide. Such expectations govern what it is to be conversationally cooperative. But Alice’s conversation partners often seem to be extremely uncooperative, at least by Alice’s standards and by ours. Why do they say the things they do to her? Are they trying to be helpful, or trying to be obstructive? Are they suggesting something she has failed to notice? Do they willingly flout or ignore Alice’s familiar communicative conventions? Do they operate with different sets of expectations which we need to try to understand?

It’s not just Alice’s understanding of others’ linguistic behaviour which is vulnerable in Wonderland and through the looking glass, but also her understanding of their behaviour more generally. Understanding somebody is a key to anticipating their behaviour. When we have difficulty coming up with reasons why others behave the way they do, we face a serious block to predicting their actions – and, in turn, a serious block to understanding them well enough to trust them. The behaviour of Alice’s companions is often alien (whether it’s advancing an argument which doesn’t add up, or engaging in insufficiently constrained beheading). Motives and interests are unclear. Characters often have a serious degree of unfathomability, which is why they are potentially dangerous to be around.

Work by philosophers like Donald Davidson has made a strong case for thinking that in order to make any sense of another person, you must work on the assumption that the other person is to some extent like you. We mustn’t suppose too much similarity, of course, else we would leave no space for the idea of difference between someone else’s outlook and our own. But enough similarity must be assumed to guide me in attributing beliefs and attitudes to the other person. When this assumption becomes unstable – as it sometimes threatens to in Wonderland – our chance of comprehending the other being, treating them as a person with thoughts and aims, starts to disappear.

The Alice stories bring our attention to the hazy line between strange goings-on which can nevertheless be interpreted (in principle and with effort) – and, on the other hand, the genuinely incomprehensible. By raising the question of what we can make sense of and how we do it, Carroll’s stories, and adaptations like Švankmajer’s, point us towards something which underpins how humans relate to one another.


If you found this blog interesting, ICE offers several courses which pick up on its themes. If you are interested in surrealism in Alice and other films, you might enjoy our weekly course on Surrealism and film. If you’re interested in how literature raises philosophical questions and proposes answers, you might enjoy our online reading group Philosophy through literature, and if you’d like to reflect philosophically on how literature works, you might like Philosophy of literature: understanding other minds through fiction, in our Literature Summer School. Finally, if you’d like to know more about philosophy of language and communication, you might enjoy our weekend course, The meaning and purpose of words.

Dr Emily Caddick Bourne, ICE Academic Director for Philosophy


A seasonal scientific miscellany

Written by Erica Bithell Friday, 20 December 2013 09:35

erica bithell 180pxtwelve days of christmas

The Twelve Days of Christmas are a bridge between the twelve months of the preceding and of the following years. The count of twelve is widespread in our lives: twice twelve hours in day, twelve spans of five minutes in an hour, twelve inches in a foot. More of this counting later, but my own ‘count of twelve’ for the Christmas season is twelve science-related highlights from Cambridge University’s Research News feed, one from each month passed in 2013. This is an entirely personal selection, so do not be surprised if you detect a bias towards the physical sciences, engineering and mathematics (I am myself a materials scientist). New materials, new applications and fresh insights into how the physical world works sit quite comfortably alongside a seasonal sense of excitement!

In January, we learnt how synchrotron radiation has been used to image the backbone structure of the earliest four-legged animals.

February brought the news that a team at the University of Surrey, in collaboration with astronomers in Cambridge, have been able to use the behaviour of phosphorus atoms in silicon to model the extreme chemistry on the surface of a white dwarf star.

Back on earth, March brought the opening of a state-of-the-art gallium nitride growth facility, which will allow researchers to improve the techniques for growing high efficiency LEDs on cheap silicon substrates. Experiments planned for the new reactor have the potential to save the UK £1 billion per year in electricity usage.

At the beginning of the financial year, April saw the roll-out across Europe by Cronto, a Cambridge University spin-out company, of a security product designed to protect us from online malware by using visual symbols and dots to verify the authenticity of customer transactions.

May saw another materials development, of a flexible, stretchable sheet material with colours as vibrant and shimmering as an opal, but without the use of potentially toxic dyes or metals. The material has potential applications for security, textiles and sensing.

Early summer is evidently the season for new materials: ‘carbon nanotube candyfloss’ was reported in June, not for consumption by visitors to the Strawberry and Midsummer Fairs, but as a potential route to super-strong electrical wires.

If June is the time for fairs, July is the time to travel, and was also the month in which Cambridge University Library made the archive papers of the 18th and early 19th-century Board of Longitude available to the public via the Cambridge Digital Library project.

August saw a metaphorical journey into outer space, with observations of the Sagittarius A* black hole at the centre of our own galaxy’s Milky Way rejecting gas clouds when these are too hot to be sucked in and devoured.

Much closer to home, spectacular images were published in September of the first known example of functioning natural mechanical gears, in a plant-hopper insect.

With October and the onset of chillier weather, two Cambridge engineers published their analysis of the whistling sound generated by a traditional kettle. Although the underlying reason for the noise is a straightforward piece of physics, the details of when and how the sound is produced are much more complex.

The nature of policy-making is such that those taking and presenting decisions often require scientific input, and need to apply this information without necessarily sharing the same depth of technical knowledge. November saw the publication of a timely list of ‘twenty top tips’ to help non-scientists appreciate the limitations of scientific enquiry.

My twelve months’ selection ends in December with a biological materials development: the announcement that certain retinal cells could be printed into patterns using piezoelectric inkjet technology, with the potential for retinal repair procedures.

And what of the significance of the count of twelve itself? Is this just a pre-decimal, cultural relic? Not at all – 10 is a useful base for arithmetic calculation, but 12 is exceptionally useful for division into equal parts. Twelve is the smallest positive whole number divisible into two, three, four or six parts which are also whole numbers; 60 is the smallest which is similarly divisible into two, three, four, five or six parts. No surprise then that our measures of time and space put 60 seconds in a minute, 60 minutes in an hour, 360 degrees (6 x 60) into a circle. The Babylonians based their counting system upon multiples of 60, which you can learn more about at any time of year from the NRICH Mathematics resources (search for ‘Babylon’).

Our courses at ICE aim to bring the excitement of being part of Cambridge learning and research to as wide a range of students as possible. In the Physical Sciences we have courses coming up in the next few months on Geological Hazards and Geological History, Mathematics (not just as a spectator!), and Nanomaterials. I would encourage you to take a look at what is on offer, come and join us, and share in our scientific journey.

Dr Erica Bithell, ICE Academic Director in Physical Sciences


MOOCs, SOCCs and kisses

Written by Jenny Bavidge Tuesday, 09 July 2013 10:19

jenny bavidge2 180px square

I’ve recently finished teaching a five week course on the creative and critical afterlife of Wuthering Heights. We looked at various responses to Emily Brontë’s novel, from the commercial (MTV’s film version which recasts Heathcliff as a blond rock star, oh dear) to the brilliantly eccentric (the still-classic Kate Bush song). I’ve taught this subject before, but this was the first time I’ve conducted a course entirely online, never meeting my students face-to-face. My students had the advantage over me as they could see my short video lectures whereas I had only a small photograph and their postings by which to get to know them.

Academic colleagues sometimes express uncertainty about how teaching online works and I’ll admit to some anxiety about how it would feel to teach students I’d never meet in person. A lecturer friend of mine says he can only imagine teaching students when he can “see the whites of their eyes” and it’s certainly true that any teacher of any subject will know how they respond to their students’ body-language; how one picks up the eager lean forward, or little flicker of comprehension or disagreement, a politely-concealed yawn or exasperated eye-roll as you speak too fast or snigger too long at your own joke.

As well as this kind of physical noticing, eye contact feels important in the classroom. You can prompt someone to speak by staring hard at them, or instigate a cheerful argument by glancing at a student whose opinion you suspect differs from that of the person speaking.

My old schoolfriend Hannah Thompson, a Cambridge alumna who now teaches French Literature at Royal Holloway, writes a wonderful blog about her research into cultural and literary representations of blindness which also charts her own experiences as a partially-blind lecturer. In an article about her research and teaching practice, Hannah describes how she has recently changed her approach in the classroom as she has become less able to make eye contact with class members or recognise faces. Rather than relying on the connection of eye contact, Hannah encourages her students to forget raising their hands or waiting for the conductor/teacher to bring them in, and to call out their responses and answers instead. Her students were nervous at first, but she describes how, gradually, some of the usual formalities and restrictions of the seminar room began to fall away. The students’ understanding of their teacher’s disability and her inspirational mastery and exploration of it, provoked all sorts of interesting responses to their subject of study and to their experience of studying it together.

The situation in an online seminar room is different to Hannah’s classroom, of course. I can’t see my students’ response to my talks or questions, but I can’t hear them either. It is possible to set up online seminars where students communicate with audio rather than typing or ‘live’ lectures where students can type in real-time questions, but many of my students were in different timezones, dropping in from Japan or the US (and, heavens, Northampton) so we normally didn’t have even that vague sense of each other’s physical presence to aid our communication. Instead, we got to know each other through initial introductions in the orientation week, where students worked out how and where they could talk to me and to each other, and then relied on the space of online forums to discuss the week’s reading.

Much of the recent discussion about online courses has concerned the growth of MOOCs (Massive Online Open Courses) where the emphasis is on massiveness and accessibility. At ICE, our model is the more cosy-sounding ‘SOCCs’ (Small Online Closed Courses), which are taught to closed groups with a limited number of students. Our SOCCs are organic, hand-knitted experiences, carefully designed to fit busy feet and based on the artisanal pedagogic approach for which Cambridge is known: small group-teaching, led by a tutor, encouraging wide-reading and independent thinking.

Unlike most MOOCs, your SOCC tutor will talk back to you when you post a comment or want to argue a point. And like undergraduate modules that develop from year to year, our courses are also protean in their content because they are research-inspired. My ICE colleague Ed Turner recently taught part of his online course in Conservation from the jungles of Sumatra where he was conducting research; my own course was punctuated by a visit to the no less exotic University of Leeds for a conference on creative responses to the work of the Brontës, so I came back to my students with my head full of Lisa Sheppy’s ‘Empty Dress’ and discussions of the Japanese version of Wuthering Heights.

One recent commentator on the MOOCs/SOCCs issue says that the mobility and flexibility of online courses are best suited to vocational subjects designed to respond to an ever-changing employment landscape, and not for traditional academic topics which move more slowly. Adam Kotsko says: “A course on The Odyssey could remain relatively unchanged for a long time, but that’s not the kind of thing that people are generally looking for with online ed.” Whyever not? That ‘kind of thing’ (the Humanities in general, or just old stuff?) isn’t inert knowledge. Our readings and understanding of The Odyssey, or Wuthering Heights or Ancient Rome change with every year, every new adaptation, or archaeological find, or critical move, or, indeed, with every new group of students who come together to travel with Odysseus, Heathcliff or the Romans.

I also don’t accept that Humanities courses which might rely on traditional techniques of slow and close reading can’t be taught via speedy digital technologies. And, in truth, the online class I was teaching had something rather beautifully old-fashioned about it even in its shiny new medium; as we post and respond to each other, we’re engaging in the communication common to letter-writers over the centuries. Writers, readers, editors, and groups of literary critics have always sent their thoughts over many miles: admiring, caustic, critical, devoted, fannish or furious, and, above all, focused, letters of discussion and comment. Digital letter-writing has its own advantages. There’s a spell-check for a start. Online, in-class discussions are more carefully constructed than emails, longer than tweets, and can use the little windows of hyperlinks which drop interlocuters into related areas of discussion alongside the main topic: I can place a link in a sentence to something that my reader can dive off to read before they come back to finish my sentence.

In a letter to his patron Henry Wotton, John Donne wrote in praise of the power of words to overcome distance:

“…more than kisses, letters mingle souls,
For thus, friends absent speak.”

There are many joys in the weekly encounters of our Certificate and Diploma classes at Madingley, or the yearly visits of our Summer School students who arrive in Cambridge with the swifts, but as Donne suggests, there are other ways to ‘mingle souls’, and although we can’t promise kisses, we think our SOCCs will warm you up.

--- Dr Jenny Bavidge, ICE Academic Director for English Literature

Find out more about online courses at the Institute of Continuing Education


What’s the buzz?

Written by Ed Turner Friday, 31 May 2013 10:09

ed turner 180pxbees 180px

Over the last few weeks the British countryside and particularly the gardens, woods and fields around Madingley have really come to life. From where I am sitting in my office, I can see the meadow at the back of Madingley Hall sparkling in the sun and bespeckled with pale yellow; the flowers of hundreds of cowslips.

Here and there delicate pink cuckoo flowers are in bloom, providing a rich resource for springtime bees and butterflies. Walk along one of the mowed paths through the meadow and you can see bees at work, particularly bumblebees, as they drone from flower to flower, collecting pollen and nectar for their developing nests.

Follow one of these furry little foragers back to their home and their lives appear even more remarkable. Many, like the buff-tailed bumblebee, one of our commonest species, make their nest below ground in deserted mouse holes. In the early spring the new queens emerge from hibernation and search for a new home. This is when you can see them hovering like miniature helicopters low over the ground, occasionally dropping down to inspect a likely looking hole or crevice. Soon they set up shop and start to construct cells of wax, which they provision with pollen and nectar and where they lay their eggs.

As the season develops, the eggs hatch into grubs, which are fed by the young queen and grow rapidly. After a few weeks, they pupate and then emerge as new adults: the first generation of workers. Now the queen no longer leaves the nest, but stays at home laying more eggs for the workers to tend. As spring turns to summer and the nest expands, more and more workers are produced until a single colony can number several hundred individuals!

But, despite the prosperity, things are not as peaceful as they appear. As the colony grows it is time for the next generation of queens and males to be produced and now mutiny occurs. To produce these new reproductive individuals, the queen relaxes production of special pheromones by which she has been keeping the workers under control. Without these chemical shackles, anarchy breaks loose and the workers begin to lay their own eggs and can even attack and kill the queen, their mother! To think that all this life and tragedy can occur just below our feet in a Cambridgeshire garden.

With so many insects flying around the meadow at Madingley, it is easy to forget that bees and other pollinators have declined severely during the last century. What caused the dramatic loss isn’t entirely clear. The destruction of flower-rich meadows with agricultural intensification and increases in herbicide and fertiliser use were probably major factors.  But in recent years, researchers have identified another potentially serious threat. Pesticides called neonicotinoids, which are very widely used on flowering crops such as oilseed rape, not only stop colonies from growing as quickly, but also reduce the chance that foraging workers will find their way back to the nest. Partly as a result of this growing evidence, the EU recently passed legislation to ban the use of three of these pesticides on flowering crops for the next two years.

Hopefully this initiative and the use of more biodiversity-friendly farming methods are helping to restore wild pollinator communities to their former glory. Not only are these remarkable species part of the natural world, but they carry out invaluable services for us by pollinating many of our crops. If you have a garden or even just a window box, you can also help make sure that these pollinators get enough to eat, by planting bee-friendly flowers and by not being too precious about tidying up your borders. After all, what would summer be without the buzz of bees?

If you would like to learn more about pollinators, their ecology, value for pollination and conservation, why not sign up to our weekend course, Bees, flies and flowers, starting on 14 June, run by pollinator professional, Dr Lynn Dicks.

Dr Ed Turner, ICE Teaching Officer and Academic Director in Biological Sciences


Teaching the unteachable

Written by Sarah Burton Monday, 25 March 2013 10:07

Creative writing 106 200px

‘But can you really teach people how to write?’

It’s a line I’ve heard so many times, yet it’s still surprising. When someone wants to become a painter or a sculptor, they go to Art School. No-one says: ‘But can you really teach people how to paint?’ It’s just universally accepted that if you are artistically gifted you will benefit by studying technique, observing how other artists have achieved their effects, and experimenting, under the guidance of tutors (who are also artists) in order to develop your own unique style. But teaching (or learning) Creative Writing is regarded as a much more spurious affair. Writers are born, not schooled, according to some.

One of the writers I read in my early teens who made me sit up and realise there were vital voices which had not formed part of my Eng. Lit. education at school was Kurt Vonnegut. This was a writer who changed everything I had previously thought about what writing was, or could be. I didn’t know then that he had begun teaching Creative Writing at the University of Iowa at the same time he began writing the novel which brought him to the public’s attention, Slaughterhouse-Five (1965).

Past students on that course included Tennessee Williams and Flannery O'Connor. (‘One wonders what ever became of them,’ Vonnegut reflected, when he, too, was faced with the same question: ‘But can you really teach people how to write?’)

In defence of teaching creative writing, Vonnegut repeated a legend that he felt made a key point.

‘A tough guy, I forget which one, is asked to speak to a creative writing class. He says: "What in hell are you doing here? Go home and glue your butts to a chair, and write and write until your heads fall off!" Or words to that effect.

‘My reply: "Listen, there were creative writing teachers long before there were creative writing courses, and they were called, and continue to be called, editors."’

Vonnegut said that the most recent person to ask him the question about whether writing could be taught was a journalist, and that, in all probability, the asker of the question was taught by an editor. Many novelists were previously journalists, and the on-the-job training, because informal, remains largely unrecognised. Returning to the comparison with artists, Leonardo da Vinci was educated in the studio of Verrochio; Michelangelo was apprenticed to a painter and subsequently studied under a sculptor. They may well have been born artists, but they grew and learnt and refined under the critical eye and nurturing hand of other artists.

Vonnegut went on: ‘If the tough guy was Thomas Wolfe or Ernest Hemingway, he had the same creative writing teacher, who suggested, on the basis of his long experience, how the writer might clean up the messes on paper that he had made. He was Maxwell Perkins, reputedly one of the greatest editors of fiction who ever lived.’

Discovering this article fairly recently whetted my appetite for finding out more about the literary editor, Maxwell Perkins. I learned that he had indeed made Tom Wolfe publishable by encouraging him to cut 90,000 words from his first novel (that, in itself, is the length of a full-blown novel); he brought Hemingway’s first book to the press, fighting in-house resistance to Hemingway’s ‘bad language’ by securing the author’s co-operation in deleting some of it and defending the rest of it. Vonnegut was so sure of Perkins’ contribution to literature that he did not even add that Perkins had also mentored F Scott Fitzgerald and published his first novel, not to mention bringing Erskine Caldwell and Alan Paton (Cry, the Beloved Country) to the world’s attention, supplying the plot for The Yearling (Marjorie Kinnon Rawlings), and publishing the first efforts of a number of Pulitzer prize-winning authors.

Perkins never rewrote his authors’ works. He suggested titles and plots. He gave advice about structure and selection. He advised them on what to read. And he defended them. His letters to writers are full of thoughtful, sound and sensitive advice, tailored to the needs of that particular writer, and yet of universal value. He, like Vonnegut, speaks to me about what writing can be.

And there’s another, purely economic aspect, to Creative Writing as a subject. Quite simply, it saves time. As Vonnegut himself observed, he wished he had attended a good creative writing course at the beginning of his writing career. ‘To have done so would have been good for me.’ He quotes another author who regretted not having taken a course at Iowa or Stanford when he was starting out as a novelist. ‘That would have saved him, he said, the several years he wasted trying to find out, all by himself, the best way to tell a story.’

Creative Writing is now offered as an A-level, but there appears to be an expectation that English teachers will just be able to teach it. Some will do it, easily and well. Others will struggle, and so will their students. Somehow it’s sneaked onto the syllabus without any training being offered, as if there is an assumption that English teachers can teach this, because ‘it’s all writing’. If this experiment fails, it won’t be the teachers’ fault. It will be down to a fatal misunderstanding of the difference between criticism and practice.

So, let’s return to the initial question. Can you teach people how to write? Yes, if the student has a burgeoning talent and the tutor understands how to nurture it. You also have to resist the temptation make everyone write like you do. You need to help them write like themselves. That’s what Maxwell Perkins did. The writer, in his words, had to ‘own the book’.

Dr Sarah Burton

Sarah is Course Director and Tutor on many of ICE's Creative Writing courses, including the new MSt in Creative Writing


What do scientists actually do?

Written by Ed Turner Friday, 04 January 2013 13:50

ed turner 180pxTrekking 180pxIndonesia View 180pxField camp 180px

One of the advantages of being a tropical biologist is that you have a legitimate excuse to escape the soggy shores of England for a few weeks over the winter. There’s something almost magical about stepping onto a plane in the UK, where the trees are leafless, the nights are drawing in and everyone seems to have a head cold, and disembarking in the tropics, where the humid heat hits you like a wall and everything’s in full flower. These research trips (or ‘holidays’ as one of friends irritatingly describes them) form the backbone of my research on tropical biodiversity and conservation.

Also, they really aren’t holidays. Over the weeks or months I am away, each day is carefully planned to fit into a research schedule that makes the most of my time. First there is the set-up and visa chasing. In my last trip this involved a noisy and smelly week in the centre of Jakarta, running from government office to office delivering passport photos and filling in forms. Then there is travelling to the research area (often quite remote), liaising with local scientists and collaborators, and setting up research plots. Once this is done, there is the careful collection of data. In my case this usually involves surveying for different insect species, collecting specimens using standard techniques and storing and identifying them.

The set-up and distribution of each survey area, the methods used and the types of insects studied are all planned well in advance; determined by the research questions being asked. Once the data is collected, the results are analysed statistically and written up for publication in peer-reviewed scientific journals. For applied research the process doesn’t stop there. Perhaps the most important step is to make sure that findings are communicated to other organisations and individuals who can then make use of the information. For my research, presentations to the agricultural industry and conservation organisations are vital in ensuring results actually inform policy and management on the ground.

This whole process of research, from the inception of a research question to planning, project design, data collection, analysis, write up, review, publication and communication, is central to how science works. Most of it isn’t at all glamorous or exciting, but rather careful, balanced and reflective. Only rarely do findings lead to a sudden shift in concepts or how things operate; rather data slowly accumulates which provides support for or against a particular theory or process. As Isaac Newton put it “If I have seen further it is by standing on the shoulders of giants” or as Hal Abelson has it “If I have not seen as far as others, it is because giants were standing on my shoulders.” Science is all about communication and building on the ideas and concepts of other researchers. I sometimes wonder if this careful and interactive core of science is underplayed or ignored when science is portrayed in the media. All too often scientists appear as egg-headed intellectuals, crouching in their high-tech labs awaiting a eureka moment, or, for field biologists, charging through the tropical rainforests without apparent direction on the lookout for a cure for cancer or the discovery of a new species.

At Madingley we have a wide range of courses coming up over the next few months, which break down these misconceptions and give participants a front row seat of cutting-edge scientific research. In each course, participants have the opportunity to interact with the Madingley biology tutors, who are often Cambridge scientists, and to find out more about how research takes place and its application in the real world.

For example in January, you can dive into the topic of marine conservation in Marine biology and conservation: exploring planet ocean. In February, you can discover more about the history of research in Cambridge and the value of biological collections in our Cambridge collections course, which provides a rare opportunity for a behind the scenes look in five of the Cambridge museums. Complementing this, you can also find out about contemporary Cambridge research in How science works.

In March and April you can discover more about natural history around Cambridge with Birds in spring and learn how to collect biological data in the field with Wild Madingley. Alternatively, if you’re more of an armchair biologist, you can learn about fieldwork in the most challenging environments from the comfort of Madingley Hall in Polar challenges for people and science.

So why not sign up to some of our courses and get behind the lab coats and big spectacles to meet real scientific researchers and find out more about what scientists do, and why research is important.

Dr Ed Turner, ICE Teaching Officer and Academic Director in Biological Sciences