Professor Nutt and the classification of harm through the misuse of drugs

The sacking of Professor Nutt (now ex-head of the Advisory Council on the Misuse of Drugs) by the Home Secretary Alan Johnson, has been in the news today. The immediate cause of his sacking appears to have been this recently published paper which was originally presented as the “2009 Eve Saville Memorial Lecture” at the Centre for Crime and Justice Studies at King’s College in July 2009. The lecture appears to have been a policy discussion based in part on his classification of relative drug harm which was first published in The Lancet in 2007:

Development of a rational scale to assess the harm of drugs of potential misuse, David Nutt, Leslie A King, William Saulsbury, Colin Blakemore, The Lancet, vol. 369, (2007), p1047-1053.

This classification of harm was based on assessment by two sets of experts: the first set of 29 from the Royal College of Psychiatrists’ register as specialists in addiction, the second set draw from a wider community involving members “ranging from chemistry, pharmacology, and forensic science, through psychiatry and other medical specialties, including epidemiology, as well as the legal and police services”. The basic scheme was to ask these experts to assess the harm caused by a set of 20 substances (mainly illegal but including alcohol and tobacco) on a set of 9 measures:

This is done iteratively using what is called a ‘delphic process’, the experts make their initial numerical assessments independently in an initial round, but can then modify those assessments once they have seen and discussed the assessments made by others. Once they have reached some pre-determined finishing criteria they combine the average scores for each area to produce an overall measure of harm. They are pre-warned of the substances in question so they can go read up on them. The rankings of the two separate groups appeared to be very much in agreement. The resulting mean harm scores for the twenty substances are shown in the following graph:

The interesting thing about this group is that tobacco and alcohol (which I’m currently enjoying in the form of fine Chardonnay) are found in the middle of the range, below heroin and cocaine but above cannabis and Ectasy. A statement which in part has earnt Professor Nutt his dismissal.

Now you could argue that “The Lancet” paper is flawed, and Professor Nutt makes suggests for improvements in methodology, but the thing is: there is no competition. Current drug classifications into A, B and C are not made on an assessment of harm based on any published or transparent criteria. If Alan Johnson wants to argue that Professor Nutt is wrong on his evaluation of the relative harm of drugs he should do so on the basis of a transparent evaluation process not because he just doesn’t like the advice he’s been given.

Though I have not focussed on it in this post, the Eve Saville lecture includes this assessment of harm along with a discussion of other issues including the media reporting of deaths through drug misuse. It does also include some support for elements of government policy on drugs, in particular he says:

One thing this government has done extremely well in the last ten years is to cut away much of the moral argument about drug treatments. They have moved in the direction of improving access to harm reduction treatments, an approach that, I think, is wholly endorsed by the scientifi c community and by the medical profession.

Update
1st November 2010: Professor Nutt has published an improved version of this study in The Lancet (pdf), the process used is a little different and an attempt has been made to improve the relative weight given to different harms. This revised study finds that heroin, crack cocaine and metamfetamine most harmful to individual users and alcohol, heroin and crack cocaine most harmful to others.

Who Dr.?

After the big and shiny experience as an undergraduate I went off to do a PhD., to make me into a Dr. This was something I’d intended to do since a visit to the Campden and Chorleywood Food Research Centre as a school student; there we were shown around the labs and I was convinced that a career in science without a PhD. was going to be a serious uphill struggle involving the cleaning of much lab glassware.

The exact nature of a PhD. varies from country to country and from subject to subject. In the UK a PhD. in physical chemistry is typically 3 years long and the supervisor will usually have a big say in what the student does.

I did my PhD. at Durham University in the Interdisciplinary Research Centre for Polymer Science, supervised by Prof. Randal Richards. Prime motivation for this particular PhD. was the cash, it was funded by Courtaulds Plc and paid a research assistant salary. It also got me back to more big and shiny science, in the form of the neutron source at the Rutherford-Appleton Laboratory  (RAL) and with the added benefit that a very skilled technician made my polymers for me. This was good because I’ve never been “at one” with synthetic chemistry, the untidiness of the process didn’t suit my temperament. Apocryphally the start of polymer science was a bit slow because the early polymer synthesisers couldn’t crystallise their material, this led to much derision from other synthetic chemists who made lovely crystals from their materials, rather than black sludge that polymer scientists made. The molecular nature of polymers wasn’t appreciated until the 1920’s which is really rather recent.

So for 3 years I slaved away: I prepared samples – spinning thin films onto lumps of shiny flat silicon, I went down to the RAL for 48 hour experimental runs, I wrote FORTRAN programs do do data analysis, I read journal articles, I attended conferences, made posters and gave presentations. I observed, from a small distance, the activities of synthetic chemists.

The chap over the desk from me was a historical re-enacter, I watched as he made his own chain-mail.

It was whilst I was writing my thesis, entitled “Surface composition profiles in some polymer mixtures”, that I first met with the elephant of despair. The elephant of despair lived in the library, he was made of a transparent material so you could scarcely see him and he was only about 6 inches tall. He stood in the gaps between the journals, waiting for when I would arrive to find an article and discover on the way a paper published 10 years ago which captured most of what I’d slaved over for the last three years. His plaintive trumpeting has haunted me on and off through the years.

I think the day I decided I wasn’t going to make an effort to get “Dr.” onto all my paperwork was the day I was in the bank the man in front of me was having a lengthy discussion with the cashier because the printed numbers in his saving book did not line up with the ruled lines. After he’d left the cashier turned to her colleague and said: “He had to complain, he was a doctor”. As it stands the only people who call me “Dr Hopkinson” are my parents, one of my credit cards and the odd polite student.

For reasons I don’t understand medical doctors appear to refer to PhD’s as “proper doctors”, whilst I’ve always considered myself a bit of fraud since I was not a “proper doctor” – who could potentially save your life. Perhaps they’re just being polite.

And now I’m nearly a PhD. grandfather, I supervised three PhD. students of my own and one of these has a student who is about to do her viva. I don’t have children, but I feel very ‘parental’ about my students – I’m immensely proud of them and their achievements.

Wordless Wednesday

Talkin’ about my generation

My generation have all been wallowing in nostalgia at the Electronic Revolution strand on BBC4, in particular Electric Dreams – the 80’s and Micro Men – the story of Sinclair and Acorn computers. We grew up in a golden age for programming, the generation before us had no hardware and the generation after us had no need to write their own software. We programmed because we had to.

I had a Commodore VIC20, cheaper than the BBC Micro, more classy and substantial looking than the Sinclair ZX81, available slightly before the ZX Spectrum. All of these lovely old machines available for your viewing pleasure at Centre for Computing History, along with many others. Look around the internet and you can also find all manner of emulators and manuals for these early machines. We wrote our own programs, or we typed in games from magazines – this was often a rather lengthy process and a bit prone to error.

I found the “VIC20 Programmers Reference Guide” here re-typed by Asbjorn Djupdal. Here’s snippet: a program which allows you to enter the scores in each quarter for an American football game and then prints them out on screen in a table:

100 DIM S(1,5), T$(1)
110 INPUT “TEAM NAMES”;T$(0),T$(1)
120 FOR Q = 1 TO 5
130 FOR T = 0 TO 1
140 PRINT T$(T),”SCORE IN QUARTER” Q
150 INPUT S(T,Q)
160 S(T,Q) = S(T,0) + S(T,Q)
170 NEXT T,Q
180 PRINT CHR$(147) “SCOREBOARD”
190 PRINT “QUARTER”;
200 FOR Q = 1 TO 5
210 PRINT TAB(Q*2 + 9)Q;
220 NEXT
230 PRINT TAB(15)”TOTAL”
240 FOR T = 0 TO 1
250 PRINT T$(T)
260 FOR Q = 1 TO 5
270 PRINT TAB(Q*2 + 9) S(T,Q);
280 NEXT
290 PRINT TAB(15) S(T,0)
300 NEXT

Oh, this brings back memories!

To me programming and science (or at least physics) are intimately linked, almost the first programming I ever did was to visualise beat frequencies. To this day, if I want to really understand a scientific paper I’ll implement the equations in a program, as often as not a few typos in the equations are revealed in this way and I’ll have learnt exactly what the paper was on about. Teaching a student is a fantastic why to learn something, teaching a computer is almost as good.

Most the programming I do is of a workmanlike nature, it drives machines for measurements; it processes data; it analyses results; it computes equations, but there is scope in programming for a deep elegance, a pared down beauty which is difficult to describe – it’s like finding the answer to a cryptic crossword clue – perhaps for an artist it’s like finding just the right line to give a character personality. It’s an algorithm that does what it has to do with the least effort required. I still program a lot for my work (relatively small stuff that only I will use), and it’s not unknown for me to waste an hour doing something elegantly rather use the quick, dirty and obvious approach.

Programming is in my genes, in two ways really – my parents were both programmers from the sixties. We once found a leaflet advertising the Elliot 503 in our loft, 400sq ft of ’60s computer with substantially less processor power than the most lowly of today’s devices – this is the computer on which my mum learnt to program. Dad started on an early Ferranti of some description in the late 50’s.

Earlier programming for me pretty much amounted to shouting verbs at things, possibly because I used FORTRAN which at the time was ALL IN CAPITALS. Programming today feels very different, it’s more like visiting a library to get a book of spells to cast or the singing of a choir. I still enjoy doing it, in fact I’m writing a twitter client in C# just so see how to do it.

You might get the impression from all of this that programming is for the mathematically minded, but it isn’t – it’s really for the logically minded, for some mathematical applications maths is required but otherwise it isn’t.

I taught the basics of programming to first year physics students a few years ago, and the thing that really shocked me was that, out of a class of fifty, only one had any real programming experience. There is hope though, I suspect programming still holds a fascination – my single data point: father and son sitting down to program the BBC Micro on Electric Dreams.

Wordless Wednesday