Category: Book Reviews

Reviews of books featuring a summary of the book and links to related material

Book review: Mauve by Simon Garfield

mauveMauve: How one man invented a color that changed the world by Simon Garfield is a biography of William Perkin. Who first synthesised the aniline dye, mauve, in 1856 at the age of 18.

Synthetic dyes were to form the catalyst for the modern chemical industry, an area close to my heart since I worked at Unilever on fluorescent and “shader” dyes for the colouring of laundry and teeth. For my undergraduate degree and PhD I was close to organic synthesis labs but didn’t participant with any any enthusiasm (everything gets mixed up and you can poison, burn or explode yourself!).

The book starts with a trip by William Perkin to the United States in 1906, and a series of events to celebrate the fiftieth anniversary of his discovery. It’s very reminiscent of similar celebrations on a visit of Lord Kelvin at around the same time. By the later years of his life he was lauded in his field, if not so much beyond it.

Chemistry as a subject was relatively unformed in the middle years of the 19th century. Lavoisier, Davy, Dalton and others had laid the foundations of the modern subject in the early years of the century but it looked nothing like it does today. Chemical formulae were understood but their structural meaning was still a mystery and certainly not liable to routine elucidation. There were chemical industries of sorts, such as the manufacture of gunpowder, the preparation of dyes and tanning. Coal gas was made from coal, producing a variety of by-products including coal tar.

Perkin was studying at the Royal College of Chemistry as an assistant to August Hofmann who was focused on the idea of synthesising quinine from coal tar. He had been encouraged in his scientific studies by Faraday, and Hoffmann had personally intervened with his father for him to study at the Royal College, who had a career in architecture in mind for him.

There is a superficial similarity in the chemical compositions of aniline, a component of coal tar, and quinine. At the time it seemed plausible to synthesis the one from the other. Quinine was highly valued as an antimalarial drug whose supply was very limited. In the end quinine was not to be synthesised until 1944 by Robert Woodward. The synthesis of useful analogues of natural compounds continues to be one of the driving forces in synthetic chemistry.

In 1856, whilst trying to make quinine, Perkin synthesised an attractive colour (mauve) that dyed silk. Such a discovery was not entirely novel or unknown, the colouring properties of coal tar derivatives had been observed before. However, Perkin saw commercial potential and approached a Scottish dye manufacturer, Robert Pullar for advice. At the time dyes such as madder, indigo and cochineal were derived from animal or vegetable matter and were expensive and unpredictable. The natural growth process meant you were never quite sure of the quality of product you were making, or using.

Colouring something is only half the story with dyes, it is also important that the dye sticks to the target and stays there after washing or exposure to light. The techniques and materials for achieving this depends on whether the target is cotton, silk, wool, paper or whatever. With a new class of dyes, new techniques were required. So alongside the colouring material Perkin also provided technical services to help his customers use the dyes he made.

The business was boosted when mauve became a fashionable colour, worn by Queen Victoria. Perkin grew his factory in Greenford, and ultimately sold it when he was 35 for around £100,000 (which appears to be something around £75million in current value). After this he seems to have focused on further research rather than any other commercial venture. His motivation for selling up seemed to be that German companies had become dominant in the production of dye. It was felt that they had better access to trained technical personnel, and their companies were more willing to spend money on research (a complaint still heard today). Then, as now, it was argued that the British were good at inventing but not exploiting.

From dyes the synthetic chemical industries expanded into new areas. In the first instance dyes were useful in themselves in preferentially staining different microscopic structures. It was then discovered that some of them had biological activity, such as methylene blue. And from the aniline dyes were synthesised the antibiotic sulfa drugs and then other, uncoloured medicines.

The synthetic adventure was to continue with synthetic polymers which, in common with mauve, started as an unpromising black sludge at the bottom of a reaction vessel.

The chemical industry in Britain was resuscitated by World War I. Britain found itself dependent on German companies for dyes for military uniforms and precursors to explosives at the onset of war. The strategy, repeated across many industries, was for government to take direct control with the resulting organisations continuing after the war. For the chemical industry this lead to formation of ICI, Imperial Chemical Industries. The manufacture of bulk chemicals has largely moved to China now and ICI broke up and was sold between the early nineties and 2010.

Mauve is an enjoyable read but lacks depth.

Book review: Coalition by David Laws

Coalition: The Inside Story of the Conservative-Liberal Democrat Coalition Government by David Laws does exactly what it says on the tin. It is the story of the Coalition running from 2010-15 from the point of view of someone at the heart of the action on the Liberal Democrat side. David Laws was a member of the negotiating team which took the party into the Coalition and a regular attendee at meetings of the Quad (where differences between the Coalition parties were thrashed out). Later he was a secretary of state in Education.

Laws finishes the book by answering three questions which I list below and are a useful way of organising this review.

Did the coalition work as a form of government?

The Coalition lasted the full parliamentary term, contrary to what many people expected. Both parties in the Coalition implemented significant chunks of their manifesto, and there didn’t seem to be many great dramas over votes unexpectedly lost. The members of the coalition seemed to get on OK, there was a dispute resolution system involving the Quad (Nick Clegg, Danny Alexander, David Cameron and George Osborne) and, in extremis, David Cameron and David Clegg alone. Laws appears a rather amiable chap and seemed to get along with many of his Coalition opposite numbers, particularly Oliver Letwin, Ken Clarke, George Osborne, even Michael Gove (whom he also found infuriating).

Laws writes quite a lot about his experiences in the Department for Education, and it becomes increasingly clear to me that the stories you see about chaos in government are typically an “inside job”. In this case Gove and his advisor Dominic Cummings briefing against the free school meals Laws championed. You can see Cummings hand in the briefings against Cameron now they are on opposite sides of the EU referendum debate. It follows from similar internecine struggles during the Blair and Brown years, and you can see it now in Corbyn’s Labour party. It is not absent in the Liberal Democrats, Laws highlights that part of the pain of tuition fees for the party was in the deep division within the party. Regardless of what had been achieved, half the party would remain unconvinced and if the party doesn’t believe then what hope persuading the public? Vince Cable’s frequent, contrary, interventions on the economy had a similar effect. And the polling done by his friend, Matthew Oakshott to undermine Nick Clegg. 

The accusation that senior Tories act very directly and explicitly in their own self-interest and that of their major donors is all the more damning coming from someone who clearly has a lot of time for them. Areas like the response to the Leveson enquiry are muted because of Tory Party enthusiasm for keeping papers on side. The proposed Mansion Tax is quashed to keep Tory party donors onside, it being raised by both Labour and Liberal Democrats is welcomed though. The Tories, particularly George Osborne, were repeatedly looking to cut the welfare bill (except for pensioners) largely because they didn’t see claimants as “their people”.

The “English Votes for English Laws” announcements made on the day of the Scottish referendum victory very much put a dampener on the result, and was done by Cameron for short-term gain.

After the 2015 election we can see that Liberal Democrats had a substantial restraining influence on the Tories in power, the distributional impact of changes for the budget is much more heavily skewed against lower income groups than it was under the Coalition (see here for the 2010-15 figures and here for the 2015-19). Legislation like the parliamentary boundary changes and the “Snooper’s Charter” are now going ahead, previously blocked by Liberal Democrats.  

What were it’s achievements?

The Liberal Democrat achievements in government have been summarised in Mark Pack’s rather fine infographic or the eponymous What the Hell Have the LibDems Done? website.

In summary:

  • Increased personal tax allowance to £10600 from £6475 in 2010;
  • Pupil premium / free school meals;
  • Pensions triple lock;
  • Overseas aid target;
  • Early years education entitlement;
  • Shared parental leave;
  • Pensions and benefit uprating in line with high inflation;
  • Equal marriage;
  • Mental health access standards;

The introduction of equal marriage was a surprise bonus, not in anyone’s manifesto but pushed through by Liberal Democrat Lynne Featherstone with the support of Theresa May despite continually opposition from backbench Tories and surprisingly, initial opposition from Labour and Stonewall.

Constitutional reform was the area where Liberal Democrats fell down, not getting either electoral reform or reform of the House of Lords. Neither of these are areas where the public shows any interest, and nor do they have the support of either Labour of Tory parties so perhaps failure was inevitable. In contrast to the EU referendum and the Scottish referendum there appears to be no call for a second referendum on AV.

What could Liberal Democrats have done better?

It was widely touted in the Liberal Democrats that coalition would be electorally damaging, given the experience of other smaller liberal parties in coalition in Europe and elsewhere. I think we gradually took this to heart as we lost councillors, then MEPs and finally all but eight of our MPs but none of us were really prepared for the final blow. Now following the first local and Scottish parliament elections after the end of the Coalition we are starting to win back seats and grow support.

Much of our loss in votes came pretty much immediately that we formed a coalition with the Tories, so one thing we could have done is not formed a coalition. I don’t support this idea, David Laws doesn’t support this idea, and he cites a whole load of other Liberal Democrats who don’t support this idea. The last 5 years have been the best time to be a Liberal Democrat at least since I joined the party in about 1990, our policies actually got implemented in government – which is the whole point of being a political party!

Inevitably attention will turn to the tuition fees vote, Laws’ first prescription for this is not to have made the promise to scrap tuition fees in the 2010 election. His second prescription, to have vetoed the idea is probably right in retrospect but didn’t happen because we were still trying to work out how to make coalition work and weren’t confident of our actions. As it stands the current tuition fee policy works, in the sense that enrolment in universities and enrolment from lower income groups continues to rise. It is a graduate tax in all but name with the advantage that you don’t avoid it by emigrating and it can be collected from EU students.

The NHS Bill is another idea which Liberal Democrats should have vetoed, largely in my view because it was unhelpful at a time when the NHS was supposed to be making large efficiency savings. It would also have helped the Tories in not damaging their fragile reputation over the NHS. Lansley was sacked as Secretary of State for Health for contaminating the brand of the Tories over the NHS, to be replaced by Jeremy Hunt(!).

From a more technical point of view Laws toys with the idea of going for more senior Secretary of State positions in the government rather than the more junior ministerial positions that were taken, Danny Alexander and Nick Clegg both held quite senior positions but they were someone else’s deputies. Our strength in Cabinet was propertional to our share of seats rather than our share of votes. Other Liberal Democrats such as Vince Cable held top positions but in less important departments.  

The style of the book is crisp, it rattles through around 50 short chapters. The quoted dialogue sounds incredibly wooden, I recommend not buying any fiction Laws’ might write! If you’re interested in politics then I thoroughly recommend this book, if nothing else it gives a clear insight into how coalition government can work in the UK. For Liberal Democrats it is an essential record of what we achieved in government. Whilst there may be more detached, historical reports in the future there is unlikely to be one better from the core of the action.

Book review: An Introduction to Geographical Information Systems by Ian Heywood et al

HeywoodI’ve been doing quite a lot of work around Geographical Information Systems recently. So I thought I should get some background understanding to avoid repeating the mistakes of others. I turned to An Introduction to Geographic Information Systems by Ian Heywood, Sarah Cornelius and Steve Carver, now in its fourth edition.

This is an undergraduate text, the number of editions suggests it to be a good one. The first edition of Introduction was published in 1998 and this shows in the content, much of the material is rooted in that time with excursions into more recent matters. There is mention of CRT displays and Personal Data Assistants (PDA). This edition was published in 2011, obviously quite a lot of new material has been added since the first edition but it clearly forms the core of the book.

I quite deliberately chose a book that didn’t mention the latest shiny technologies I am currently working with (QGIS, Open Layers 3, spatial extensions in MariaDB) since that sort of stuff ages fast and the best, i.e. most up to date, information is on the web.

GIS allows you to store spatially related data with the ability to build maps using layers of different content and combine this spatial data with attributes stored in databases.

Early users were governments both local and national and their agencies, who must manage large amounts of land. These were followed by utility companies who had geographically distributed infrastructure to manage. More recently retail companies have become interested in GIS as a way of optimising store location and marketing. The application of GIS is frequently in the area of “decision support”, along the lines of “where should I site my…?” Although, “how should I get around these locations?” is also a frequent question. And with GPS for route finding arguably all of us carry around a GIS, and they are certainly important to logistics companies.

From the later stages of the book we learn how Geographic Information Systems were born in the mid to late 1960s became of increasing academic interest through the 1970s, started to see wider uptake in the eighties and became a commodity in the nineties. With the advent of Google Maps and navigation apps on mobile phones GIS is now ubiquitous.

I find it striking that the Douglas-Peucker algorithm for line simplification, born in the early seventies, is recently implemented in my favoured spatially enabled database (MariaDB/MySQL). These spatial extensions in SQL appear to have grown out of a 1999 standard from the OGC (Open Geospatial Consortium). Looking at who has implemented the standards is a good way of getting an overview of the GIS market.

The book is UK-centric but not overwhelmingly so, we learn about the Ordnance Survey mapping products and the UK postcode system, and the example of finding a site for a nuclear waste repository in the UK is a recurring theme.

Issues in GIS have not really changed a great deal, projection and coordinate transforms are still important, and a source of angst (I have experienced this angst personally!). We still see digitisation and other data quality issues in digitized data, although perhaps the source is no longer the process of manual digitization from paper but of inconsistency in labelling and GPS errors.

One of the challenges not discussed in Introduction is the licensing of geographic data, this has recently been in the news with the British government spending £5 million to rebuild an open address database for the UK, having sold off the current one with the Royal Mail in 2013. (£5 million is likely just the start). UN-OCHA faces similar issues in coordinating aid in disaster areas, the UK is fairly open in making details of administrative boundaries within the UK available electronically but this is not the case globally.

I have made some use of conventional GIS software in the form of QGIS which although powerful, flexible and capable I find slow and ugly. I find it really hand for a quick look-see at data in common geospatial formats. For more in-depth analysis and visualisation I use a combination of spatial extensions in SQL, Python and browser technology.

I found the case studies the most useful part of this book, these are from a wide range of authors and describe real life examples of the ideas discussed in the main text. The main text uses the hypothetical ski resort of Happy Valley as a long running example. As befits a proper undergraduate introduction there are lots of references to further reading.

Despite its sometimes dated feel Introduction to Geographic Information Systems does exactly what it says on the tin.

Book review: A New History of Life by Peter D. Ward and Joe Kirschvink

A new history of life

This next book is a Christmas present, A New History of Life by Peter D. Ward and Joe Kirschvink.

The theme of the book is the evolution of life from the very early periods of life on earth with a particular emphasis on that which has been discovered over the last 20 or so years, they cite Life: A Natural History of the First Four Billion Years of Life on Earth by Richard Fortey as the last comparable work.

A New History follows a long thread of books I’ve read, some I’ve reviewed here such as Neil Shubin’s Your Inner Fish, others in the prehistory of my blogging such as Stephen Jay Gould’s Wonderful Life: the Burgess Shale and the Nature of History. I’ve also written about First Life, a program narrated by David Attenborough on the earliest life.

It turns out quite a lot has happened in the last 20 or so years, radio-dating has improved in sensitivity allowing us to probe the early years of life on earth in more detail, new fossil fields have opened up in China, the chemistry of the early earth is better understood, early “Snowball Earth” episodes have been identified, and the discovery of exoplanets has led to more interest in the very earliest stages of life. Indeed, Tiktaalik at the core of Shubin’s Your Inner Fish, one of the first vertebrates to walk on the land, was discovered this century.

I always feel a little cautious approaching popular books such as this, promising great and revolutionary things, the risk is they present a minority view unsupported by other experts in the field and as an outsider you would be completely unaware of this. Thankfully, this isn’t the case here, Ward and Kirschvink are experts in early life but they are explicit where they own theories come into play and present alternative viewpoints in a fairly balanced way.

Half of the book covers the earliest life on earth up to the Cambrian Explosion 600-500 million years ago, when a huge diversity of life suddenly appeared. It seems here that the greatest new research activity has taken place. This includes work on ancient atmospheric composition: the relative amounts of carbon dioxide and oxygen; the “Snowball Earth” periods of complete glaciation where there was only liquid water at the surface due to volcanic activity; the chemistry of early life and the precursors to the Cambrian Explosion such as the Ediacaran fauna and other life such as Grypania and arcritarchs of which I had not heard. Vernanimalcula is also mentioned as the first bilateral animal, a microscopic fossil found in rocks 600 million years old. Although I see from wikipedia that this attribution is disputed.

The origins of life have the air of the physicist’s dark matter, their must be something there but we have little direct evidence for what it is and so it is ripe for a wide range of hypotheses. The big problem is the formation of RNA and DNA, experiments have long show that the basic building blocks of life can form in plausible early conditions stimulated by heat and lightning. But DNA and RNA are large, complex molecules, and not particularly heat stable.  One of the authors (Kirschvink) is keen on a Martian genesis for these molecules, then transported by asteroid to earth. I’ve always found these extra-terrestrial origins proposals unlikely.

A New History highlights the dispute between Stephen Jay Gould and Simon Conway Morris over the Burgess Shale. Gould was a proponent of the idea that the Burgess Shale assemblage represented a massive diversification of forms which of which many are now extinct, whilst Morris sees the forms as precursors to modern forms.

Following on from the earliest times the rest of the book is a story of successive mass extinctions followed by diversifications. Aside from the K-T extinction 65 million or so years ago, caused largely by a massive asteroid impact, extinction events were caused by changes in atmospheric chemistry. Typically this involved high levels of carbon dioxide leading to global warming and lower levels of oxygen. These changes in atmospheric chemistry were driven by large scale geology and life. Other than microbes, life struggles to survive when oxygen levels are much below the 21 percent we currently enjoy, conversely when oxygen levels are high large animals can evolve. The dinosaurs prevailed because they could survey at relatively low oxygen levels, and then became giants when oxygen levels rose above current levels.

I was amused to discover that reptiles and amphibians can’t run and breathe effectively at the same time, their splayed gait compresses the ribcage inconveniently. Creatures such as the dinosaurs resolved this problem by moving the legs beneath the body, our bipedal stance is even better since breathing and running can work entirely independently.

The book starts a little bombastically with comments on how boring history is perceived to be and how new and revolutionary this book is but once it settles into its stride its rather readable.

Book review: Artificial intelligence for Humans: Volume 3 Deep Learning and Neural Networks by Jeff Heaton

heaton-vol3Deep learning and neural networks are receiving more attention these days, you may have seen the nightmarish images generated using this technology by Google Research. I picked up Artificial Intelligence for Humans: Volume 3 Deep Learning and Neural Networks by Jeff Heaton to find out more since the topic fits in with my interests in data science and machine learning. There doesn’t seem to be much in the way of accessible, book length treatments of this relatively new topic. Most other offerings on Amazon have publication dates in the future.

It turns out that Artificial Intelligence for Humans is the result of a Kickstarter campaign, so far the author has funded three volumes on artificial intelligence by this route: two of them for around $18,000 and on for around $10,000. I paid £16 for the physical book which seems like a reasonable price. I think it is a pretty well polished product, it doesn’t quite reach the editing and production levels of a publisher like O’Reilly but it is at least as good as other technical publishers. The accompanying code examples and web site are really nicely done.

Neural networks have been around for a long time, since the 1940s, and see period outbreaks of interest and enthusiasm. They are modelled, loosely on the workings of biological brains with “neurons” connected together with linkages of different weights which can be trained to perform tasks such as image recognition, classification and regression. The “neurons” are grouped into layers with an input layer, where data enters, feeding into potentially multiple successive “hidden” layers finally leading to an output layer of neurons where results are read off. The output of a neuron is calculated by summing its inputs multiplied by the weights of the inputs and feeding the result through an “activation function”. The training process is used to optimise the weights, and may also evolve the structure of the network.

I remember playing with neural networks in the 1980s, typing a programme into my Amstrad CPC464 from a magazine which recognised handwritten digits, funnily enough this is still the go to demonstration of neural networks! In the past neural networks have not gained traction because of the computational demands of training. This problem appears to have been solved with new algorithms and GPU-based computation. A second innovation is the introduction of techniques to evolve the structure of neural networks to do “deep learning”.

Much of what is presented is familiar to me from my reading on machine learning (supervised and unsupervised learning, regression and classification), image analysis (convolution filters), and old-fashioned optimisation (stochastic gradient descent, Levenberg-Marquardt, genetic algorithms and stimulated annealing). It does lead me to wonder sometimes whether there is nothing new under the sun and that many of these techniques are simply different fields of investigation re-casting the same methods in their own language. For example, the LeNET-5 networks used in image analysis contain convolution layers which act exactly like convolution filters in normal image analysis, the max pool layers have the effect of downscaling the image. The combination of these one would anticipate to give the same effect as multi-scale image processing techniques.

The book provides a good summary on the fundamentals of neural networks, how they are built and trained, what different variants are called and then goes on to talk in more detail about the new stuff in deep learning. It turns out the label “deep” is applied to neural networks with more than two layers, which isn’t a particularly high bar. It isn’t clear whether this is two layers including the input and output layers or two layers of hidden neurons. I suspect it is the latter. These “deep” networks are typically generated automatically.

As the author highlights, with the proliferation of easy to use machine learning and neural network libraries the problem is no longer the core algorithm rather it is the selection of the right model for your particular problem and optimising the learning and evaluation strategy. As a Pythonista it looks like the way to go is to use the NoLearn and Lasagna libraries. A measure of this book is that when I go to look at the documentation for these projects the titles at least make sense.

The author finishes off with a description of his experience with doing a Kaggle challenge. I’ve done this, it’s a great way of getting some experience in machine learning techniques on nearly real problems. I thought the coverage was a bit brief but it highlighted how neural networks are used in combination with other techniques.

This isn’t an in depth book, but it introduces all the useful vocabulary and the appropriate libraries to start work in this area. And as a result I’m off to try t-SNE on a problem I’m working on, and then maybe try some analysis using the Lasagna library.