Category: Book Reviews

Reviews of books featuring a summary of the book and links to related material

Book review: An Introduction to Geographical Information Systems by Ian Heywood et al

HeywoodI’ve been doing quite a lot of work around Geographical Information Systems recently. So I thought I should get some background understanding to avoid repeating the mistakes of others. I turned to An Introduction to Geographic Information Systems by Ian Heywood, Sarah Cornelius and Steve Carver, now in its fourth edition.

This is an undergraduate text, the number of editions suggests it to be a good one. The first edition of Introduction was published in 1998 and this shows in the content, much of the material is rooted in that time with excursions into more recent matters. There is mention of CRT displays and Personal Data Assistants (PDA). This edition was published in 2011, obviously quite a lot of new material has been added since the first edition but it clearly forms the core of the book.

I quite deliberately chose a book that didn’t mention the latest shiny technologies I am currently working with (QGIS, Open Layers 3, spatial extensions in MariaDB) since that sort of stuff ages fast and the best, i.e. most up to date, information is on the web.

GIS allows you to store spatially related data with the ability to build maps using layers of different content and combine this spatial data with attributes stored in databases.

Early users were governments both local and national and their agencies, who must manage large amounts of land. These were followed by utility companies who had geographically distributed infrastructure to manage. More recently retail companies have become interested in GIS as a way of optimising store location and marketing. The application of GIS is frequently in the area of “decision support”, along the lines of “where should I site my…?” Although, “how should I get around these locations?” is also a frequent question. And with GPS for route finding arguably all of us carry around a GIS, and they are certainly important to logistics companies.

From the later stages of the book we learn how Geographic Information Systems were born in the mid to late 1960s became of increasing academic interest through the 1970s, started to see wider uptake in the eighties and became a commodity in the nineties. With the advent of Google Maps and navigation apps on mobile phones GIS is now ubiquitous.

I find it striking that the Douglas-Peucker algorithm for line simplification, born in the early seventies, is recently implemented in my favoured spatially enabled database (MariaDB/MySQL). These spatial extensions in SQL appear to have grown out of a 1999 standard from the OGC (Open Geospatial Consortium). Looking at who has implemented the standards is a good way of getting an overview of the GIS market.

The book is UK-centric but not overwhelmingly so, we learn about the Ordnance Survey mapping products and the UK postcode system, and the example of finding a site for a nuclear waste repository in the UK is a recurring theme.

Issues in GIS have not really changed a great deal, projection and coordinate transforms are still important, and a source of angst (I have experienced this angst personally!). We still see digitisation and other data quality issues in digitized data, although perhaps the source is no longer the process of manual digitization from paper but of inconsistency in labelling and GPS errors.

One of the challenges not discussed in Introduction is the licensing of geographic data, this has recently been in the news with the British government spending £5 million to rebuild an open address database for the UK, having sold off the current one with the Royal Mail in 2013. (£5 million is likely just the start). UN-OCHA faces similar issues in coordinating aid in disaster areas, the UK is fairly open in making details of administrative boundaries within the UK available electronically but this is not the case globally.

I have made some use of conventional GIS software in the form of QGIS which although powerful, flexible and capable I find slow and ugly. I find it really hand for a quick look-see at data in common geospatial formats. For more in-depth analysis and visualisation I use a combination of spatial extensions in SQL, Python and browser technology.

I found the case studies the most useful part of this book, these are from a wide range of authors and describe real life examples of the ideas discussed in the main text. The main text uses the hypothetical ski resort of Happy Valley as a long running example. As befits a proper undergraduate introduction there are lots of references to further reading.

Despite its sometimes dated feel Introduction to Geographic Information Systems does exactly what it says on the tin.

Book review: A New History of Life by Peter D. Ward and Joe Kirschvink

A new history of life

This next book is a Christmas present, A New History of Life by Peter D. Ward and Joe Kirschvink.

The theme of the book is the evolution of life from the very early periods of life on earth with a particular emphasis on that which has been discovered over the last 20 or so years, they cite Life: A Natural History of the First Four Billion Years of Life on Earth by Richard Fortey as the last comparable work.

A New History follows a long thread of books I’ve read, some I’ve reviewed here such as Neil Shubin’s Your Inner Fish, others in the prehistory of my blogging such as Stephen Jay Gould’s Wonderful Life: the Burgess Shale and the Nature of History. I’ve also written about First Life, a program narrated by David Attenborough on the earliest life.

It turns out quite a lot has happened in the last 20 or so years, radio-dating has improved in sensitivity allowing us to probe the early years of life on earth in more detail, new fossil fields have opened up in China, the chemistry of the early earth is better understood, early “Snowball Earth” episodes have been identified, and the discovery of exoplanets has led to more interest in the very earliest stages of life. Indeed, Tiktaalik at the core of Shubin’s Your Inner Fish, one of the first vertebrates to walk on the land, was discovered this century.

I always feel a little cautious approaching popular books such as this, promising great and revolutionary things, the risk is they present a minority view unsupported by other experts in the field and as an outsider you would be completely unaware of this. Thankfully, this isn’t the case here, Ward and Kirschvink are experts in early life but they are explicit where they own theories come into play and present alternative viewpoints in a fairly balanced way.

Half of the book covers the earliest life on earth up to the Cambrian Explosion 600-500 million years ago, when a huge diversity of life suddenly appeared. It seems here that the greatest new research activity has taken place. This includes work on ancient atmospheric composition: the relative amounts of carbon dioxide and oxygen; the “Snowball Earth” periods of complete glaciation where there was only liquid water at the surface due to volcanic activity; the chemistry of early life and the precursors to the Cambrian Explosion such as the Ediacaran fauna and other life such as Grypania and arcritarchs of which I had not heard. Vernanimalcula is also mentioned as the first bilateral animal, a microscopic fossil found in rocks 600 million years old. Although I see from wikipedia that this attribution is disputed.

The origins of life have the air of the physicist’s dark matter, their must be something there but we have little direct evidence for what it is and so it is ripe for a wide range of hypotheses. The big problem is the formation of RNA and DNA, experiments have long show that the basic building blocks of life can form in plausible early conditions stimulated by heat and lightning. But DNA and RNA are large, complex molecules, and not particularly heat stable.  One of the authors (Kirschvink) is keen on a Martian genesis for these molecules, then transported by asteroid to earth. I’ve always found these extra-terrestrial origins proposals unlikely.

A New History highlights the dispute between Stephen Jay Gould and Simon Conway Morris over the Burgess Shale. Gould was a proponent of the idea that the Burgess Shale assemblage represented a massive diversification of forms which of which many are now extinct, whilst Morris sees the forms as precursors to modern forms.

Following on from the earliest times the rest of the book is a story of successive mass extinctions followed by diversifications. Aside from the K-T extinction 65 million or so years ago, caused largely by a massive asteroid impact, extinction events were caused by changes in atmospheric chemistry. Typically this involved high levels of carbon dioxide leading to global warming and lower levels of oxygen. These changes in atmospheric chemistry were driven by large scale geology and life. Other than microbes, life struggles to survive when oxygen levels are much below the 21 percent we currently enjoy, conversely when oxygen levels are high large animals can evolve. The dinosaurs prevailed because they could survey at relatively low oxygen levels, and then became giants when oxygen levels rose above current levels.

I was amused to discover that reptiles and amphibians can’t run and breathe effectively at the same time, their splayed gait compresses the ribcage inconveniently. Creatures such as the dinosaurs resolved this problem by moving the legs beneath the body, our bipedal stance is even better since breathing and running can work entirely independently.

The book starts a little bombastically with comments on how boring history is perceived to be and how new and revolutionary this book is but once it settles into its stride its rather readable.

Book review: Artificial intelligence for Humans: Volume 3 Deep Learning and Neural Networks by Jeff Heaton

heaton-vol3Deep learning and neural networks are receiving more attention these days, you may have seen the nightmarish images generated using this technology by Google Research. I picked up Artificial Intelligence for Humans: Volume 3 Deep Learning and Neural Networks by Jeff Heaton to find out more since the topic fits in with my interests in data science and machine learning. There doesn’t seem to be much in the way of accessible, book length treatments of this relatively new topic. Most other offerings on Amazon have publication dates in the future.

It turns out that Artificial Intelligence for Humans is the result of a Kickstarter campaign, so far the author has funded three volumes on artificial intelligence by this route: two of them for around $18,000 and on for around $10,000. I paid £16 for the physical book which seems like a reasonable price. I think it is a pretty well polished product, it doesn’t quite reach the editing and production levels of a publisher like O’Reilly but it is at least as good as other technical publishers. The accompanying code examples and web site are really nicely done.

Neural networks have been around for a long time, since the 1940s, and see period outbreaks of interest and enthusiasm. They are modelled, loosely on the workings of biological brains with “neurons” connected together with linkages of different weights which can be trained to perform tasks such as image recognition, classification and regression. The “neurons” are grouped into layers with an input layer, where data enters, feeding into potentially multiple successive “hidden” layers finally leading to an output layer of neurons where results are read off. The output of a neuron is calculated by summing its inputs multiplied by the weights of the inputs and feeding the result through an “activation function”. The training process is used to optimise the weights, and may also evolve the structure of the network.

I remember playing with neural networks in the 1980s, typing a programme into my Amstrad CPC464 from a magazine which recognised handwritten digits, funnily enough this is still the go to demonstration of neural networks! In the past neural networks have not gained traction because of the computational demands of training. This problem appears to have been solved with new algorithms and GPU-based computation. A second innovation is the introduction of techniques to evolve the structure of neural networks to do “deep learning”.

Much of what is presented is familiar to me from my reading on machine learning (supervised and unsupervised learning, regression and classification), image analysis (convolution filters), and old-fashioned optimisation (stochastic gradient descent, Levenberg-Marquardt, genetic algorithms and stimulated annealing). It does lead me to wonder sometimes whether there is nothing new under the sun and that many of these techniques are simply different fields of investigation re-casting the same methods in their own language. For example, the LeNET-5 networks used in image analysis contain convolution layers which act exactly like convolution filters in normal image analysis, the max pool layers have the effect of downscaling the image. The combination of these one would anticipate to give the same effect as multi-scale image processing techniques.

The book provides a good summary on the fundamentals of neural networks, how they are built and trained, what different variants are called and then goes on to talk in more detail about the new stuff in deep learning. It turns out the label “deep” is applied to neural networks with more than two layers, which isn’t a particularly high bar. It isn’t clear whether this is two layers including the input and output layers or two layers of hidden neurons. I suspect it is the latter. These “deep” networks are typically generated automatically.

As the author highlights, with the proliferation of easy to use machine learning and neural network libraries the problem is no longer the core algorithm rather it is the selection of the right model for your particular problem and optimising the learning and evaluation strategy. As a Pythonista it looks like the way to go is to use the NoLearn and Lasagna libraries. A measure of this book is that when I go to look at the documentation for these projects the titles at least make sense.

The author finishes off with a description of his experience with doing a Kaggle challenge. I’ve done this, it’s a great way of getting some experience in machine learning techniques on nearly real problems. I thought the coverage was a bit brief but it highlighted how neural networks are used in combination with other techniques.

This isn’t an in depth book, but it introduces all the useful vocabulary and the appropriate libraries to start work in this area. And as a result I’m off to try t-SNE on a problem I’m working on, and then maybe try some analysis using the Lasagna library.

Book Review: The Honourable Company by John Keay

thehonourablecompanyI’ve been reading a lot of books about naturalists who have been on great expeditions: Alexander von Humboldt, Charles Darwin, Joseph Banks and the like. This book, The Honourable Company by John Keay, is a bit of a diversion into great expeditions for commercial purposes. Such expeditions form the context, and “infrastructure” in which scientific expeditions take place. The book is a history of the English East India Company, founded in the early 17th century with a charter from the English sovereign to conduct trade in the Far East (China, Japan, Java) and India.

It is somewhat chastening to realise the merchants had been exploring the world for one hundred years (and the Spanish and Portuguese for nearer 200 years) before the scientific missions really got going in the 18th century.

The book is divided into four parts each covering periods of between 40 and 80 years, within each part there is a further subdivision into geographical areas: the East India Company had interests at one time or another from Java, Japan and China in the Far East to Calcutta, Bombay and Surat in India to Mocha in the Middle East.

The East India Company was chartered in 1600, following the pattern of the (slightly) earlier Muscovy and Levant Companies which sought a North West passage to the Far East and trade with Turkey respectively. At the time the Spanish and Portuguese were dominating long distant trade routes. The Dutch East India Company was formed shortly after the English, and would go on to be rather more successful. The Company offered investors the opportunity to combine together to fund a ship on a commercial journey. The British Crown gave the Company exclusive rights to arrange such trade expeditions.

Initially the aim was to bring back lucrative spices from the Far East, in practice the trade shifted to India initially and in its later years, to China and the import of tea. The Dutch were more military assertive in the Far East where spices like nutmeg and pepper were sourced.

Once again I’m struck by the amount of death involved in long distance expeditions. It seems western Europeans had been projecting themselves across the oceans with 50% mortality rates from sometime in the early 16th century to the end of the 18th century. For the East India company, many of their factors – the local representatives – were to die in their placements of tropical diseases.

In the early years investors bought into individual expeditions with successive expeditions effectively competing with each other for trade, this was unproductive and subsequently investment was in the Company as a whole. Although it is worth noting that even in the later years of the Company in India the different outposts in Madras, Bombay and so forth were not averse to acting independently and even in opposition to each others will, if not interests. Alongside the Company’s official trade the employee’s engaged in a great deal of unofficial activity for their own profit, this was known as the “country trade”.

The East India Company’s activities in India led to the British colonisation of the country. For a long time the Company made a fairly good effort at not being an invading force, basically seeing it as being bad for trade. This changed during the first half of the 18th century where the Company became increasingly drawn into military action and political intrigue either with local leaders against third parties or in proxy battles with other European powers with which the home country was at war. Ultimately this lead to the decline of the Company since the British Government saw them acting increasingly as a colonial power and saw this as their purview. This was enacted in law through the Regulating Act in 1773 and East India Company Act of 1784 which introduced a Board of Control overseeing the Company’s activities in India.

Keay is very much focussed on the activities of the Company, the records it kept and previous histories, so it is a little difficult to discern what the locals would have made of the Company. He comments that there has been a tendency to draw a continuous thread from the early trading activities of the Company to British India in the mid-19th century and onwards but seems to feel these links are over-emphasised.

India is the main focus of the book despite the importance of China, tea and the opium trade in the later years which is covered only briefly in the last few pages. I must admit I found the array of places and characters a bit overwhelming at times, not helped by my slightly vague sense of Indian geography. Its certainly a fascinating subject and it was nice to step outside my normal reading.    

Book review: Pro Git by Scott Chacon and Ben Straub

progitPro Git by Scott Chacon and Ben Straub is available to download for free, or to read online at the website but you can buy a paper copy if you prefer. I downloaded an read it on my tablet. Pro Git is the bible of all things relating to git, the distributed version control system. This is an application to record the history of changes to your computer code, or any other plain test file. Such applications are essential if you are a software company producing code commercially, or if you are collaborating on an open source project. They are also useful, if you use code in analysis or modelling, as I do.

Git is most famous as the creation of Linus Torvalds in support of the development of the Linux operating system. For developers version control is a fundamental activity which crosses all boundaries of domain and language. Git is one of the more recent examples in a line of version control systems, my former colleague Francis Irving wrote very nicely about this subject.

My adventures with source control extend over 20 years although it is fair to say that I didn’t really use them in anger until I worked at ScraperWiki. There my usage moved from being a safety line for work that only really impacted me, to a collaborative tool. I picked up my usage of git through pairing with other people, and through explicitly stated conventions for using git in a developing team. Essentially one of the other developers told us off if he thought our commit messages were not up to scratch! This is a good thing. This culturally determined use of git is important in collaborative environments.

My interest in git has recently been re-awoken for a couple of reasons: my new job means I’m doing a lot of coding, and I discovered GitKraken which is a blingy new git client. I’ve not used a graphical git client before but GitKraken is very pretty and the GUI invites you to discover more git functionality. Sadly it doesn’t work on my work PC, and once it leaves beta I have no idea what the price might be.

Pro Git starts with an introduction to git and the basics of getting up and running. It then goes on to describe how to use git in collaborative environments, how to use git with GitHub and then more advanced topics such as how to write hooks, and use git as a client to Subversion (an earlier source control system). Coverage feels pretty complete to me, it’s true that you might resort to Stackoverflow to answer some questions but that’s universally true in coding.

The book finishes with a chapter on git internals, what is going on under the hood as you issue commands. Git has a famous division between “porcelain” and “plumbing” commands. Plumbing is what really get things done, low level commands with somewhat opaque meaning whilst porcelain is the stuff you use day to day. The internals chapter starts by showing how the plumbing works by reproducing the effects of some of the porcelain commands. This is surprisingly informative, and built my confidence a bit – I always have some worry that I will lose something irrevocably by issuing the wrong command in git. These dangers exist but Pro Git is clear where they lie.

Here are a couple of things I’ve already started using on reading this book:

git log –since=1.week

– filter the log to just show the commits made in the last week, other time options are available. Invaluable for weekly reporting!

git describe

– make a human readable (sort of) build number based on the most recent tag and how far you are along from it.

And there are some things I used to wonder about. First of all I should consider commits as a tree structure, with branches pointers to particular commits. In this context HEAD^ refers to the parent commit of the current HEAD, or latest commit. HEAD~2 refers to the grandparent of the current commit, and so on. I now have some appreciation of soft, mixed and hard resets. Hard is bad, it could lose your work!

I now know why git filter-branch was spoken of in hushed tones in the ScraperWiki office, basically because it allows you to systematically rewrite the history of a repository which is sort of really wrong in source control.

Pro Git is good in outlining not only what you can do but also what you should do. For example, one has the choice with git to merge different branches or to carry out a rebase. I’d always been a bit vague on the difference between these two things but Pro Git explains clearly, and also tells you when you shouldn’t use rebase (when other people have seen the commits you are rebasing).

My electronic edition on Kindle does suffer from the occasional glitch with some paragraphs appearing twice but the writing is clear and natural. Pro Git can’t be beaten for the price and it is probably worth the  £32 Amazon charge for a paper copy.