Category: Book Reviews

Reviews of books featuring a summary of the book and links to related material

Book review: Network Graph Analysis and visualization with Gephi by Ken Cherven

network_gephi

This review was first published at ScraperWiki.

I generally follow the rule that if I haven’t got anything nice to say about something then I shouldn’t say anything at all. Network Graph Analysis and visualization with Gephi by Ken Cherven challenges this principle.

Gephi is a system for producing network visualisations, as such it doesn’t have a great many competitors. Fans of Unix will have used Graphviz for this purpose in the past but Gephi offers greater flexibility in a more user-friendly package. Graph theory and network analysis have been growing in importance over the past few years in part because of developments in the analysis of various complex systems using network science. As a physical scientist I’ve been aware of this trend, and it clearly also holds in the social sciences. Furthermore there is much increased availability of network information from social media such as Twitter and Facebook.

I’ve used Gephi a few times in the past, and to be honest there has been an air of desperate button clicking to my activities. That’s to say I felt Gephi could provide the desired output but I could only achieve it by accident. I have an old-fashioned enthusiasm for books even for learning about modern technology. Hence Network Graph Analysis and visualization with Gephi – the only book I could find with Gephi in the title. There is substantial online material to support Gephi but I hoped that this book would give me a better insight into how Gephi worked and some wider understand of graph theory and network analysis.

On the positive side I now have a good understanding of the superficial side of the interface, a feel for how a more expert user thinks about Gephi, and some tricks to try.

I discovered from Network Graph Analysis that the “Overview” view in Gephi is what you might call “Draft”, a place to prepare visualisations which allows detailed interaction. And the “Preview” view is what you might call “Production”, a place where you make a final, beautiful version of your visualisations.

The workflow for Gephi is to import data and then build a visualisation using one of a wide range of layout algorithms. For example, force-based layouts assume varying forces between nodes for which an arrangement of nodes can be calculated by carrying out a pseudo-physical simulations. These algorithms can take a while to converge, and may get trapped in local minima. The effect of these layout algorithms is to reveal some features of the network. For example, the force layouts can reveal clusters of nodes which might also be discovered by a more conventional statistical clustering algorithm. The concentric layout allows a clearer visualisation of hierarchy in a network.

It’s clear that the plugin ecosystem is important to the more experienced user of Gephi. Plugins provide layout algorithms, data helpers, new import and export functionality, analysis and so forth. You can explore them in the Gephi marketplace.

Cherven recommends a fairly small, apparently well-chosen set of references to online resources and books. The Visual Complexity website looks fabulous. You can read the author’s complete, pre-publication draft of Networks, Crowds and Markets: Reasoning about a highly connected world by David Easley and Jon Kleinberg here. It looks good but it’s nearly 800 pages! I’ve opted for the rather shorter Graph Theory and Complex Networks: An Introduction by Maarten van Steen.

On the less positive side, this is an exceedingly short book. I read it in a couple of 40 minute train journeys. It’s padded with detailed descriptions of how to install Gephi and plugins, including lots of screenshots. The coverage is superficial, so whilst features may be introduced the explanation often tails off into “…and you can explore this feature on your own”.

Network Graph Analysis is disappointing, it does bring a little enlightenment to a new user of Gephi but not very much. A better book would have provided an introduction to network and graph analysis with Gephi the tool to provide practical experience and examples, in the manner that Data Mining does for weka and Natural Language Processing with Python does for the nltk library.

This book may be suitable for someone who is thinking about using Gephi and isn’t very confident about getting started. The best alternative that I’ve found is the online material on GitHub (here).

Book review: Falling Upwards by Richard Holmes

fallingupwardsI read Richard Holmes book The Age of Wonder some time ago, in it he made a brief mention of balloons in the 18th century. It pricked my curiosity, so when I saw his book Falling Upwards, all about balloons, I picked it up.

The chapters of Falling Upwards cover a series of key points in the development of ballooning, typically hydrogen balloons from the last couple of decades of the 18th century to the early years of the 20th century. One of the early stories is a flight from my own home city, Chester. Thomas Baldwin recorded his flight in Airopaidia: Containing the Narrative of a Balloon Excursion from Chester, the eighth of September, 1785. The book does not have the air of a rigorous history of ballooning, it introduces technical aspects but not systematically. It is impressionistic to a degree, and as a result a rather pleasant read. For Holmes the artistic and social impact of balloons are as important as the technical.

In the beginning there was some confusion as to the purposes to which a balloon might be put, early suggestions included an aid to fast messengers who would stay on the ground to provide but use a small balloon to give them “10 league boots”, there were similar suggestions for helping heavy goods vehicles.

In practice for much of the period covered balloons were used mainly for entertainment – both for pleasure trips but also aerial displays involving acrobatics and fireworks. Balloons were also used for military surveillance.  Holmes provides chapters on their use in the American Civil War by the Union side (and very marginally by the Confederates). And in the Franco-Prussian war they were used to break the Prussian siege of Paris (or at least bend it). The impression gained though is that they were something like novelty items for surveillance. By the time of the American Civil War in the 1860’s it wasn’t routine or obvious that one must use balloon surveillance, it wasn’t a well established technique. This was likely a limitation of both the balloons themselves and the infrastructure required to get them in the air.

Balloons gave little real utility themselves, except in exceptional circumstances, but they made a link to heavier-than-air flight. They took man into the air, and showed the possibilities but for practical purposes generally didn’t deliver – largely due to their unpredictability. To a large extent you have little control of where you will land in a balloon once you have gone up. Note, for example, that balloons were used to break the Prussian siege of Paris in the outbound direction only. A city the size of Paris is too small a target to hit, even for highly motivated fliers.

Nadar (pseudonym of Gaspard-Félix Tournachon), who lived in Paris, was one of the big promoters of just about anything. He fought a copyright battle with his brother over his, adopted, signature. Ballooning was one of his passions, he inspired Jules Verne to starting writing science fiction. His balloon, Le Géant, launched in 1863 was something of a culmination in ballooning – it was enormous – 60 metres high but served little purpose other than to highlight the limitations of the form – as was Nadar’s intent.

From a scientific point of view Falling Upwards covers James Glaisher and Henry Coxwell’s flights in the mid-nineteenth century. I was impressed by Glaisher’s perseverance in taking manual observations at a rate of one every 9 seconds throughout a 90 minute flight. Glaisher had been appointed by the British Association for the Advancement of Science to do his work, he was Superintendent for Meteorology and Magnetism at the Royal Greenwich Observatory. With his pilot Henry Coxwell he made a record-breaking ascent to approximately 8,800 meters in 1862, a flight they were rather lucky to survive. Later in the 19th century other scientists were to start to identify the layers in the atmosphere. Discovering that it is only a thin shell – 5 miles or so thick which is suitable for life.

The final chapter is on the Salomon Andrée’s attempt to reach the North Pole by balloon, as with so many polar stories it ends in cold, lonely, perhaps avoidable death for Andrée and his two colleagues. Their story was discovered when the photos and journals were recovered from White Island in the Artic Circle, some 30 years after they died.

Falling Upwards is a rather conversational history. Once again I’m struck by the long periods for technology to reach fruition. It’s true that from a technology point of view that heavier-than-air flight is very different from ballooning. But it’s difficult to imagine doing the former without the later.

Book review: Big data by Viktor Mayer-Schönberger and Kenneth Cukier

BigData

This review was first published at ScraperWiki.

We hear a lot about “Big Data” at ScraperWiki. We’ve always been a bit bemused by the tag since it seems to be used indescriminately. Just what is big data and is there something special I should do with it? Is it even a uniform thing?

I’m giving a workshop on data science next week and one of the topics of interest for the attendees is “Big Data”, so I thought I should investigate in a little more depth what people mean by “Big Data”. Hence I have read Big Data by Viktor Mayer-Schönberger and Kenneth Cukier, subtitled “A Revolution That Will Transform How We Live, Work and Think” – chosen for the large number of reviews it has attracted on Amazon. The subtitle is a guide to their style and exuberance.

Their thesis is that we can define big data, in contrast to earlier “little data”, by three things:

  • It’s big but not necessarily that big, their definition for big is that n = all. That is to say that in some domain you take all of the data you can get hold of. They use as one example a study on bout fixing in sumo wrestling, based on  data on 64,000 bouts – which would fit comfortably into a spreadsheet. Other data sets discussed are larger such as credit card transaction data, mobile telephony data, Google’s search query data…;
  • Big data is messy, it is perhaps incomplete or poorly encoded. We may not have all the data we want from every event, it may be encoded using free text rather than strict categories and so forth;
  • Working with big data we must discard an enthusiasm for causality and replace it with correlation. Working with big data we shouldn’t mind too much if our results are just correlations rather than explanations (causation);
  • An implicit fourth element is that the analysis you are going to apply to your big data is some form of machine learning.

I have issues with each of these “novel” features:

Scientists have long collected datasets and done calculations that are at the limit (or beyond) their ability to process the data produced. Think protein x-ray crystallography, astronomical data for navigation, the CERN detectors etc etc. You can think of the decadal censuses run by countries such as the US and UK as n = all. Or the data fed to the early LEO computer to calculate the deliveries required for each of their hundreds of teashops. The difference today is that people and companies are able to effortlessly collect a larger quantity of data than ever before. They’re able to collect data without thinking about it first. The idea of n = all is not really a help. The straw man against which it is placed is the selection of a subset of data by sampling.

They say that big data is messy implying that what went before was not. One of the failings of the book is their disregard for those researchers that have gone before. According to them the new big data analysts are comfortable with messiness and uncertainty, unlike those fuddy-duddy statisticians! Small data is messy, scientists and statisticians have long dealt with messy and incomplete data.

The third of their features: we must be comfortable with correlation rather than demand causation. There are many circumstances where correlation is OK, such as when Amazon uses my previous browsing and purchase history to suggest new purchases but the area of machine learning / data mining has long struggled with messiness and causality.

This is not to say nothing has happened in the last 20 or so years regarding data. The ubiquity of computing devices, cheap storage and processing power and the introduction of frameworks like Hadoop are all significant innovations in the last 20 years. But they grow on things that went before, they are not a paradigm shift. Labelling something as ‘big data’, so ill-defined, provides no helpful insight as to how to deal with it.

The book could be described as the “What Google Did Next…” playbook. It opens with Google’s work on flu trends, passes through Google’s translation work and Google Books project. It includes examples from many other players but one gets the impression that it is Google they really like. They are patronising of Amazon for not making full use of the data they glean from their Kindle ebook ecosystem. They pay somewhat cursory attention to issues of data privacy and consent, and have the unusual idea of creating a cadre of algorithmists who would vet the probity of algorithms and applications in the manner of accountants doing audit or data protection officers.

So what is this book good for? It provides a nice range of examples of data analysis and some interesting stories regarding the use to which it has been put. It gives a fair overview of the value of data analysis and some of the risks it presents. It highlights that the term “big data” is used so broadly that it conveys little meaning. This confusion over what is meant by “Big Data” is reflected on the datascience@Berkeley blog which lists definitions of big data from 30 people in the field (here). Finally, it provides me with sufficient cover to make a supportable claim that I am a “Big Data scientist”!

To my mind, the best definition of big data that I’ve seen is that it is like teenage sex…

  • Everyone talks about it,
  • nobody really knows how to do it,
  • everyone thinks everyone else is doing it,
  • so everyone claims they are doing it too!

Book review: Greenwich Time and the Longitude by Derek Howse

greenwich_timeI am being used as a proxy reader! My colleague drj, impressed by my reviewing activities, asked me to read Greenwich Time and the Longitude by Derek Howse, so that he wouldn’t have to.

There was some risk here that Greenwich Time and the Longitude would overlap heavily with Finding Longitude which I have recently read. They clearly revolve around the same subjects and come from the same place: the National Maritime Museum at Greenwich. Happily the overlap is relatively minor. Following some brief preamble regarding the origins of latitude and longitude for specifying locations, Greenwich Time starts with the founding of the Royal Observatory at Greenwich.

The Observatory was set up under Charles II who personally ordered it’s creation in 1675, mindful of the importance of astronomy to navigation. The first Royal Astronomer was John Flamsteed. Accurate measurement of the locations of the moon and stars was a prerequisite for determining the longitude at sea both by lunar-distance and clock based means. Flamsteed’s first series of measurements was aimed at determining whether the earth rotated at a constant rate, something we take for granted but wasn’t necessarily the case.

Flamsteed is notorious for jealously guarding the measurements he made, and fell out with Isaac Newton over their early, unauthorised publication which Newton arranged. A detail I’d previously missed in this episode is that Flamsteed was not very well remunerated for his work, his £100 per annum salary had to cover the purchase of instruments as well as any skilled assistance he required which goes some way to explaining his possessiveness over the measurements he made. 

Greenwich Time covers the development of marine chronometers in the 18th century and the period of the Board of Longitude relatively quickly.

The next step is the distribution of time. Towards the middle of the 19th century three industries were feeling the need for precise timekeeping: telegraphy, the railways and the postal service. This is in addition to the requirements of marine navigators. The first time signal, in 1833, was distributed by the fall of a large painted zinc ball on the top of the Greenwich observatory. Thereafter, strikingly similar balls appeared on observatories around the world.

From 1852 the time signal was distributed by telegraphic means, and ultimately by radio. It was the radio time signal that ultimately brought an end to the publication of astronomical tables for navigation. Britain’s Nautical Almanac, started in 1767, stopped publishing them in 1907 – less than 10 years after the invention of radio.

With the fast distribution of time signals over large distances came the issue of the variation between local time (as defined by the sun and stars) and the standard time. The problem was particularly pressing in the United States which spanned multiple time zones. The culmination of this problem is the International Date Line, which passes through the Pacific. Here the day of the week changes on crossing the line, a problem discovered by the very first circumnavigators (Magellan’s expedition in 1522), identified when they reached travellers who had arrived from the opposite direction and disagreed on the day of the week. I must admit to being a bit impressed by this, I can imagine it’s easy to lose track of the days on such an expedition.

I found the descriptions of congresses to standardise the meridian and time systems across multiple nations in the 1880s rather dull.

One small thing of interest in these discussions: mariners used to measure the end of the day at noon, hence what we would call “Monday morning” a mariner would call “the end of Sunday”, unless he was at harbour – in which case he would use local time! It is from 18th century mariners that Jean Luc Picard appears to get his catchphrase “Make it so!”, this was the traditional response of a captain to the officer making the noon latitude measurement. The meridian congresses started the process of standardising the treatment of the day by “civilians”, mariners and astronomers.

The book finishes with a discussion of high precision timekeeping. This is where we discover that Flamsteed wasn’t entirely right when he measured the earth to rotate at a constant rate. The earth’s rotation is showing a long term decrease upon which are superimposed irregular variations and seasonal variations. And the length of the year is slowly changing too. Added to that, the poles drift by about 8 metres or so over time. It’s testament to our abilities that we can measure these imperfections but somehow sad that they exist.

The book has an appendix with some detail on various measurements.

Not as sumptuous a book as Finding Longitude it is an interesting read with a different focus. It has some overlap too with The History of Clocks and Watches by Eric Bruton.

Book review: Degrees Kelvin by David Lindley

How to start? I’ve read another book… degrees_kelvinDegrees Kelvin: A tale of genius, invention and tragedy by David Lindley. This is a biography of William Thomson, later Lord Kelvin, who lived 1824-1907.

Thomson lived at a time when the core of classical physics came into being, adding thermodynamics and electromagnetism to Newtonian mechanics. He played a significant role in creating these areas of study. As well as this he acted as a scientific advisor in the creation of the transatlantic telegraph, electric power transmission, marine compasses and a system of units for electromagnetism. He earned a substantial income from patents relating to telegraphy and maritime applications, and bought a blingy yacht (the Lalla Rookh) with the money.

He died a few years after the discovery of radioactivity, x-rays, special relativity and the first inklings of quantum mechanics – topics that were to form “modern physics”.

The book starts with William Thomas heading off to Cambridge to study maths. Prior to going he has already published in a mathematical journal on Philip Kelland’s misinterpretation of Fourier’s work on heat.

His father, James Thomson is a constant presence through his time in Cambridge in the form of a stream of letters, these days he’d probably be described as a “helicopter parent”. James Thomson is constantly concerned with his son falling in with the wrong sort at university, and with the money he is spending. James Thomson was a professor of mathematics at Glasgow University, William had attended his classes at the university along with his brother. Hence his rapid entry into academic publishing.

Fourier’s work Analytical Theory of Heat is representative of a style of physics which was active in France at the beginning of the 19th century. He built a mathematical model of the flow of heat in materials, with techniques for calculating the temperature throughout that body – one of which were the Fourier series – still widely used by scientists and engineers today. For this purpose the fundamental question of what heat was could be ignored. Measurements could be made of heat flow and temperature, and the model explained these outward signs. Fourier’s presentation was somewhat confused, which led Philip Kelland – in his book Theory of Heat to claim he was wrong. Thomson junior’s contribution was to clarify Fourier’s presentation and point out, fairly diplomatically, that Kelland was wrong. 

Slightly later the flow of letters from Thomson senior switches to encourage his son into the position held by the ailing William Meikleham, Professor of Natural Philosophy at Glasgow University – this project is eventually successful when Meikleham dies and Thomson takes the post in 1846. He retired from his position at Glasgow University in 1899.

William Thomson appears to have been innovative in teaching, introducing the laboratory class into the undergraduate degree, and later writing a textbook of classical physics, Treatise on Natural Philosophy, with his friend P.G. Tait.

Following his undergraduate studies at Cambridge, William goes to Paris, meeting many of the scientific community there at the time and working in the laboratory of Henri Regnault on thermodynamics. In both thermodynamics and electromagnetism Thomson plays a role in the middle age of the topic, not there at the start but not responsible for the final form of the subject. In both thermodynamics and electromagnetism Thomson’s role was in the “formalisation” of the physical models made by others. So he takes the idea of lines of force from Faraday’s electrical studies and makes them mathematical. The point of this exercise is that now the model can be used to make quantitative predictions in complex situations of, for example, the transmission of signals down submarine telegraph wires.

Commercial telegraphy came in to being around 1837, the first transatlantic cable was strung in 1857 – although it only worked briefly, and poorly for a few weeks. The first successful cable was laid in 1866. It’s interesting to compare this to the similarly rapid expansion of the railways in Britain. Thomson played a part from the earliest of the transatlantic cables. Contributing both theoretically and practically – he invented and patented the mirror galvanometer which makes reading weak signals easier.

It’s a cliché to say “X was no stranger to controversy” Thomson had his share – constantly needling geologists over the age of the earth and getting into spats regarding priority of James Joule on the work on inter-convertibility of energy. It sounds like he bears some responsibility for the air of superiority that physicists can sometime display over the other sciences. Although it should be said that he more played second fiddle to the more pugnacious P.G. Tait.

Later in life Thomson struggled to accept Maxwell’s formulation of electromagnetic theory, finding it too abstract – he was only interested in a theory with a tangible physical model beneath it. Maxwell’s theory had this at the start, an ever more complex system of gear wheels, but ultimately he cut loose from it. As an aside, the Maxwell’s equations we know today are very much an invention of Oliver Heaviside who introduced the vector calculus notation which greatly simplifies their appearance, he too cut his teeth on telegraphy.

At one point Lindley laments the fact Lord Kelvin has not had the reputation he deserves since his death. Reputation is a slippery thing, recognition amongst the general public is a fickle and no real guide to anything. Most practicing scientists pay little heed to the history of their subject, fragments are used as decoration for otherwise dull lectures.

It’s difficult to think of modern equivalents of William Thomson in science, his theoretical role is similar to that of Freeman Dyson or Richard Feynman. It’s not widely recognised but Albert Einstein, like Thomson, was active in making patent applications but does not seem to have benefitted financial from his patents. Thomson also plays the role of Victorian projector, such as Isambard Kingdom Brunel. Projects in the 21st century are no longer so obviously the work of one scientist/engineer/project manager/promoter these roles having generally been split into specialisms. 

I was intrigued to discover that Lindley apparently uses S.P. Thompson’s 1910 biography of Kelvin as his primary source, not mentioning at all the two volume Energy and Empire by Crosbie Smith and M. Norton Wise published in 1989.

Degrees Kelvin provides a useful entry into physics and technology in the 19th century, I am now curious about the rise of electricity and marine compasses!