Category: Book Reviews

Reviews of books featuring a summary of the book and links to related material

Book Review: Kingdom of Characters by Jing Tsu

jing_tsuKingdom of Characters: A Tale of Language, Obsession, and Genius in Modern China by Jing Tsu describes the evolution of technology to handle the Chinese language from the start of the 20th century until pretty much today (2022). As such it blends technology, linguistics and politics.

The "issue" with Chinese as a language is that it is written as characters with each character representing a whole word, in contrast to English and similar languages which build words from a relatively small number of alphabetic characters. The Chinese language uses thousands of characters regularly, and including rarer forms brings the number to tens of thousands. This means that technologies to input, print, transmit, and index Chinese language material must be changed fairly radically from the variants used to handle alphabetic languages.

The written Chinese language has been around in fairly similar form for getting on for 3000 years, and it was in China that printing was invented in around 600AD – several hundred years before it was invented in Western Europe by Gutenberg. "Penmanship" – how someone writes characters – is still seen as an important personal skill, in a way that handwriting in English is not.

Aside from the linguistic and technological aspects of the process, politics plays and important part.

Kingdom of Characters covers the modernisation of the Chinese language and its use in new technology in seven chapters, (in chronological order), each chapter focuses on one or two individuals each, and some attempt is made to fill out their backgrounds. The first chapter covers the standardisation of the written language to Mandarin which culminated in the 1913 conference of the Commission on the Unification of Pronunciation.

The next step in the modernisation of Chinese was the invention of a Chinese character typewriters, commercialised by the Commercial Press from 1926, developed by Zhou Houkun and Shu Zhendong.

I found the telegraphy chapter quite telling, not through its solution but as a demonstration of what happened when China was not at the table when systems were designed – they were condemned to use a numerical code system which was more expensive than sending alphabetic letters. Interestingly the global telegraphy system seemed to spend a great deal of time trying to stop people sending encoded messages because they saw it as "fare dodging", Chinese was caught up in this effort. Numbers were more expensive to send than letters but representing whole words with numbers was seen as encoding.

Cataloguing gets a chapter of its own, the chapter covers the period from the late 1920s until the 1950s but it feels like a continuation of other discussions on how to break the tens of thousands of characters down into a smaller set of ordered elements in a consistent and memorable fashion. There is a precedent for this, Chinese characters are written in a standard order, stroke by stroke and there has existed for a long time the idea of "radicals" a small set set of foundational strokes. It means that the challenge is two-fold: technical but also linguistic.

In a reprise of the standardisation discussion the fifties saw the simplification of Chinese characters, followed by the introduction of Pinyin – a phonetic system for Chinese. This replaced the Wade-Giles phoneticisation, developed by two Westerners. Growing up in the seventies I first learned that Peking (under the Wade-Giles) was the capital of China, for it to be replaced by Beijing (the Pinyin) in the eighties. The new system also included Chinese tones which don’t have an equivalent in English or other Western languages.

The chapter entitled "Entering into the computer (1979)" is largely about using computers to do photo-typesetting to print Chinese. I suspect the Chinese invention of vector-based character representations may have leapfrogged Western technology. This work was born during the Cultural Revolution which from 1966-76 impacted technological progress rather seriously. I recall in the late eighties a Chinese academic who was visiting the research group where I did my final year undergraduate project, he had worked in the fields during the Cultural Revolution – not voluntarily, and he had a better time of it than many.

The final chapter is on the burgeoning Chinese internet, with a proliferation of input methods, and an audience several times larger than the US audience although it starts with the introduction of Unicode in 1988, and the standing group tasked with the addition of new Chinese characters to the standard from ever more esoteric literary sources.

The broad political context of the work is the decline of China in the 19th century under the Qing Dynasty – forced to open up to foreign influences by the Opium Wars. Towards the end of this time the Chinese language, tied to the ruling dynasty was seen as part of the problem – holding China back from becoming a modern nation. In the 20th century 1912 saw the formation of the republican, Nationalist government although it was in regular conflict with the communists, and then the Japanese in the Second Sino-Japanese War which ended with the defeat of the Japanese in the Second World War. The People’s Republic of China was founded in 1949 with a renewed interest in preserving the Chinese language, but with the interests of the worker at its heart – under the Qing Dynasty literacy, and the use of the written language, was a preserve of the ruling class.

Kingdom of Characters is pretty readable, and will appeal to those interested in radically different writing systems (when compared to alphabetic languages).

Book review: Data mesh by Zhamak Dehgani

datameshThis book, Data mesh: Delivering Data-Driven Value at Scale by Zhamak Dehghani essentially covers what I have been working on for the last 6 months or so, therefore it is highly relevant but I perhaps have to be slightly cautious in what I write because of commercial confidentiality.

The Data Mesh is a new design for handling data within an organisation, it has been developed over the last 3 or 4 years with Dehghani at the Thoughtworks consultancy at the core. Given its recency there are no Data Mesh products on the market so one is left build your own on the basis of components available.

To a large degree the data mesh is a conceptual and organisational shift rather than a technical shift, all the technical component parts are available for a data mesh, less programmatic glue to hold the whole thing together.

Data Mesh the book is divided into five parts, the first describes what a data mesh is in fairly abstract terms, the second explains why one might need a data mesh, the third and fourth parts about how to design the architecture of the data mesh itself, and the data products that make it up. The final part is on “How to get started” – how to make it happen in your organisation.

Dehghani talks in terms of companies having established systems for operational data (data required to serve customers and keep the business running such billing information and the state of bank accounts), the data mesh is directed at analytical data – data which is derived from the operational data.  She uses a fictional company, Daff, Inc. that sounds an awful lot like Spotify to illustrate these points. Analytical data is used to drive machine learning recommender systems, for example, and better understanding of business, customer and operations.

The legacy data systems Data Mesh describes are data warehouses and data lakes where data is managed by a central team. The core issue this system brings is one of scalability, as the number of data sets grows the size of the central team grows, and the responsiveness of the system drops.

The data mesh is a distributed solution to this centralised system. Dehghani defines the data mesh in terms of four principles, listed in order of importance:

  1. Domain Ownership – this says that analytical data is owned by the domains that generate it rather than a centralised data team;
  2. Data as a product – analytical data is owned as a product, with the associated management, discoverability, quality standards and so forth around it. Data products are self-contained entities in their own right – in theory you can stand up the infrastructure to deliver a single data product all by itself;
  3. Self-serve data platform – a self-serve data platform is introduced which makes the process of domain ownership of data products easier, delivering the self-contained infrastructure and services that the data product defines;
  4. Federated computational governance – this is the idea that policies such as access control, data retention, encryption requirements, and actions such as the “right to be forgotten” are determined centrally by a governance board but are stored, and executed, in machine-readable form by data products;

For me the core idea is that of a swarm of self-contained data products which are all independent but by virtue of simple behaviours and some mesh spanning services (such as a data catalogue) provide a sum that is greater than the whole. A parallel is drawn here with domain-driven design and microservices, on which the data mesh is modelled.

I found the parts on designing the data mesh platform and data products most interesting since this is the point I am at in my work. Dehghani breaks the data mesh down into three “planes”: the infrastructure utility plane, the data product experience plane, and the mesh experience plane (this is where the data catalogue lives).

We spent some time worrying over whether it was appropriate to include data processing functionality in our data mesh – Dehghani makes it clear that this functionality is in scope, arguing that the benefit of the data product orientation is that only a small number data pipelines are managed together rather than hundreds or possibly thousands in a centralised scheme.

I have been spending my time writing code, which Dehghani describes as the “sidecar”, common code that sits inside the data product to provide standard functionality. In terms of useful new ideas, I have been worrying about versioning of data schema and attributes – Dehghani proposes that “bitemporality” is what is required here (see Martin Fowler’s blog post here for an explanation). Essentially bitemporality means recording the time at which schema and attributes were changed, as well as the time at which data was provided and recording the processing time. This way one can always recreate a processing step simply by checking which set of metadata and data were in play at the time (bar data being deleted by a data retention policy).

Data Mesh also encouraged me to decouple my data catalogue from my data processing, so that a data product can act in a self-contained way without depending on the data catalogue which serves the whole mesh and allows data to be discovered and understood.

Overall, Data Mesh was a good read for me in large part because of its relevance to my current work but it is also well-written and presented. The lack of mention of specific technologies is rather refreshing and means the book will not go out of date within the next year or so. The first companies are still only a short distance into their data mesh journeys, so no doubt a book written in five years time will be a different one but I am trying to solve a problem now!

Book review: The Art of More by Michael Brooks

the_art_of_moreThe Art of More by Michael Brooks is a history of mathematics written by someone whose mathematical ability is quite close to mine – that’s to say we did pretty well with maths at school but when we went to university we reached a level where we stopped understanding what we were doing and started just manipulating symbols according to a recipe.

The book proceeds chronologically starting with origins of counting some 20,000 years ago and finishing with information theory in the mid-20th century with chapters covering arithmetic, geometry, algebra, calculus, logarithms, imaginary numbers, statistics and information theory.

It is probably chastening to modern mathematicians and scientists that much of the early work in maths on developing the number system, including zero and negative numbers, was driven by accounting and banking. Furthermore, much of the early innovation came from China, India and the Middle East with Western Europe only picking up the ideas of zero and negative numbers in around the 13th century.

Alongside the development of the number system, the ancient Greeks and others were developing geometry, the ancient Greeks seemed to go off numbers when they discovered irrational numbers – those which cannot be expressed exactly as a ratio of integers! Geometry is essential for construction, surveying, navigation and mapmaking – sailors have often been competent mathematicians – through necessity. Geometry also plays a part in the introduction of accurate perspective in drawings and paintings.

Complementing geometry is algebra, developed in the Arabic world. Our modern algebraic notation did not come into being until the 16th century with the introduction of the equals sign and what we would understand as equations. Prior to this problems were expressed either geometrically or rather verbosely.

Leading on from algebra was calculus – the maths of change. It started sometime around the beginning of the 17th century with Kepler calculating the volumes of wine barrels whilst he was preparing for his wedding. There was further work on the infinitesimals through the century before the work by Newton and Leibniz who are seen as the inventors of calculus. I was struck here by how all the key characters in the development of calculus Newton, Leibniz, Fermat, Descartes and the Bernoullis all sounded like deeply unpleasant men. Is this the result of the distance of history and the activities of various proponents for and against in the intervening centuries? Or were they really just deeply unpleasant men?

Doing a lot of calculation started to become a regular occurrence for sailors, as well as people such as Kepler and Newton working on the orbits of various celestial bodies. John Napier’s invention of logarithms and his tables of logarithms, published in 1614 greatly simplified calculations. It converted multiplication and division into addition and subtraction of values looked up in his tables of logarithms. The effort to create the tables was massive, it took 20 years for Napier to prepare his first set of tables, containing millions of values. Following Napier’s publication in 1614 logarithms reached their modern form (including natural logarithms) by 1630. In addition mechanical calculating devices like the slide rule were quickly invented. I grew up in a house with slides rules, although by the time I was old enough to appreciate them electronic calculators had taken over. Napier was also an early promoter of the modern decimal system. Logarithms also link to exponential growth, highly relevant as we still wait for the COVID pandemic to subside.

Historically the next area of maths is the invention of imaginary numbers, if you don’t know what these are then I’m not going to be explain in the space of a paragraph! There is a link here with natural logarithms through Euler’s identity which somewhat ridiculously manages to link e, pi and i in one really short equation. I was not previously familiar with Charles Steinmetz who introduced complex numbers into the analysis of electrical circuits responding to alternating currents – although it is a very elegant way of handling the problem and a method I used a lot at university. Largely when we talk about complex numbers we are discussing the addition of i, the square root of -1, to our calculations. But there are additionally quaternions, invented by William Hamilton, which add three complex numbers: i,j and k to the real numbers but the limit is octonions – a system of seven complex numbers and the real numbers. I am curious as to why we cannot have more than 7 flavours of complex numbers.

Statistics is my area of mathematics, I’m a member of the Royal Statistical Society. I think the thing I learned from this chapter was that the word "statistics" has its origins in German and "facts about the state". I quite liked Brooks’ description of p-values which seemed particularly clear to me. Brooks highlights some of the sordid eugenicist history of statistics, as well as the more enlightening work of Florence Nightingale and others.

The book finishes with a chapter on information theory, largely based on the work of Claude Shannon but with roots in the work of Leibniz and George Boole. George Boole invented his Boolean logic in an attempt to understand the mind in the mid-19th century but his work on "binary" logic was neglected for 70 or so years until it was revived by Shannon and other pioneers of early computing.

This is a fairly informal history of mathematics, I found it very readable but it includes a number of equations which might put off the completely non-mathematical.

Book review: Play it Loud by Brad Tolinski and Alan Di Perna

play_it_loudI took up the guitar a few years ago, and play in the manner described by Kurt Vonnegut, that’s to say with little skill but expanded horizons. I read The Birth of Loud by Ian S. Port a while back and Play it loud by Brad Tolinska and Alan Di Perna is in a similar vein, a book about the electric guitar and the music that came from it. Whilst The Birth of Loud focused on Leo Fender and Les Paul and a period from the early fifties to the mid-sixties Play it Loud starts earlier and extends later and is broader in scope.

Play it Loud is divided into chapters which typically cover one or two people and one or two guitars, each illustrating a technical innovation or change in musical style. Broadly each chapter follows on from the previous one in time, taking us from the 1920s and thirties in the first chapter through to around 2015 by the end. It finishes with a timeline, which I liked.

The book starts with George Beauchamp in the 1920s and the first guitar pickups designed to pickup the vibration of the strings rather than the vibration of the guitar body, this followed the invention earlier in the century of the electronic valve amplifier and the paper cone speaker – both prerequisites for useful electric guitars. Guitars had been around for some time, and in the twenties guitar-based Hawaiian music was popular in the US. Hawaiian stringed music had its roots in Portuguese sailors in the 18th century. Beauchamp with Rickenbacker produced the first electric guitar based on this technology, the A-32 ‘Frying Pan’ in 1932. This was a cast-aluminium lap-steel style guitar.

The next development was the Gibson ES-150 in 1936 with a bar pickup that sat under the strings, rather than over them as for the Beauchamp pickup, ES stands for electric Spanish – it was the first of its kind. The guitar was made popular by the endorsement of Charles Christian, a Jazz guitarist, who was considered better than Django Reinhardt and Les Paul at the time. He was to die at the age of 25 of tuberculosis. This type of endorsement is a recurring theme, celebrated musician endorsements are massively valuable to guitar companies.

By the early fifties a number of people had realised that the guitar body was largely a place to hang strings and pickups and no longer needed to be hollow – the hollow chamber of the guitar is the amplifier in an acoustic guitar. Thus was born the Fender Telecaster and then the Stratocaster and, at Gibson, the "Les Paul". This is the period covered in The Birth of Loud. It is worth noting that Les Paul was one of a breed of musician/technician who were to recur with Eddie Van Halen and Steve Vai in the late seventies and early eighties, who pushed forward the development of the guitar. I hadn’t realised but the very futuristic looking Gibson Flying V and Explorer models were born in this period of the late fifties – they were unpopular then but saw a resurgence in the early eighties.

The new solid-body electric guitar, Fender’s Precision Bass and new amplifiers meant that by the early sixties an electric four-piece band could fill a hall with sound (previously this required a big band or an orchestra and by the late sixties Jimi Hendrix could make rather more noise than that. At this point Tolinska and Perna highlight how the electric guitar fitted in with protest and counter-cultural – also citing Bob Dylan and his infamous switch to the electric guitar. His electric set at the Newport Country Festival was so short because it had only been brought together a few days earlier.

By the late sixties the quality of Fender and Gibson’s offerings was dropping, and players like Eric Clapton started looking for the discontinued Les Paul models. The drought in good quality guitars was to extend for a while, in the mid-sixties, when Fender and Gibson were dropping in quality Japan was producing a large number of cheap, low quality guitars. In this environment, an after-market parts market grew with names we recognise today like Seymour Duncan, Jackson Charvel and Larry Dimarzio. Japan was to later produce high quality guitars – Steve Vai chose Ibanez to make his signature model.

The book finishes with a chapter centred on Jack White of The White Stripes, and his enthusiasm for very retro, and not highly regarded guitars and amplifiers. This represents a thread running through the book, guitars are more than their technical components – the choice of guitar says something about what a players intentions are. So Eric Claption took up the discontinued Les Paul to ape the earlier blues players. The punk and garage bands were trying to get away from those blues roots, and cheap, plastic guitars fitted that vibe. They were also trying to get away from the comfortable middle class hobby guitarists (like me) who would happy spend a couple of thousand dollars on a signature or classic guitar because they could.

In common with my reading of The Birth of Loud I found myself googling for the guitars mentioned and thinking I should get one!

Book review: Pale Rider – The Spanish Flu of 1918 by Laura Spinney

pale_riderPale Rider: The Spanish Flu and How it Changed the World by Laura Spinney is obviously very topical at the moment, it was published in June 2017 which makes it more striking how relevant it is than if it had been published in the last two years.

The book starts with an overall chronology of the 1918 flu pandemic before return to specific themes, generally through the medium of personal accounts or individual incidents. It is worth highlighting that the "Spanish" label is highly misleading, essentially the 1918 flu pandemic arose somewhere between the American mid-West, Northern France on the battle fields of the First World War or, a remote possibility, in China. Spinney discusses the link with viruses found in wildlife and livestock.

Initial estimates as to the death toll of the 1918 flu pandemic were around 25 million but these have been revised upwards recently to up to 100 million. Furthermore, the 1918 flu pandemic largely took place over September to December in 1918 with smaller waves in the spring of 1918 and in the following spring and with there were some variations by geography as to exactly when the worst effects were felt. So 1918 flu pandemic was a shorter, more devastating pandemic than the 2020 covid pandemic (which has killed around 3 million of a much large population). This was against the back drop of the First World War which killed more people in Europe than the pandemic, although around the world Europe was the exception with more killed by pandemic in all other continents.

The context for the 1918 flu pandemic was different too, the 19th century had been one of epidemics driven by industrialisation and the associated urbanisation. Amongst those were flu pandemics and 1830 and 1890. The 1890 "Russian" flu pandemic, was the first to be measured as a pandemic. The 1918 pandemic was at a time when the germ theory of disease was being developed, and the value of hygiene was understood. However, viral diseases were not well understood and it was not until the 1930s that the mechanism of transmission for flu was discovered with the first flu vaccines coming in 1936. It was not until the 1950s that it was confirmed as a viral disease. The symptoms of this flu pandemic were quite different from those of the covid pandemic with a mahogany colouration forming on the cheekbones that spread progressively until death, teeth and hair falling out and delirium (leading to suicide).

The health measures taken to address the 1918 pandemic were not that different from those used recently with sanitary cordons and quarantine used extensively. Religious ceremonies were exempt from restrictions in Spain leading to more cases. Closing schools was argued over with those in favour seeing schools as better for the monitoring of outbreaks, communication of health information, and offering better sanitary conditions, and food, to children. Starvation was a problem with supply chains effected from start to finish.

It is interesting to see the varying responses of Australia and New Zealand between the 1918 pandemic and the covid pandemic, Australia isolated in 1918, as it did in the covid pandemic but in 1918 they did not. The disproportionate impacts of the 1918 pandemic were also in evidence, with the recent Italian immigrants to the US, India and remote native American communities in Alaska very badly effected with mortality rates of up to 40%.

The pandemic had arguable impacts in world affairs, Woodrow Wilson had a serious stroke probably as a result of a bout of flu, and was not present to limit the war reparations against Germany.The independence movement in India grew. The flu impacted people in their twenties and thirties quite heavily, leaving behind a generation of orphans – their treatment was handled with new legislation by France and England. There was a post-pandemic (and war) fertility boom.

Despite the enormous death toll, even compared to the First World War, the 1918 pandemic appeared to have little impact on art and literature although scholars will look for signs of post-viral fatigue in paintings. Spinney argues this is because insufficient time has passed, noting that there are approaching 80,000 books on the First World War and but only 400 on the 1918 pandemic – but this number is growing rapidly. It has made me wonder about the lost siblings, in my grandparents generation which were never spoken of – similarly the absence of stories from fighting age men of the Second World War. Essentially these stories were too painful to handle at a human, personal level and the culture in the UK at least would not have been to speak about them. So it is left to historians and the passage of time for the stories to come to light.

A second factor, proposed by psychologists, is that pandemics lack a good story line with a clear beginning and end and a selection of heroes – unlike the First World War.

The Pale Rider is very readable, it is difficult to use the word "enjoy" regarding a book which tells of the deaths of 100 million people. I was struck by how relevant the 1918 flu pandemic was to our current situation with the disparate impacts depending on country and social conditions, the debates over school closures, the dedication of medical staff, the measures to address the pandemic and the debates over the compliance with public health measures. The covid pandemic is different – it has played out over a longer period, it has a far lower death toll, our medical knowledge is much improved, our world is much more connected but nevertheless The Pale Rider feels very prescient.