Dr Administrator

Author's posts

Pevsner Architectural Guide – Liverpool by Joseph Sharples

Pevsner-LiverpoolI reach Pevsner’s Architectural Guide – Liverpool (additional material Joseph Sharples) by a somewhat winding route: I take the Merseyrail train to Liverpool; my normal route is changed and I must walk across the city; it turns out the buildings are spectacular; I take photos and then I want to know what I have photoed. This is where the Pevsner Guide enters the picture.

Sir Nickolaus Pevsner was a German born art historian who moved to the UK in 1933, he felt that the academic study of architecture in Britain was lacking, and furthermore there were was no convenient source of information on the many and wondrous buildings of the country. In 1945 he proposed a series of books: Buildings of England to address this lack. The series ultimately ran to 46 volumes, 32 written by Pevsner, a further 10 which he co-authored and 4 written by others.

This Guide – Liverpool is a city specific reworking of the original guides. The book is large-pocket sized, well produced with a fair number of images. It starts with an overview of the history of Liverpool. I have to admit, shame-faced, that I was woefully ignorant of the city I now work in. For nearly 10 years I have lived just down the line in Chester, and yet I had visited Liverpool a handful of times, in the evening for works dinners. My perceptions of Liverpool are coloured by the time I grew up, in the 1980s, when Liverpool was host to riots in the Toxteth area of the city, mass unemployment and far-left politics. Walking around now what I see is completely at odds with my perceptions, you can see in my earlier post. To add some decoration to this post, below is the Royal Liver Building, one of the Three Graces, built on the waterfront at the beginning of the 20th century.

Royal Liver Building, corrected

Royal Liver Building, wide angle view with perspective “correction” applied

And St George’s Hall:

St George's Hall

St George’s Hall

Liverpool has long been Britain’s second port and probably has a strong claim for second city status (both following London). Initially it grew through exporting Cheshire salt, than as part of the triangle route carrying slaves, then as a point of exit for Britain’s manufactured goods and finally as a passenger terminus. Liverpool is not blessed with the best of conditions for shipping, this meant it was an early pioneer of gated docks. This was significant engineering work, only possible through the collective action of the city Corporation. It’s worth noting that one of the first railways in Britain was between Liverpool and Manchester, providing a link between the manufacturing centre and the port. Liverpool remained preeminent until the sixties when British manufacturing declined, and shipping became containerized, much reducing the labour required. It had no “second fiddle” so with the loss of shipping it went into rapid decline.

Nowadays Liverpool is making a resurgence, the fine buildings from its early high water mark are joined by some excellent new ones.

After a historical overview the Guide covers six major buildings / areas of the city: the Town Hall (dating to mid 18th century and the oldest major building in the city), St George’s Hall and the Plateau (up by Liverpool Lime Street Station); the William Brown Street Group and St. John’s Gardens; Pier Head where the outstanding Three Graces are to be found and finally the two cathedrals (Anglican and Catholic) both built in the 20th century. The majority of the buildings date from the latter half of the 19th century and early 20th century, the burgeoning wealth of the city having little time for preserving the relatively meagre past. The city suffered significant bombing during the Second World War, as a result of its importance as a port.

After the major buildings the remainder of the the Guide is broken down into a set of 10 walks around the central area of the city, spanning a few miles with an interlude covering the city centre. I’d spotted the grand building of the Marks & Spencer store in the centre of town on my previous perambulations, it is in fact Compton House (see below) built as one of the earliest department stores in 1867.

 

35 Church Street, Marks and Spencer

Compton House, Church Street

 

As a bonus the book finishes with three short pieces from areas outside the city: Speke Hall, Port Sunlight and Hamilton Square.

The Pevsner Guides are not really designed to be read sitting on the train, as I did, they are to be held as you walk around with a map. Despite the relatively large number of photos it doesn’t feel like enough, I was frustrated by reading the words but not being able to see the buildings. I think a few more walks with the camera are in order.

The Guide is a staccato recounting of what you can see, listing locations, architectural features, architects and the occasion blunt opinion, this is his comment on the re-development of the Prince’s Dock:

“The results so far, though, are inadequate. The architecture is both bland and overly fussy”

It feels like an excellent opportunity for a smartphone app. The current publishers seem a bit bewildered by this newfangled app world and have produced a digital companion in the form of a glossary of architectural terms. Elsewhere someone is selling a database of all of the Pevsner entries, the Guide is a database rendered in prose form.

It seems the components are there for a Pevsner App, who is with me in making it happen?

Footnote

Here‘s a Google Map of the Major Buildings from the Guide.

More rant on higher education

It’s been quite a while since I’ve had a rant on my blog. I’ve been muted by the responsibilities of child care and a new job, but the drought is over!

My ire has been raised by this article in the TES where university academics are bemoaning the fact that they may actually be compelled to hold qualifications in teaching.

The core information in the piece is that the Higher Education Statistics Agency is compelling universities to provide anonymised data on the teaching qualifications of their staff. Academics fear the data will be used for “foolish” ends and may make such qualifications compulsory.

Oh the humanity!

It isn’t like I have no working knowledge in this area – I worked in universities for 10 years: as a PhD student, a postdoc, a weird intermediate state at Cambridge and finally as a lecturer. In all that time I received approximately 4 days training on how to teach. I taught students in small groups, practical classes and lectures.

Teaching well is a skill and it is ironic that, in institutions which award qualifications to people, the idea that qualifications in your profession might be useful is a radical idea and a thing which must be opposed.

An interviewee for the piece lets the cat out of the bag, he says: “If you want to maintain an active research life, you need to devote… 24 hours in the day towards your research. These [teaching] qualifications take time and effort to obtain, and anything that takes away from research time makes it more difficult to stay in the research excellence framework”.

Remember this when you or your offspring are planning on spending £9k per year to attend a university because the view that any teaching effort is a distraction from research is common.

Outside academia qualifications are seen as something of a benefit, if I had a teaching qualification it would be on my linkedin profile in a flash!

Imagine buying a rather expensive car and being told “None of the designers or engineers who made this car had any qualifications in making cars but they’ve been doing it for years”. It’s the very core of teaching: learning stuff you wouldn’t learn by experience.

So, there is my rant.

Book review: The Tableau 8.0 Training Manual – From clutter to clarity by Larry Keller

Tableau 8.0 Training Manual

This review was first published at ScraperWiki.

My unstoppable reading continues, this time I’ve polished off The Tableau 8.0 Training Manual: From Clutter to Clarity by Larry Keller. This post is part review of the book, and part review of Tableau.

Tableau is a data visualisation application which grew out of academic research on visualising databases. I’ve used Tableau Public a little bit in the past. Tableau Public is a free version of Tableau which only supports public data i.e. great for playing around with but not so good for commercial work. Tableau is an important tool in the business intelligence area, useful for getting a quick view on data in databases and something our customers use, so we are interested in providing Tableau integration with the ScraperWiki platform.

The user interface for Tableau is moderately complex, hence my desire for a little directed learning. Tableau has a pretty good set of training videos and help pages online but this is no good to me since I do a lot of my reading on my commute where internet connectivity is poor.

Tableau is rather different to the plotting packages I’m used to using for data analysis. This comes back to the types of data I’m familiar with. As someone with a background in physical sciences I’m used to dealing with data which comprises a couple of vectors of continuous variables. So for example, if I’m doing spectroscopy then I’d expect to get a pair of vectors: the wavelength of light and the measured intensity of light at those wavelengths. Things do get more complicated than this, if I were doing a scattering experiment then I’d get an intensity and a direction (or possibly two directions). However, fundamentally the data is relatively straightforward.

Tableau is crafted to look at mixtures of continuous and categorical data, stored in a database table. Tableau comes with some sample datasets, one of which is sales data from superstores across the US which illustrates this well. This dataset has line entries of individual items sold with sale location data, product and customer (categorical) data alongside cost and profit (continuous) data. It is possible to plot continuous data but it isn’t Tableau’s forte.

Tableau expects data to be delivered in “clean” form, where “clean” means that spreadsheets and separated value files must be presented with a single header line with columns which contain data all of the same type. Tableau will also connect directly to a variety of databases. Tableau uses the Microsoft JET database engine to store it’s data, I know this because for some data unsightly wrangling is required to load data in the correct format. Once data is loaded Tableau’s performance is pretty good, I’ve been playing with the MOT data which is 50,000,000 or so lines, which for the range of operations I tried turned out to be fairly painless.

Turning to Larry Keller’s book, The Tableau 8.0 Training Manual: From Clutter to Clarity, this is one of few books currently available relating to the 8.0 release of Tableau. As described in the title it is a training manual, based on the courses that Larry delivers. The presentation is straightforward and unrelenting; during the course of the book you build 8 Tableau workbooks, in small, explicitly described steps. I worked through these in about 12 hours of screen time, and at the end of it I feel rather more comfortable using Tableau, if not expert. The coverage of Tableau’s functionality seems to be good, if not deep – that’s to say that as I look around the Tableau interface now I can at least say “I remember being here before”.

Some of the Tableau functionality I find a bit odd, for example I’m used to seeing box plots generated using R, or similar statistical package. From Clutter to Clarity shows how to make “box plots” but they look completely different. Similarly, I have a view as to what a heat map looks like and the Tableau implementation is not what I was expecting.

Personally I would have preferred a bit more explanation as to what I was doing. In common with Andy Kirk’s book on data visualisation I can see this book supplementing the presented course nicely, with the trainer providing some of the “why”. The book comes with some sample workbooks, available on request – apparently directly from the author whose email response time is uncannily quick.

Book review: A history of the world in twelve maps by Jerry Brotton

HistoryOfTheWorldInTwelveMapsAs a fan of maps, I was happy to add A History of the World in Twelve Maps by Jerry Brotton to my shopping basket (I bought it as part of a reduced price multi-buy deal in an actual physical book shop).

A History traces history through the medium maps, various threads are developed through the book: what did people call the things we now call maps? what were they trying to achieve with their maps? what geography was contained in the maps? what technology was used to make the maps?

I feel the need to explicitly list, and comment on, the twelve maps of the title:

1. Ptolemy’s Geography 150 AD, distinguished by the fact that it probably contained no maps. Ptolemy wrote about the geography of the known world in his time, and amongst this he collated a list of locations which could be plotted on a flat map using one of two projection algorithms. A projection method converts (or projects) the real life geography of the spherical earth onto the 2D plane of a flat map. Project methods are all compromises, it is impossible to simultaneously preserve relative directions, areas and lengths when making the 3D to 2D transformation. The limitation of the paper and printing technology to hand meant that Ptolemy was not able to realise his map. Also the relatively small size of the known world meant that projection was not a pressing problem. The Geography exists through copies created long after the original was written.

2. Al-idrisi’s Entertainment, 1154AD. The Entertainment is not just a map, it is a description of the world as it was known at the time. This was the early pinnacle in terms of the realisation of the roadmap laid out by Ptolemy. Al-Idrisi, a Muslim nobelman, made the Entertainment for a Christian Sicilian king. It draws on both Christian and Muslim sources to produce a map which will look familiar to modern eyes (except for being upside down). There is some doubt as to exactly which map was included in the Entertainment since no original intact copies exist.

3. Hereford Mappamundi, 1300AD this is the earliest original map in the book but in many ways it is a step backward in terms of the accuracy of its representation of the world. Rather than being a geography for finding places it is a religious object placing Jerusalem at the top and showing viewers scenes of pilgrimage and increasing depravity as one moves away from salvation. It follows the T-O format which was common among such mappmundi.

4. Kangnido world map, 1402AD. To Western eyes this is a map from another world: Korea, again it only exists in copies but not that distant from the original. Here we see strongly the influence of the neighbouring China. The map is about administration and bureaucracy (and contains errors thought to have been added to put potential invaders off the scent). An interesting snippet is that the Chinese saw the nonogram (a square made of 9 squares) as the perfect form – in a parallel with the Greek admiration for the circle. The map also contains elements of geomancy, which was important to the Koreans.

5. Waldseemuller world map, 1507AD. This is the first printed map, it hadn’t really struck me before but printing has a bigger impact than simply price and availability when compared to manuscripts. Printed books allow for all sorts of useful innovations such as pagination, indexes, editions and so forth which greatly facilitate scholarly learning. With manuscripts stating that something is on page 101 of you handwritten manuscript is of little use to someone else with his handwritten copy of the same original manuscript. The significance of the Waldseemuller map is that it is the first European map to name America, it applies the label to the south but it is sometimes seen as the “birth certificate” of the USA. Hence the US Library of Congress recently bought it for $10 million.

6. Diogo Ribeiro, world map, 1529AD. A map to divide the world between the Spanish and Portuguese, who had boldly signed a treaty dividing the world into two hemispheres with them to own one each. The problem arose on the far side of the world, where it wasn’t quite clear where the lucrative spice island of Moluccas lay.

7. Gerard Mercator world map, 1569AD. I wrote about Mercator a while back, in reviewing The World of Gerard Mercator by Andrew Taylor. The Mercator maps are important for several reasons, they introduce new technology in the form of copperplate rather than woodcut printing, copperplate printing enables italic script, rather than the Gothic script that is used in woodcut printing; they make use of the newly developed triangulation method of surveying (in places); the Mercator projection is one of several methods developed at the time for placing a spherical world onto a flat map – it is the one that maintained – despite limitations.And finally he brought the Atlas to the world – a book of maps.

8. Joan Blaeu Atlas maier, 1662. Blaeu was chief cartography for the Dutch East India Company (VOC), and used the mapping data his position provided to produce the most extravagant atlases imaginable. They combined a wide variety of previously published maps with some new maps and extensive text. These were prestige objects purchased by wealthy merchants and politicians.

9. Cassini Family, map of France, 1793. The Cassini family held positions in the Paris Observatory for four generations, starting in the late 17th Century when the first geodesic studies were conducted, these were made to establish the shape of the earth, rather than map it’s features. I reviewed The Measure of the Earth  by Larry D. Ferriero which related some of this story. Following on from this the French started to carry systematic triangulation surveys of all of France. This was the first time the technique had been applied at such scale, and was the forbearer to the British Ordnance Survey, the origins of which are described in Map of a Nation by Rachel Hewitt. The map had the secondary effect of bringing together France as a nation, originally seen by the king as a route to describing his nation (and possibly taxing it), for the first time Parisian French was used to describe all of the country and each part was mapped in an identical manner.

10. The Geographical Pivot of History, Halford Mackinder, 1904. In a way the Cassini map represents the pinnacle of the technical craft of surveying. Mackinder’s intention was different, he used his map to persuade. He had long promoted the idea of geography as a topic for serious academic study and in 1904 he used his map to press his idea of central Asia as being central to the politics and battle for resources in the world. He used a map to present this idea, its aspect and details crafted to reinforce his argument.

11. The Peters Projection, 1973. Following the theme of map as almost-propaganda the Peters projection – an attempted equal-area projection – shows a developing world much larger than we are used to in the Mercator projection. Peters attracted the ire of much of the academic cartographic communities, partly because his projection is nothing new but also because he promoted it as being the perfect, objective map when, in truth it was nothing of the kind. This is sort of the point of the Peters projection, it is open to criticism but highlights that the decisions made about the technical aspects of a map have a subjective weight. Interestingly, many non-governmental organisations took to using the Peters projection because it served their purpose of emphasising the developing world.

12. Google Earth, 2012. The book finishes with a chapter on Google Earth, initially on the technical innovations required to make such a map but then moving on to the wider commercial implications. Brotton toys with the idea that Google Earth is somehow “other“ from previous maps in its commercial intent and the mystery of its methods, this seems wrong to me. A number of the earlier maps he discusses were of limited circulation and one does not get the impression that methods were shared generously. Brotton makes no mention of the Openstreetmap initiative that seems to address these concerns.

In the beginning I found the style of A History a little dry and academic but once I’d got my eye in it was relatively straightforward reading. I liked the broader subject matter, and greater depth than some of my other history of maps reading.

Making a ScraperWiki view with R

 

This post was first published at ScraperWiki.

In a recent post I showed how to use the ScraperWiki Twitter Search Tool to capture tweets for analysis. I demonstrated this using a search on the #InspiringWomen hashtag, using Tableau to generate a visualisation.

Here I’m going to show a tool made using the R statistical programming language which can be used to view any Twitter Search dataset. R is very widely used in both academia and industry to carry out statistical analysis. It is open source and has a large community of users who are actively developing new libraries with new functionality.

Although this viewer is a trivial example, it can be used as a template for any other R-based viewer. To break the suspense this is what the output of the tool looks like:

R-view

The tool updates when the underlying data is updated, the Twitter Search tool checks for new tweets on an hourly basis. The tool shows the number of tweets found and a histogram of the times at which they were tweeted. To limit the time taken to generate a view the number of tweets is limited to 40,000. The histogram uses bins of one minute, so the vertical axis shows tweets per minute.

The code can all be found in this BitBucket repository.

The viewer is based on the knitr package for R, which generates reports in specified formats (HTML, PDF etc) from a source template file which contains R commands which are executed to generate content. In this case we use Rhtml, rather than the alternative Markdown, which enables us to specify custom CSS and JavaScript to integrate with the ScraperWiki platform.

ScraperWiki tools live in their own UNIX accounts called “boxes”, the code for the tool lives in a subdirectory, ~/tool, and web content in the ~/http directory is displayed. In this project the http directory contains a short JavaScript file, code.js, which by the magic of jQuery and some messy bash shell commands, puts the URL of the SQL endpoint into a file in the box. It also runs a package installation script once after the tool is first installed, the only package not already installed is the ggplot2 package.


function save_api_stub(){
scraperwiki.exec('echo "' + scraperwiki.readSettings().target.url + '" > ~/tool/dataset_url.txt; ')
}
function run_once_install_packages(){
scraperwiki.exec('run-one tool/runonce.R &> tool/log.txt &')
}
$(function(){
save_api_stub();
run_once_install_packages();
});

view raw

code.js

hosted with ❤ by GitHub

The ScraperWiki platform has an update hook, simply an executable file called update in the ~/tool/hooks/ directory which is executed when the underlying dataset changes.

This brings us to the meat of the viewer: the knitrview.R file calls the knitr package to take the view.Rhtml file and convert it into an index.html file in the http directory. The view.Rhtml file contains calls to some functions in R which are used to create the dynamic content.


#!/usr/bin/Rscript
# Script to knit a file 2013-08-08
# Ian Hopkinson
library(knitr)
.libPaths('/home/tool/R/libraries')
render_html()
knit("/home/tool/view.Rhtml",output="/home/tool/http/index.html")

view raw

knitrview.R

hosted with ❤ by GitHub

Code for interacting with the ScraperWiki platform is in the scraperwiki_utils.R file, this contains:

  • a function to read the SQL endpoint URL which is dumped into the box by some JavaScript used in the Rhtml template.
  • a function to read the JSON output from the SQL endpoint – this is a little convoluted since R cannot natively use https, and solutions to read https are different on Windows and Linux platforms.
  • a function to convert imported JSON dataframes to a clean dataframe. The data structure returned by the rjson package is comprised of lists of lists and requires reprocessing to the preferred vector based dataframe format.

Functions for generating the view elements are in view-source.R, this means that the R code embedded in the Rhtml template are simple function calls. The main plot is generated using the ggplot2 library. 


#!/usr/bin/Rscript
# Script to create r-view 2013-08-14
# Ian Hopkinson
source('scraperwiki_utils.R')
NumberOfTweets<-function(){
query = 'select count(*) from tweets'
number = ScraperWikiSQL(query)
return(number)
}
TweetsHistogram<-function(){
library("ggplot2")
library("scales")
#threshold = 20
bin = 60 # Size of the time bins in seconds
query = 'select created_at from tweets order by created_at limit 40000'
dates_raw = ScraperWikiSQL(query)
posix = strptime(dates_raw$created_at, "%Y-%m-%d %H:%M:%S+00:00")
num = as.POSIXct(posix)
Dates = data.frame(num)
p = qplot(num, data = Dates, binwidth = bin)
# This gets us out the histogram count values
counts = ggplot_build(p)$data[[1]]$count
timeticks = ggplot_build(p)$data[[1]]$x
# Calculate limits, method 1 – simple min and max of range
start = min(num)
finish = max(num)
minor = waiver() # Default breaks
major = waiver()
p = p+scale_x_datetime(limits = c(start, finish ),
breaks = major, minor_breaks = minor)
p = p + theme_bw() + xlab(NULL) + theme(axis.text.x = element_text(angle=45,
hjust = 1,
vjust = 1))
p = p + xlab('Date') + ylab('Tweets per minute') + ggtitle('Tweets per minute (Limited to 40000 tweets in total)')
return(p)
}

view raw

view-source.R

hosted with ❤ by GitHub

So there you go – not the world’s most exciting tool but it shows the way to make live reports on the ScraperWiki platform using R. Extensions to this would be to allow some user interaction, for example by allowing them to adjust the axis limits. This could be done either using JavaScript and vanilla R or using Shiny.

What would you do with R in ScraperWiki? Let me know in the comments below or by email: [email protected]