"Double Entry" by Jane Gleeson-White

Anyone interested in numbers and the history of math will enjoy "Double Entry: How the Merchants of Venice Created Modern Finance" by Jane Gleeson-White. It's a new history of the development of mathematics and accounting from the Renaissance forward, with a particular emphasis on the work of the mathematician Luca Pacioli. Gleeson-White is Australian, and brings a refreshing perspective to the convoluted politics of her story. The history of double-entry bookkeeping is necessarily entwined with the conversion from Roman to Arabic numerals and the development of the printing press. Pacioli documented and standardized bookkeeping, and gave the world a versatile tool. His timing was such that his description was published and distributed throughout Europe.

The first six chapters take us through the 19th century, when double-entry bookkeeping was adapted to take account of the developments of the industrial revolution (expenses of man-made components like rail ties are conceptually different from a good that is traded away, like a gallon of wine.) This part of the book is lucid and interesting, though it might have been enhanced by illustrations that match the glory of the cover illustration.

Gleeson-White loses her way a bit in the final four chapters, forgetting that she is talking about a tool - instead she reifies double-entry bookkeeping into the cause of many social ills. Her point that we do not account for the environmental costs of many of our actions is a good one, but that is not because of the tool. On accounting for environmental costs, today's New York Times carries an interesting article on Ireland's three-year history of carbon taxes. One short quote:
“We are not saints like those Scandinavians — we were lapping up fossil fuels, buying bigger cars and homes, very American,” said Eamon Ryan, who was Ireland’s energy minister from 2007 to 2011. “We just set up a price signal that raised significant revenue and changed behavior. Now, we’re smashing through the environmental targets we set for ourselves.”
By contrast, carbon taxes are viewed as politically toxic in the United States. Republican leaders in Congress have pledged to block any proposal for such a tax, and President Obama has not advocated one, although the idea has drawn support from economists of varying ideologies.
 Cover image via


Adam Davidson on statistics and narrative

If you haven't yet read Adam Davidson's article "God Save the British Economy" in yesterday's New York Times Magazine click on the link and read it. It's a very clear explanation of the British government's economic austerity policy and related criticism. From this blog's perspective it's also an explicit and important description of the use of statistics in forming narrative - the narrative that we use to make sense of the world.

Here's one example from the article:
Economics often appears to be an exercise in number-crunching, but it actually resembles storytelling more than mathematics. Before the members of the Monetary Policy Committee gather for their monthly meeting, they sit through a presentation from the Bank of England’s economic staff. The staff members take the most recent economic data — G.D.P. growth, the unemployment rate and more subtle details gathered from interviews with businesspeople throughout the country — and try to fashion it into a narrative. Does a sudden spike in new factory orders represent a fundamental shift, or is it just a preholiday blip? Do anecdotal reports of rising food prices herald a period of inflation, or is it the result of a cold snap? Which story feels truer?
I've often talked about the importance of thinking about the statistics we use, understanding the context and thinking about the cognitive choices we are making when we develop and interpret the statistics. In this story Davidson dissects a set of choices. Read it and think about it - and tell me what you think in the comments.


Year in revew, from Google

Google has released this cool video, showing how we searched over the course of the year. And there's more data here: searches, images, TV shows, events and more. Have fun.

I'll be posting intermittently and infrequently between now and January 2. Happy New Year, and thanks for reading.


Identifying a poor work environment, and some things you can do about it

Here's a link to a useful article "Fight the Nine Symptoms of Corporate Decline" by Rosabeth Moss Kanter on the Harvard Business Review HBR Blog network (free when you register). Here are the warning signs Kanter identifies:
  • Communication decreases
  • Criticism and blame increase
  • Respect decreases
  • Isolation increases
  • Focus turns inward
  • Rifts widen and inequities grow
  • Aspirations diminish
  • Initiative decreases
  • Negativity spreads
Though the examples may be from private business, the symptoms can occur in not-for-profits too. In fact, I've worked in places with some or all of these symptoms, and she's right, they are a warning of troubled times. This is a good checklist to use to take the temperature of your office. But just because you see a warning sign, you're not necessarily on an extended downward slide. Kanter continues with some ways to shift a culture to more successful habits:

  • Keep communication open and information flowing. Foster widespread problem-solving dialogue. Face facts openly and honestly.
  • Emphasize personal responsibility. Refuse to listen to attacks on others and ask each person to take responsibility for his or her part of a problem.
  • Model respect for talent and achievements at every level. Offer frequent public thanks. Praise those who meet high standards while helping poor performers improve (or weeding them out if they don't).
  • Convene conversations across groups. Involve diverse cross-cutting teams in problem-solving.
  • Stress common purpose. Communicate inspiring goals larger than any individual or group. Find a grand challenge to unite people.
  • Work on reducing inequities and status differences. Require the privileged to mentor and help others. Spread extra resources to many groups, and encourage joint projects or shared service. Provide opportunities for learning and growth.
  • Raise aspirations. Use small wins to show the potential for bigger successes. Encourage realistic stretch goals and offer people the help to reach them.
  • Reward initiative. Provide time or small grants to work on new ideas. Make brainstorming a habit.
  • Reinforce the positive by saying and demonstrating that change is possible. Ignore the voices of negativity.
Common sense, yes. But hard to do unless you are really thinking about your environment.


Warm November weather

Weird December weather - not to mention October and November - when was the last time you saw a white Christmas? Climate Central is reporting that November 2012 was the 333d straight month when average temperatures exceeded the 20th century average. That's 27 years. And since 15 of those years were in the 20th century and included in the average, well, that's bad news.

Climate Central also says:
Much of the world saw warmer-than-average temperatures during November. Warmer-than-average weather affected Australia, the Central and Western U.S., northern Africa, far eastern Russia, and central Asia. The small European nation of Croatia was particularly mild during November, with temperatures ranging from 4.3°F to 7.9°F above average during the month.


Neuroscience research

Yesterday Columbia University announced a very large gift establishing the endowment for the Zuckerman Mind Brain Behavior Institute at Columbia. In addition to its timeliness it's an exciting effort to bring together work in neuroscience, decision-making, imaging with the humanities. The institute will build on the work of Nobel Laureates Eric Kandel and Richard Axel, among others. If you haven't Kandel's books "In Search of Memory" and "The Age of Insight" (my review of the latter in the Brooklyn Bugle is here) I highly recommend them. And here's an earlier post of mine with links to some interesting articles about the science of decision-making.

Columbia held a forum on interdisciplinary neuroscience  in conjunction with the announcement. From the discussion it's clear that the institute is still taking shape. The panelists mostly described the contributions the various disciplines will bring, and they expressed some hopes for useful research in the first decade.

Richard Axel spoke more philosophically (and I am paraphrasing). Axel said that the brain is the most complex structure in the universe, and we don't understand it. We have learned that an individual alone cannot understand the brain - and plenty of individuals in disciplines ranging from philosophy to biology have tried. Moreover, the mind doesn't lend itself to verbal description. The institute is being formed to find new ways to address the problem.

The mind is particularly elusive, Axel went on, because neurons (nerve cells) are not like liver cells or heart cells, where genes control the behavior. Neurons themselves do not control behavior either; that is the job of neurosystems. Neurosystems are large and complex, with trillions of connections. The brain itself abstracts each system - and translates higher order notions into the firing of neurons. It's like abstract art, he said. And the task is to understand the meaning of this abstraction.

It will be interesting to see, in 30 years, what the institute looks like. Meanwhile, you can read Columbia's press release and watch a short video about the institute here.


Some gun data

The Guardian has posted an interactive US Gun Crime Map on its data blog. The screenshot above shows the percent change in firearms murders per 100,000 population between 2010 and 2011. (Note that no data are available for Alabama and Florida.)

You can also see firearms murders as a percent of all murders, firearms murders per 100,000 population in 2010 and 2011, and firearms assaults and robberies per 100,000 population. It's interesting.


One number

and it is too large: 27.

Please go to and sign this petition. And send a letter to the President: today is the right day to start the conversation about gun control, nationwide.

Update, December 17: For a good look at the discussion and where it can start, see this James Fallows blog post and related links. Further update: And if you need some definitions, Slate provides them here.


Global mortality rates and causes, visually

There have been a lot of news reports about the Global Burden of Disease Study published yesterday in The Lancet. The Guardian's data blog has an interactive graphic showing cause of death visually. Here is a screenshot showing the 2010 causes (by percentage) of death among women by age group.

And a screenshot of the same data by region:

You can compare 1990 to 2010, look at rates for men, women or both, look at rate, number or percentage. And to there are tables comparing 1990 with 2010. Fascinating, and very well done.


Understanding more about Bayesian analysis

Since I finished reading Nate Silver's book "The Signal and the Noise" (you can read my review of it here) I've been trying to find a way to describe the difference between Bayesian and standard statistics.

As I understand it standard, or frequentist statistics, the kind we were taught in school, asks the question: given a set of data, what is the frequency that a particular phenomenon will occur? According to Silver, this way of looking at a question means that we are thinking hard about the accuracy of our measurement (but assuming that we are measuring what we want to measure).

Bayesian statistics, on the other hand, asks the question: given a certain outcome or set of data, what is the most likely cause (or causal chain) for that outcome? Again according to Silver, Bayesian statistics allow us to think about how certain we are we know something.

Here's a relatively simple explanation of the math. 

And here is a more complex one:

The power comes from the ability to vary the different scenarios. Using a Monte Carlo simulation the analyst builds a model but substitutes a range of values for any factor that is uncertain. That's what Nate Silver does in the analysis for his blog, as you can see when you read his methodology. (You can read Jim Manzi's book 'Uncontrolled' for a look at the same thing using big data.) Acknowledging and accounting for uncertainty means that you get better results in the long run - as in the submarine search example in the first video above.  

Why weren't we taught it? Two reasons. First, running these simulations (Silver talks about running 10,000 a day, and that was in 2008) takes a lot of computing power, power that has only recently become available. Second, because Bayesian analysis starts with what we think we know, with a greater or lesser degree of certainty, some philosophers of science have argued, for various reasons, that Bayesian analysis failed to take account of the problem of induction: ie, that the only true knowledge comes from deduction. (I admit I am way oversimplifying here.) (Silver has a very interesting chapter on his discussions with Donald Rumsfeld about unknown unknowns). This view is now being rebutted. If you are interested, there's a good and reasonably accessible paper, "Philosophy and the practice of Bayesian statistics" written by Andrew Gelman and Cosma Shalizi available here.


Arctic warming is accelerating

Last week NOAA released its Arctic Report Card: Update for 2012. The news is generally bad.
In 2012:
  • 97% of the Greenland ice sheet's surface melted. In four days.
  • the Arctic sea ice pack melted at alarming rates this summer. See also here.
  • the land and ocean surfaces are darker than normal - which means they are absorbing more sunlight and warming faster than normal.
As the journal Nature quotes one of the report's editors on its website:
The darkening of the surface creates a positive feedback that explains why the Arctic is warming twice as quickly as lower latitudes . . . This is what we call the Arctic amplification of global warming, a phenomenon that was predicted 30 years ago, which we’re now seeing happening in a significant way.
 And arctic fox and lemmings (no symbolism there) populations are dropping.


Tree decorations by the numbers

Just in time for Christmas, a group of math students at the University of Sheffield, in the UK, has calculated the number of decorations, the height of the star, and the length of tinsel and lights you need for the perfect Christmas tree. And they've posted a plug-in formula so you can avoid the math.

I'm not sure what "perfect" means in this context (the screenshot is a still from the Debenham's TV ad, which you can watch here). The formula is based on the height of the tree and defaults to 140 centimeters, which is about 4.5 feet.

For a 4.5 foot tree, they say, you need 29 baubles, a 5.5 inch star, and about 14.5 feet of lights.

For a bigger tree, say 8 feet (my converter says that's 244 cm) you'd need 50 baubles, a 9.5 inch star, and about 25 feet of lights.

Trying this out makes me see some ratios - the star has been set at about 10% of the height of the tree. What happens when you try out the formula? Any thoughts on why height, not surface area or volume, is the basis? Have you ever not put all your ornaments on the tree? Have you even counted them? Send pictures!


NASA's Grail project maps the moon's gravity field

Update, December 14: Here's a link to the NY Times story about the crash of the two satellites into the dark side of the moon, projected for Monday.

Ebb and Flow, two satellites that have been orbiting the moon collecting data about the moon's gravity field, have sent back enough data for NASA to release this video of the gravity field map.

As NASA describes it, the map shows:
an abundance of features never before seen in detail, such as tectonic structures, volcanic landforms, basin rings, crater central peaks and numerous simple, bowl-shaped craters. Data also show the moon's gravity field is unlike that of any terrestrial planet in our solar system.
Why is this important? The moon's surface preserves the record of impacts from other bodies (overgrown or underwater or fractured on Earth). There will be more data and more reports, until the two satellites crash later this month. Read more here.

You can see a map of the earth's gravitational field (it's not round) here.


Useful tip for the weekend

This one is from Jesus Diaz of Gizmodo:
There's a Google Mail feature you have to use. Seriously. You must. Because copying an entire chain of messages after your reply doesn't make any sense when people can scroll down to see all the messages, chained one after the other. What makes sense is to only provide the snippet that you are actually replying to. And that's why you need to do this:
1. Select the text you are replying to in Gmail.
2. Hit the reply button.
3. Boom! Only the selected text will be quoted. Reply at will.
I know. This is common in all mail programs, but most people don't know it exists in Gmail too. I discovered it by chance a long time ago, assuming it would work, but today I discovered that most people don't know about it.
So start using it, please. Pass it around and enjoy the love of your correspondents, who will be grateful forever for your neat replies.
And, in case you think we're all about numbers here, a link to the 10 most often looked-up words in 2012 (according to Merriam-Webster).


Considering the Obvious

Duncan Watts is a principal researcher at Microsoft and former professor of sociology at Columbia who is interested in what we can learn about humans from our networking behavior. I'm looking forward to reading his book "Everything is Obvious* *Once You Know the Answer" about common sense and its, well, weaknesses. His work has implications for marketing, social science research, and social services.

One example is government - we think we can use common sense, Watts says, to solve large social problems.
The problem with common sense is not that it isn’t sensible, but that what is sensible turns out to depend on lots of other features of the situation. And in general, it’s impossible to know which of these many potential features are relevant until after the fact (a fundamental problem that philosophers and cognitive scientists call the “frame problem”).
Nevertheless, once we do know the answer, it is almost always possible to pick and choose from our wide selection of common-sense statements about the world to produce something that sounds likely to be true. And because we only ever have to account for one outcome at a time (because we can ignore the “counterfactuals,” things that could’ve happened, but didn’t), it is always possible to construct an account of what did happen that not only makes sense, but also sounds like a causal story.
Common sense, in other words, is extremely good at making the world seem sensible, quickly classifying believable information as old news, rejecting explanations that don’t coincide with experience, and ignoring counterfactuals. Viewed this way, common sense starts to seem less like a way to understand the world, than a way to survive without having to understand it.
 Here's another interesting Watts column, about making predictions.
At the end of the day, making the right prediction is just as important as getting the prediction right, but it is only at the end of the day that know which prediction was the right prediction.
If this sounds hopeless, it is -- but only if we aspire to a level of certainty about the future that is at odds with the fundamental randomness of the world.  If we acknowledge that randomness, there are still useful predictions we can make, just as poker players who count cards can make useful predictions without ever knowing with 100% confidence which particular card is going to show up next.


An interactive guide to energy use

The journal Nature has published an interactive guide to the world's energy use, available here.
You can find out which countries are using which resources (in 2011). There are some surprises. For example, I knew that mainland China and the US are large consumers of coal and oil energy. But they are also the largest consumers of hydro and renewable energy as well. I expect that's because the largest consumers of energy are going to be the largest consumers regardless of the source. If you think I'm wrong, please let me know via the comments.

You get a sense of the issue in these screenshots:



It's an interactive guide, and I selected a few countries to illustrate what I found. I tried to be reasonably representative of established and emerging economies while keeping the charts small enough to see. (Note - you will have to click through to the Nature page to interact with the data yourself.)


“The Cost Disease: Why Computers Get Cheaper and Health Care Doesn’t” by William J. Baumol and others

We’ve all read many articles recently about the increasing costs of important services, health care chief among them. The rising costs of education, social services, and the arts have rated hand-wringing as well. “The Cost Disease” provides a fascinating counter to this view. In the book, William J. Baumol, an economist at NYU, and his co-authors argue that the rising productivity of manufactured goods (because automation reduces the amount of labor required to produce them) offsets the rising costs of services (whose costs rise because the amount of labor required cannot be reduced). What’s more, he argues, we can afford them and should continue to pay for them.

The cost disease, Baumol argues, is the perception that because costs of services like health care are rising faster than the rate of inflation, they are being priced out of our reach. But this perception, he says, is wrong. The problem is two-fold. It’s not so much the costs themselves that bother us as it is the rate at which costs are increasing. When we look at an average increase in real costs, we forget that it’s an average – and that while some costs increase faster than the average, others decrease slower. The costs of providing education, health care, or arts like opera, dance and music require a lot of labor. And the people providing that labor need to be paid enough to live, and to keep them from moving to other jobs. Baumol notes that he first published the theory in the 1960s – and that the data of subsequent years have confirmed it. For example, he reports, the salaries of health care workers have barely kept up with inflation, and those of employees at colleges and universities did not.

So, Baumol argues, if we think about the economy overall and understand that the unevenness of productivity growth is the source of the perception we will be able to afford increasing costs, even as the services take over a larger section of the economy. (He views that as an effect, not a cause.) The cost of manufacturing will continue to decrease and we will continue to to innovate so the economy will continue to grow. He cautions that because the poor will continue to get poorer, we must make a choice to cut back on some manufacturing and invest in social goods. Baumol attaches some caveats to his prediction, among them the need for wise government policy-making, careful education of the public, and tackling some of the foremost problems we have already created: climate change, the easy availability of dangerous weapons, and, well, our own cupidity.

Baumol discusses cost disease in these contexts as well as in the context of global health care, and those chapters are very interesting. One point is important to note: in health care at least, the quality-adjusted productivity has increased; that is, we're getting more benefits from our care. But when, he says, we look at productivity not adjusted for quality the result more mixed. If we don't do look at productivity alone, we fail to think about how much money must be raised to purchase a product. If this sounds a lot like a cost-benefit analysis it is, but it's an analysis that includes the context of the services. And that exposes a paradox: as Baumol puts it, we want the improvements in health care but don't like the associated costs.

It's when Baumol discusses the hybrid sectors of the economy, such as Research and Development or social services that things get really interesting. In these sectors, the cost of equipment, such as computers to support the work quickly become negligible compared to the labor costs. But the work is heavily labor-dependent: you can't, for example, trust a computer algorithm to come up with the right combination of services, in the right order, to help a family enough to prevent a steep decline into violence or child neglect. This imbalance often leads to poor government-decision making in the name of cost-savings.

But there is cause for hope. In addition to recognizing the cost disease, there are some hybrid sectors of the economy that repay investment. Software and business process services are Baumol's prime examples, as each repays investment two and three times, once when the developing company puts them to work and again when their customers do. Another way of thinking about them is as inputs to other services. This reframing can - and should - be applied to social services as well. A prime example is Steve Rothschild, whose book "The Non Non-Profit" I reviewed here. Rothschild sets out "create economic value from social benefit" as a key value. Doing so is critical, because showing that services create taxpayers from people who otherwise might continue to receive government benefits indefinitely is a compelling argument about efficacy - and for future funding.

"The Cost Disease" is a well-written book, very clear even for non-economists. (If I have one quibble, it's that the small pages mean that the small charts can be very hard to read.) The book should be required reading for anyone interested in public policy.

Image via


The increasing heat of summer

Do summers appear to be getting warmer? You're right, they are. That video animates data showing that temperature extremes are become more frequent in the northern hemisphere. You can see another, equally frightening, video here. You can read the original article here.

Update: There's an amusing take on the unseasonably warm December from Philip Bump here.

Popular Posts