# Don’t Kill Math: Comments on Bret Victor’s Scientific Agenda

By Evan Miller

*October 27, 2012*

One of the most refreshing voices to emerge on the Web belongs to Bret Victor, who, since he left Apple in 2010, has been busy weaving together words, images, and code into compelling visions for computer-assisted creativity and scientific understanding. Among his works over the last two years you will find new “interfaces” for (really, conceptualizations of) algebra, trigonometry, dynamic systems, algorithm design, computer programming, circuit design, game design, animation, scientific publishing, mathematical exposition, and political propaganda. Each represents an impressive level of creativity and an admirable expenditure of effort. Each of these interfaces bears the mark of genius inasmuch as the average practitioner in the relevant field will think of his work differently than he thought of it prior to seeing Victor’s presentation. In particular, the practitioner will think: “I need that.”

The unifying goal of Victor’s work, as he puts it in his Inventing on Principle talk (for all practical purposes, Victor’s manifesto) is to bring people into closer communion with their creative ideas. I personally applaud this goal, and would be hard-pressed to find anyone who is against it (who can oppose human creativity?).

And yet I feel compelled to express serious concern and reservation about a particular aspect of Victor’s agenda, namely, his dismissive attitudes toward analytic methods in the sciences. His attitudes are summarized in the web page called, without subtlety, Kill Math. It collects some of Victor’s thoughts on symbolic manipulation (i.e. algebra) as well as some demonstrations for what he believes to be superior alternatives (i.e. computer simulations).

My present purpose is to come to the defense of analytic methods, and to explain why I think they should generally be preferred to other methods. Before mounting this defense, let me review Victor’s case against old-fashioned analysis.

The thrust of Victor’s argument is as follows: the fruits of the quantitative sciences are codified in symbolic equations, which, like the rhythms of Latin poetry, are accessible to only to an elite few. It is wrong that this knowledge is restricted to a small subset of humanity (first argument), but this wrong can be corrected by creating computer simulations which will allow a person without mathematical ability to gain an intuition for the system being studied by manipulating an interactive interface (second argument). Furthermore, these interactive interfaces can and should replace traditional analytic methods for practicing scientists and engineers because interactive interfaces convey deeper understanding than their analytic counterparts (third argument).

I do not take issue with Victor’s first argument — that it is wrong that only a small subset of humanity can understand equations well enough to make concrete predictions. But I think Victor is misguided in his second and third arguments, when he advocates simulation as being the best predictive tool for layman and scientist alike.

In particular, Victor gives analytic methods rather short shrift when it comes to comprehending the observed world. For reasons I will explicate below, analytic methods are — and should be — considered a first choice among scientists and engineers when attempting to understand anything observable and quantitative. And because analytic methods offer a fundamentally deeper understanding of phenomena than simulations do, I believe that Victor’s time would be better spent *making analytic methods accessible to the average person* rather than attempting to replace analytic methods wholesale with computer simulations, no matter how mesmerizing and seductive the kaleidoscopic gyrations of the latter may be.

For the practicing scientist, the chief virtue of analytic methods can be summed up in a single word: *clarity*. An equation describing a quantity of interest conveys *what is important in determining that quantity* and *what is not at all important*.

To take an example, consider the universal law of gravitation, which describes the gravitational force between two bodies:

\[ F = \frac{GMm}{r^2} \]A person with a minimal education in physics can quickly surmise:

*What’s important in determining F:* The product of the two masses; the square of the distance between them.

*What’s not important:* Whether the first body has more mass than the second body, or vice versa; whether the objects are moving, spinning, or stationary; the possible presence of a brick wall in between the two bodies; an electric charge on either body, or an electric current running through them; and everything else we might think of.

Or consider the analytic solution to Victor’s skateboard problem:

\[ sin(\theta) = \frac{r}{h} \]*What’s important in determining \(\theta\):* The ratio of \(r\) to \(h\).

*What’s not important:* The speed of the skateboard, the size of \(r\) in absolute terms, the size of \(h\) in absolute terms, the color of the grass, and everything else we might think of.

To the trained eye, equations yield *confident understanding* and *memorable insights*. If something is not in the equation, it simply doesn’t matter. (The importance of an equation has as much to do with what is absent from the equation as with what is in it.) Similarly, if two variables appear in the equation *only as a product* or *only as a ratio*, then *only the product (or the ratio) matters*.

Of course, an equation’s *dramatis personae* is only the beginning of the play. We can then ask questions such as: can the quantity of interest ever be zero? Can it ever be negative? If a particular variable doubles, how is the quantity of interest affected, and *what does that effect depend on?* Analytic methods (in particular, partial derivatives) tell us exactly how one variable can amplify or diminish the effect of a second variable on a third variable. They describe with great precision the rich set of interactions that lie beneath the surface.

Contrast this situation to the “intuitive” understanding one is supposed to gain by playing with a Victor-style computer simulation. One might make “discoveries”, but one is never certain:

Does this “discovery” apply to all parameter choices?

What is the actual

*quantitative content*of this discovery? If some relationship appears to hold — is this relationship*approximate*or*exact*? Numbers might go “up together” or “down together”, but is the underlying relationship*linear*,*exponential*,*periodic*, or what? If*no*relationship appears to hold — is that because I can’t see it, or because it’s not there?Am I missing a more fundamental relationship in this simulation? Is there a product, ratio, or other function of parameters that is more important than the parameters considered individually?

Victor addresses the first concern in his essay, Up and Down the Ladder of Abstraction. The problem described in this essay — an algorithm for turning a car that has driven off the track — is not immediately amenable to analytic methods (at least to my eyes), so it is an excellent showcase for computer simulation. And yet it suffers from the same shortcomings that all computer simulations suffer from, namely:

*The curse of dimensionality.*“Sweeping” a parameter space is a neat trick for finding an optimum, used to great effect in The Ladder of Abstraction as well as Victor’s video, Interactive Exploration of a Dynamical System. However, the trick becomes impractical as soon as there are more than a handful of parameters. (The amount of computation required to sweep a parameter space grows exponentially with the number of parameters.) The Ladder of Abstraction cannot have more than a few rungs.*No quantitative content.*I have no idea what I am supposed to take away from the picture at the top of the ladder. I might describe a “strip of stability” and an “inlet of uncertainty” in the final picture, but this is an exercise in classification, akin to geography, zoology, or philately.*No insight into possible relationships between parameters.*Can I reduce the provided parameters to a more fundamental quantity of some sort? There is no way of telling from the simulation.

As with interactive displays at science museums, the colorful process of “discovery” feels engaging and rewarding, but the actual insights tend to be tenuous and imprecise. For this particular problem domain, that may be the best one can do: build up pictures and hope for a semblance of understanding. But the absence of analytic insight should be mourned, for the same reason a sailor should curse if he is forced to navigate without a chart.

Suppose for a moment that the list of insights derived from simulation were identical in precision and quantity to those derived from analysis. I would argue that analytic methods should still be preferred wherever they are available. Analysis lets you ask the basic question of all scientific inquiry: “Why?” On what assumptions does this insight depend? When will it hold up, and when will it break down? Is it a logical consequence of something I knew previously? Simulation can demonstrate the *what* and *how*, but never elucidate the *why*. Analysis connects disparate insights together through the structure of mathematics.

Analytic methods are by no means the one true path to scientific enlightenment. They can be used speciously, that is to say, without empirical justification, and in many situations, they fail to produce any insight. *However, more than Victor would have his readers believe, analytic methods are often an appropriate and illuminating tool for messy, “real-world” problems.*

What Victor neglects to mention in his essays is that there are highly developed analytic methods that can solve problems *very close to, if not identical to* the problems that he describes as requiring simulation. The analytic solutions, when present, carry the usual benefits: clarity, confident understanding, and memorable insights.

In particular, I would like to draw attention to three analytic tools that would go a long way toward solving the problems posed in Victor’s essays:

Lagrange polynomials. In Simulation as a Practical Tool, Victor drew the curved wall so it “looked good”, but it seems one could construct something awfully close with just a few polynomial or sinusoidal terms.

Polar coordinates. Modeling the wall as a parametric curve or as a function in polar coordinates, one could start to solve for properties in the most general case, rather than being chained to the shapes of particular walls.

The calculus of variations and optimal control theory: These are tools for finding entire functions that optimize a value of interest; they are commonly used to solve problems quite close to the car-driving problem.

The reader will forgive me for not playing the role of teacher and working out examples of each of these. That is not the purpose of the present essay. I merely wish to point out that Victor fails to exhaust all the possible analytic approaches before proclaiming they should be abandoned in favor of his preferred method of simulation.

And here, I think, is where Victor made a mistake. He speaks grandly of how with simulation, “The conditions of a problem do not need to be contrived or compromised for a convenient symbolic representation.” *But compromises and contrivances have been at the heart of all scientific discovery since Bacon.* Approximating the planets as spheres, electrons as point-charges, nuclei as point-masses and space as a vacuum are all “contrivances” with “a convenient symbolic representation” that have yielded concise descriptions — more importantly, precise predictions — about the universe we inhabit.

In the physical sciences, approximations are part and parcel with analytic methods. In my mind, the role of simulation — for both the scientist and the layman — should be to show us where our analytic approximations are useful, and where they are not useful. Minds like Victor’s should be focused not on eliminating analysis from the process of inquiry, but on making analysis easier, more intuitive, and more concrete — ideally, allowing mathematical novices to ply even the dark arts of Lagrange polynomials and the calculus of variations. Simulation facilities should serve only as a sort of back-up method for those lamentable situations where the fog is too thick to see anything clearly, that is to say, where analytic methods fail to enlighten.

**

A tangent to Bret Victor’s “Kill Math” project is something that he calls reactive documents. In his own words, a reactive document is one in which “The reader can play with the premise and assumptions of various claims, and see the consequences update immediately.” As an example, there is a paragraph describing the number of state parks that must be shut down in California assuming that a certain tax is levied; as the reader changes the size and structure of the tax according to his whim, the stated number of parks to be shuttered changes in concert.

It is an interesting concept. And yet the provided example represents, I think, the worst of Victor’s anti-equation ideology. There are five independent controls which one may click and drag around. But doing so is merely groping around in the dark — one tries this, tries that, and tries the other, and at the end of it, tries to remember the effect that each control actually exerted on the outcome, and (God help you) whether this effect was distorted by the position of any of the other controls.

A complete and rigorous understanding of all five of those numbers could be gained by displaying a simple equation that any high-school student can read and understand:

(Parks shut down) = 218 parks - ( (Vehicle tax) * (Fraction charged) * 28 million vehicles + (Income tax) * (Fraction charged) * 14 million taxpayers + (104 million potential visitors) * (Fraction who attend given Admission price) * (Admission price) ) / ($2.2 million per park)

Here we have a perfect case for analytic methods — where all of the interconnections between variables become perfectly visible to any high-schooler’s eye — but, fantastically, Victor prefers to embed a five-parameter simulation in the document rather than affright his reader with the sight of addition and multiplication. I am sure that a person of Victor’s ingenuity could make the mechanics of the above equation engaging and comprehensible to the lay reader. But rather than illuminate and explain the equation’s inner workings, he has chosen to hide them away.

Something that quickly becomes clear from examining the equation — but not from examining particular inputs and outputs — is that this miniature simulation includes an important contrivance:

Fraction who attend given Admission price

The curious reader who digs into Victor’s source code will find this snippet:

// fake demand curve
model.newVisitorCount = model.oldVisitorCount * Math.max(0.2,
1 + 0.5*Math.atan(1 - averageAdmission/model.oldAdmission));

To paraphrase the pamphleteer, “Behold the convenient appearance of the arc-tangent!” A crucial part of the analysis has been fabricated out of whole cloth. We are led to this discovery only because the equation encourages us to ask the question, “Why?”. In contrast, had we focused only on the knobs, switches, and output, we would never even think to question the underlying assumptions of the model. The simulation pats the reader on the head and tells him not to worry about the details.

I will conclude with a passage from Nikola Tesla, the prolific inventor who briefly worked for Thomas Edison. He described Edison’s work habits thus:

[Edison’s] method was inefficient in the extreme, for an immense ground had to be covered to get anything at all unless blind chance intervened and, at first, I was almost a sorry witness of his doings, knowing that just a little theory and calculation would have saved him 90% of the labor. But he had a veritable contempt for book learning and mathematical knowledge, trusting himself entirely to his inventor’s instinct and practical American sense.

It is too early to tell whether Bret Victor will one day earn a proper comparison to Edison in scope of invention and influence, but “Inventing on Principle” leaves no doubt that his personal agenda is to make Edisons of us all. That is a laudable goal, but it must be qualified. If Victor were to embrace “just a little theory and calculation” in his scientific agenda, he would help us not merely to survey the universe of possible worlds via simulation, but to see the underlying structure of those possibilities with a level of clarity afforded only by analysis.

*You’re reading evanmiller.org, a random collection of math, tech, and musings. If you liked this you might also enjoy:
*

*Get new articles as they’re published, via Twitter or RSS.*

*Want to look for statistical patterns in your MySQL, PostgreSQL, or SQLite database? My desktop statistics software Wizard can help you analyze more data in less time and communicate discoveries visually without spending days struggling with pointless command syntax. Check it out!*

Back to Evan Miller’s home page – Follow on Twitter – Subscribe to RSS