From al-Khwārizmī to Steve Jobs

The sixth and last in a series. See the November 20 entry, “What is algebra?” for the first, the December 13 entry “When did algebra begin?” for the second, the December 19 entry “The  golden age of Arabic mathematics” for the third, the December 25 entry “al-Khwārizmī for the fourth, and the December 30 entry “What is algebra good for?” for the fifth.

JANUARY 5, 2012. History tends to focus on key individuals, when in fact most advances are the cumulative affect of the contributions of many. In the case of algebra, claims that al-Khwārizmī invented algebra are not sustainable. As I have explained in previous articles in this short series, the chain leading to algebra goes back at least to the ancient Babylonians, and to modern eyes Diophantus’s book Arithmetica was clearly a book on algebra. Nevertheless, al-Khwārizmī does deserve the credit for establishing algebra as a major collection of intellectual tools.

He can’t be credited with establishing it as a branch of mathematics, howver, since the mathematicians of the Arabic period did not view the methods they developed as anything other than a set of very valuable practical tools. (Likewise, Diophantus viewed his work as a sophisticated form of arithmetic, as the title of his famous work suggests.) Viewing algebra as a discipline in its own right came later.

Al-Khwārizmī’s greatness is in the same category as Euclid nine centures earlier, as Leonardo of Pisa four hundred years later, or as Steve Jobs in our own time: their impact on society and thence the course of history. None of these four were the original inventors or discoverers of the seminal developments we associate with their names. Their greatness was not one of original discovery – though both Euclid and al-Khwārizmī may well have contributed some of the methods they described in their seminal books and we know from his other works besides Liber abbaci that Leonardo was a first-rate, original mathematician. Rather, all four had the highly unusual ability to take a collection of powerful new ideas and package and present them to society in a manner that made them acceptable to – indeed eagerly sought-after by – a wide range of people. In our present-day society we tend to focus on the priority of discovery and invention, as epitomised by the status we award Nobel Laureates, but initial discovery would be of little value were others not able to take the new knowledge and use it to change society.

Each of Euclid, al-Khwārizmī, and Leonardo (and Steve Jobs) were followed by many others who carried the torch forwards, and they too deserve credit.

Among the hundreds of Arabic mathematicians who helped to develop and spread algebraic knowledge after al-Khwārizmī, several stand out as worthy of special mention. I’ll list a few.

Abū Kāmil The Egyptian born Abū Kāmil Shujāʿ ibn Aslam ibn Muḥammad ibn Shujā (c. 850 – c. 930) was the first major Arabic algebraist after al-Khwārizmī. By all accounts he was a prolific author. There are references to works with the titles Book of fortune, Book of the key to fortune, Book of the adequate, Book on omens, Book of the kernel, Book of the two errors, and Book on augmentation and diminution. None of these have survived. Works that did survive include the Book on algebra, the Book of rare things in the art of calculation, Inheritance by means of algebra, and the Book on surveying and geometry.

The Book on algebra (Kitāb fi al-jabr wa al-muqābala) is arguably Abū Kāmil’s most influential work. It expanded on al-Khwārizmī’s Algebra. Whereas the latter was aimed at the general public, Abū Kāmil wrote more for other mathematicians, assuming familiarity with Euclid’s Elements. He extended the range of polynomials studied beyond al-Khwārizmī to include 8th powers.

Al-Karajī  A century after Abū Kāmil did his work, around 1000 C.E., another major advance in algebra was made by the Persian mathematician and engineer Abū Bakr ibn Muammad ibn al usayn al-Karajī, who lived from c. 953 to c. 1029. His three major works were Al-Badi’ fi’l-hisab (Wonderful on calculation), Al-Fakhri fi’l-jabr wa’l-muqabala (Book of al-Fakhri on the Art of Algebra), and Al-Kafi fi’l-hisab (Sufficient on calculation).

Al-Fakhri  is regarded as one of the key works on the path that led to the final separation of algebra from geometry as a discipline in its own right. Al-Karajī gave a systematic treatment of reducible higher-degree equations. He studied the algebra of exponents, and was the first to state explicitly that the sequence x, x2, x3, … could be extended indefinitely, and likewise the reciprocals 1/x, 1/x2, 1/x3, …

Omar Khayyám Shortly after al-Karajī died, another famous Arab scholar came onto the scene: Omar Khayyám. Although in the West he is better known today as a poet, he was a first rate mathematician.

Al-Khayyám, more fully Ghiyath al-Din Abu’l-Fath Umar ibn Ibrahim al-Nisaburi al-Khayyámi, was born on 18 May, 1048 in Nishapur, Persia (now Iran), and died there on 4 December, 1131. As a young man he studied philosophy, and went on to be an outstanding mathematician and philosopher. By the time he was 25, he had written several books, covering arithmetic, geometry, algebra, and music. His major work in algebra was an analysis of polynomial equations titled Treatise on the Proofs of Algebra Problems.

Al-Khayyám approached mathematics primarily as a geometer, firmly rooted in the Greek tradition. Whereas Abū Kāmil and al-Karajī presented algebra as a method for numerical problem-solving, al-Khayyám viewed it as a tool for theoretical geometers.

al-Samawʾal Several further advances in algebra were made around the mid-twelfth century by a teenager (yes, that’s right, a teenager) called Ibn Yaḥyā al-Maghribī al-Samawʾal, who was born around 1130 in Baghdad. His parents were Jewish, his father a literature scholar and Rabbi from Morocco, his mother from Basra, in Iraq.

Although his initial interest as a child was to become a doctor, al-Samawʾal proved to be a child prodigy in mathematics, and the study of medicine was soon relegated to second place (but not abandoned). He began to study the Hindu methods of calculation when he was thirteen or so. Rapidly finding himself ahead of his teachers, he continued on his own, reading the works of Abū Kāmil, al-Karajī, and others. By the time he was eighteen years old he had read almost all the available mathematical literature. He wrote his most famous treatise, al-Bahir fi’l-jabr (The brilliant in algebra), when he was just nineteen years old.

Mathematicians before al-Samaw’al had begun to develop what contemporary historians have called the “arithmetization of algebra”. Al-Samaw’al was perhaps the first to give this development a precise description, writing that it involved “operating on unknowns using all the arithmetical tools, in the same way as the arithmetician operates on the known.” This can be regarded as a significant step toward the development of modern algebra.

In all, al-Samaw’al is reported to have written 85 books or articles, though most have not survived. He died in Maragha, Iran, around 1180.

Further advances in algebra were made in the Maghreb in the twelfth to fifteenth century, by a highly organized teacher-student network linked to mosque and madrasah teaching. The Maghrebs used abbreviations for both unknowns and their powers and for operations; another innovation in the chain that culminated in the development of modern symbolic algebra in Europe in the 16th century.

* * *

For the next episode in the development and growth of algebra, when the ideas found their way to Europe, see my recent book The Man of Numbers: Fibonacci’s Arithmetic Revolution. (And for a comparison between Fibonacci’s role and that of Steve Jobs, see the companion e-book Leonardo and Steve.)

Acknowledgement
I am greatful to Professor Jeffrey Oaks of the University of Indianapolis for his assistance in the prepartion of the essays in this series. In particular, he supplied me with preprints of his forhtcoming articles for Springer Verlag’s upcoming Encyclopedia of Sciences and Religions (2012): “Mathematics and Islam”, “Arithmetic and Islam”, “Algebra and Islam”, and “Geometry and Islam”, which I drew on heavily. He also commented in detail on a more substantial work from which these essays were abridged.

Steve Jobs remembered

OCTOBER 10, 2011. The fact that I, along with millions of other people, learned of Steve Jobs’ death by way of a text message or email sent to my iPhone indicates just how huge was the Apple co-founder and CEO’s impact on the way many of us go about our business and live our lives. I can’t imagine any other CEO of a large corporation whose death would be so widely mourned.

By chance, the Apple Store in Job’s home town of Palo Alto, where I live, is right across the street (University Avenue) from the Peet’s Coffee Shop where I go for my morning latte. As I left the coffee house the morning after his death was announced, I noticed some flowers and a couple of lighted candled had been left on the sidewalk in front of the Apple Store window. I walked over and took a couple of photos. A day later, that small shrine had grown considerably.

Since Jobs had lived less than a mile from my home, on the Saturday morning I detoured past his (surprisingly modest) suburban-style house on my way to the supermarket, where I found precisely what I had anticipated. A huge outpouring of grief and remembrances.

Again it struck me: this reaction to the passing of a billionaire CEO? I cannot imagine a similar response to the death of Bill Gates, Larry Page, Sergey Brin, Mark Zuckerberg, or any of the other technology elite. Somehow,  Jobs did more than provide people with useful consumer products, he touched their emotions.

To be sure, not everyone is an Apple fan. But to those who are, their attachment to the brand is deep. People tend not to like Apple stuff, rather they LOVE it. And that is surely a tribute to superb design – design both for appearance and for usability.

I can’t say I’ve ever been a fan of anyone or anything, but I am a sucker for great design. When I first saw, and used, a Macintosh (one of the first generation), it quite literally changed my career, and in due course my life. As a mathematician with research interests in computation and a side interest (more of a hobby) in computational number theory, I was using computers long before the personal computer came on the scene. Back then (and this was the 1970s), a computer was a slave that you commanded to do things for you. Even worse, it was a dumb slave, so you had to formulate those commands with extreme care. A single missing or misplaced comma would cause the machine to grind to a halt.

All that changed with the Mac. As a user, you were no longer issuing commands to be acted upon, you were HANDLING INFORMATION. It was your stuff, and you were the one performing the action. Or so it seemed. In reality, you were being fooled by a cleverly designed interface.  But that was the genius of the Mac; it took something intrinsically alien to human beings, computation, and presented it is a way that we find natural and instinctive.

Well, actually, that genius was not Apple’s, rather that of a remarkable group of researchers at the Xerox Palo Alto Research Center (PARC) who had developed this new approach to computing over several years. To Xerox (and many of the researchers at PARC), the goal was to build computers to support office workers. Jobs’ genius was recognizing that, properly packaged and marketed, this approach to computing could turn computers into mass-market consumer products.

When I first started to use a Mac, I realized that there was huge value to be had in viewing reasoning not as a PROCESS of logical deduction to arrive at a conclusion (the classical view of mathematical reasoning), but as  GATHERING INFORMATION in order to reach a decision. The result of that shift in viewpoint first resulted in my book Logic and Information, which was published in 1990, and my entire research career since then has followed on the heels of that book.

Were it not for the Mac, I doubt I would have shifted my research the way I did. It would not have been enough to read about the WIMPS interface (windows, icons, mouse, pointers), or even to watch someone else using it. It was the powerful sensation of DOING it yourself, of physically manipulating items of information, that made all the difference. The Mac was not a device you used. It was something you experienced. And that is a profound difference.

As became clear through his entire career, Jobs had a deep appreciation for the importance of making technology something people experience – not merely use.

By pure chance, just two months before Jobs died, I published a short e-book in which I compared him with one of the greatest innovators of all time, the thirteenth century mathematician Leonardo of Pisa, known more commonly today by the modern nickname Fibonacci. Leonardo changed the world by writing a book that introduced modern arithmetic to the western world.

The stories of what these two men did exhibit remarkable similarities. Leonardo packaged arithmetic and, through the medium of parchment, gave personal computing to the masses. Jobs made personal computing accessible to everyone through the medium of silicon. Neither individual was an inventor. Their genius was taking something alien and complex and making it accessible – and friendly – to all.

For more details on the Jobs – Leonardo connection, see my MAA column “Devlin’s Angle” for August of this year.

Apple homepage October 6, 2011