October 22, 2011
October 22, 2011
Most principals can’t identify or explain what constitutes good teaching, much less help teachers improve, according to a new book.
It’s happened hundreds of times. An audience of principals, superintendents and instructional coaches is shown a short videotape of a classroom lesson and asked to score it from 1 to 5. It would seem straightforward: The teacher is good, bad or somewhere in-between. But invariably, the scores come in all over the map, with high and low in fairly equal numbers.
Having toured the United States with those videotapes, two leaders of the University of Washington’s Center for Educational Leadership conclude that most school leaders can’t identify or explain what constitutes good teaching, much less come up with helpful suggestions for improvement.
“Frankly, this is shocking to consider,” say Stephen Fink and Anneke Markholt, co-authors of Leading for Instructional Improvement: How Successful Leaders Develop Teaching and Learning Expertise and executive and associate directors, respectively, of the center.
“Whether under the guise of academic freedom, local control, or perhaps just simply doing what we have always done, millions of students are taught every day by hundreds of thousands of teachers, supported by thousands of school and district leaders without a clear understanding and agreement on quality practice,” the authors say.
In their view, the major challenge facing public education in America is a widespread lack of expertise. No wonder, they say: While doctors spend between six and 11 years in medical school, principals charged with improving the “highly complex and sophisticated endeavor” of teaching typically study administration for only two years and will spend no more than a semester in an internship. As Miller-McCune reported last year, the National Council for Accreditation of Teacher Education itself has suggested that teachers be trained through “clinical practice,” as doctors are.
Separately from the audience survey, Fink’s center has administered individual tests to 2,000 principals, superintendents and instructional coaches in several hundred school districts since 2001, asking each participant to observe a 20-minute videotape of a math or reading class, assess it in writing and make recommendations for how to improve it.
The principals and their colleagues answered three questions:
• What do you notice about teaching and learning in this classroom?
• What conversation would you want to have with this teacher?
• How, if at all, does this inform your thinking about and planning for professional development?
Based on how well a principal can assess a teacher’s goals and strategies, take stock of how engaged the students are and offer helpful feedback, the center assigns him or her a grade. Out of four possible points, with 4.0 being an “expert instructional leader,” the average score of the school leaders was 1.8.
“It should be deeply insightful for us as policymakers to say that if we’re not focusing on developing the expertise of our teachers and leaders, then we’re off-track,” Fink said.
It’s not a question of simply getting rid of bad teachers, Fink said. “If we’re going to improve the quality of learning for all kids, we have to develop the expertise of those teachers we have in our ranks.”
He admits the center sets a high bar for “expert” scores: “You really have to know your stuff.”
But most don’t. For example, while visiting a class on the novel To Kill a Mockingbird, a principal sees that the teacher is simply grilling students on what’s in the text. In a written evaluation after the class is over, the principal suggests that the teacher ask higher-level questions that would prompt students to analyze and think about what they’ve read instead of merely parroting it back. But as Fink and Markholt note, the teacher likely doesn’t know how to ask those kinds of questions or he or she would already be doing it.
“We have observed this typical exchange between principals and teachers dozens of times,” they write. “If principals want teachers to teach differently — in any way, shape, or form — then they must guide, support and nurture teacher learning just like we expect teachers to do for students.”
To help smarten up schools, the center shows principals how to become like anthropologists taking field notes, “noticing and wondering” and scripting what goes on in the classroom. The center also sends in coaches to lead principals on “learning walkthroughs,” or “instructional rounds.” These are modeled after the medical rounds in which doctors, nurses and medical students discuss hospital patients, case by case…
“Historians have long puzzled over why the Scientific Revolution happened, and why it happened in Europe”
October 22, 2011
Between 1500 and 1700, the European understanding of nature changed dramatically. In astronomy, the ancient geocentric theory of the heavens was succeeded first by Nicolaus Copernicus’s heliocentric reform, then by Johannes Kepler’s elliptical orbits and Isaac Newton’s celestial mechanics. In physics, Aristotle’s qualitative explanation of motion and change was consigned to the dustbin of intellectual history by Galileo Galilei, René Descartes and Newton. In anatomy and physiology, the Roman physician Galen’s mistaken understanding of the body gave way to Andreas Vesalius’s anatomy and William Harvey’s discovery of the circulation of the blood. New instruments—the telescope, the microscope, and the barometer and the air pump (which together showed that air has weight)—opened new worlds. And new forms of organization and communication—the scientific society, the learned journal—developed to encourage and sustain this bustling activity. The changes were so dramatic that, for nearly a century, these developments have been referred to collectively as the Scientific Revolution.
Historians have long puzzled over why the Scientific Revolution happened, and why it happened in Europe. In Intellectual Curiosity and the Scientific Revolution, sociologist Toby Huff addresses both questions. Huff provides an overview of the Scientific Revolution, drawn from reliable secondary literature, along with an idiosyncratic and occasionally puzzling attempt to show why the revolution did not happen in China or the Islamic empires, both of which had rich scientific traditions in the central Middle Ages.
Huff devotes the first part of the book to an account of the invention of the telescope at the beginning of the 17th century. The “discovery machine,” as he terms the instrument, was pointed at the sky in England by Thomas Harriot, who did not publish his observations, and in Italy by Galileo, who announced his early discoveries in spectacular fashion in his Starry Messenger(1610). Galileo and others continued to explore the skies, making new discoveries and quickly reaching consensus about their reality. For Huff, the telescope epitomizes the “infectious curiosity” of the Scientific Revolution: As more and more telescopes became available, discoveries begat further discoveries.
But in China and the Islamic world, the discovery machine failed to catch on. Jesuit missionaries brought the telescope to China and trained Chinese astronomers in its use, but they made no discoveries and did not incorporate the instrument into astronomical practice. The telescope was known in the Mughal and Ottoman Empires, but there too it failed to make a mark in astronomy. Compared with Europe, the world’s other advanced civilizations, confronted with the telescope, evinced what Huff calls a “curiosity deficit.”
In the short second part of the book, Huff examines the institutional context for a European “ethos of scientific curiosity.” In a chapter condensed from his 1993 book, The Rise of Early Modern Science, Huff argues that the legally autonomous corporation, a social institution that had no equivalent in the Chinese or Islamic worlds, played a key role in fostering science. Finally, the third part of the book is devoted to other aspects of the Scientific Revolution: the “infectious curiosity” that produced new discoveries in anatomy, microscopy and pneumatics; and the “grand synthesis” of celestial and terrestrial physics in Newton’s Mathematical Principles of Natural Philosophy(1687). These developments represented a huge accumulation of “intellectual capital,” which Europeans savvily invested, reaping dividends in the form of industrial development and world domination from the 18th century through the 20th. Meanwhile, says Huff, the rest of the world stagnated due to “a deficit in scientific curiosity that seems to have prevailed outside Europe from before the seventeenth century all the way to the end of the twentieth century.”
Curiosity plays a key explanatory role in this book, but, curiously, Huff makes no attempt to explore what early modern Europeans thought about the subject. Historians Hans Blumenberg and Lorraine Daston have traced how, in the late Middle Ages, Europeans took a new view of curiosity: By transforming it from the vice of inquisitiveness into a cognitive virtue, they legitimated scientific inquiry. Unfortunately, Huff does not draw on the work of Blumenberg and Daston. Instead of tracing changes in what curiosity has meant, he assumes it has always been the same thing, and that Europeans just happened to have a surfeit of it, whereas others had a deficit. His attempt to establish this point, though, is flawed: Huff identifies things about which Europeans were curious, and then shows that Chinese and Muslim scholars were not equally curious about the same things. Because India had astronomers, Huff writes, “we can assume” that they would find the telescope “of intrinsic interest”—but he does not explain why that would be the case. Because of this methodological asymmetry, he misses areas in which non-Europeans demonstrated that they were quite capable of curious investigation—natural history, for example.
But Huff is not interested in what non-Europeans were curious about, because it was not modern science. In his account, the “breakthrough” or “march to the modern scientific revolution” appears inevitable. Despite occasional wrong turns onto “garden paths,” European scientists by and large made “progress” toward goals that they could not “resist.” Because Huff sees modern science as the inevitable result of curiosity, he assumes that other sophisticated cultures must have lacked it. The “discovery machine” was like a lighted match tossed into a powder keg; if it fizzled out for Chinese and Islamic scholars, that must have been because their intellectual powder was damp…
October 22, 2011
The Bab el-Mandeb, the strait that separates the Red Sea from the Indian Ocean, has conjoined Africa and Asia for centuries. It likely was the first route taken by Homo sapiens on their journey out of Africa, and the traffic between the Horn and Arabia has continued apace to this day. Men, goods, and ideas have gone back and forth, giving the Arabian Sea a degree of integration and similarity that is obfuscated by the arbitrary taxonomy of modern geography: Africa vs. Asia; the Horn of Africa vs. Arabia vs. South Asia. Only some 3,300 kilometers (about 1,800 nautical miles) separate Mumbai from Djibouti, the extreme range of a series of seaports and islands that dot the Arabian Sea: Massawa, Djibouti, Aden, Berbera, Mogadishu, Socotra, Muscat, Hormuz, Gwadar, Karachi, Mumbai.
The harsh, unforgiving environment is another factor of uniformity, as is the seasonal rhythm of the monsoon winds, which influences cattle migrations, harvests and, back in the era of sailboats, the coming and going of merchants. And there is khat, a shrub grown on the plateaus of Ethiopia and Yemen whose leaves are chewed ubiquitously by the locals for its euphoric properties and for suppressing appetite in times of famine. Given its central location in the Eurasian trade route for at least the past 2,500 years, the region has been continuously exposed to the flux of historical change, to new ideas and technologies passing through. Yet, the ancient states that once existed—the Sabean kingdom, Aksum—passed and were not replaced. Major civilizations are born on the beds of large rivers, but the lands that skirt the Gulf of Aden and the Arabian Sea have run dry, an arid belt stretching from the Nile to the Indus.
The environment has continuously degraded over the millennia, partly as a result of local climate change, partly due to human activity. The wooded hills have given way to rocky crags, the meadows turned into dust bowls. Both shores of the Bab el-Mandeb are among the hottest places on earth. Water is mostly found in the aquifer that is slowly being depleted; the rivers of the rainy season do not reach the sea. For most of the past 1,000 years, the lack of natural resources has allowed only light population density, minimal capitalization, and sporadic political centralization. Nomadic tribalism and sparse settlements have been a dominant form of social organization, with the occasional rise of a monarchic dynasty that never quite had it to evolve into more permanent forms of statehood.
These constraints were never lifted in the postcolonial era. Britain, France, and Italy ruled over the region lethargically and departed between the 1940s and the 1970s, leaving nominally independent states that proved to be of limited sustainability. Djibouti came to live off strategic rent paid by the United States and France. Somalia failed as a state in 1991, and has since eluded attempts to form a centralized polity. Yemen has been in a marginally better situation, using limited oil reserves to maintain a degree of political cohesion, which after thirty years of one-man rule has been worn down by both political and economic forces.
At the dawn of the twenty-first century, the challenges for the countries on the littoral of the Arabian Sea are civil war(s), piracy, radical Islamism, transnational terrorism, and a real risk of environmental and economic failure on both sides of the strait. Yet, its strategic importance as a conduit for maritime trade between Asia and the Mediterranean world is as great as it was when Egyptian pharaohs built a canal between the Nile and the Red Sea. Then, just as today, the lands around the Bab el-Mandeb were as difficult to pacify as the Red Sea was treacherous to navigate. The historical documents found in the Cairo Geniza show that in tenth-century A.D., vessels leaving Egypt sailed in a convoy along the route to India in order to deter a profusion of local pirates.
Islam came early to the region, during the life of the Prophet Muhammad, carried by small groups of refugees when the young Muslim community was still persecuted in Mecca. In the following centuries, Islam settled throughout the Indian Ocean amid the numerous merchant communities that thrived under the aegis of Pax Islamica.
Civil wars, piracy, radical Islamism, transnational terrorism, and a real risk of environmental and economic failure plague this region.
Arab Muslim traders would dominate that commerce for a thousand years, their dhows sailing from the mouth of the Red Sea down the African coast to Malindi, in Kenya; across the Arabian Sea to Mumbai and Goa, in India; through the Strait of Malacca all the way to Canton, in South China. Their activities were regulated by the commercial law of the Sunni Shafi’i school of jurisprudence, which as a result became dominant in the Indian Ocean. Sufi orders expanded to form long-distance networks, providing travelers with trusted local agents and housing facilities. Zheng He, the legendary Chinese navigator of the fifteenth-century, was a Muslim.
Muslim dominance of the maritime commerce in the Indian Ocean continued until the seventeenth century. Following Vasco da Gama’s first trip to India, in 1498, the Portuguese had forced themselves onto that ecosystem but without changing much of its fundamental characteristics. The Portuguese were pirates and petty traders in a world of pirates and petty traders. Western supremacy came later, with the Dutch and the British trading companies. Commercial dominance led to political dominance, and the 1757 Battle of Plassey delivered Bengal to Robert Clive of the East India Company.
Muslim power was waning throughout the Indian Ocean. The Mughal Empire, the world’s richest Muslim polity of the early modern period, a torchbearer for Islam’s secular power, was losing ground in the Indian peninsula to British interests. Soon it was the turn of the Muslim sultanates of the Southeast Asian archipelago (in today’s Indonesia and Malaysia) to pass under the protection of the Dutch and the British companies. The littoral sultanates of the Arabian Peninsula (modern-day Yemen, Qatar, Bahrain, Kuwait, and the United Arab Emirates) would make similar arrangements with Britain in the early nineteenth century. The coast of Muslim Africa was next.
The African continent produced ardent defenders of Muslim sovereignty against European imperial expansion. In 1830, a French expeditionary corps was sent on a whim to invade Algiers. And there stood Abd al-Kadir, the Algerian Sufi shaykh who held the lines against the French Army—not so long before, Napoleon’s Great Army—for seventeen years. The French would have to commit a vast contingent and ravage the country to get him to surrender. They would go on in their African ventures to build a canal between Suez on the Red Sea and Port Said on the Mediterranean, which in a roundabout way delivered Egypt and Sudan to the British.
And there stood Muhammad Ahmad, the Sudanese shaykh of the Samaniyah Sufi order and self-proclaimed Mahdi(“messiah”) who, in 1881, rose in rebellion against Anglo-Egyptian rule. Muhammad Ahmad’s followers—the Ansars—famously massacred a British contingent led by Maj. Gen. Charles Gordon at the 1885 Battle of Khartoum, setting back the British claim over Sudan for thirteen more years. While Abd al-Kadir was religiously moderate—and would spend the rest of his days authoring poetry and religious exegesis—Muhammad Ahmad’s call was already fundamentalist. The ferocity and determination of the Mahdi’s Ansars, attributed to their religious fanaticism, would become the stuff of British colonial lore…