Friday Links

I’m worried about losing a feature of virtual learning: our ability to turn off our Zoom cameras, our power to shut down the gaze. In 2020, I was anxious about teaching a special topics course on makerspaces virtually -- a class that is centered on shared tools, hands-on building and in-person collaborations. Fast-forward to 2021, and I am trying to imagine what it would look like to turn video off in a face-to-face classroom.

Having a virtual classroom with the ability to turn off our cameras offered a generative, unusual sweet spot for learning. It’s an environment where students were not only together but also alone. It’s an environment where students were supported but also weren’t being observed by their instructor or peers -- one where we could take a collective exhale from the performative demands of the classroom with a simple click of the “stop video” button.

Since the pandemic began, about 1 in 434 rural Americans have died of covid, compared with roughly 1 in 513 urban Americans, the institute’s data shows. And though vaccines have reduced overall covid death rates since the winter peak, rural mortality rates are now more than double urban rates — and accelerating quickly.

But the truth is that Venus Cloacina was probably not the patroness of privies that Swift and his contemporaries imagined. In fact, the famous Roman sewers she patronized were probably not really sewers, at least not in the sense in which we use the term today. The Cloaca Maxima was more of a massive storm drain, used to direct rainwater out of the streets and into the Tiber, and most Roman toilets were probably cesspits, unconnected from the sewer.

In fact, the famous Roman sewers she patronized were probably not really sewers, at least not in the sense in which we use the term today. The Cloaca Maxima was more of a massive storm drain, used to direct rainwater out of the streets and into the Tiber, and most Roman toilets were probably cesspits, unconnected from the sewer.

By withdrawing U.S. troops from Afghanistan, the administration of President Joe Biden sought to create a sense that the United States’ string of exhausting and counterproductive interventions in the Middle East and South Asia was coming to an end. But the truth is more sobering. For all its commitments to end “forever wars,” the administration has given no sign that it is preparing to pivot away from the use of military force to manage perceived terrorist threats. Its ongoing counterterrorism policy review appears to be focused more on refining the bureaucratic architecture around drone strikes and other forms of what the military refers to as “direct action” than on a hard look at the costs and benefits of continuing to place military force at the center of U.S. counterterrorism policy.

The problem, according to his provocative argument, is not the war’s brutality but its relative humanity. Moyn does not at all advocate a return to brutal methods or so-called total war, but he does suggest that in vilifying torture, reducing casualty counts, and otherwise focusing on how the United States conducts hostilities, lawyers and advocates have stunted public criticism and diverted energy from the peace movements that might otherwise bring it to an end.

Moyn sees precisely this dynamic at work in the war on terror, especially the years that immediately followed the 9/11 attacks. Humane’s account of this period is in many ways the emotional core of the book. There is some irony in this line of argument, in that Bush’s response to the attacks is remembered more for its brutality than for respecting humanitarian protections: the era’s totemic images remain those of shackled detainees in orange jumpsuits at the makeshift U.S. detention facility in Guantánamo Bay, Cuba, and of prisoners suffering vicious torture at the hands of U.S. service members at the Abu Ghraib prison in Iraq. Nevertheless, Moyn argues, the administration’s abuses need to be viewed alongside the reaction they provoked. Scholars, lawyers, and advocates rallied in protest. They flooded the courts with filings, took their cases to international bodies, and worked passionately to close legal loopholes to make sure such things never happened again.

In so doing, Moyn intimates, they may have missed the forest for the trees. Yes, they secured a combination of U.S. Supreme Court decisions, executive orders, and new statutes that reined in torture. But they did little or nothing to address the underlying conflicts in which the torture took place. Why didn’t the same lawyers who shook with fury in the face of custodial abuse harness the same energy to oppose the wars that created a pretext for it?

Herein lies my problem. If we take only the economic perspective we are guilty of capitalist realism, of failing to imagine an alternative to inequalities. But if we take only the latter perspective, we are guilty of at best wishful thinking and at worst recklessly endangering the livelihoods of the worst off.

In the same way that electricity went from a luxury enjoyed by the American élite to something just about everyone had, so, too, has fame, or at least being known by strangers, gone from a novelty to a core human experience. The Western intellectual tradition spent millennia maintaining a conceptual boundary between public and private — embedding it in law and politics, norms and etiquette, theorizing and reinscribing it.

Even with identical credentials, first-generation graduates have more trouble getting jobs than their better-coached and -connected classmates, according to new research by scholars at Michigan State University and the universities of Iowa and Minnesota.

Throughout his adventures, William Dampier jotted down meticulous observations of the natural world while his shipmates pillaged, plundered, and raided just a few miles away. Caribbean scholar John Ramsaran quotes one scholar, who imagines Dampier “writing up his journal, describing a bunch of flowers, or a rare fish, in the intervals between looting a wine-shop or sacking a village.”

In the pages of his notebook, Dampier expressed a great curiosity about the world—and a great keenness for eating basically any animal he came across. This included shark (which his men ate “very savorily”), wallaby (a “very good Meat,” similar to raccoon), flamingo, and many, many sea turtles.

Overall, about 63 percent of virtual for-profit schools were rated unacceptable by their states in the latest year for which data was available, according to a May report by the University of Colorado’s National Education Policy Center (NEPC).

Friday Links

“The key is the data that I have in my possession. Data is not clean of the sins of the past.”

Earlier in the pandemic, fully virtual students were paired with teachers at their home schools who guided them through a full day of classes — from math to gym — often alongside their pre-pandemic classmates.

But this year, after the state prohibited that kind of remote learning, San Diego launched a standalone virtual academy with its own virtual teachers. Students get some live instruction and teacher check-ins, then spend the rest of the time doing work on their own.

Another change? The level of interest. Last year, 44% of students ended the year online. But so far, less than 1% have chosen the virtual academy, though the district is still working through applications.

“Our projections consistently foresee a positive future for wild cacao in Peru as the current suitable area is expected to largely remain suitable and to further expand in the future,” explain Ceccarelli and her colleagues.

The research team cautions that despite the positive findings, chocolate lovers shouldn’t celebrate too soon. Even if wild cacao does fare well in the forests of a warmer world, that does not mean that it would grow well enough under cultivation to replace domestic cocoa as a cash crop. And their model didn’t try to incorporate other possible knock-on effects of a changing climate, such as increasing pests and disease.

These concerns about credibility are overblown. Credibility is whether others think you mean what you say in a given situation. It is context-specific; because circumstances can vary widely, credibility is judged on a case-by-case basis. How a state has behaved in the past is an important component of its credibility, but it is not the only one. The Biden administration’s withdrawal from Afghanistan will affect these calculations the next time the United States commits to an extraordinarily costly venture in a place not vital to the country’s core security interests, but it is unlikely to sabotage U.S. credibility writ large.

Credibility is different from reputation, however. If credibility is whether others think your deeds will match your words, reputation is what others think of you in the first place. On this count, the consequences of the U.S. withdrawal will likely be considerably greater. The pullout has been messy and chaotic: the Taliban took control of Afghanistan more quickly than the Biden administration had publicly predicted, and members of a regional branch of the Islamic State (or ISIS) launched a deadly bomb attack at the Kabul airport as Afghan and foreign citizens attempted to evacuate the country.

Reputations are, in essence, beliefs—they exist only in the minds of others. The formation and maintenance of reputations therefore has an important psychological component, and the psychological evidence is relatively clear that observers pay attention to past actions when predicting future behavior. Experimental studies I conducted with Jonathan Renshon and Keren Yarhi-Milo on both members of the public and elite decision-makers found that when asked to assess a country’s resolve in a foreign policy crisis, observers consistently focus on behavior in previous disputes, even when presented with countervailing information about capabilities and interests. The question is not simply whether allies and adversaries will doubt U.S. resolve because Washington backed down from a 20-year stabilization effort in Afghanistan. It is whether their existing doubts will grow stronger than they would have had the United States continued fighting.

Americans are exhausted by educational disruption. That's the message of a new survey by the journal Education Next. According to their poll, support for virtually every proposed innovation has dropped since 2019 (a few items were flat). That includes both highly popular measures, such as annual testing, and more controversial policies, including charter schools. 

Prior to the 1960s and 1970s, writes Ensmenger, computer programming was thought of as a “routine and mechanical” activity, which resulted in the field becoming largely feminized. The work wasn’t particularly glamorous; “coders” were “low-status, largely invisible.” They were only supposed to implement the plans sketched out by male “planners.” Ensmenger quotes one female programmer, who recalled, “It never occurred to any of us that computer programming would eventually become something that was thought of as a men’s field.”

Google readily (and ironically) admits that such ubiquitous web tracking is out of hand and has resulted in “an erosion of trust... [where] 72% of people feel that almost all of what they do online is being tracked by advertisers, technology firms or others, and 81% say the potential risks from data collection outweigh the benefits.”

“Research has shown that up to 52 companies can theoretically observe up to 91% of the average user’s web browsing history,” a senior Chrome engineer told a recent Internet Engineering Task Force call, “and 600 companies can observe at least 50%.”

About Me

Developer at Brown University Library specializing in instructional design and technology, Python-based data science, and XML-driven web development.

Tags