Friday Links

A surcharge is added on top of a ride fare if a passenger takes longer than two minutes to enter a vehicle after it arrives. Uber added wait time fees in some US cities in 2016 before expanding the policy across the country.

These fees, however, discriminate against people with mobility or visibility issues, federal prosecutors argue.

To David Graeber, it was a matter of plain fact that things did not have to be the way they were. Graeber was an anthropologist, which meant it was his job to study other ways of living. “I’m interested in anthropology because I’m interested in human possibilities,” he once explained. Graeber was also an anarchist, “and in a way,” he went on, “there’s always been an affinity between anthropology and anarchism, simply because anthropologists know that a society without a state is possible. There’s been plenty of them.” A better world was not assured, but it was possible — and anyway, as Graeber put it in Fragments of an Anarchist Anthropology, “since one cannot know a radically better world is not possible, are we not betraying everyone by insisting on continuing to justify and reproduce the mess we have today?”

Eastern Gateway Community College spent the past few years riding a tidal wave of enrollment growth, but its rapid rise has now put the college’s accreditation at risk.

Richard Kronick, a former federal health policy researcher and a professor at the University of California-San Diego, said his analysis of newly released Medicare Advantage billing data estimates that Medicare overpaid the private health plans by more than $106 billion from 2010 through 2019 because of the way the private plans charge for sicker patients.

Kronick called the growth in Medicare Advantage costs a “systemic problem across the industry,” which CMS has failed to rein in. He said some plans saw “eye-popping” revenue gains, while others had more modest increases. Giant insurer UnitedHealthcare, which in 2019 had about 6 million Medicare Advantage members, received excess payments of some $6 billion, according to Kronick. The company had no comment.

In other words, does long-COVID occur because people survived a really bad illness? After all, studies from the before times show that 2/3rds of people who survive an ICU stay have persistent symptoms. But is COVID uniquely bad – uniquely harmful to a variety of organ systems, out of proportion to severity of illness?

In the early 20th century, millions of chickens wore rose-coloured eyeglasses so they wouldn’t turn into cannibals.

Several strategies WHO endorsed — educating people about ageism, fostering intergenerational contacts, and changing policies and laws to promote age equity — are being tried in the United States. But a greater sense of urgency is needed in light of the coronavirus pandemic’s shocking death toll, including more than 500,000 older Americans, experts suggest.

In October, a group of experts from the U.S., Canada, India, Portugal, Switzerland and the United Kingdom called for old age to be removed as one of the causes and symptoms of disease in the 11th revision of the International Classification of Diseases, a global resource used to standardize health data worldwide.

Aging is a normal process, and equating old age with disease “is potentially detrimental,” the experts wrote in The Lancet. Doing so could result in inadequate clinical evaluation and care and an increase in “societal marginalisation and discrimination” against older adults, they warn.

Rules such as 'cause no harm to humans' can't be set if we don't understand the kind of scenarios that an AI is going to come up with, suggest the authors of the 2021 paper. Once a computer system is working on a level above the scope of our programmers, we can no longer set limits.

Friday Links

One of the most interesting things about computers is the way they hold a mirror up to us, a mirror that reflects not nature but our conception of nature. Attempts to create artificial intelligence force us to grapple with questions about our own natural intelligence — what it is, where it comes from, what its limits are. Programs for natural language processing raise hard questions about the origins and character of natural language. And in our attempts to create virtual worlds with virtual inhabitants — the metaverse, for instance — we confront profound questions about our being: What is a world? What does it mean to be in a world? What’s the relationship of mind and body? As Michael Heim wrote in his 1991 essay “The Erotic Ontology of Cyberspace,” collected in the book Cyberspace: First Steps, “cyberspace is a metaphysical laboratory, a tool for examining our very sense of reality.”

Ken Adam, the set designer for many of the early films, paralleled Bond’s prewar traditionalism with an affinity for classical architecture. The British intelligence agency where Bond works is usually housed in a classical building. Likewise, Bond’s apartment, of which we catch a glimpse in Dr. No, is located in a Georgian-style building. The architecture that Bond inhabits resembles a kind of self-assured, institutional power, comparable to the status of the British before WWII.

Over 60% of all jobs in the U.S. typically require a high school education or less and pay accordingly. And for at least several decades, data from the New York Federal Reserve has shown what all college faculty and many parents and workers have long known: A large minority of bachelor’s degree holders are consistently underemployed, working in jobs requiring less formal education than they received. The educational attainment levels of the population, which are at historic highs, far exceed what the labor market requires. This is the real economy.

The format of this announcement is interesting. It tries desperately to strike a positive tone, with several paragraphs citing specific examples of the benefits of facial recognition and only gesturing to the potential for harm and abuse. I am glad Facebook sees so many great uses for it; I see them, too. But I wish the company were anywhere near as specific about the acknowledgements of harm. As it is presented, it looks defensive.

These presentations had the familiar vibe of an overly-ambitious video game reveal.

Facebook may be entirely invested in these efforts, but we have not yet come to terms with what the company represents today. It can parade its research projects and renderings all it wants, but the fact of the matter is that it is a company that currently makes the de facto communications infrastructure for much of the world, and frequently does a pretty poor job of it. Is this really the figurehead we want for the future of personal computing? Is this a company that we trust?

If you are like me, you may be thinking that it is sort of weird that this advertising company thinks it can create the mixed-reality future in software, hardware, and developer tools.

Let me start off by saying that MMORPGs are not typically good games....While many other kinds of games could be said to be based around "go here and kill things", the MMO needs to do it in a flatly consistent way due to the online shared world – hundreds of players running around the same sort of things – so they cannot be presented in a particularly interesting way outside of some flavour text.

Stiles argues that “the now-familiar trope of the mad scientist…traces its roots to the clinical association between genius and insanity that developed in the mid-nineteenth century.” In the early 1800s, the Romantics saw the condition as a “mystical phenomenon beyond the reach of scientific investigation.” The Victorians took a more detached and critical approach. “Rather than glorifying creative powers, Victorians pathologized genius and upheld the mediocre man as an evolutionary ideal,” Stiles writes. “All aberrations from the norm could be seen as pathological, including extreme intelligence.”

Friday Links

More experience and equipment are required to create a cup of Cometeer coffee than any other halfway plausible cup of coffee, literally ever. (You can tell the MIT, Apple, and Tesla scientists and Princeton-educated coffee-masters did a good job of brewing your coffee with proprietary machinery in Gloucester, Mass., flash-freezing it in liquid nitrogen, packing it in dry ice, and shipping it to your home for you to store in your freezer, because it tastes like you spent five minutes making it yourself using techniques that predate the advent of antibiotics.)

The famous legal phrase caveat emptor (“let the buyer beware”) entered common law because of a 17th century dispute over a magic bezoar stone.

There’s a lot of creepy consumerism here. Beyond the details that come out in McMillan Cottom’s description, the online tour of the house reveals stylized décor that romanticizes a simpler time when one could easily travel to distant vacation destinations with children. The kids’ room includes a climbing rope so homeowners will be well-prepared to exercise their children the next time public parks close and kids without backyards or indoor climbing equipment are again relegated to streets and sidewalks and parking lots for their outdoor recreation. These houses are clearly marketed to people who have the means (or hope for the means) to get as much of their skin out of the future pandemic game as possible.

It is hard to know who to sympathize less with: the person who claimed to be a certified psychic who could remove an ex-girlfriend curse (among the most powerful curses known to man) for just $5,100, or the person who claims he believed that person and gave her money to uncurse him. Ideally, no one would win this case. Unfortunately (unless it settles), someone probably will.

If such a high rate of resignations were occurring at a time when jobs were plentiful, it might be seen as a sign of a booming economy where workers have their pick of offers. But the same labor report showed that job openings have also declined, suggesting that something else is going on. A new Harris Poll of people with employment found that more than half of workers want to leave their jobs. Many cite uncaring employers and a lack of scheduling flexibility as reasons for wanting to quit. In other words, millions of American workers have simply had enough.

Friday Links

I’m worried about losing a feature of virtual learning: our ability to turn off our Zoom cameras, our power to shut down the gaze. In 2020, I was anxious about teaching a special topics course on makerspaces virtually -- a class that is centered on shared tools, hands-on building and in-person collaborations. Fast-forward to 2021, and I am trying to imagine what it would look like to turn video off in a face-to-face classroom.

Having a virtual classroom with the ability to turn off our cameras offered a generative, unusual sweet spot for learning. It’s an environment where students were not only together but also alone. It’s an environment where students were supported but also weren’t being observed by their instructor or peers -- one where we could take a collective exhale from the performative demands of the classroom with a simple click of the “stop video” button.

Since the pandemic began, about 1 in 434 rural Americans have died of covid, compared with roughly 1 in 513 urban Americans, the institute’s data shows. And though vaccines have reduced overall covid death rates since the winter peak, rural mortality rates are now more than double urban rates — and accelerating quickly.

But the truth is that Venus Cloacina was probably not the patroness of privies that Swift and his contemporaries imagined. In fact, the famous Roman sewers she patronized were probably not really sewers, at least not in the sense in which we use the term today. The Cloaca Maxima was more of a massive storm drain, used to direct rainwater out of the streets and into the Tiber, and most Roman toilets were probably cesspits, unconnected from the sewer.

In fact, the famous Roman sewers she patronized were probably not really sewers, at least not in the sense in which we use the term today. The Cloaca Maxima was more of a massive storm drain, used to direct rainwater out of the streets and into the Tiber, and most Roman toilets were probably cesspits, unconnected from the sewer.

By withdrawing U.S. troops from Afghanistan, the administration of President Joe Biden sought to create a sense that the United States’ string of exhausting and counterproductive interventions in the Middle East and South Asia was coming to an end. But the truth is more sobering. For all its commitments to end “forever wars,” the administration has given no sign that it is preparing to pivot away from the use of military force to manage perceived terrorist threats. Its ongoing counterterrorism policy review appears to be focused more on refining the bureaucratic architecture around drone strikes and other forms of what the military refers to as “direct action” than on a hard look at the costs and benefits of continuing to place military force at the center of U.S. counterterrorism policy.

The problem, according to his provocative argument, is not the war’s brutality but its relative humanity. Moyn does not at all advocate a return to brutal methods or so-called total war, but he does suggest that in vilifying torture, reducing casualty counts, and otherwise focusing on how the United States conducts hostilities, lawyers and advocates have stunted public criticism and diverted energy from the peace movements that might otherwise bring it to an end.

Moyn sees precisely this dynamic at work in the war on terror, especially the years that immediately followed the 9/11 attacks. Humane’s account of this period is in many ways the emotional core of the book. There is some irony in this line of argument, in that Bush’s response to the attacks is remembered more for its brutality than for respecting humanitarian protections: the era’s totemic images remain those of shackled detainees in orange jumpsuits at the makeshift U.S. detention facility in Guantánamo Bay, Cuba, and of prisoners suffering vicious torture at the hands of U.S. service members at the Abu Ghraib prison in Iraq. Nevertheless, Moyn argues, the administration’s abuses need to be viewed alongside the reaction they provoked. Scholars, lawyers, and advocates rallied in protest. They flooded the courts with filings, took their cases to international bodies, and worked passionately to close legal loopholes to make sure such things never happened again.

In so doing, Moyn intimates, they may have missed the forest for the trees. Yes, they secured a combination of U.S. Supreme Court decisions, executive orders, and new statutes that reined in torture. But they did little or nothing to address the underlying conflicts in which the torture took place. Why didn’t the same lawyers who shook with fury in the face of custodial abuse harness the same energy to oppose the wars that created a pretext for it?

Herein lies my problem. If we take only the economic perspective we are guilty of capitalist realism, of failing to imagine an alternative to inequalities. But if we take only the latter perspective, we are guilty of at best wishful thinking and at worst recklessly endangering the livelihoods of the worst off.

In the same way that electricity went from a luxury enjoyed by the American élite to something just about everyone had, so, too, has fame, or at least being known by strangers, gone from a novelty to a core human experience. The Western intellectual tradition spent millennia maintaining a conceptual boundary between public and private — embedding it in law and politics, norms and etiquette, theorizing and reinscribing it.

Even with identical credentials, first-generation graduates have more trouble getting jobs than their better-coached and -connected classmates, according to new research by scholars at Michigan State University and the universities of Iowa and Minnesota.

Throughout his adventures, William Dampier jotted down meticulous observations of the natural world while his shipmates pillaged, plundered, and raided just a few miles away. Caribbean scholar John Ramsaran quotes one scholar, who imagines Dampier “writing up his journal, describing a bunch of flowers, or a rare fish, in the intervals between looting a wine-shop or sacking a village.”

In the pages of his notebook, Dampier expressed a great curiosity about the world—and a great keenness for eating basically any animal he came across. This included shark (which his men ate “very savorily”), wallaby (a “very good Meat,” similar to raccoon), flamingo, and many, many sea turtles.

Overall, about 63 percent of virtual for-profit schools were rated unacceptable by their states in the latest year for which data was available, according to a May report by the University of Colorado’s National Education Policy Center (NEPC).

Friday Links

The exact format of these sessions is not revealed, but it does sound like 368 people had to sit down at a desk on their own and have a list of rude words read out to them over the course of five days by someone with a clipboard. While a case could certainly be made for the importance of conducting such research, pity the poor statistician questioning their life choices as they ask their fourth person of the day "What about 'knob'? What do you think about 'knob'?"*

The "Non-English words" section contains a handy translation table for those who do not speak the relevant languages. Some of these are very offensive indeed and should probably not be wielded by anyone who lacks the suitable linguistic and cultural context, or possibly anyone at all. But the "mild" South Asian pan-linguistic term Uloo Ka Patha – which literally translates as "son of an owl" and reportedly means "idiot", "imbecile" or "moron" – definitely deserves wider use.

Cricket was at the forefront of the sporting boycott of apartheid, albeit accidentally so. The MCC initially did not select the South African born “cape coloured” player Basil d’Oliviera for the 1969 tour of South Africa, probably for political reasons. The consequent pressure on them, and the ‘injury’ to selected player Tom Cartwright (who, it is rumoured, withdrew in order to increase the pressure to select d’Oliviera), resulted, eventually, in the cancellation of the tour which, in turn, resulted in the widespread boycott. I supported the boycott almost without reservations and certainly without regret.

As things stand it is hard to avoid the conclusion that Afghanistan will face a cricketing boycott which, I suspect, is the only sporting boycott that matters in this case. In the past decade or so cricket as conquered Afghanistan, and Afghanistan, frankly, has conquered cricket, rising from an unknown participant in the third or fourth rank of cricketing nations, to becoming one of only 11 Test playing countries, with some of the best and most sought-after short from players in the world; no other sport comes close to it.

I am far more regretful about this one than about the South African boycott. Here are some thoughts.

About 5,600 people gave birth outside a hospital in California in 2020, up from about 4,600 in 2019 and 3,500 in 2010. The shift took place during a widespread “baby bust,” so the proportion of births outside hospitals rose from 0.68% in 2010 to 1.34% in 2020, according to a KHN analysis of provisional data from the California Department of Public Health. The proportion of births outside hospitals stayed relatively high — 1.28% — from January through July 2021.

Tuskegee was no secret; many researchers just didn’t see the problem with it. Holding up universities as “the guardians of voluntary participation” since 1947 is an absurd fantasy. Rules have been developed around medical research exactly because of egregious harms, particularly against vulnerable minority populations. We have to keep talking about informed consent because of the looming specter of research misconduct in this arena.

What’s more, the fact is that mandatory vaccination policies are not medical experiments at all, so seeing this through the lens of IRB and Nuremberg is wrongheaded; such policies are not the sort of human-subjects research governed by the infrastructure Walsh discusses. Our home university, Virginia Tech, requires students, faculty and staff to submit their vaccine documentation or file for a medical or religious exemption (for which weekly testing is then required). The FDA has now approved the Pfizer vaccine in full, so students should have no reason to refuse vaccination based on how "new" or "untested" it is.

Friday Links

“For many less-educated Americans, the economy and society are no longer providing the basis for a good life. Concurrently, all-cause mortality in the U.S. is diverging by education — falling for the college-educated and rising for those without a degree — something not seen in other rich countries.”

Community and technical colleges educate more people per year than apprenticeship programs, coding boot camps and federal job training programs combined, noted Tamar Jacoby, president of Opportunity America and author of the report. Nonetheless, many people underestimate the value of these institutions.

As four key insights in this analysis will later show, teacher, parent, and student conversations on social media have been largely siloed within their individual groups and focused on different aspects of the education system. However, the pandemic—and the far-reaching issues generated by it, such as an exam-grading controversy and students’ mental health—represent important moments when the three groups united in joint conversations around education, ripening the possibility for change.

Social listening is an innovative and relatively new research method for gathering and making sense of large amounts of social media data. We were particularly intrigued by the promise of social media data scraping for targeting the thoughts and sentiments of the millions of citizens whose voices simply cannot make it into even the most ambitious of surveys.

In 1927-1928, a large research study was undertaken by the Payne Fund, an educational foundation. Superintendents and principals in hundreds of American schools were surveyed; they were asked their views on the value and importance of educational broadcasting. The responses were so positive that it inspired the Ohio Department of Education, along with Ohio State University, to make educational broadcasts more widely available. The Ohio legislature allocated $20,000 for the project, and the Ohio School of the Air was born; it debuted on January 7, 1929, operating four days a week, from 1:30 to 2:30 p.m. By the end of 1929, more than 230 Ohio communities had installed radios in their schools so that students could hear the daily programs, broadcast from Ohio State’s WEAO in Columbus and powerful Cincinnati commercial station WLW.

Having a budget enabled the Ohio School of the Air to produce top-quality programs that were interesting for students, and easy for teachers to build discussions around—there were even sample lesson plans and suggested readings provided. It also meant being able to hire talented professionals to read poetry or perform Shakespearean plays, native speakers to give French lessons, and experienced musicians for music appreciation (a major plus for schools too small to have their own orchestras or glee clubs). And students were not just passive listeners: the lessons often included talks by well-known experts, like aviators, or doctors, or scientists, or journalists, or even the governor, who answered questions the students had asked.

But what Facebook has built, according to Horwitz, is not a system to protect the integrity and security of Facebook users with a large audience. It is an over-broad attempt to ward off what employees call “PR fires” — a side effect of which being that the users with the biggest megaphones are given another channel by which to spread whatever information they choose with little consequence.

When we do not believe we will need information later, we do not recall it at the same rate as when we do believe we’ll need it.Because the internet is readily available, we may not feel we need to encode the information. When we need the answer, we will look it up.

Friday Links

The Centers for Disease Control and Prevention recommends that everyone at colleges and universities wear masks indoors, even if they are fully vaccinated, in locales with substantial or high transmission of the coronavirus. Most of the country meets that standard at this point. The CDC also recommends that colleges offer and promote covid vaccines.

A series of influential articles published in the 1970s would argue that for a few decades after the Scopes trial, biology textbooks would become “less scientific.” According to these historians, textbook writers and publishers deleted sections about evolution, removed pictures of Darwin, softened “controversial” discussions by weaving in religious quotations, and moved the topic evolution to the last chapter of the textbooks, where it could easily be cut out (sometimes literally, with scissors).

Scholar Ron Ladouceur refers to this popular narrative as the “myth of Scopes,” which he defines as “the conventional belief that the theory of evolution…was fairly presented in high school textbooks prior to the Scopes trial, but that references to the topic were systematically removed by authors and publishers under pressure from Christian fundamentalists over the next 35 years.”

Orca will likely soon be dwarfed by competing projects in the US and Scotland that are expected to come online in the next two years. But even then, without much more public and private investment, the industry will be far from the 10 million tons per year that the International Energy Agency says are needed by 2030.

Like many pandemic-induced changes to American society, what remains to be seen is whether homeschooling is having a moment, or whether it is establishing itself as a permanent feature among educational options in the US. There are reasons to suspect it could be the latter.

Specifically, the researchers found that for every additional $50,000 of net worth accumulated at midlife, the risk of death later in life dropped by 5%.

Judging from the media coverage of the work from home (WFH) phenomenon, you’d think it’s become near universal. It’s not. In July, only about one in eight workers were teleworking—the Bureau of Labor Statistics’ (BLS) preferred term—and those are heavily concentrated in a few sectors and occupations, and among the highly credentialed.

According to BLS stats, in July 2021, just 13% of workers are doing so remotely because of the pandemic, down from 35% in May 2020, the first month the numbers were collected. (See graph below.) And that initial 35% number was inflated by the fact that so many workers who couldn’t work from home, like those in retail and hospitality, had been laid off. Since that peak, the share has declined in twelve of the subsequent fourteen months.

But most jobs just can’t be done remotely. Those who can work remotely have to be served by the majority of lower-status, lower-paid workers who make things and move them around. If they were paid not to work, money would quickly become worthless because there’d be nothing to buy with it—no food, no electricity, no appliances, no medical equipment, nothing.

We present a set of computing tools and techniques that every researcher can and should consider adopting. These recommendations synthesize inspiration from our own work, from the experiences of the thousands of people who have taken part in Software Carpentry and Data Carpentry workshops over the past 6 years, and from a variety of other guides. Our recommendations are aimed specifically at people who are new to research computing.

We all know of college students who are pre-med one minute and pursuing an architectural degree the next. A lot of those changes will be occurring early on, during the fall semester. We need to ensure that students, especially diverse students, don’t see this as a failure or setback but rather as an opportunity to help them find their way. Hopefully, some will take a path toward professions where their perspective and presence are critically needed.

Friday Links

“The key is the data that I have in my possession. Data is not clean of the sins of the past.”

Earlier in the pandemic, fully virtual students were paired with teachers at their home schools who guided them through a full day of classes — from math to gym — often alongside their pre-pandemic classmates.

But this year, after the state prohibited that kind of remote learning, San Diego launched a standalone virtual academy with its own virtual teachers. Students get some live instruction and teacher check-ins, then spend the rest of the time doing work on their own.

Another change? The level of interest. Last year, 44% of students ended the year online. But so far, less than 1% have chosen the virtual academy, though the district is still working through applications.

“Our projections consistently foresee a positive future for wild cacao in Peru as the current suitable area is expected to largely remain suitable and to further expand in the future,” explain Ceccarelli and her colleagues.

The research team cautions that despite the positive findings, chocolate lovers shouldn’t celebrate too soon. Even if wild cacao does fare well in the forests of a warmer world, that does not mean that it would grow well enough under cultivation to replace domestic cocoa as a cash crop. And their model didn’t try to incorporate other possible knock-on effects of a changing climate, such as increasing pests and disease.

These concerns about credibility are overblown. Credibility is whether others think you mean what you say in a given situation. It is context-specific; because circumstances can vary widely, credibility is judged on a case-by-case basis. How a state has behaved in the past is an important component of its credibility, but it is not the only one. The Biden administration’s withdrawal from Afghanistan will affect these calculations the next time the United States commits to an extraordinarily costly venture in a place not vital to the country’s core security interests, but it is unlikely to sabotage U.S. credibility writ large.

Credibility is different from reputation, however. If credibility is whether others think your deeds will match your words, reputation is what others think of you in the first place. On this count, the consequences of the U.S. withdrawal will likely be considerably greater. The pullout has been messy and chaotic: the Taliban took control of Afghanistan more quickly than the Biden administration had publicly predicted, and members of a regional branch of the Islamic State (or ISIS) launched a deadly bomb attack at the Kabul airport as Afghan and foreign citizens attempted to evacuate the country.

Reputations are, in essence, beliefs—they exist only in the minds of others. The formation and maintenance of reputations therefore has an important psychological component, and the psychological evidence is relatively clear that observers pay attention to past actions when predicting future behavior. Experimental studies I conducted with Jonathan Renshon and Keren Yarhi-Milo on both members of the public and elite decision-makers found that when asked to assess a country’s resolve in a foreign policy crisis, observers consistently focus on behavior in previous disputes, even when presented with countervailing information about capabilities and interests. The question is not simply whether allies and adversaries will doubt U.S. resolve because Washington backed down from a 20-year stabilization effort in Afghanistan. It is whether their existing doubts will grow stronger than they would have had the United States continued fighting.

Americans are exhausted by educational disruption. That's the message of a new survey by the journal Education Next. According to their poll, support for virtually every proposed innovation has dropped since 2019 (a few items were flat). That includes both highly popular measures, such as annual testing, and more controversial policies, including charter schools. 

Prior to the 1960s and 1970s, writes Ensmenger, computer programming was thought of as a “routine and mechanical” activity, which resulted in the field becoming largely feminized. The work wasn’t particularly glamorous; “coders” were “low-status, largely invisible.” They were only supposed to implement the plans sketched out by male “planners.” Ensmenger quotes one female programmer, who recalled, “It never occurred to any of us that computer programming would eventually become something that was thought of as a men’s field.”

Google readily (and ironically) admits that such ubiquitous web tracking is out of hand and has resulted in “an erosion of trust... [where] 72% of people feel that almost all of what they do online is being tracked by advertisers, technology firms or others, and 81% say the potential risks from data collection outweigh the benefits.”

“Research has shown that up to 52 companies can theoretically observe up to 91% of the average user’s web browsing history,” a senior Chrome engineer told a recent Internet Engineering Task Force call, “and 600 companies can observe at least 50%.”

Friday Links

The results are generally consistent with past research: Online coursework generally yields worse student performance than in-person coursework. The negative effects of online course-taking are particularly pronounced for less-academically prepared students and for students pursuing bachelor’s degrees. New evidence from 2020 also suggests that the switch to online course-taking in the pandemic led to declines in course completion. However, a few new studies point to some positive effects of online learning, too.

Reversing course, the Centers for Disease Control and Prevention said Tuesday that all students and staff should wear masks inside schools, regardless of whether they’re vaccinated — an acknowledgment that slowing vaccination rates and the highly contagious delta variant are complicating plans for a more normal start to the school year.

And those not vaccinated are 29 times more likely to be hospitalized for COVID.

A new report titled, “Stranded Credits: A Matter of Equity,” from Ithaka S+R, explores the lived experiences of students and staff familiar with institutional debt, also known as stranded credits. This phenomenon particularly impacts students of color, first-generation and low-income students. The report defines stranded credits as academic credits achieved by students that they cannot access due to an unpaid balance.Stranded credits not only impact students’ academic progress, they can also thwart career trajectories because they are unable to access their transcripts due to unpaid debt. Researchers also found this phenomenon also has a detrimental impact on mental health and wellbeing.

In a longitudinal study of almost 400,000 employees from nearly 400 Japanese firms over 12 years, the gender gap in bonus pay was found to be greater in workplaces with a merit-based system than in workplaces without it, said Eunmi Mun, a professor of labor and employment relations at Illinois.

As U.S. President Joe Biden seeks to resurrect American leadership on the world stage, the perennial question of how the United States should respond to international crises looms large. In his latest book, the political scientist John Mueller offers a refreshingly straightforward answer: Washington should aim not for transformation but for “complacency,” which Mueller characterizes as “minimally effortful national strategy in the security realm.”

Education researchers have a particular kind of tutoring in mind, what they call “high-dosage” tutoring. Studies show it has produced big achievement gains for students when the tutoring occurs every day or almost every day. Less frequent tutoring, by contrast, was not as helpful as many other types of educational interventions. In the research literature, the tutors are specially trained and coached and adhere to a detailed curriculum with clear steps on how to work with one or two students at a time. The best results occur when tutoring takes place at school during the regular day.

Especially suspect, in Pliny’s opinion, were professional magicians, or “magi,” a term that originally referred to Persian fire priests but came to mean any practitioner of magic for hire. “The most blatant example of the shamelessness of the magi,” he writes, is a ritual to produce an amulet that makes its wearer invisible.

Shields might prove helpful in specific instances — like halting the big droplets emitted during coughs and sneezes — but not particularly in trapping the "unseen aerosol particles" by which COVID-19 spreads. "The smaller aerosols travel over the screen and become mixed in the room air within about five minutes," said Catherine Noakes, a professor at the University of Leeds in England. "This means if people are interacting for more than a few minutes, they would likely be exposed to the virus regardless of the screen." 

Friday Links

To complicate matters further, about half of the roughly forty-five known examples were discovered before the emergence of rigorous archaeological standards for documenting objects’ findspots and contexts. This means that we simply have no idea where—or even when—some of the most famous slave collars were originally found, such as the so-called Zoninus collar now in the collection of the Museo Nazionale Romano.

On the other hand, the specimens that were excavated more recently show a range of archaeological contexts so wide that it is almost impossible to make generalizations: some have been found still attached around a skeleton’s neck, indicating that “for some slaves at least, a metal neck collar was permanent,” while others have been found in trash heaps and gutters, perhaps discarded by successful fugitives.j

For all that, it is hard not to think that he would have been a more appealing character, at least aesthetically speaking, if he had lived 200 years earlier. In the 19th century McAfee might have composed "The Revolt of Islam" or a biography of William Tell in between keeping a pet bear in his Oxford rooms or fighting for Greek independence. Instead he both consumed and sold an enormous amount of drugs, wrote computer software, and ran unsuccessfully for the presidential nomination of a minor political party....

That evening's debate was memorable not least of all for McAfee's candor. In response to a question about what works of political philosophy had inspired the candidates, he flatly declared: "I come to you untutored in the great thinkers of libertarianism. The first book I ever read cover to cover was Darwin's Origin of Species at the age of 30. I read that book because I was dealing drugs in Mexico at the time and it was the only English-language book I could find."

In a case of l’esprit de l’escalier, I just worked out the perfect parenthetical addition to this piece that was published in Inside Story, responding to a string of pro-natalist pieces in the New York Times and elsewhere. The central point is that the economic model in which strong young workers support elderly retirees is outdated and will only become more so.

The model underlying the desire for a population pyramid is one in which physical work predominates. Young and strong, needing only on-the-job training, workers leave school at 14 and immediately start contributing to the economy. By 65, they are worn out and ready for retirement. In this model, the more young people, the better.

One of the most entertaining of these rumors was what historian John McMillan has called “the Great Banana Hoax of 1967.” In the spring of that year, publishers of underground papers printed a recipe for smoking banana peels. It involved freezing the peels, blending them into a pulp, baking the residue at 200 degrees, and then smoking it in a cigarette or pipe (The Berkeley Barb, March 17, 1967). This supposedly produced an experience similar to that of smoking marijuana.

It involved freezing the peels, blending them into a pulp, baking the residue at 200 degrees, and then smoking it in a cigarette or pipe (The Berkeley Barb, March 17, 1967). This supposedly produced an experience similar to that of smoking marijuana.

As fun, if not necessarily effective, as banana smoking might have been, it was not without risks. According to The Rag, two people were taken into custody for possession of what turned out to be a banana peel: according to the Los Angeles Free Press, Donald Arthur Snell of Santa Fe Springs, California, was charged with driving while under the influence of drugs—the drug being banana peel (Berkeley Barb, May 26, 1967).

But small gatherings like Doug’s party are a potential important source of transmission, though this has been really hard to measure. At least, unless you get clever, which is what a team led by Anupam Jena did in this article, appearing in JAMA Internal Medicine.

With coronavirus infections falling in the U.S., many people are eager to put the pandemic behind them. But it has inflicted wounds that won’t easily heal. In addition to killing 600,000 in the United States and afflicting an estimated 3.4 million or more with persistent symptoms, the pandemic threatens the health of vulnerable people devastated by the loss of jobs, homes and opportunities for the future. It will, almost certainly, cast a long shadow on American health, leading millions of people to live sicker and die younger due to increasing rates of poverty, hunger and housing insecurity.

It will, almost certainly, cast a long shadow on American health, leading millions of people to live sicker and die younger due to increasing rates of poverty, hunger and housing insecurity.

In particular, it will exacerbate the discrepancies already seen in the country between the wealth and health of Black and Hispanic Americans and those of white Americans. Indeed, new research published Wednesday in the BMJ shows just how wide that gap has grown. Life expectancy across the country plummeted by nearly two years from 2018 to 2020, the largest decline since 1943, when American troops were dying in World War II, according to the study. But while white Americans lost 1.36 years, Black Americans lost 3.25 years and Hispanic Americans lost 3.88 years. Given that life expectancy typically varies only by a month or two from year to year, losses of this magnitude are “pretty catastrophic,” said Dr. Steven Woolf, a professor at Virginia Commonwealth University and lead author of the study.

Western powers have reacted with alarm to the ICC’s recent attempts to open investigations outside Africa, heaping pressure on the court and leveling sanctions against it. In so doing, they have revealed the tacit understanding that international criminal law applies to some more than others.

Friday Links

(Laugh if you will, but archaeology has proven that the ancients who once inhabited the Iberian Peninsula greatly valued the lime-flavored nacho chip for its nutritive value, and they long afforded it primacy among chips.)

Since the U.S. doesn't have a statutory minimum of paid public holidays like most of the rest of the world, it will fall on employers to decide whether or not to actually honor America's Second Independence Day.

Friday Links

Shane Frederick has devised a cognitive reflection test – three questions in which there is an intuitive but wrong answer and a non-intuitive but right answer. He has found that even at the US’s top universities, less than half of students get all three questions right.

Lawmakers say covid’s disproportionate impact on California’s Black and Latino residents, who experienced higher rates of sickness and death, makes their request even more pressing.

Yet even in the age of the cruise missile, the bayonet remains. However obsolete on the battlefield, it’s an “attitudinal and behavioral” tool. The point is all about morale-building. Stone notes: “What has proved important is its role in motivating scared, and frequently isolated, soldiers to continue fighting when their instincts demand otherwise.”

Perhaps the most striking thing about this article is that there is only a passing mention of Bitcoin — and nothing about cryptocurrency more generally — even though these attacks are only possible because of cryptocurrency.

Friday Links

“Together, these findings illustrate that the most common approach to diversity in higher education ironically reflects the preferences, and privileges the outcomes, of White Americans,” the study notes.

What must one believe in to be willing to borrow tens of thousands of dollars in order to pursue a certification of completion — a B.A.? What would a college have to promise in order to compel someone to do that? What would a bank have to believe to extend this person credit? Or the U.S. government, to guarantee such loans en masse — now roughly $2 trillion? And what would a society have to believe to sustain the system that keeps it all going?

"On 'snow days' or days when school buildings are closed due to an emergency, all students and families should plan on participating in remote learning," the NYC Department of Education said.

Two 19th century Belgian bibliographers heard about the Dewey Decimal System and asked to translate it into French. But rather than slavishly follow Dewey, they added some significant twists. Their system, first published in 1905 and still used today, is known as the Universal Decimal Classification.

The truth is the question of whether student debt should be canceled is largely irrelevant. Most student debt will be canceled sooner or later, because an ever-growing share of borrowers cannot possibly repay their loans. Ever. The only question that matters is whether President Biden and Democrats in Congress can grapple with reality and fix America's colossally stupid system of funding higher education.

Effectively, the IDR program (whose enrollment has grown steadily to about a fifth of borrowers) is a tacit admission that most student loans are never going to be paid off in full. Those who have not enrolled have seen far higher rates of default; on current trends most borrowers will be in IDR eventually, which is rapidly becoming a kind of ad hoc bankruptcy program for student borrowers. In a sense, the U.S. is starting to fund its higher education system with a payroll tax on people who go to college but are too poor to pay for it out of pocket — except we then force them to sit under an enormous load of basically imaginary debt for decades while doing it. This damages their credit, making it harder to get a job, a house, a car, and so on.

Friday Links

But there is a problem with the new authenticity standard. The people who made applying to college an elaborate performance, a nervous and yearslong exercise in self-construction have now decided that the end result of this elaborate performance must be “the real you.”

As we face a new struggle to get covid-19 vaccination rates up in this country, we need to remember that there is a group of people with virtually zero vaccine uptake. This group often congregates together in indoor gatherings, coming into close physical contact for extended periods. Fully 24% of Americans are part of this group. We call them children.

Even before the 1950s began, Meredith writes, the phrase “the Nifty Fifties” began circulating. On a much more ominous note, one Chicago Tribune writer warned that “with an eye to Russia, this next decade will be tagged either ‘The Friendly Fifties’—or ‘The final Fifty.” And a report from Hays, Kansas, explained that dust storms in that area had led residents to declare the start of the “Filthy ’50s,” a callback to the “Dirty ’30s.”

Instead of lamenting a demise of expertise, wouldn’t it be more productive to ask whether the language of expertise features characteristics that invite its audience to overlook or misread it?

But new research suggests schools that tighten security and surveillance in response to shootings or other acts of violence may worsen long-term discipline disparities and academic progress, particularly for Black students.Schools that maintain the closest watch on students have significantly higher suspension rates and lower math performance than schools that use a lighter touch, according to new research released at the annual American Educational Research Association conference earlier this month. The findings also suggest a high-surveillance culture at school can make it harder for educators to implement strategies like restorative justice intended to reduce discipline disparities for students of color.

And these consequences fell disproportionately on Black students, who were more than four times more likely than white students to attend a school with the highest level of surveillance.

Indeed, it was lost—an estimated 1000 pages of poetry and pictures—until partly rediscovered by a small boy named Dave, in an IKEA in suburban Stroughton, Massachussets.

Substantial portions were in use as a prop book on a VITTSJÖ shelf unit (appropriately, as this unit promises ‘an open, airy feel.’)

Friday Links

Yet, in district after district, Black families are largely choosing to continue learning from home, despite efforts to reopen their schools. Rather than using equity as a buzzword to gain moral high ground in the reopening debate, we believe that advocates and school officials should listen to and engage with Black families and trust their decision-making.

So how can I make such a brash counter-claim that tests, quizzes, and exams are not essential? McFarland mentions how proctoring is not optional if “the goal of an exam or test or assignment is to measure learning or skill mastery.”

This is where the big reveal comes: quizzes, tests, exams, assignments – none of those can measure learning or skill mastery. Not directly at least.

All forms of tests and assignments are designed to serve as this evidence in the form of a proxy for direct observation. The idea is that if you really learned something, you can take that knowledge and answer questions about it, or describe it, or do a project with it, or something along those lines. There are a wide range of assignment options that work well as a proxy, but exams and tests are usually questionable at best. This is especially true when they rely on one of the most popular forms: the multiple choice question.

The research is usually aimed at seeing how many students cheated, not finding out the likelihood that they will cheat on your specific test this Friday.

The reason for this is because those numbers probably wouldn’t be as scary as “5000% of students cheat!” This is important because the real concerns most critics have with proctoring technology are about the problems with racism, ableism, and privacy violations that students have reported. If you think that most of the students in your course would probably cheat, you kind of shrug at the possible problems and say “well, I have to do something.” But if someone were to say that there was a less-than-5% chance any given student would cheat on your specific exam, then suddenly, the problems you subject students to do not seem worth it.

But it is also not surprising that a younger generation of left intellectuals has turned against higher education, given that it has turned against them. Following years of austerity budgets and the systematic deprofessionalization of academic labor, millennials and their generational successors have found it harder and harder to get faculty positions. As for students, a college degree of some sort has become a near-universal standard for younger cohorts entering an increasingly credentialized labor market. For them, the university has meant neither an enriching intellectual experience that sets them on a path of humanistic, lifelong inquiry nor a path to middle-class economic stability, but rather escalating tuition for degrees of questionable value that sets them on a path of crushing, lifelong debt. Once popular on the right, the Bennett hypothesis is likely to find more and more of its adherents on the left.

But brains need not and should not be confined to our bodies. They can, and should, and sometimes do, reside elsewhere.

One place is in our habits. I invest into tracker funds by direct debit each month. Most of my investment is done without thinking. 

Strictly speaking, this isn’t optimal: stock markets aren’t fully efficient and tracker funds can be beaten by momentum and defensive stocks (pdf) (but, I suspect, not by any other strategy). Implementing such strategies, however, would require me to think. And if I did that I’d fall prey to the gazillion cognitive biases that I warn IC readers against.

But there is another more cynical case for universal voting. Democracy, which has come to be based on an ever-greater franchise, provides legitimacy to government and an orderly mechanism for resolving political conflict. Undermine those things, and violence and instability could spill out of control.

Sesame Street introduced Roosevelt Franklin in February 1970. He was created and originally voiced by Sesame Street actor Matt Robinson, who felt the show lacked content that would draw in Black kids. He told Ebony magazine in a 1970 interview that kids needed “more realism in black-oriented problems,” a concern echoed by others....

The Muppet was a hot topic behind the scenes. Other Black Sesame Street staffers felt he was too stereotypical, with one of the show’s advisers noting, “I like the idea of black muppets, [but not] this one-dimensional use of black muppets.” Robinson pushed back, advocating for the use of Black English as a way to meet kids where they were. Still, a 1973 article in Black World Magazine called the attempt “a gross misrepresentation of Black Language.” Later, a 1975 Freedomways article called out Roosevelt and his segments as “a chaos of ‘darky’ accents and racist stereotypes.”

About Me

Developer at Brown University Library specializing in instructional design and technology, Python-based data science, and XML-driven web development.

Tags