Friday Links

One of the most interesting things about computers is the way they hold a mirror up to us, a mirror that reflects not nature but our conception of nature. Attempts to create artificial intelligence force us to grapple with questions about our own natural intelligence — what it is, where it comes from, what its limits are. Programs for natural language processing raise hard questions about the origins and character of natural language. And in our attempts to create virtual worlds with virtual inhabitants — the metaverse, for instance — we confront profound questions about our being: What is a world? What does it mean to be in a world? What’s the relationship of mind and body? As Michael Heim wrote in his 1991 essay “The Erotic Ontology of Cyberspace,” collected in the book Cyberspace: First Steps, “cyberspace is a metaphysical laboratory, a tool for examining our very sense of reality.”

Ken Adam, the set designer for many of the early films, paralleled Bond’s prewar traditionalism with an affinity for classical architecture. The British intelligence agency where Bond works is usually housed in a classical building. Likewise, Bond’s apartment, of which we catch a glimpse in Dr. No, is located in a Georgian-style building. The architecture that Bond inhabits resembles a kind of self-assured, institutional power, comparable to the status of the British before WWII.

Over 60% of all jobs in the U.S. typically require a high school education or less and pay accordingly. And for at least several decades, data from the New York Federal Reserve has shown what all college faculty and many parents and workers have long known: A large minority of bachelor’s degree holders are consistently underemployed, working in jobs requiring less formal education than they received. The educational attainment levels of the population, which are at historic highs, far exceed what the labor market requires. This is the real economy.

The format of this announcement is interesting. It tries desperately to strike a positive tone, with several paragraphs citing specific examples of the benefits of facial recognition and only gesturing to the potential for harm and abuse. I am glad Facebook sees so many great uses for it; I see them, too. But I wish the company were anywhere near as specific about the acknowledgements of harm. As it is presented, it looks defensive.

These presentations had the familiar vibe of an overly-ambitious video game reveal.

Facebook may be entirely invested in these efforts, but we have not yet come to terms with what the company represents today. It can parade its research projects and renderings all it wants, but the fact of the matter is that it is a company that currently makes the de facto communications infrastructure for much of the world, and frequently does a pretty poor job of it. Is this really the figurehead we want for the future of personal computing? Is this a company that we trust?

If you are like me, you may be thinking that it is sort of weird that this advertising company thinks it can create the mixed-reality future in software, hardware, and developer tools.

Let me start off by saying that MMORPGs are not typically good games....While many other kinds of games could be said to be based around "go here and kill things", the MMO needs to do it in a flatly consistent way due to the online shared world – hundreds of players running around the same sort of things – so they cannot be presented in a particularly interesting way outside of some flavour text.

Stiles argues that “the now-familiar trope of the mad scientist…traces its roots to the clinical association between genius and insanity that developed in the mid-nineteenth century.” In the early 1800s, the Romantics saw the condition as a “mystical phenomenon beyond the reach of scientific investigation.” The Victorians took a more detached and critical approach. “Rather than glorifying creative powers, Victorians pathologized genius and upheld the mediocre man as an evolutionary ideal,” Stiles writes. “All aberrations from the norm could be seen as pathological, including extreme intelligence.”

Friday Links

“The key is the data that I have in my possession. Data is not clean of the sins of the past.”

Earlier in the pandemic, fully virtual students were paired with teachers at their home schools who guided them through a full day of classes — from math to gym — often alongside their pre-pandemic classmates.

But this year, after the state prohibited that kind of remote learning, San Diego launched a standalone virtual academy with its own virtual teachers. Students get some live instruction and teacher check-ins, then spend the rest of the time doing work on their own.

Another change? The level of interest. Last year, 44% of students ended the year online. But so far, less than 1% have chosen the virtual academy, though the district is still working through applications.

“Our projections consistently foresee a positive future for wild cacao in Peru as the current suitable area is expected to largely remain suitable and to further expand in the future,” explain Ceccarelli and her colleagues.

The research team cautions that despite the positive findings, chocolate lovers shouldn’t celebrate too soon. Even if wild cacao does fare well in the forests of a warmer world, that does not mean that it would grow well enough under cultivation to replace domestic cocoa as a cash crop. And their model didn’t try to incorporate other possible knock-on effects of a changing climate, such as increasing pests and disease.

These concerns about credibility are overblown. Credibility is whether others think you mean what you say in a given situation. It is context-specific; because circumstances can vary widely, credibility is judged on a case-by-case basis. How a state has behaved in the past is an important component of its credibility, but it is not the only one. The Biden administration’s withdrawal from Afghanistan will affect these calculations the next time the United States commits to an extraordinarily costly venture in a place not vital to the country’s core security interests, but it is unlikely to sabotage U.S. credibility writ large.

Credibility is different from reputation, however. If credibility is whether others think your deeds will match your words, reputation is what others think of you in the first place. On this count, the consequences of the U.S. withdrawal will likely be considerably greater. The pullout has been messy and chaotic: the Taliban took control of Afghanistan more quickly than the Biden administration had publicly predicted, and members of a regional branch of the Islamic State (or ISIS) launched a deadly bomb attack at the Kabul airport as Afghan and foreign citizens attempted to evacuate the country.

Reputations are, in essence, beliefs—they exist only in the minds of others. The formation and maintenance of reputations therefore has an important psychological component, and the psychological evidence is relatively clear that observers pay attention to past actions when predicting future behavior. Experimental studies I conducted with Jonathan Renshon and Keren Yarhi-Milo on both members of the public and elite decision-makers found that when asked to assess a country’s resolve in a foreign policy crisis, observers consistently focus on behavior in previous disputes, even when presented with countervailing information about capabilities and interests. The question is not simply whether allies and adversaries will doubt U.S. resolve because Washington backed down from a 20-year stabilization effort in Afghanistan. It is whether their existing doubts will grow stronger than they would have had the United States continued fighting.

Americans are exhausted by educational disruption. That's the message of a new survey by the journal Education Next. According to their poll, support for virtually every proposed innovation has dropped since 2019 (a few items were flat). That includes both highly popular measures, such as annual testing, and more controversial policies, including charter schools. 

Prior to the 1960s and 1970s, writes Ensmenger, computer programming was thought of as a “routine and mechanical” activity, which resulted in the field becoming largely feminized. The work wasn’t particularly glamorous; “coders” were “low-status, largely invisible.” They were only supposed to implement the plans sketched out by male “planners.” Ensmenger quotes one female programmer, who recalled, “It never occurred to any of us that computer programming would eventually become something that was thought of as a men’s field.”

Google readily (and ironically) admits that such ubiquitous web tracking is out of hand and has resulted in “an erosion of trust... [where] 72% of people feel that almost all of what they do online is being tracked by advertisers, technology firms or others, and 81% say the potential risks from data collection outweigh the benefits.”

“Research has shown that up to 52 companies can theoretically observe up to 91% of the average user’s web browsing history,” a senior Chrome engineer told a recent Internet Engineering Task Force call, “and 600 companies can observe at least 50%.”

About Me

Developer at Brown University Library specializing in instructional design and technology, Python-based data science, and XML-driven web development.

Tags