Saturday, April 29, 2023

Suggested reading: Will press lever for food

Cain by Henri Vidal

Cain having just murdered his brother Abel by Henri Vidal, 1896, Tuileries Garden, Paris. Photo credit: Alex E. Proimos; image source: Wikimedia Commons, CC BY 2.0 (Attribution 2.0 Generic)

1. Gen Z

Baby Boomers, Generation X, Millennials, Generation Z, and whatever comes after: generational labeling has never been so prevalent. But do generational labels and their accompanying generalizations really tell us anything that's not obvious?

In "Gen Z and Me" [1], Joe Moran, professor of English and cultural history at Liverpool John Moores University, points out that some generational differences are real (though their significance is another matter). In communication styles and the use of technology, for example, (most) members of Gen Z differ from (most) members of previous generations. In general, Gen Z derides email (requires too much time to read or write) and has developed elaborate unwritten rules defining online etiquette. The political attitudes of (many) members of Gen Z on issues such as gender and sexual identity, privacy, racial justice and climate change can also differ from those of (at least some) members of previous generations. But Moran points to two recent books that draw contrasting conclusions about what these technological and attitudinal differences really mean.

In Gen Z, Explained: The Art of Living in a Digital Age (University of Chicago Press, 2021), a Stanford anthropologist (Roberta Katz), an Oxford linguist (Sarah Ogilvie), an Oxford historian (Jane Shaw) and a Lancaster University—now King's College London—sociologist (Linda Woodhead) use interviews with college students try to explain Gen Z to everyone else. Moran writes that, according to Gen Z Explained, the members of Gen Z have

created rich and hard-to-penetrate subcultures. What they mostly like to do, the book argues, is to collaborate in leaderless groups. They use digital tools to create shared documents, sync their calendars, write and read fan fiction, play games together. They use apps to organise lift-sharing, couch-surfing and political activism. The authors further 'explain' Generation Z by pointing to the intricate language and etiquette of their online lives. Post-millennials can quickly convey their pleasure or displeasure through memes. They use emojis as a 'social lubricant' and bracket words with asterisks and tildes for emphasis and irony. Whether they write 'k' or 'kk' to mean 'OK' is charged with meaning. The first is curt; the second is cheerful and casual, a way to temper the brusqueness of the single letter. These tonal shadings matter because post-millennials like to state their intentions clearly. Self-labelling, especially of fine-grained sexual and gendered identities, has become an 'imperative'. They think it important to be themselves, to admit their struggles and vulnerabilities, to say what they mean. In the iGen Corpus, a digital data bank compiled by Ogilvie of seventy million words used by post-millennials, terms such as real, true, honest and fake occur far more often than in general language use.

But being earnest, creating subcultures and language that are impenetrable to people your parents' age and using technology that befuddles them are all just part of being in your teens and early twenties. Earlier generations did the same things, while members of their parents' generation rolled their eyes, complained, or thought that it heralded the end of civilization:

  • Gen Zers spend too much of their time bent over their phones texting their friends? "Youth is the age when people are most devoted to their friends or relations or companions, as they are then extremely fond of social intercourse and have not yet learnt to judge their friends or indeed anything else by the rule of expediency."
  • The mocking memes and references Gen Zers exchange seem bizarre? "If the young commit a fault, it is always on the side of excess and exaggeration. . .they carry everything too far, whether it be their love or hatred or anything else. . .Finally, they are fond of laughter and consequently facetious, facetiousness being a more cultivated form of insolence."
  • They are far too ready to take offense? "They are passionate, irascible and apt to be carried away by their impulses."
  • Their self-righteousness is irritating? "They think they know everything and are absolutely sure in their assertions; this is in fact the reason of their carrying everything too far."
  • They seize on and abandon trends with bewildering rapidity? "They are changeable too and fickle in their desires, which are as transitory as they are vehement; for their wishes are keen without being permanent, like a sick man's fits of hunger and thirst."

As you may have suspected, the quotes above are taken from a source written a few generations back: they're from Aristotle's Rhetoric, written two and a half millennia ago in the 4th century BCE. [2] For a compendium of similarly parallel comments about Gen Y/Millennials and the youth of previous generations, see "People have always whinged about young adults. Here's proof." [3]

In The Generation Myth: Why When You're Born Matters Less Than You Think (Basic, 2021, published in the UK as Generations: Does When You’re Born Shape Who You Are?), King's College London professor of public policy Bobby Duffy "argues that the discourse around generational difference is 'a mixture of fabricated battles and tiresome clichés'." In Duffy's view, most "generational difference" is a "lifecycle effect":

Sociologists give three explanations for the change in people’s attitudes and behaviours over time: period effects, lifecycle effects and cohort effects. Period effects describe change across all age groups: the result of sweeping societal shifts. Lifecycle effects describe change resulting from the ageing process or in response to key events such as leaving home, becoming a parent or retiring. Cohort effects describe change that results from shared generational experiences. Duffy, a professor of public policy, argues that the current discussion attributes too much to cohort effects and not enough to period and lifecycle effects.

In other words, many of what seem unbridgeable differences between older generations and Gen Z have to do with the phase of life, late adolescence and young adulthood, that Gen Z is passing through. Older people judge younger people not in comparison to the way they actually behaved when they were the same age, but in comparison to an idealized memory distorted by the way they are today: older, sadder, perhaps wiser, and probably further behind the technological curve.

One common complaint about (many) members of Gen Z is their sense of entitlement. Teachers complain that students seem to want infinitely flexible deadlines and to receive good grades no matter what their actual effort or achievement. This isn't because Gen Z members are uniquely lazy or incompetent, but because students have vastly more leverage now than in the past.

What grades really mean cartoon by Jorge Cham

"What grades really mean," by Jorge Cham, 2014. Image source: Piled Higher and Deeper (www.phdcomics.com)

2. "Liking" good grades

Grades have little to do with learning. They do not measure it, and if anything the emphasis placed on grades (a short-term goal based on punishment for failure) is actually counterproductive to expanding students' knowledge and capacities (a long-term and cumulative project based on engaging and rewarding curiosity).

There's another way in which grades are a meaningless measure: they shift over time so that similar levels of proficiency receive ever-higher levels of assessment. Grade inflation is real, but it is not the fault of either students or instructors. It is the inescapable result of our dysfunctional system of funding higher education. As Lorna Finlayson writes about the UK higher education system in "Everyone Hates Marking" [4]:

The proportion of firsts [first-class honours degrees] has more than doubled in the last ten years: it's simply not plausible that the overall standard of education has risen dramatically in this period. What has changed is the marketisation of universities, which introduced pressures that make so-called inflation inevitable, linking institutions' material fortunes to league table positions while financially incentivising the 'recruitment' of as many students as possible.

In both the US and Britain, state support of public higher education has plummeted. This has been a long-term trend over the past several decades, and its negative effects are now unavoidable. Colleges and universities are expected to make up the difference between lower levels of state support and ever-higher operating expenses by enrolling more students and charging higher tuition. In order to attract more students and charge them more, schools have to compete with each other in catering to their wants.

Of course, it's a Catch-22: more students require more dorms, more classrooms, more instructors, more support staff, and more library books, increasing costs further. And raising tuition is not only directly counter to the mission of public universities to provide an affordable education to every state resident who qualifies, it burdens students with ever-higher levels of debt.

Most of the additional instructors who are being hired to teach these larger undergraduate cohorts are part-time, non-tenure track adjunct lecturers. Since they generally have no research responsibilities, their continued employment from semester to semester is dependent almost entirely on good student evaluations of their teaching. However, student evaluations do not correlate with learning and have been shown to exhibit systematic gender, age, and racial bias. In addition, instructors who are perceived to be "tough graders" receive lower evaluations, particularly if they are women. [5]

Since it's so strongly in the economic interest of colleges and universities to enlarge enrollments, it's no mystery why grades for similar levels of proficiency keep moving up the scale: we have created a system in which, to remain employed, instructors must chase "likes" from their students. And students like good grades.

Will press lever for food cartoon by Craig Swanson

"Will press lever for food" by Craig Swanson. Image source: Perspicuity (www.perspicuity.com)

3. The Reaction Economy

This is an example of what William Davies terms "The Reaction Economy" [6]. He writes,

The behaviourist tradition that came to dominate American psychology in the 20th century, pioneered by John B. Watson after the First World War and later identified with B.F. Skinner, was established with the explicit aim of rendering human responses predictable and thereby controllable. Psychology would abandon any theory of mind in favour of data on observable behaviour. Behaviourists imagined a wholly conditioned, programmable person. . .

The behaviourist tradition has revealed a lot about how humans (and other animals) respond to different stimuli. . .But it tells us nothing about the vast amounts of time and labour that societies such as ours now invest in actively trying to generate and capture reactions of various kinds, not just in science laboratories or hospitals, but across the economy, public sphere and civil society.

On the contrary: far from telling us nothing about the reaction economy, behaviorism's explicit goal is "predictable and thereby controllable" responses; think of laboratory rats pressing levers for a reward. And the biggest behaviorist experiment in human history is currently underway. On social media, search engines and other commercial sites companies track your "engagement," clicks and "likes," and sell your attention to advertisers based on predictions of your behavior. Your responses to the ads you see become more data points in an endless feedback loop. This system has been memorably described by social psychologist Shoshana Zuboff as "surveillance capitalism." And the engine driving it is the technology sector centered in Northern California that since at least 1970 has been called Silicon Valley.

Apple new campus building in Cupertino, California

Cupertino, California: Aerial photo of Apple new campus building, April 13, 2017. Photo credit: Uladzik Kryhin. Image source: Shutterstock

4. Silicon Valley

The man who founded what became Silicon Valley was William Shockley. As a researcher at Bell Labs in New Jersey after World War II Shockley co-invented the transistor, for which he and his colleagues jointly won the Nobel Prize in Physics in 1956. That same year he founded Shockley Semiconductor Laboratory in Mountain View, California, in what was then known as the Santa Clara Valley. A key reason for Shockley's choice of location: his mother lived in Palo Alto, home of Stanford University and the next town over as you headed north from Mountain View towards San Francisco.

Shockley had a good eye for talent, but as John Lanchester writes in "Putting the Silicon in Silicon Valley" [7], he was "an outstandingly horrible human being." He divorced his wife when she was diagnosed with uterine cancer, he actively promoted racist and eugenicist views, and few could stand to work with or for him. Just a year after the founding of Shockley Semiconductor eight of its leading researchers left to found a rival company, Fairchild Semiconductor. A decade later two of the eight, Robert Noyce (co-inventor of the integrated circuit) and Gordon Moore (formulator of Moore's Law) split off from Fairchild to form Intel. Along with Texas Instruments, Fairchild and Intel dominated microchip design for decades.

But manufacturing chips was expensive, even in the low-wage nonunion states to which these companies moved their production, and in 1968 to increase profits Texas Instruments built a chip fabrication plant in Taiwan. Lanchester writes, "The investment would also give the US a stake in defending Taiwan, at a time when America’s enthusiasm for Asian military adventures was at a low ebb. TI committed to build their Taiwan fab [fabrication plant] in 1968. In 1980 they shipped their billionth chip. A new strategy was in place."

A key figure in the strategic integration of Taiwan into the microchip supply chain and thus the US military perimeter was a man named Morris Chang. Born in mainland China in 1931, he emigrated to the US in 1949, and after getting bachelor's and master's degrees at MIT and a TI-sponsored PhD at Stanford, eventually became vice-president in charge of TI's microchip division.

After leaving TI, in 1987 Chang founded the Taiwan Semiconductor Manufacturing Company (TSMC), which for decades has produced many of the most advanced microchips. Advanced microchips are crucial to the development of new military technologies, including AI.

Risk

Risk 60th Anniversary Edition Board Game created by by Albert Lamorisse and produced by Hasbro. Image source: Best Buy

5. Meet the new strategy, same as the old strategy

A major driver of China's interest in absorbing Taiwan is that by seizing the world's most advanced microchip fabrication plants it would control the chip supply to the rest of the world—most especially the US—and gain a large technological and military advantage.

As Lanchester writes,

China has to import powerful microchips. The numbers involved are substantial. For most of this century, China has spent more money on importing microchips than it has on importing oil. . .China is acutely aware of its dependence on the West in this area.

In October 2022, the Biden administration instituted a ban on advanced microchip exports to China by US companies and any foreign companies that use US microchips—in practice, a global ban (although enforcing it is another question).

This has disturbing parallels with the events leading up to World War II in the Pacific. In July 1940 the US placed an aviation fuel and scrap iron embargo on an expansionist Japan that had already brutally invaded and occupied Manchuria and was in renewed conflict with China. In September 1940 Japan allied itself with Nazi Germany and Fascist Italy. The US imposed a total oil embargo in August 1941; four months later on 7 December Japan attacked Pearl Harbor, and a month after that invaded the Dutch East Indies, a major producer of rubber and oil. [8]

Lanchester concludes,

The chip ban has been described as a 'declaration of economic war'. And perhaps not only economic war. The assumption in military circles is that AI is going to be crucial to the next wave of innovation in warfare. The AI revolution will depend on new chip technology. The second Cold War is going to be a military-technological contest just like the first one, and once again semiconductors are going to be central. We are starting to get glimpses of what that might look like, with the first arrivals of drone swarms on battlefields. Coming soon: unmanned vehicles, fire-and-forget missiles, 'loitering munition systems' and facial recognition assassination drones. Advanced chips are as crucial to the process of designing new weapons systems as to the weapons themselves, because the majority of testing for these systems is done on computers. Fingers crossed that all this helps with avoiding World War Three.

Yeah—embargoes worked so well at heading off World War II.

Update 5 July 2023: An article in The Guardian today, "Chip Wars" by Amy Hawkins, reports that the Biden administration is considering expanding existing restrictions to include some less advanced chips, as well as some cloud services. The Netherlands, home of advanced-chip maker ASML, is also expected to increase its restrictions. On 30 June the Dutch government announced that export restrictions on photolithography equipment manufactured by ASML that is used to etch circuits onto microchips would go into effect on 1 September.


  1. London Review of Books, Vol. 45, No. 4, 16 February 2023: https://www.lrb.co.uk/the-paper/v45/n04/joe-moran/gen-z-and-me
  2. The Rhetoric of Aristotle, translated, with an analysis and critical notes, by J.E.C. Welldon. Macmillan, 1886, Book II, Section xii, pp. 164-166 (translation slightly altered). https://babel.hathitrust.org/cgi/pt?id=hvd.hxjtry&view=1up&seq=216
  3. Amanda Ruggeri, "People have always whinged about young adults. Here's proof." BBC Worklife, 2 October 2017. https://www.bbc.com/worklife/article/20171003-proof-that-people-have-always-complained-about-young-adults.
    See also: John Protzko and Jonathan W. Schooler, 2019, "Kids these days: Why the youth of today seem lacking," Science Advances, Vol. 5, No. 10:eaav5916. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6795513/pdf/aav5916.pdf;
    Jessica Kitt, 2013, "Kids These Days: An Analysis of the Rhetoric against Youth across Five Generations," https://cah.ucf.edu/writingrhetoric/wp-content/uploads/sites/27/2019/10/KWS2_Kitt.pdf
  4. London Review of Books, Vol. 45, No. 6, 16 March 2023. https://www.lrb.co.uk/the-paper/v45/n06/lorna-finlayson/diary
  5. Heather E. Campbell, 2019, "JPAE at 25: Looking back and moving forward on teaching evaluations," Journal of Public Affairs Education, Vol. 25, Issue 1, pp. 23-29, DOI: https://doi.org/10.1080/15236803.2018.1558823
  6. London Review of Books, Vol. 45, No. 5, 2 March 2023. https://www.lrb.co.uk/the-paper/v45/n05/william-davies/the-reaction-economy
  7. London Review of Books, Vol. 45, No. 6, 16 March 2023. https://www.lrb.co.uk/the-paper/v45/n06/john-lanchester/putting-the-silicon-in-silicon-valley
  8. See "The Revolutionary Pacifism of A.J. Muste: On the Backgrounds of the Pacific War," in Noam Chomsky, American Power and the New Mandarins: Historical and Political Essays, Pantheon, 1969, pp. 159-220.

No comments :

Post a Comment