rivkat: Dean reading (dean reading)
([personal profile] rivkat Jun. 17th, 2024 12:57 pm)
Sarah Igo, The Known Citizen: A History of Privacy in Modern AmericaMost interesting in its earlier chapters; a cultural history of how Americans talked about privacy and the varied things they feared were destroying it. “Women and sexual minorities were presumed to have a lesser claim on privacy than heterosexual men, and as a result they came first to the recognition that altering privacy’s terms through disclosure and confession might be the path to a more inclusive public sphere.” At the same time, white middle-class people were the ones who felt the most entitlement to complain about privacy violations, which could make privacy seem like a “bourgeois” preoccupation. And affluent citizens often supported proposals for mandatory fingerprinting in the 1910s/20s; presumably they thought that the visibility imposed by a fingerprint registry wouldn’t be used against them.

An entitlement to privacy can be part of citizenship, but so can public recognition; people often like to display as well as to hide. One fascinating example: the rise of the postcard, first authorized for sale in 1873. They were super popular but also not very private.

The most interesting parts for me concerned the period around the introduction of Social Security. “During World War II, the citizenship requirement for working in defense industries uncovered the fact that a full one-third of Americans of working age had no proof of birth, with rural African Americans and southwestern Spanish speakers the most poorly documented.” Social Security had the same problem, since old age benefits were tied to the day a worker turned 65. But when was that?

Republicans agitated against Social Security, arguing that it would regiment Americans and make everyone wear Social Security dog tags. But this didn’t work, in part because the cards were more similar to department store credit cards, which were already use. Indeed, many Americans cared more about keeping information from their employers than from the government; “women and Jews were especially reluctant” to disclose their Social Security forms to employers, as employers quickly demanded, “because they have falsified their age to their employers or because they are married women representing to be single in order to retain their positions”; Jews had changed the names under which they worked to avoid antisemitism. And information about previous work, including union affiliation, could be found from a work history on file. The nascent SSA did what it could to solve these problems: it allowed workers to submit separate forms and promised that information on a worker-submitted form would override that on an employer-held form. Still, “opportunistic companies were circulating their own official-looking forms demanding data from their employees—including the worker’s nationality, years of residence in the United States, religious background, educational level, home ownership, number of dependents, relatives employed in the same plant, and political and trade union affiliations.” Social Security offices were also flooded with requests for information from non-employers: “[w]ives seeking their husbands, mothers looking for lost children, sons who strayed away, war veterans in search of former buddies,” and more. To protect the system, the Board refused all requests, though for a short time it forwarded some inquiries to the individual concerned. The reason for interest, including from draft boards, was obvious: a Social Security number “was often the best, and sometimes the only, way to know where an individual lived, if he was drawing a paycheck, or whether he was alive at all.”

It was fascinating to learn that workers took steps to prevent employers from finding out the information on the form, but that people were fairly casual about the SSN itself. “Throughout the 1940s, for example, specific individuals’ SSNs were routinely printed in the newspaper without raising any hackles.” Radio stations in the 1950s did SSN-based contests. Some people bought special “protection” packages from private sources—not identity theft protection, but engraved metal versions of the fragile paper cards. Others (before the revelations of the concentration camps) got their numbers tattooed on themselves.

The Black press promoted the citizenship-affirming benefits of the SSN, given that most Black workers weren’t covered. Being known to the government was therefore a kind of privilege.

In the 1950s, public debate centered on invasion of private minds—by marketers and employers particularly, but also by general society. “The puzzle of postwar privacy, as well as Cold War-era individualism, was that the person herself seemed porous, her perimeter unfixed.” Betty Friedan criticized the “open plan” design of the suburban home as destroying a woman’s privacy and enabling her surveillance; she “could forget her own identity in these noisy open-plan houses.” No longer was the home a place of privacy—there was even debate about the popular “picture window,” which exposed people’s lives to their neighbors, allowing them to check whether you were keeping up or ahead in conspicuous consumption (and what was wrong with you if you didn’t want to do that?).

Public intellectuals were also concerned about media penetration: “[I]n some cases, the very privacy allotted to postwar middle-class youth, now often housed in their own separate bedrooms, facilitated the incursion.” And they were very worried about what advertisers knew about how customers thought. Conservatives were also concerned with schools’ incursions into the psychological lives of young people, fearing loss of parental control, which sounds a lot like the present day. “Privacy” was a valuable rhetorical resource in the fights against psychological testing—and desegregation.

Personality tests deployed by schools, and employers, offered the promise—false or not—of giving institutions knowledge about people that the people themselves didn’t have, and that was more valuable than their actual activities. Secret homosexual? That could be ferreted out. William Whyte’s The Organization Man (1956) advised faking normality: “When in doubt,” he counseled, “repeat to yourself: I loved my father and my mother, but my father a little bit more. I like things pretty much the way they are. I never worry much about anything. I don’t care for books or music much. I love my wife and children. I don’t let them get in the way of company work.”

Igo also argued that Griswold v. Connecticut, the now-imperiled case establishing a right to contraception, fit into a “a Cold War and anti-totalitarian frame, making it a key symbolic statement of American commitment to a private sphere free from state interference.” Privacy morphed into a concept related to sex and abortion, while “[t]he intrusions of community life, advertising and marketing, school and employment testing” didn’t cross into legal discourse. (Which was good for marketers, who worried in the Journal of Marketing that privacy rights might threaten their ability to do research.) On the other hand, some concern for “the privacy and self-respect of welfare recipients” did do so.

Later, in the 60s and 70s, regulators acted on worries about credit and educational records, but not in the sense of limiting their existence—instead, they accepted that citizens would leave massive data trails but provided rules about who could and couldn’t access them, and some error-correction mechanisms. In FERPA, for example, regulating educational records, “[a] provision that would have required parental consent before any psychological test or behavioral inquiry could be conducted did not survive in the final legislation. But the long list of materials that parents would now have a right to review told its own story about the trove of data schools housed.”

Further along, citizens created their own monitoring mechanisms—gated communities, caller ID, video surveillance, etc.—which Igo argues was driven at least in part out of privacy concerns “from the sense that citizens were on their own in facing them.” But upper- and middle-class homeowners also wanted lots of insight into others’ affairs. Igo suggests that the cultural cachet of surveillance, voyeurism, and extracting confessions was such that people committed to producing them for themselves, without waiting for the government, in everything from investigative reportage to the predecessors of reality TV. “Politicians as well as homosexual and reclusive celebrities were among those affected by a new standard of personal disclosure linked to the citizenry’s right to know.” One response was to frame “coming out” as a rejection of stigma, not just a matter of privacy. ACT UP’s slogan, “Silence = Death,” suggested that “language, discourse, public manifestations, and the production of identity are necessary weapons of defense in a contemporary strategy of gay survival.”

Kelly Weinersmith & Zach Weinersmith, A City on Mars: Can we settle space, should we settle space, and have we really thought this through?Entertaining enough though it gets a bit repetitive, since the answers are “not at all presently,” “probably not,” and “definitely not.” It’s a deliberate answer to boosters: “reading about space settlement today is kind of like reading about what quantity of beer is safe to drink in a world where all the relevant books are written by breweries.” Not only are they worried about military conflicts over space resources, but space is a really dangerous place; we can barely make the necessary high-tech supplies on Earth, and being self-sustaining would require incredible amounts of energy and technical innovation. And that’s before you get into the human factors: a company town in a “poisonous hellscape” six months away from any alternatives is not a recipe for successful human interactions. For five generations of a self-sustaining population, you’d want about thirty thousand people, and even that is low. “[T]he most autarkic countries on Earth have much more than 1 million people, are not the most economically desirable places on Earth, and incidentally would both like to be less autarkic.” But even that understates the challenge, given the resource constraints. “Even if you have 99 percent reuse of something like water, that loss of 1 percent mass adds up to 40 percent of your mass over fifty years.”

Ultimately, though, they’re really worried about governance. Going off Earth might just give us more ways to destroy ourselves, up to and including throwing rocks down the gravity well.


Xaq Frohlich, From Label to Table: Regulating Food in America in the Information AgeThis is really more how we got from table to label—that is, how we started relying on ingredients instead of food, and disclosure instead of controlling what people put in their bodies, allowing us to blame each other for our choices while still having the state involved in them. Also, understandably, the FDA focuses on big players who have big impacts on national markets and many consumers; but the corollary is that its rules can discriminate against small operators. This played out, for example, in the history of “organic” regulations.

Frolich identifies the 1970s as the real start of the slide into labeling/caveat emptor. Instead of undernourished poor consumers, underinformed (and poor) consumers became the regulatory targets. Thus, regulators “focused on the new challenges of the affluent society and need for novel solutions like food labeling.” At the same time, suspicion of government made substantive regulations difficult; it was popular to consider the FDA as needlessly paternalistic. Instead of defining “standards of identity” for things like ice cream, the nutrition information label would put all the ingredients on the label, diminishing the need for “punitive” labels like “imitation cream cheese.” This gave a boost to “engineered” foods relative to whole foods, especially in response to nutritional fads.

“If the government named a food as healthy or unhealthy, industry would argue it was an unfair attack; if, however, it named a nutrient, it was just information. This helps explain the lack of political pushback against the FDA’s new labeling rules: Everyone could support a public campaign for more nutrition information. It was only when the FDA sought to tell consumers what they ought to do with that information that the politics crept back in.” Thus, the FDA inclined to “nutritionism,” framing food as merely the sum of its (identified) nutrients.

As a result, many Americans consulted nutrition labels at least occasionally, but with mixed results. One risk: consumers generalize health claims, so that “low cholesterol” becomes “healthy.” FDA’s congressionally-mandated lesser regulation of supplements compared to food also encouraged producers to push the boundaries: supplement sellers made drug-like claims, but carefully framed them to avoid drug regulation because of the strict US definition of “drug.” Supplement makers can make “structure/function” claims, linking specific nutrients with explicit or implied claims for physiological effect (e.g., “calcium builds strong bones” or “fiber maintains bowel regularity”), but not disease treatment claims—but it’s often pretty easy to reframe a disease claim as a structure/function claim, as you can imagine.

I’m not sure I agree with Frolich’s characterization of the Organic label as “the antithesis of Nutrition Facts” because the former is holistic and considers where the nutrition in food comes from. “Nutrition Facts leads to tinkering with functional foods, while organic evokes the older purity-versus-contamination ideology of the early twentieth-century pure food movements.” Evokes, yes, but given the regulatory compromises made to allow big ag to get in on “organic,” I think it looks that way more than is that way. As Frolich points out, a key goal of federal standards for “organic” was designed to encourage a market for packaged/prepared/processed organic food. The ultimate standards had nothing to do with labor justice; the certification process was hard for small farms to comply with; and they allowed “organic Twinkies” if the ingredients could be sourced. Instead of a social movement, organic became “a price premium upgrade option at the supermarket,” “a political opt-out for the conscientious consumer.”

Frolich’s bottom line is that “[l]abeling movements leave some citizens behind…. When labels become opt-out tools from mainstream products, they also create a moral platform for judging less conscientious consumers.” Also, “[h]istory suggests that food companies will game any food labeling system …. Devices like the food label increase, not decrease, consumers’ dependency on expert mediators who determine what goes on or stays off labels.”

Matthew D. Morrison, Blacksound: Making Race and Popular Music in the United StatesMorrison sees blacksound as a concept like blackface: a means of appropriating Black culture and also reinforcing racism, here through intellectual property. He quotes bell hooks’ concept of “eating the other”—Blackness is both consumed and commodified, and blacksound extends that even when there is no blackface. “It was through the ritual of blackface—which invoked being possessed by a black ‘other’—that non-black people felt free to express topics from the most taboo to the most common.” Morrison also argues that blackface, and blacksound, were exported from the US to the rest of the world, shaping the globalization of popular culture. And he links the distribution of lynching phonographs in the 1890s, “thought (by a white listener) to be authentic recordings of the racial terror experienced by black people during these lynch mobs,” to the popularization of blues and jazz.
 
Matthew Sears, Sparta and the Commemoration of WarSparta exists in modern historical memory as the source for “300” and a successful warrior culture. But the records are poor and often deliberately distorted, sometimes by the Spartans themselves. “Sparta fought more wars when it changed from emphasizing the glory of war for the individual or the Spartan state, often to the detriment of sound strategy and tactics, to claiming to fight for freedom and in the service of the Greeks.” The latter rationale “had the effect of making war more, rather than less, likely–and might still do so today.” Moreover, “Sparta was able to be militaristic and isolationist at the same time only because it was quite literally isolated in the southern Peloponnese from the wider world.” Of course, the glorifiers of Sparta are also glorifying slavery and forced labor—and they were found among enslavement-supporting Southern whites and in Nazi education programs, along with the January 2021 insurrection. Ultimately the book has a presentist mission: “Those of us who study ancient history need to be more diligent in illuminating the less savory elements of ancient Sparta, such as extreme violence, brutal exploitation of others, and xenophobia that were not only bad in and of themselves but also damaging to Sparta’s long-term stability, especially when paired with open-ended campaigns of ‘liberation’ abroad.”
mecurtin: Snoopy reads a book with ears standing on end (reading Snoopy)

From: [personal profile] mecurtin


Have you read any of Bret Devereaux's writing on Sparta? He argues, forcefully, that Sparta militarily ranged from not-so-hot to failure, and that "the glorifiers of Sparta are also glorifying slavery and forced labor" -- and focusing on a minute aristocracy of enslavers, and that's the *point*, that's what Sparta-love is all about.
via_ostiense: Eun Chan eating, yellow background (Default)

From: [personal profile] via_ostiense


I'm impressed you finished the Mars book; the writing is engaging & the practical considerations are interesting, but it was so repetitive that I gave up about halfway through.
.

Links

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags