Tuesday, July 18, 2017

two centuries ago today, Jane Austen died, a few thoughts on one of Wenatchee The Hatchet's literary heroes

I've written this thought a few times before but the two literary voices I most leaned on as influences for how this blog works have been Jane Austen and Joan Didion.  If I had to boil my literary heroes down to a mere two figures I guess they'd be Dostoevsky and Austen. Fortunately nobody really has to distill their literary inspirations down to a mere two authors, but if I had to pick just two those would be the two.

I consider Austen one of the great comedic geniuses of English language literature.  That Austen's characters spend so much time navigating, with success or abject failure, the differences between the formalities of expected public discourse and the private realities of what people really do has been a natural fit for a lot of my literary and regional historical interests. 

Ten years on it actually feels inevitable that Wenatchee The Hatchet would be a Jane Austen fan who has written a lot about the gap between the branding and the interior reality of what used to be called Mars Hill.  I'm also a Christopher Nolan fan and his penchant for telling stories about corrupt and corruptible men who fool themselves into thinking the terrible things they want to do are the right thing to do is another inspiration for the kinds of stuff I love writing about. 

But as a contrast to the dudely dude chest-thumpery of Mark Driscoll and the bros who admire him, it would be difficult to find a literary figure who might be more antithetical in style and guiding ethic than, say, Jane Austen. 

I'd planned to write more for this particular day but there's other writing I've been tackling.  So drawing some inspiration from some famous Christian author who's had no problem shamelessly recycling old content ... we can do something like that here.

Here's an old piece from a few years ago that ... somebody ... wrote about Jane Austen's most famous literary work back in 2013. 

With a few years between publication and the present, it's evident to anyone who has read the book that Austen regarded the mercenary pragmatism of the Collins marriage to be less than ideal.  Yet Charlotte's correction to her friend Lizzy was to say that not everyone had the beauty and brains to have the luxury of shooting down a suitor on the assumption that another offer was just around the corner.  Lizzy could afford that, Charlotte could not.  Yet by novel's end it was abundantly clear that we weren't supposed to regard the stupid horndogs Wickham and Lydia as having married for particularly good reasons.  It would be a bit tricky to assert that Jane Austen was a feminist or a romantic/progressive of any of the stripes we see in 21st century American culture; but it was relatively clear that she believed that a marriage that was not "just" bout business concerns and also more than just the fire of the loins was what was necessary.  Her ideals regarding romance and companionate marriage may have become so notorious that her satires of the self-aggrandizing and entitled nature of the aristocracy can be all but forgotten.

Anyway ...

https://mereorthodoxy.com/romance-in-pride-and-prejudice-sometimes-we-settle/
Romance in Pride and Prejudice: Sometimes, We Settle
February 18, 2013

It is axiomatic that an artist’s work will be admired and disdained for a single set of qualities. Some admire the breadth and passion of Beethoven while others find his stamina and pathos tedious. Some admire the precision and pacing of Kubrick’s films while others find them pretentious.  Jane Austen is no exception; her longevity is like that of any other significant artist. The defenders and detractors never stop having their arguments about the worth of her work.

It may be worth revisiting Pride & Prejudice, which is two hundred years old this year, to consider what distinguishes her romances from contemporary romances. After all, Elizabeth Bennett is not the kind of character we can imagine will be convincingly portrayed by a Meg Ryan or a Kate Hudson, or even a Julia Roberts. Lizzy and Jane are not heroines who lend themselves to being championed by America’s sweethearts in just about any generation of film.

Arguably, Noah Berlatsky, writing for the Atlantic, has summed up the paradoxical appeal of Austen’s work: “She has to be one of the least romantic writers ever to write romance.”
Austen’s tales of romance may endure because she put so little stock in romance as we tend to define it. In an Austen novel, career advancement, real estate values, the size of an entailment, and the social and fiscal connections that come with marriage all matter. If that seems unappealing it is because we can’t conceive of a culture in which a marriage could be arranged to benefit clans rather than as the culmination of a quest for a “soulmate.” We also live in a culture which, in some sense, denies the inevitability of death.  And so Austen’s tales of matrimony and negotiation don’t make sense to us because they are often, as Berlatsky put it, as “small as life.”  Americans want life to be bigger and grander in every respect than a life could be in Jane Austen’s time.

But a title like Pride & Prejudice suggests that however domestic the tale, Austen’s themes are hardly small. Just as stories about war are rarely “just” about war, Austen’s tales of romance are not “just” stories of people who marry.  The title tips us off to character flaws before we’ve even opened the book. Though Elizabeth and Darcy are not imbued with a social or symbolic significance as apocalyptic as Dostoevsky’s characters, they do represent ways of living life. That Austen is quotidian where Dostoevsky is apocalyptic, that Austen is mundane where Dostoevsky is grotesque hardly means she was not writing about ideas. Austen had an eye for the mundane details with which philosophies of life must contend on a daily basis. Dostoevsky wrote about the personal and social cataclysms that philosophies create when untempered by other ideals.  But it is the dry domesticity of Austen’s narrative world and the long term decisions made within it that give her characters’ decisions weight.  Irreversible life-altering decisions hinge on a person’s ability or inability to make the right decision after observing mundane details.

The marriages that take place in the novel are made by people following ideals (Elizabeth and Darcy), altruistic affection (Jane and Bingley), pragmatism (Charlotte and Collins), and visceral chemistry (Lydia and Wickham).  While it is obvious that Austen did not endorse the latter pairings, it is equally clear she shows us the latter two couples are not at all disappointed with their respective catches. Charlotte Lucas isn’t ruined by settling for Mr. Collins any more than Lydia is unhappy to be married to Wickham.  Charlotte realistically assessed herself, knew marriage to be a sure defense against poverty and loneliness, and pragmatically accepted the best offer she had. Charlotte could tell Lizzy that Lizzy had the luxury of being beautiful enough and clever enough to actually turn down proposals. But Charlotte had neither and so went with her best option. Austen’s stories are stories in which social, economic, and sexual capital are all part of a calculation for a plausible pairing as a business decision, not merely a quest for true love. But even Lydia, silly as she is, never seems unhappy with Wickham or the support they get from the Darcys by books’ end.  They just continue as they do.

Today we may recognize that the ideal is Lizzy and Darcy, but when our culture advises to settle we prefer to settle for Lydia and Wickham rather than Charlotte and Collins.  The love that bursts forth like a fire, demolishing property and removing clothing, was the sort that Austen made fun of.  Yet in the 21st century that sort of attraction is so taken for granted that director Joe Wright determined that Austen was too discreet to tell us the real reason Lizzy and Darcy fall for each other. So it turns out for a contemporary film-maker to sell himself on Lizzy and Darcy he has to believe they were drawn to each other like Lydia and Wickham. They are now heroes for sublimating their desire more decorously than others.  But that sort of erotic obsession goes past the point of even Lydia and Wickham to become the obsession of Dmitry Karamazov with Grushenka, only benefiting from the refinement of English manners.

In Austen’s actual novel, Lizzy and Darcy must overcome their own character flaws to discover they love each other.  In Wright’s film they simply need to contain themselves long enough to get social permission to do what they soon realized they wanted to do.  Cinema is full of tales where transgressive love is prized as a philosophical statement. “Theirs was a forbidden love” has been the clarion call to more than one or two made-for-TV-dramas. If the sparks create a fire hot enough then the heat was worth it.  This ethos is so prized it was shoe-horned into a Jane Austen adaptation. Apparently without that spark we won’t believe her story in our time.

Even evangelicals who claim that we should not be like the world still seem to want that devouring spark. When evangelical speakers and writers say that a marriage founded on anything but mutual love and attraction is going to fail, this indicates ignorance not merely of literature but of history. Negotiated lives together have happened in all sorts of ways. Because the spark of mutual sexual attraction inevitably wanes, friendship is important.

So far, so obvious. But great art, music, and literature help us avoid underestimating the obvious. Charlotte and Mr. Collins may not know the rapturous heights of mutual affection that Lizzy and Darcy know, but neither do they feel the imagined betrayals or wounded egos with bitterness and shame.  For Mr. and Mrs. Lucas the whole thing was to make sure the clergyman had his wife and the wife had her home. Having asked for no more than that and having found it, that was that. Though they may seem insultingly insular and provincial to us, they found their happy ending.

That Austen showed us happy endings even for those who settled in marriage is a reason for her greatness. It is because we know the marriages in that time and place could not be simply annulled that Charlotte’s decision bears a hint of tragedy. She settled, and she settled in a way we hope never to do. And by novel’s end Lydia and Wickham also settled, in their own way. In their world there can be no seven-year itch in which they reconsider their choices.  But both couples seem happy, if not wise, and by contemporary American romantic tropes they may be sitting in a church pew near you or me.

Sunday, July 16, 2017

over at Aeon, a piece surveying how usury stopped being thought of as sinful in the Judeo-Christian millieu and became respectable finance

Much of the time pieces sent to Aeon can be unconvincing and even insanely stupid.  But the price of promulgating think-pieces is sometimes the think pieces have dumb ideas, like the idea that children should be redistributed by the state across all racial lines so as to ensure racism never happens again, as though the totalitarian regime that would be necessary to enforce such a policy over against "genetic narcissism" would only be a benefit to the human race. 

But sometimes there are useful or at least interesting surveys and to such a little survey we turn:

https://aeon.co/essays/how-did-usury-stop-being-a-sin-and-become-respectable-finance



...
 
In Debt: The First 5,000 Years (2011), the anthropologist David Graeber argues that before the advent of money, economic life within a community was a web of mutual debts. People did not behave as self-interested individuals – at least not from the perspective of a single transaction; rather, they would share food, clothes and luxuries, and trust that their peers would repay the favour in return. When we consider these origins of debt and credit – as a system of mutual aid between people who trust each other – it’s no surprise that so many cultures viewed charging interest as morally wrong.
 
...
 
Meanwhile, the Catholic Church played its own part in sowing the seeds of a change of attitude. In the 13th century, it introduced the concept of Purgatory – a place that had no basis in scripture but did offer some reassurance to anyone committing the sin of usury each day. ‘Purgatory was just one of the complicitous winks that Christianity sent the usurer’s way,’ wrote the historian Jacques Le Goff in Your Money or Your Life: Economy and Religion in the Middle Ages (1990). ‘The hope of escaping Hell, thanks to Purgatory, permitted the usurer to propel the economy and society of the 13th century ahead towards capitalism.’

and here's more along those lines.  Indulgences do get a mention. 

over at The Imaginative Conservative, an author rues the day Star Wars ruined arts culture by celebrating distraction, skips over the monomyth and the possibility that Star Wars franchises may be the distillation of the total work of art sought by German and French avant garde ideals

I admit I tend to identify as moderately conservative about religion and politics.  By moderate I mean to say I'm a Presbyterian dour Calvinist who thinks the human condition is fraught by human frailties so stark that I find myself thinking the Frankfurt school authors were too optimistic about the human condition in modern technocratic societies.  And I've been reading arts history/art criticism books by authors who write for Thesis Eleven ... .  My commitment is more to Christian doctrine and teaching than to the left or right on the political spectrum.   My views may be an uneasy grab bag of Edmund Burke, Jacques Ellul and Roger Williams ... . Throw in Dostoevsky, Kafka, Conrad, some Bonhoeffer and Brunner and I guess that's where I'm at. 

Which is set up for the observation that when I see a title like "The Imaginative Conservative" I can't help but wonder if "The Imaginative Reactionary" might not be a synonymous title.  Take this recent Sean Fitzpatrick piece that rues the day George Lucas' franchise exploded into the Cineplex. 

http://www.theimaginativeconservative.org/2017/07/star-wars-sean-fitzpatrick.html

Forty years ago this summer—what seems to many a long time ago in a galaxy far, far away—Star Wars was released, and America was sold into the slavery of pop-culture merchandising. With this era-changing movie, the American cinematic focus shifted away from sophisticated dramas—such as The Godfather, One Flew Over the Cuckoo’s Nest, and Taxi Driver—back to a pre-60s golden-age trope where exhibitionism and carnival capers in motion pictures made money. Some say that George Lucas effected a return to what the movies were meant to be, while others argue that his swashbuckling “space opera” was a backslide from which cinema has never recovered. In either case, Star Wars was the flagship film to sell itself as a franchise, driven and dominated by mass marketing, special effects, action sequences, and cornball dialogue. Gaining the status of highest-grossing film of all time, Star Wars became the epitome of the summer blockbuster, recasting movies as commercial events that cater to the lowest common denominator of the movie-going public. The effects of Star Wars run deep in the entertainment industry and have made explosive, eye-candy spectacle an idol of distraction for many whose lives are so meaningless that distraction is a crucial drug.

Popcorn flicks like Star Wars are central, even integral, to American leisure—which is arresting if Josef Pieper’s notion about the basis of culture is correct. Where would society be without its screens, its celebrities, and its space sagas? It is rare to walk into a home that does not have a television dominating, or even enshrining, its living room. It is almost a matter of principle akin to a religious obligation in the civilian temples of Americanism. The parallels between the television and the tabernacle show how deft the forces of darkness are at leading man from the truth by imitating it. Leaving aside the comparisons that exist between the local church and the local theater, entertainment has become something like a new religion, a ritual for people to fill the voids in their lives—only entertainment is fast becoming nothing more than an addiction to nothingness, a placebo against the emptiness of the times. In these ways, modern entertainment is not simply distorting the elements of religion, but actually commandeering the role of religion in human society. A new idol has risen for the idle neo-pagans, and it is the idolatry of distraction.

***
For an author to take this stance about Star Wars while singing the praises of Dickens or Poe or Arthur Conan Doyle invites a question as to what it is about the pulp fiction of earlier centuries that let it become part of a literary canon in our more recent era.

The Godfather was no less than Jaws founded on pulp idioms and popular fiction. 

The process by which multi-media comprehensive branded merchandising and marketing was not necessarily all done in 1977.  The process started but the deregulated industry practices that allowed for children to be exposed to films for which there were toys and comics and novelizations and cartoons more properly erupted in the Reagan years.  It's not a surprise if a contributor to The Imaginative Conservative would like to think that the beginning of the doom of pop culture enslavement happened during the Carter administration but that seems daft. 

Any accounting of Star Wars that ignores Campbell's monomyth is an accounting that isn't really worth taking seriously. 

It's like a whole bunch of people don't get what European avant garde theorists were proposing centuries ago about the role the arts could play in formulating a new mythological substitute for Christian religion.  It's not that pop culture somehow was "allowed" to commandeer the cultic elements of religions.  Entertainment figures explicitly set out to create cults around their franchises.  Even an atheist like Joss Whedon can talk about how great it was, twenty years later, that Buffy the Vampire Slayer became show with the cult following it has. Perhaps he hopes a comparable cult following can let him keep playing with Firefly stuff for a while.

Now David Roberts has written three books that can be pretty opaque but he proposed, at length, that the ideal of the total work of art as precursor of and catalyst for the ideal society moved from Germany and France to the United States.  Others have mentioned this, too, but the idea I'm mulling over is that if in the avant garde of Europe utopianism and the avant garde tended to fixate on the utopian past or the utopian future, the American innovation in the later 20th century is more inclusive.  The futurist tech of the Star Wars franchise famously took place a long time ago in a galaxy far, far away.  Ancient future.  Something Roberts discussed at length in his books is how the Germans venerated Athens and the French venerated Sparta and how theorists and philosophers imagined that the Athenian art religion was a unified celebration in art of a unified society.

Well, okay, let's suppose that the American approach to unified art or the total work of art or ... the brand ... is also a celebration of an idealized status quo.  That could mean the dreams of German philosophers and avant garde artists would have been most realized in American franchises like Star Trek, Star Wars, My Little Pony, Transformers, G. I. Joe and so on.

But that can't be right.  It's supposed to be Wagner's operas and the literature of Mallarme and Schiller and Goethe and ... it's not supposed to be Optimus Prime and Twilight Sparkle or Captain Kirk and Luke Skywalker. 

Just because the religious or cultic elements of pop culture don't adhere to an old conservative nationalistic or ethnic demographic does not necessarily make them any less functionally religious.  In an era where conservatives write about morally therapeutic deism why wouldn't they spot that this is central to a Star Wars spirituality?  A religion of universal humanity, human reason and art doesn't need a deity to be functional.  Consider the cult of Star Trek these last fifty years. What middlebrow arts critics find so loathesome about mass and pop cultural franchises is that they not only make no bones about being explicitly and directly philosophical, their moralizing is front and center.  Superheroes explicitly insist upon telling us who is and isn't a hero and why.  It's not like Woody Allen films where the protagonist is an author stand-in or other kinds of films that are open to the interpretations of suitable cognoscenti--no, the Star Wars cinematic universe doesn't give you the luxury of supposing Palpatine is the hero of the story.  You're not supposed to imagine that perhaps the Empire has some worthy goals.  Maybe someone will write a funny piece at The Federalist making such a case, replete with the line "The Empire is back, baby, and they're gonna show these hippies who's boss!" 

Fans of the highbrow from the left and the right will likely never stop wringing their hands that too many people derive too much pleasure from too much pulp fiction. 

Since I'm a moderately conservative Presbyterian rather than a really conservative Catholic I suppose I may never land in the same spot as the sort of person who writes what's quoted above at The Imaginative Conservative.

The idea that a franchise like Transformers could reflect a not-even latent desire to in some sense see the world re-enchanted in a paradoxical way through technology is probably not going to be on the table.  That's probably because the kinds of folks who write for The Imaginative Conservative are going to be soooo busy attacking all ideas that even could possibly be associated with Marxists as never being able to correspond to any ideas that people with traditional or conservative Christian beliefs could agree with.   Actually ... there's a piece at Aeon I want to link to about the long history of how Christendom in the West went from saying usury was straight up evil to defending it ...

over at Vulture a case that Tony Stark is the "real" villain of Spiderman-Homecoming with the bromide that the Vulture is a Trump voter ... but ...

http://www.vulture.com/2017/07/spider-man-homecoming-is-iron-man-the-real-villain.html?wpsrc=nyma


...
 
Tony Stark’s always been something of a lovable rogue, and he’s accomplished many heroic things in other films. Here, however, his actions seem more sinister when he’s dealing with children — and as it turns out, when he’s running Stark Industries, which, in Homecoming, seems to operate on the shady end of the spectrum. In the beginning of the film, we learn that the business of cleaning up the wreckage from the Avengers’ New York battles has been given over to the Department of Damage Control, which, as Darren Franich pointed out in EW, is co-financed by Tony Stark and seems like a fairly malevolent force, despite the fact that national treasure Tyne Daly is its main spokesperson. DDC forces out local contractors like Michael Keaton’s Adrian Toomes, giving it the monopoly on superhero clean-ups. This might be designed to prevent dangerous alien tech from slipping into the hands of the unready (even though Toomes and his pals manage to steal it anyway), but it also ensures that Tony Stark has a vertical monopoly on superhuman activity: The battles use Stark technology; the clean-up crews are Stark branded; the PR is managed through Pepper Potts. Stark’s superpower, after all, is that he’s smart and rich. He lives in a world with few consequences. Money solves most of his problems; his monopolies prevent him from directly answering to the public. Who is he to teach a 15-year-old personal responsibility?

It’s unclear whether or how Stark Industries turns a profit, but its actions, as Homecoming reveals, have forced Americans out of their jobs. Case in point: Adrian Toomes, who offers the most compelling critique of Stark before he decides to become the evil Vulture. Toomes starts out in salvaging, gets forced out of his job by the Department of Damage Control, and then turns to a life of crime. As he faces off against Spider-Man, Keaton also gives the film a rare jolt of class consciousness as he tells Peter, “The rich and the powerful, like Stark, they don’t care about us.” The movie’s quick to supply examples of Toomes’s hypocrisy; as Vulture’s own Abe Riesman pointed out, he’s something akin to a monstrous vision of a Trump voter, furious at the elites of the world but unable to acknowledge his own relative privilege, as exemplified by a modernist home with way too many windows. [emphasis added]

 
The Vulture wears a bird suit, and goes from murder-curious to murderous after accidentally killing Logan Marshall-Green, but that doesn’t mean we should ignore his ideas. In the long term, Tony Stark’s actions do hurt the little guy. He’s like a Silicon Valley CEO who, after disrupting the economy with one good product, doesn’t acknowledge the evil he’s produced as a consequence. Tony Stark and his compatriots have seized control of a significant portion of the world’s power apparatus, and they are forcing out the ordinary man. Does this make Iron Man the villain? Marvel movies tend to have villains who intend to do harm, while people who cause damage unintentionally are more redeemable. (See Bucky Barnes in Winter Soldier or Civil War.) Surely, there’s enough evidence in Homecoming to see Toomes as at least a complicated figure, operating in something of a moral gray area.

The thing about the stereotypical Trump voter/alt right voter is white nationalistic ideas.  Yet ... for anyone who has actually seen Spiderman: Homecoming, the interracial marriage that led to the existence of Peter's crush Liz is really obvious by act 3.  Perhaps journalists wanting to describe the latest Spiderman villain in political terms want to find some other reference point for a white guy married to a black woman who's committing all his crimes to provide for his family in terms that don't deviate from the mainstream script in the press about Trump. 

The problem with Toomes isn't that he's a hypocrite.  No Marvel antagonist so far seems more committed to doing everything under the radar and as quietly as possible.  Toomes ends up killing Shocker 1 after an incident where Shocker 1 insists on showing off high-powered weaponry in a suburb without regard for collateral damage.  Underground arms dealer though he is, this is still an Adrian Toomes he can regard Mac Gargan with contempt as someone he wouldn't even deal with if Spiderman hadn't messed up other business deals.  If people want to cast Toomes in some kind of political sense the idea that this Adrian Toomes is a Trump voter seems a bit much.  Maybe he could be likened to a Reagan Democrat ...

But his criticism of Stark and the Department of Damage Control (subtle name, as always) is that what Stark and company benefit from is the kind of crony monopolistic capitalism in which the haves get to have more and those who don't get completely sidelined.  How do we know that Adrian Toomes, if he were magically a real person, wouldn't have voted for Sanders?  He might even have voted for Clinton, whose record as a hawk doesn't seem in any contradiction to the Vulture's family-driven pragmatism.  Had Trump not won would journalists even think to interpret the Vulture's activities and motives in Trump-voter terms?  Not ... very ... likely.  Last year some tried to describe the antagonist of the Magnificent Seven remake of a remake in Trumpian terms even though the production was under way (i.e. already scripted) before Trump's candidacy was solidified.  But there seems to be this penchant in the entertainment industry for a kind of political punditry recency bias; X or Y is imputed to a pop culture event that may have taken years to come together as though it were somehow consciously anticipating or responding to current events.  That makes sense if we're talking about a show like South Park where Parker and Stone are obviously reacting within a few weeks to current events.

When Parker confronts Toomes at the end Toomes' objection is that he is, in fact, pretty much doing the same thing that got the Starks their empire of wealth, selling weapons to killers.   Toomes' problem isn't hypocrisy so much as that he refuses to concede that the difference between what Starks Tony or Howard did and what he's been doing is the difference between the formal relationship granted by the state.  The state, in the form of the Department of Damage Control, deprived him of his job and contract to clean up the post-Avengers 1 damage. He, in turn, steals from Damage Control to refurbish alien tech into weapons and tools that he sells on the black market.   he's still a criminal but with an understandable motive.

If the studios want to even bother with a Sinister Six film they can bring back Keaton as the Vulture and maybe bring back Molina as Doc Ock.  One of the fun things about the classic Spiderman villains is that since they're older guys older actors could step into the roles.  Odds are pretty decent that the Osborn stuff has been too badly played out to be worth continuing.

Saturday, July 15, 2017

links for the weekend--"is classical journalism in decline?" is still the rhetorical question du jour in journalism, which might invite a different rhetorical question, does an arts scene that can't be monetized existed for arts journalism?

Here's an article that asks whether classical music journalism is fading into silence, one of those rhetorical questions as title articles that we inevitably see on this subject.

https://www.sfcv.org/article/diminuendo-is-classical-music-journalism-fading-to-silence
...
As arts coverage has shifted from major to minor, the diminution of print coverage of classical music events takes its toll. As in Hadyn’s “Farewell” Symphony, the players, snuffing out their candles, slowly exit the stage one by one until two violins play the final pianissimo adagio.

In 2016, reports of newspapers eliminating arts journalists through layoffs and buyouts seem more mind-numbing than shocking. Since the beginning of the millennium, legacy media has shed jobs across the board. In 2007-08, a quarter of all U.S. jobs in arts journalism were eliminated. By 2011, the John S. and James L. Knight Foundation’s then vice president for arts Dennis Scholl estimated that as many as 50 percent of the local arts journalism jobs in America had vanished. In July 2015, The American Society of News Editors reported its first double-digit decline—10.4 percent— in all newsroom jobs since the Great Recession. And anecdotally, the loss of arts journalists, especially critics, outpaces those in other departments. “I can count the number of full-time classical music critics on both hands,” says Douglas McLennan, the editor of ArtsJournal.
...
Online journalism has entered the wild west when it comes to monetization. Traditional broadsheets are forced to compete for clicks with The Daily Beast, Huffington Post, Salon, Slate, and BuzzFeed and with the distillation of newsbites on social media such as Facebook and Twitter. “There used to be a media that was top-down, but that has changed rapidly for the news industry,” says Michael Zwiebach, the senior editor of San Francisco Classical Voice since 2009. “People are not addicted to, nor do they trust, one source of information. For a lot of sources like the San Francisco Chronicle and Boston Globe, there has been a free-for-all competition for eyeballs. They take stories they think will bring in the most hits,” which in the culture category is most likely television, movies, pop music, or an occasional blockbuster like Hamilton.

And culture that isn’t easily monetized gets ignored. “At one time when there were classical recordings, there was a revenue and economic stream for classical music and the opera world that perpetuated media to cover them because it was also a business and industry,” says Peter B. Carzasty, the founder of the arts consulting firm Geah, Ltd. The survival struggle of media institutions was only exacerbated by the Great Recession of 2008–09. [emphasis added]

Increasingly shortened attention spans have driven hunger for quick internet news. The 1982 launch of USA Today, specifically designed for a generation raised on television and whose motto was, “an economy of words, a wealth of information,” predated internet trollers who can’t wade through anything over 140 characters, much less a 1000-word review.

It might be impossible to overstate the significance of the phrase phrase in bold--one might dare to say that one of the problems for the classical musical scene is that, in journalistic terms, we could ask whether or not something that is easily monetized even exists to begin with in cultural journalism.  Take one of my pet topics for discussion here at this blog over the last twelve years, polyphonic music composed for the guitar.  Has there been a single headline for that topic?  Nikita Koshkin's 24 preludes and fugues got published this year, at last (!), and yet to date I'm not sure if there has been any coverage (yet) on this cycle. 

On the one hand if there were a more robust or healthier arts journalism scene perhaps this cycle could get the coverage I think it should get.  On the other hand, the knowledge of the guitar and its literature, let alone the knowledge necessary to establish how one could assess polyphonic music, seems to be more a rarity in the classical guitar scene than maybe it should be.  When guitarists so often can say the instrument is not really suited to idioms like sonata and fugue does the fact that Igor Rekhin, Nikita Koshkin and German Dzhaparidze all have cycles of preludes and fugues for solo guitar they've composed register?   You could count the cycle yours truly composed ... if you wanted to ... but this gets back to the aforementioned question as to whether a work of art even exists for arts journalists if it isn't presented to the world in a monetizable format?  The Koshkin and Dzhaparidze cycles are both really good, by the way.  The Rehkin cycle is a bit more mixed for reasons I hope to get to later this year. 

So the lament and concerns about the state of classical music journalism is anything but an abstract or theoretical concern to me.  For that matter coverage of local religious scenes was often wanting.  Had local coverage been more thorough over the last ten years this blog wouldn't have gained notoriety for discussing what was once Mars Hill.  That's another case where presenting a small sea of information in a way that has been stubbornly free of monetization seems to have built up over time in a way where journalism establishments, for the most part, regarded the stuff as not necessarily existing, excepting maybe a stretch between 2013 and 2014. 

When Sousa warned about the rise of what we now know of as the music industry his worry was that amateur musicianship would wither away, and it was the cultures of amateur musicianship he regarded as the lifeblood of musical culture.  We live in an era in which the amateur musical scene is back and perhaps it could even be as robust as it might have been in the pre-music industry era.  But ... I wonder if arts critics and arts journalists have stopped to consider that an explosion of genuinely amateur composition and performance might mean that the cultures of monetizable properties that their bread and butter has depended on is no longer going to be as much of a thing to be covered and that ... as this article puts it:

There are winners and losers in pop’s attention economy, but most acts fall into the latter category

In an attention market, the haves get more and those who have not might lose even what they have. 

The question of how robust the amateur arts scenes are could be particularly scary and unknowable/unanswerable a question for more than just arts journalists and critics; it is a question that may need to be considered at the level of education.  Are we sure that the arts will die unless academic establishments keep it around?  Overseas there's some controversy about some proposed changes in education and a proposal that the loss of arts will be offset by the rise in IT.

‘Arts GCSE decline compensated by rise in IT,’ claims Tory education minister

I don't happen to agree that the arts are the "easy option" for people who couldn't cut it in math or engineering or something like that.  It's not that people who are not disposed toward those things couldn't or wouldn't say that, obviously.  No, I think a concern that has been brewing or erupting here on this side of the Atlantic might provide another reason we can't be entirely sure that an arts education lost will be met as tragic by non-artists--it's the whole canon wars as to what should be in the academic canon of the arts that people have to learn and why.  In European contexts where there's potentially more canonical certainty about German or French or English or Irish art that whole array of topics might be moot, but in the United States the question as to why arts instruction (if we're going to have that and keep having that) might favor a Euro-centric canon is la lively one.  Whether from the left or the right perhaps the blind spot journalists and academics have is that if we no longer have the possibility of a "folk culture" in the age of online videos and digital reproduction, we may still be witnessing a resurgent culture of amateur-driven arts activity.  Or not, that's the thing, it might be hard to gauge. 

Because what journalists and academics often like to rule out is financial success.  Twenty years ago I was eagerly collecting and reading the manga of Rumiko Takahashi, probably best known for Ranma 1/2 and Inu Yasha.  She also made the manga Maison Ikkoku--when a friend suggested her work to me he described her as being a kind of Jane Austen of manga.  I was curious about this claim, though at the time I'd read no Jane Austen.  That would come later.  Takahashi's work may be popular enough that some 200 million copies of her manga or anime adapted from her manga are in circulation but ... have you heard of her work?  Recent headlines about someone involved in the translation of the work were ... disappointing to read but if you don't already know the less you know the better for now.

The theme at this point is the linkage between monetization of art and who gets recognition.  This summer's biggest superhero blockbuster may remain an instructive case in point but we don't even have to stick to that ... .

Terry Teachout wrote recently about how it only took him somewhere between five to six minutes to map out a season of theater programs scripted entirely by women.  Teachout writes for a publication that ... at least for a majority of people who live here in Seattle, would not be identifiable as "left".  Commentary magazine is also not particularly "left".  But if Teachout could map out a season of all women playwrights in under ten minutes that's some context for a recurring set of thinkpieces as to how and why plays by women or other artistic projects helmed by women don't get more exposure.

In the summer of Wonder Woman 2017 this would, at the highest-profile level, seem like a summer to at least keep this topic in public attention. Of course for some authors over at The New Republic, Wonder Woman is just Americanist propaganda. Josephine Livingstone could sympathetically regard the big dumb spectacle of Valerian because even if it's inspired by a comic book at least it's a European comic book rather than an American one.  It seems that for folks at The New Republic or The Imaginative Conservative pulp fiction can be forgiven being what it is if it's mid-20th century European comics or inspires Coppola films ... . 

It's like there's some tacit goldilocks deal where the art can't be TOO mass-produced and TOO popular or it must either not be art or must be propaganda ... but a the same time if it exists in a form that can't be monetized or doesn't make its presence known in a market-force-level way then it doesn't even exist.  From the standpoint of arts journalism as the first draft of arts history a whole lot of the arts never existed.  This will remain the most likely outcome for a lot of arts out there in spite of the fact that, in theory and potentially also in practice, there's more and more stuff you can get access to now in the arts than ever before. You didn't need to get stuff on Prime day but with a market event like Prime day it would be relatively easy to go buy music and film and books.  Which is the transition for this weekend to ...

Elsewhere at TNR ... a piece with a sidelong commentary about the big river company:

 ... 
 
Amazon did not come to dominate the way we shop because of its technology. It did so because we let it. Over the past three decades, the U.S. government has permitted corporate giants to take over an ever-increasing share of the economy.
 
Back in the perma-temping, no-medical-benefits-jobs era of the 1990s I heard an IT person connected to a slightly larger-than-average company in the Puget Sound area complain that what Amazon did was get a strangle-hold on one-click purchasing.  So I have my doubts that "we let it" is an adequate explanation of the rise of Amazon.  It seems like it's just vague leftist boilerplate.  Even for people who tilt conservative or libertarian the history of crony capital manipulation of legislation relating to intellectual property is, if not easy to look up, feasible to research. Wouldn't an author who contributes to the TNR have more time and interest in proposing how massive corporate interests took time to revise and guide legislation regarding intellectual property, trademark, and licensing?  Well, maybe not? 

The older I get the more I get the impression that one of the problems in contemporary popular art is how much it is hamstrung by the reality that the majority of what we have as popular culture is licensed or trademarked and under copyright.  I don't think the "solution" to this problem is to come up with facile and largely unpersuasive arguments against the legitimacy of copyright.  If arts educators wanted to build a case for why arts education is essential here's an angle that, by and large, I have never seen anyone put forth in a journalistic context--in light of how restrictive copyright and trademark and licensing practices are in this new and international arts market, the most compelling reason to preserve an artistic canon of some kind is that by teaching an arts canon that is gloriously public domain and by exploring the ways in which that artistic canon has influenced and inspired more recent under-copyright art, we can give students ways to learn how to cultivate an interest in those arts that are genuinely public domain.  This is something conservatives already want to do on other grounds, as it is, the Western canon and all that. 

It's not like at this point we can even accept at face value the bromide that the Western literary canon is all dead white males.  We're hitting the bicentennial of the death of one of the greatest comedic geniuses in English language literature.  Yes, of course, I'm referring to Jane Austen. I began reading her work shortly after the start of the millennium and I have written here on a number of occasions that as I formulated the tone and literary voice for this blog tackling the history of Mars Hill I made a point of emulating the literary approaches of Jane Austen and Joan Didion.  It happens that I love the former's novels and the latter's non-fiction; it's also the case that I concluded that if there was going to be a kind of anti-Mark Driscoll aesthetic then the counterpoint to Mark Driscoll's camera-loving stand-up comic emulating stage persona drawing from Chris Rock, John Piper and Douglas Wilson would be a literary style inspired by Jane Austen and Joan Didion, both of whom have a penchant for a kind of bemused, chilly detachment.  If you read the first line of Pride & Prejudice and do not instantly grasp the nature of its joke then it's just not likely to be your thing.

There's a little piece at the New York Times recently about Jane Austen's literary style and the ideas running through her work:

https://www.nytimes.com/2017/07/06/upshot/the-word-choices-that-explain-why-jane-austen-endures.html?_r=0

It is at the heart of Austen’s work: What is going on behind the veneer that politeness demands? [emphasis added] These distinctive words, word clusters and grammatical constructions highlight her writerly preoccupations: states of mind and feeling, her characters’ unceasing efforts to understand themselves and other people
 
Human nature (together with the operation of time) is the true subject of all novels, even those full of ghosts, pirates, plucky orphans or rides to the guillotine. By omitting the fantastical and dramatic elements that fuel the plots of more conventional novels both of her own time and ours, Austen keeps a laser focus.

With Joan Didion's work one of the threads running through her non-fiction is a musing upon how the stories we tell ourselves to identify ourselves can often be deceptive, how the stories we tell ourselves to share with others often have a self-exonerating motive so that we don't have to consider what our real motives are.  Both women have male (and female) detractors who resent their icy, elitist style  Granted, and yet what's interesting about the comedic woman is that it seems men who despise women trafficking in humor that pours contempt on people can revel in that sort of humor when practiced by males.  That might be a opic for another post some time later. 

I've written in the past that there are ultimately only two types of humor, you're either laughing with or laughing at.  I had a blog post about this topic way back on August 17, 2013:
A layman makes a case for less humor from the pulpit

Now a writer can do whatever he or she wants and laughing with and laughing at are options we can all avail ourselves to.  But I have this proposal that the humorists whose work survives manage to find some kind of balance so that the laughing with and laughing at have an equilibrium.  For as often as Austen revels in laughing at her characters she arrives, in time, at resolutions to her stories in which we can laugh with them that things worked out acceptably enough for most people in the end. 

I'd write more for this blog post but I'm incubating some stuff about the newest Spiderman movie. A teaser of where I'm thinking about the new MCU Spiderman film goes roughly like this ...  

There's this joke in military cultures that if you break the rules and fail you get a courtmartial and if you break the rules but succeed you get a medal.  That's pretty much the entire MCU in a nutshell, breaking the rules but succeeding.  It's even a puchline in a subplot in the new Spiderman film wehre Captain America has done a public service announcement about how breaking the rules never pays off after a gym instructor has joked that by now Cap is probably a war criminal, but, whatever, the state paid for all these educational videos so we gotta use `em.

FilmCritHulk just went on a tear about the MCU, saying that there's this problem with them, that the gap between what the films SAY they are about and what they REVEAL themselves to be about by how they reward their protagonists in their third acts is now downright disturbing and that the new Spiderman film illustrates in ways that FilmCritHulk now finds frustrating. For the record, I think FilmCritHulk wrote the best, bar none, English language overview of Hayao Miyazaki's film The Wind Rises I've read.  Sure, I happen to like mine, too, but FCH is conversant enough in film and theory that even when I disagree with the "what" or "why" FCH writes in a way that spurs further thought and conversation.  Which is to say that for FilmCritHulk to articulate what Hulk regards as a fatal flaw in every single Marvel Cinematic Universe film that's something to mull over.  And that problem can be summed up in the aforementioned joke about how breaking the rules only gets you courtmartialed if you fail.

While FCH "may" not conversant enough in things military to formulate an objection in terms of military jokes, that's the beef, that it seems the MCU films feature heroes who all break the rules of reasonable/ethical conduct in ways that should get them courtmartialed under normal terms but since they always succeed in the third act they keep getting the medals.  What makes FilmCritHulk's recent complaint about the new Spiderman movie (which FCH does, in fact, like) intriguing is that FCH takes time to demonstrate how and why the Nolan Batman films and Raimi Spiderman films DON'T make the same mistake; Nolan and Raimi gave us heroes who made what they thought was the right and best decision to make at the time they had to make a decision that turned out to be not just a terrible strategic blunder but also to be, bluntly, morally wrong.  Whether as the result of fear or cowardice or resentful entitlement, Bruce Wayne and Peter Parker are both motivated by the reality that they made decisions that led to the deaths of people they loved.

So some of that has to be saved for the actual piece I'm meaning to write ... .

Enjoy your weekend. 

 

Saturday, July 08, 2017

links for the weekend, on bromides about the changing arts scene and who got what independence on what day

Chris Jones over at the Chicago Tribune proposes that
http://www.chicagotribune.com/entertainment/theater/ct-culture-critics-jones-ae-0702-20170701-column.html

In 2003, the National Endowment for the Arts put out a genuinely surprising report: Audiences — those attending jazz, classical music, opera, dance and theater performances — were in serious decline as a percentage of the adult population of America.

Yet worse, the data implied that all of the effort to diversify those audiences had not worked. The audiences of 2002 looked very much like the audiences of 1982: disproportionately white, affluent, educated, older and female. The NEA tried to put an optimistic spin on its findings (it noted that Sept. 11, 2001, had disrupted lives), but it was still the rare NEA report that actually interests a newsroom like this one.
...

Like the newspapers that employ them, critics grew up with industrialization and urbanization and we were at our peak in the first half of the 20th century, when works needed savvy guides to their newly structured, and painfully limited, leisure time. Prior to that, in the first half of the 19th century, the arts were something you more likely did yourself. You sang around the piano at home. You likely knew some Shakespeare or Biblical prose, whatever your walk of life. You tried to learn how to draw some. You were not up for review. You were not charging money. You were expressing yourself.
 
Welcome back.
 
Alas, this new radical democratization threatens critics, just as it does well-paid artistic directors, executive directors, curators, and all kinds of other gatekeeper types in the cultural universe, which explains why some say we/they ract defensively (see above!) to any grass-roots rebellion.
 
People are doing art themselves again: Tepper pointed out that half of 18- to 22-year-olds have made their own music. half of them say they have taught themselves something. Many have spent more hours playing video games than it takes to master the violin.  And consider this Tepper statistic: If you had asked random Americans in 1950 if they thought themselves important, about 12 percent of them would have said yes.  By the 1990s that number had risen to 85 percent.  No wonder everyone has an opinion, and social media and cheap technology now has provided the last piece: an egalitarian megaphone.


There's also some ruminations from curators as to whether museums are inherently colonial.

http://www.startribune.com/lessons-from-the-scaffold-controversy-museums-are-inherently-colonial-institutions/431623363/

I think a better way to put it is that all museums are inherently imperial, which is to say that museums are always curations of those things within empires that are regarded as touchstones of cultures, whether of the empire that hosts the museum or the cultures which the hosting museum regards as significant enough to present and discuss.  As Miyazaki had a character pose the question in The Wind Rises, which would you prefer to live in, a world with ... or without the pyramids?  The curator, by definition, must always answer "with" to the question of whether or not to live in a world with or without the pyramids.  There is no other answer a curator of arts anything can honestly provide. 

But there's a point at which remembering the colonial past includes remembering the extensiveness of Native American slavery and slave trade.  True, it was not necessarily explicitly white supremacist or racialist/essentialist in the way observed in the Confederate South ... but it's been somewhat amazing to read people who would otherwise condemn slavery categorically take pains to say that Native American slavery was ... not ... quite ... as bad as slavery in the Confederate South.  Do non-white customs of slavery get graded on a curve just because white supremacist slavery customs in the antebellum American South are considered the bottom of the barrel? 

There's a stretch of people who want to take pains to remind people the United States got its freedom or independence by dispossessing native populations.  Colonial/imperial expansion into the American West is not necessarily the same as fighting to gain independence of English colonial rule.  To collapse these two categories into a single category seems historically dishonest and dangerous but, if people insist on doing it, then they might want to bear in mind that this conflates all modes of liberty with some form of imperialism and oppression and repression.  It is an ideological commitment to a practical belief that "you" cannot be free unless "they" are enslaved in some way. In lamenting the destruction of Native American culture at the hands of white colonialists does anybody really want to restore the slave trade and the caste systems that existed in the Native American tribes of the Pacific Northwest?  That seems completely improbable. 

My distrust of the easy combination of a fight for independence with colonial/imperial expansion is that only people who completely identify these two things as necessarily related can combine them.  Now people have said that the Israelites found their Promised Land by massacring the existing native populations.  Okay then, so nobody in the United States has managed to do better.  What if freedom can only come for group A through imperialism that massacres group B?  If that's the case then is seeking maximum freedom for a maximum number of people even a salutary goal?  If our forebears in the United States only managed to make the United States as big as it is through a combination of racial-supremacist slavery and a manifest destiny policy that advocated the extermination of the American Indians then this does not suggest that the ancient Israelites were worse than us in the now United States; it suggests that the Israelites has the bluntness to not pretend they were doing something other than what they were doing.  If anything one of the pervasive critiques of Israelite settlement was that rather than exterminate the tribes they were instructed to exterminate they adopted a live and let live approach and even syncretized a variety of aspects of Yahweh veneration with Canaanite customs.  Had the United States settlers and colonists behaved MORE like the ancient Israelites as recounted in the biblical texts there might be more American Indians around today.

It's possible to establish a kind of independence that doesn't require an imperial expanse, isn't it?  Or if it's not then why should people tell themselves in tacit or explicit ways that where every other group that built freedom for themselves slaughtered whoever they displayed WE won't be guilty of that mistake?  I don't really want the local PNW tribes to get their old slave trade back.  The slave trade and slavery systems of the PNW collapsed in a way that didn't involve a war ... how and why that happened might be an instructive case study were people not so eager on the internet to speak in the most literally and figuratively black and white terms.  It's not that those matters aren't important, it's that their dominance can produce a tunnel vision that may need some gentle correctives. 

Curation is not just for museum culture these days.  There's this idea floated at The Atlantic that if in the past you were born into a community and discovered your individual identity later on as you grew up, it can seem that these days in contemporary American culture you are born as an individual who seeks out a defining community later as you go.


https://www.theatlantic.com/entertainment/archive/2017/07/what-does-community-mean/532518/

For much of the 20th century, if you asked someone to define “community,” they’d very likely give you an answer that involved a physical location. One’s community derived from one’s place—one’s literal place—in the world: one’s school, one’s neighborhood, one’s town. In the 21st century, though, that primary notion of “community” has changed. The word as used today tends to involve something at once farther from and more intimate than one’s home: one’s identity. “A body of people or things viewed collectively,” the Oxford English Dictionary sums it up. Community, in this sense, is not merely something that one fits into; it is also something one chooses for oneself, through a process of self-discovery. It is based on shared circumstances, certainly, but offers a transcendent kind of togetherness. It is active rather than passive. The LGBTQ community. The Latino community. The intelligence community. The journalism community.

...

Maybe, and if so ... then that kind of community is an affinity association driven by, well, let's just call it consumer choice.  If Americans reject altogether the legitimacy of identity deriving from either geographic location or, worse yet, socio-economic strata, then consumer choice and elective association based on an aspiration to what might be called sexual market value or producer activity (per the ribbonfarm proposal in "You Are Not an Artisan" about how people define themselves by conspicuous production regardless of whether or not the market has any use for it). 

In a way what that revolution is, the one Chris Jones was writing about a few weeks ago, can be thought of as a resurgence in ideologically committed artistic activity in a plane of non-monetized arts-making and distribution.  It might make sense that established theater and arts critics feel like the era of the critic is possibly over but in the era of often risible and embarrassing Youtube comments is the age of the critic really over?  Or could we float the idea that the era of the monetizable activity of the institutionally-backed arts critic may have a shelf life? 

Another Jones, Robert P., assures us that the election of Trump conspicuously withstanding, the long-term influence of white Christian America is still reaching its death point.

https://www.theatlantic.com/politics/archive/2017/07/robert-jones-white-christian-america/532587/

The sum of the article focuses more on the religious right/evangelical scene.  The larger discussion has included how the white/liberal/mainline version of American Christianity has also been on a steady decline.  To the extent that a lot of what passes for Christianity or "genuine spirituality" in white American Christian scenes is really likely to be some red state or blue state civic religion that has only an instrumental interest in Jesus, the decline of white American Christianity is not necessarily a huge loss.

Whether or not the decline of white American Christianity ensures a continuation of a range of at least nominally liberal policies and norms remains to be seen.  Something that blue state voters and activists can sometimes forget is that, well, as Sherman Alexie has complained, the average American Indian is more socially conservative than even the most socially conservative white guy.  It's not a slam dunk that the decline of a white Christian mainstream means that a genuinely secular/progressive culture will emerge.  Whether or not Pentecostalism and charismatic Christianity is orthodox in a big or little 'o' sense of the term, it is still around.  It's not a foregone conclusion that immigrants coming into the United States from all over are necessarily going to bring with them the mores that those who might otherwise celebrate the decline of the Religious Right white Christian scene want.

There is another reason we should not be too optimistic that a decline of a white Christian mainstream of a red or blue variety is automatically good, but it's not strictly about the ideologies that are often deployed for conflict.  The problem is resource scarcity, which remains an issue regardless of associated ideologies.  A more secularist and materialist society is going to amplify and multiply these tensions.  The more materialistic our conception of the world is the more unavoidable and unacceptable the unavoidably zero-sum game of the economic life of humanity as a global species is going to become.  This gets back to the aforementioned question about who gets to be enslaved so the people who make the money can live freely.  In an era in which actresses lament the lack of equal pay in Holywood for women it's possible to simultaneously lament that inequality on the one hand while noting that it seems a little morally dubious for movie stars who are part of a contemporary priesthood of art-as-religion to lament that they are not making a hundred times more than a person at a Wal-mart store might be making in a year. 

Which gets us back to the theme of the arts and money, of course.

Over at Mere Orthodoxy Jake Meador wrote a review about a book called Real Artists Don't Starve.

https://mereorthodoxy.com/book-review-real-artists-dont-starve-jeff-goins/

Now given that the book is published by Thomas Nelson ... the same Thomas Nelson that published Mark Driscoll and Grace Driscoll's Real Marriage and Rachel Held Evans' Year of Biblical Womanhood I admit the odds that I would read Real Artists Don't Starve are pretty close to zero.  And while Meador may actually respect Doug Wilson enough to invoke mention of pearls without a thread ... Wilson's plagiarism controversy makes it just about impossible for me to take him seriously, either. 

Even so, the axioms presented as threads in the Goins book as presented by Meador seem ... like axioms. 

For instance the advice that you don't work for free might go against the axiom that you own your own work. Not everybody at Mere Orthodoxy seems to take the idea of intellectual property all that seriously or as a particularly legitimately Christian concept, so Meador's view may just be Meador's view--but the question is more practical, what if you own your work, whatever this is taken to mean in IP terms, and nobody wants to buy or pay for what you're selling?  Years ago I got some advice from an established musician that went exactly like this, "Don't look down on work-for-hire.  At least it's work.  At least you get paid."  You could own all the rights to a large body of work but if nobody wants to buy it then all you've done is make art at your own time and expense. 

But ... isn't that exactly what folk art is, in the end?

Real artists don't have to starve but who says real artists have to pay all their bills with the art they make?  The other thing that comes to mind is that there can be some pretty potent freedoms available to someone willing to write for absolutely no money at all.  This blog has never been monetized and there's no plan for it to ever be monetized.  The prodigious amounts of primary and secondary literature connected to the rise and fall of Mars Hill featured at this blog has been possible because of, well, basically Fair Use.  Had I waited to write about what was going on at Mars Hill until some profit was possible no writing would have happened.  It's not a foregone conclusion to me that writing matters when money is exchanged for it. 

And since Mere Orthodoxy has been happen to name-drop Roger Scruton, Roger Scruton has made it explicit that the fine arts and literary arts are the domain of the leisure classes.  If Roger Scruton, famous conservative that he is, can state without equivocation that the arts have been the domain of the leisure classes, then is it a foregone conclusion that in saying real artists don't starve that there's any serious implication that real artists make their money in the arts markets? 

Now the book, Meador notes, mentions stuff about cultivating patrons.  You'd have to be careful which patrons you cultivate.  Very, very few patrons would be as generous as the Esterhazy clan was to Haydn, for instance. 

So, I suppose that's me expressing some doubts about the very idea that writers and artists and so on "should" make their respective livings as writers, artists and so on.  The freedom to make millions per movie where you pretend to be someone else for a film seems like an accident of genetics and socio-economic conditions and privilege.  If Chris Jones' ideas have a trajectory it could be that what's emerging is a newly active and activist group of amateurs in the arts, and a kind of internet commentary brigade.  This may call for a new form of interactive criticism but it could also herald a new level of identitarian politicking with a return of the vituperation of 19th century era criticism.  Ever read Slonimsky's Lexicon of Musical Invective?  Might as well be commentary from people on Youtube or discussion forums. 

I'll probably end the weekend ruminations (for now) with a remembrance that Real Marriage got out in the world thanks to Thomas Nelson. Whether or not Real Marriage merited a $400,000 advance might just be an aesthetic or philosophical point for debate.

https://wenatcheethehatchet.blogspot.com/2016/01/an-agreement-from-february-28-2011.html

Between the Driscoll marriage book and the Rachel Held Evans stunt book about "biblical womanhood", it's really, really hard to take Thomas Nelson seriously as a publisher these days.  Even if it can be granted that real artists don't starve the issues that could be raised about both the Driscoll book and the Evans book make it seem as though at least some artists might think better of publishing through Thomas Nelson.  Terry Teachout warned that the kind of Maxwell Perkins editing era is out and that you have to basically write the kind of book you'd already want to see published because these days little editing in the more old-school sense gets done. 

Maybe there's something to be said for writing just because you love to write regardless of whether or not you ever see a dime for it.  I have written paid gigs as a writer but I have on the whole written because I love writing.  If I had determined to never work for free in the way I blogged about Mars Hill, for instance, there'd be nothing at this blog about the rise and fall of Mars Hill.  But in a way that's almost a journalistic/historical question.  There may be times when a writer feels sufficiently morally obliged to document events that the question of whether "am I getting paid for this?" is not even rising to the level of secondary importance.

Friday, July 07, 2017

biblioblogger Jim West says you should go see the new Spiderman movie

https://zwingliusredivivus.wordpress.com/2017/07/07/go-see-spiderman-homecoming/

I mean, as a lifelong Spiderman fan who dug the cameo the character had in Civil War I was gonna go see the new Spiderman movie anyway ... but ... hey  Jim West says to go see it. ;)

Even though I've written a lot more about Batman over the years my two favorite superheroes have always been a tie between Batman and Spiderman.  Now, to be clear, I love Spiderman right up to the point where Marvel did the stunt-killing with Gwen and then my love began to grow progressively cold.  But that Stan Lee/Steve Ditko run?  Classic comics. 

So, yeah, I'll be seeing the new Spiderman movie at some point.  That's the plan.

We can't only ever talk about the syntactics of ragtime and sonata forms or about polyphonic cycles for classical guitar, after all. 

Haven't even touched Samurai Jack or why Legend of Korra was such a horrendous follow up to the glorious series The Last Airbender. 



Saturday, July 01, 2017

Washington Post article on the decline of the electric guitar considering that guitar era that Kyle Gann thought might exist circa 2003 that might still just be the musical subculture it's possibly always been

I've enjoyed reading Kyle Gann's blog when he's blogged but I can't help remembering that waaaay back in 2003 he wrote:
http://www.artsjournal.com/postclassic/2003/11/make_way_for_the_guitar_era.html

Something else I meant to add about my students and the piano: Perhaps it’s just Bard culture, but I see many students today, perhaps a majority, coming to musical creativity from the guitar rather than the piano, as they used to, or any other instrument. This could have profound consequences. In the Renaissance, composers usually got their start as child singers. Baroque and Classical composers were often string players (Corelli and Haydn, the violin; Bach, Mozart, and Beethoven, the viola). Romantic and modern composers were more often than not pianists. Such choices have profound consequences, and if there really is a sea-change of composers now coming from the guitar world rather than the piano, that alone could bring about a rift in musical eras. Berlioz, who played the clarinet and guitar, was almost the only non-pianist composer of his era, and as a result became its most innovative orchestrator. Guitarists visualize music theory in more contextual, less fixed and abstract, ways than pianists do. Interval size is less of a constant for them, melodies more conveniently leap throughout the register than proceed by steps, and their instruments are easily retunable and portable, tremendously louder (if electric), and carrying no upper-class connotations. By their 20s, these composers have been conditioned by a completely different relationship to pitch and volume than the pianist-composers of my generation and earlier. I’m curious as to whether professors in other music departments notice the same demographic change.

...

I don't know if I'd call it profound consequences as such but let's take note that here in the year 2017 Nikita Koshkin's 24 preludes and fugues for solo guitar have finally been published.  German Dzhaparidze's cycle of 24 preludes and fugues has been recorded in the last three years.  Igor Rekhin's cycle of preludes and fugues has not been recorded in its entirety but about two thirds of it as performed by Vladimir Tervo is able to be ordered as digital music for those who know where to find it.  My set of preludes and fugues for solo guitar now exists as a duet cycle, too, and ... that's been recorded and is available somewhere.  Which is to say that here in 2017 there are AT LEAST four cycles of 24 preludes and fugues for solo guitar out there.  At least two of those four cycles were composed by guitarists and if Dzhaparidze himself is a guitarist that would make for three.  There's a cycle by Puget Sound area composer/guitarist Philip Quackenbush, too, but I haven't been able to see the scores for that cycle. 

Not that fugues have to be composed for the guitar in sets of dozens.  Friedrich Zehm has a set of six preludes and fugues I haven't gotten to yet but hope to get to in the future.  Gilbert Biberian has written fugues into a couple of his guitar sonatas.  There's also the Brouwer fugue, and a fugue. Alexandre Eisenberg has a prelude, chorale and fugue I've got, too.  Ideally you get the idea that fugal composition for solo guitar is by now, if not exactly the stuff of musical legend, at least so firmly established as to be beyond any serious scholarly dispute.

If there are profound consequences for those whose commitment is to the preservation of the art music traditions that are informally understood to be of Western European lineage one of those consequences "could" be this--if it's possible for fugues to be composed for solo guitar and at least four guitarist composers have composed large-scale contrapuntal cycles, then would this mean the guitar has finally reached a level of respectability that was perhaps not attained in Segovia's time by dint of guitarists being busy transcribing Bach rather than more directly contributing to the polyphonic literature?    Well ...

What Gann had to say in 2003 may well have a kernel of truth if we're talking what could be called classical music.  If arts funding keeps getting sliced and if the symphonic and traditional ensemble formats suffer in the wake of such cuts then a way the art music idioms could adapt and survive could partly lay in the hands of guitarists.  I think there's at least some basis for such a move.  I admit to being highly biased in favor of this approach as a guitarist. 

But there are two ironic twists.  First ...

http://www.artsjournal.com/postclassic/2003/12/guitar_mystery_solved_gama_did.html

Long-time electronic composer and general Downtown raconteur Tom Hamilton sends me an interesting fact in response to my perceptions of the guitar’s takeover of the composing world:

In 1995, an industry group called the Guitar and Accessories Marketing Association (GAMA), along with the NAMM and MENC, started a launched a program to train teachers to start guitar programs in middle and high schools. That group estimated that by 2001, over 200,000 students have learned guitar in school, and over 38,000 students bought their own guitar. They project a trend that by 2010, will have over 1.5 million students learning guitar in school programs, and over 300,000 students purchasing guitars. And that’s just through one school-based program! My observation is that most guitarists learn through woodshedding and private lessons without any institutional structure at all.
So no wonder young guitarists seem to be coming out of the woodwork: it was a calculated industry initiative!  ...

What might have seemed an organic grassroots shift had a great big corporate explanation.

But the ironic thing is that just a few weeks ago the Washington Post had this:

https://www.washingtonpost.com/graphics/2017/lifestyle/the-slow-secret-death-of-the-electric-guitar/?utm_term=.5ad1fb9d8025

The electric guitar, that emblem of rock music and rock culture, has been on a decline. There aren't guitar heroes these days like there were in days of old.  The guitar heroes there are, are getting old.  If some believe that what is needed these days if for there to be heroes of the guitar do those heroes have to come from a rock or pop or jazz setting?  If Taylor Swift inspires kids to take up the guitar, as one source quoted in the WaPost article described, are there guitarists who will turn up their noses at the prospect that girls, wanting to imitate Taylor Swift, decide to take up the guitar?  Considering how many times I've seen people say that guys take up the guitar just to get the girls I'll just overlook that dubious kind of condescension.  Some of us take up musical instruments because we love music and not because we're trying to improve our odds on the local dating scene. 

Now perhaps there was, as some say, a bubble on the manufacturing side.  But we've blogged about this before here at Wenatchee The Hatchet.  There was also, apparently, a bubble on the observation side.  First we'll revisit a comment by an author at The Guardian describing rock music as having entered its "jazz phase".

https://www.theguardian.com/music/musicblog/2017/mar/31/five-things-i-learned-as-guardian-music-editor-rock-music-writing-michael-hann

...
2. Rock music is in its jazz phase

And I don’t mean it’s having a Kamasi Washington/Thundercat moment of extreme hipness. I mean it’s like Ryan Gosling’s version of jazz in La La Land: something fetishised by an older audience, but which has ceded its place at the centre of the pop-cultural conversation to other forms of music, ones less tied to a sense of history. Ones, dare I say it, more forward looking. For several years, it seemed, I was asked by one desk or another at the Guardian to write a start-of-year story about how this was the year rock would bounce back. But it never did. The experts who predicted big things for guitar each year were routinely wrong. No one asks for that story any longer.

Indeed!  We've just seen an article on the slow death of the electric guitar!

Then there's this thing called the Shazam Effect mentioned a few years back:

http://www.theatlantic.com/magazine/archive/2014/12/the-shazam-effect/382237/

Billboard replaced its honor system with hard numbers in 1991, basing its charts on point-of-sale data from cash registers. “This was revolutionary,” says Silvio Pietroluongo, Billboard’s current director of charts. “We were finally able to see which records were actually selling.” Around the same time, Billboard switched to monitoring radio airplay through Nielsen.When that happened, hip-hop and country surged in the rankings and old-fashioned rock slowly began to fade—suggesting that perhaps an industry dominated by white guys on the coasts hadn’t paid enough attention to the music interests of urban minorities and southern whites.

What if beyond the bubble in manufacturing in the 1990s there was this bubble in measurement that took place on the measurement side.  Rock seemed to be the big thing because rockists ran the industry.  When the systemic biases toward rock and against rap and country were eliminated by more direct metrics tracking, hip-hop and country came to dominate the charts. 

Not that music relying on the sampling of existing works is without it's own potential risks.  Let's not forget that it was just a couple of years ago the "Blurred Lines" verdict came down.

http://www.theatlantic.com/entertainment/archive/2015/03/why-the-blurred-lines-verdict-could-be-bad-for-music/387433/

Hip-hop in particular has proudly thrived on borrowed sounds and vibes, and has clashed with the courts over the years because of sampling. In the wake of the ruling, Questlove of The Roots sent (then deleted) a tweet with the hashtag #NiceKnowingYouHipHop. In 2013 he told New York that  “If it were a case of melodic plagiarism, I would definitely side with the estate,” but then explained why he thought Thicke and Pharrell were in the clear:
Look, technically it’s not plagiarized. It’s not the same chord progression. It’s a feeling. Because there’s a cowbell in it and a fender Rhodes as the main instrumentation — that still doesn’t make it plagiarized. We all know it’s derivative. That’s how Pharrell works. Everything that Pharrell produces is derivative of another song — but it’s an homage.

There are those who regard the verdict as a disaster for music.  I find that impossible to believe but then I'm a guitarist and also a guitarist who has opted to specialize in what's generally known as classical repertoire.  I guess I also fit into what would be called the new music scene, since when I play I play new compositions rather than the usual warhorse literature of Sor, Giuliani, Diabelli or Tarrega (who all wrote some fine music for the instrument, to be sure).  So for me, the "Blurred Lines" verdict has no bearing inasmuch as when I crib music from other composers I make a point of going for works that have been public domain for centuries.  I also deliberately recompose them to the point where only a specialist in music history might necessarily even recognize the source materials.  What does a musician do if they want to work in a musical idiom in which sampling is involved?  Here's one answer:

http://www.npr.org/2015/02/28/389285375/you-have-to-be-bored-dan-deacon-on-creativity
What some of the early rap samplers went through when, all of a sudden, their music became illegal in that way.
Exactly. And copyright law's getting more and more strict, but you can exist in two ways: You can either be remarkably wealthy and license whatever you want, or you can be really obscure and no one's gonna care. But if you're anywhere in the middle, collage becomes difficult. So I really like working with microsamples and sounds that are devoid of their original context, but exist just as a timbral element. [emphasis added]

Like a pixel, in a way, of music. And to get around copyright issues, you can just use it if it's that small?
Well, I would consider it fair use — because it's completely recontextualized. A new derivative work is made, and there's no way to tell what it was. [Laughing] We should really not talk about this. I'm expecting the emails that are like, "We've identified the microsamples you were discussing ... "

Now I would venture as a matter of preference that you go for stuff that is public domain.  If you need to record that public domain stuff yourself before you then manipulate the audio for sampling that might be even better.  Call it a possible third way between the two extremes Dan Deacon described. 

But for the average electric guitarist there is practically no such thing as a public domain body of work.  One of the reactions I've seen coming more from the wing that regards classical music as some kind of prestige or class problem (generally on the rock/pop/jazz side of sympathy if people situate themselves in class conflict) is to consider the "Blurred Lines" verdict a disaster for music.  It's not a disaster for music so long as you're plugged into some musical idiom that has robust enough a body of public domain works for this to not have to effect you.  If the verdict "is" a disaster for you or your preferred styles of music because nothing in the style you like is public domain then that might be your real problem, both in terms of aesthetic interests and in terms of legally constrained options. 

Generational insularity was often enough a thing in music of earlier eras but back in those days intellectual property was not quite the same thing, either.  Thanks to companies big enough with enough trademarks to protect to have a vest interest in fundamentally altering the range and scope of copyright laws and licensing practices things are different now.  This would be an opportunity for those with a traditionalist bent to argue for the value of the traditions.  What has become a public domain has become a cultural good that can be recombined, recomposed and reinvented in any number of ways for a continued existence.  A culture that is entirely under copyright and trademark is one that may paradoxically not long survive. 

Something I wish had been explained more clearly back when I was in college was the thing about how the composers of old were not necessarily compensated for their music itself, but for their labor.  Haydn was paid to provide goods and services, for instance.  He was under what effectively amounted to a military contract. Scott Timberg found it useful to gloss over that when he wrote that if Haydn didn't show up for work he could be jailed for being absent without leave over at Salon.  Duh, anyone who knows anyone who has been in the military, let alone anyone who's been in a military service, can get why an AWOL incident could get you in a brig.  But Timberg skipped over the parts where Haydn was allowed to write whatever he wanted and got free housing and a food stipend and free medical care.  When a patron doles out that much largesse for a court composer who is expected to compose all major and incidental music for the parties under a contract that has him effectually listed as a military caste participant then, yeah, the day you fail to show up for your job is the day you're in trouble. 

So ... if the guitar manufacturing industry is in a slump ... it may or may not be the guitar era Kyle Gann was guessing we'd have after all.  Matanya Ophee said decades ago there was never a proverbial golden age of the guitar, regardless of the marketing schemes of a select range of guitarists. 

I'm looking forward to blogging about the fugal cycles of Koshkin, Rekhin and Dzhaparidze later this year but you can't just go and just start blogging about stuff like that.  You have to immerse yourself in the scores and stuff.  I think that for those of us who love the six-stringed instrument banding together regardless of formal style and whether or not there are pick-ups installed in proximity to the bridge would be a good idea. 

Friday, June 30, 2017

links for the weekend--is it the top one percent or maybe the top twenty percent in the United States that is hoarding the wealth at the expense of everyone else?

Over at Commentary Terry Teachout has something about a book discussing the symphonic scene in Germany under National Socialism.

https://www.commentarymagazine.com/articles/orchestras-and-nazis/

After the 20th century, so full of atrocities and horrors that would be axiomatic if merely saying they were so was not a troubling axiom, the idea that the humanities humanize shouldn't be taken very seriously.  This is not simply a matter of whether poetry can exist after Auschwitz.  Plenty of human activities continued after that evil and will continue.  But it has been easily observed that German culture, for a long stretch regarded as the pinnacle of Western sophistication and genius in the arts, was clearly no barrier against systematic evil.   Cue the quote from Walter Benjamin if you already know it ... .

What American artists who would prefer a more European style state-subsidized arts scene might want to ask here in 2017 is whether they really would want the United States government to subsidize the arts given how vehemently anti-Trump many artists are.  Would not, at this point, it seem to many an artist that the demolition of the NEA and NEH might be inadvertent favors to artists, if only in the sense that if Trump proves to be as despotic as many on the liberal/left side of the spectrum fear he will ultimately prove to be, why would any artists with self-respect want federal funding from that administration, exactly?

Furtwangler assuring Toscanini that people are free where ever Wagner and Beethoven music was played and that, if they are not free at first, they are eventually free listening to the works, sounds ghastly.  It's ghastly for the most literally obvious reason but it's also ghastly to anyone who actually can't stand Wagner's music and prefers Haydn to Beethoven overall.  It's not that interesting music can't be made by those who are imprisoned.  Messiaen's Quartet for the End of Time is amazing.  I've been listening to Zaderatsky's preludes and fugues which, as has been reported in the last couple of years by advocates for it, was one of the first cycles of preludes and fugues for piano composed in the 20th century and which was also composed while the composer was stuck in the Gulag.  

But there's a lot of art that is made that is beautiful in spite of evils.  If people on the secular left side hope that truly great journalism will happen in the era of Trump that's not conceptually so different from reactionary right wing dispensationalist fundamentalists Christians believing that some powerful End Times anointing would be at hand if Hillary Clinton were president.  After twenty some years of hearing the red and the blue regard the other team as spawning the Antichrist if they win the Oval Office it seems easier to just assume that whoever actually gets the job ... .

There's more than one angle to approaching the conceit that bad times make for great art.  Take this article over at The New Republic about the emergence of the "thought leader".
https://newrepublic.com/article/143004/rise-thought-leader-how-superrich-funded-new-class-intellectual

...
However deeply the superrich have degraded American intellectual and political discourse, the Ideas Industry has also created an opening—albeit a very slim one—for a different kind of organic intellectual. The one percent’s attempts to disrupt the media and universities have had the unintended consequence of radicalizing a generation of young writers and academics on the left—those recently dubbed “the new public intellectuals” in The Chronicle for Higher Education. Facing dim job prospects in the academy, leftists who might once have become professors increasingly define themselves as writers or political organizers. Bad times, historically speaking, are good for ideas, and our moment is no exception. We’re arguably living in a new golden age of little magazines: Not only have publications like n+1, Jacobin, the Los Angeles Review of Books, and Current Affairs appeared in recent years, but older ones like The Baffler and Dissent have been resurrected or revitalized.

Gramsci’s conception of the organic intellectual was not merely meant to describe the prophets of the European bourgeoisie and its industrial capitalism. The organic intellectual was above all a concept for the left: a name for those who, emerging from working-class conditions, had the inclination and ability to express their vision of society and organize it into action. He envisioned not a savior swooping down from the elite, but thinkers sharing an experience of economic privation, translated into both an intellectual and social struggle.
...

For those who read and remember the Alan Jacobs lament about the loss of Christian intellectuals this idea that real intellectualism, whether it's imagined to be left or right in foundation or origin, is one of those lapsarian bromides that will probably not die until there are no humans left to express the sentiment. 

But which super-rich do we really want to hold accountable for wrecking the prospects (whatever those may be hoped to be) for the working class?  The proverbial one percent?  What if it's turned out to be the whole range of the "top twenty percent"? Such is the argument, at least, advanced lately by Richard V Reeves.

http://www.npr.org/2017/05/31/530843665/top-20-percent-of-americans-hoard-the-american-dream

http://bostonreview.net/class-inequality-education-opportunity/richard-v-reeves-dream-hoarders-how-americas-top-20-percent

...

Trump’s success among middle-class whites might seem surprising, given his own wealth. But his supporters have no problem with the rich. In fact, they admire them.  His movement was about class, not money, and he exuded the blue-collar culture. For his supporters, the enemy is upper middle-class professionals: journalists, scholars, technocrats, managers, bureaucrats, the people with letters after their names. You and me.

And here is the difficult part. The popular obsession with the top 1 percent allows the upper middle class to convince ourselves we are in the same boat as the rest of America; but it is not true.  However messily it is expressed, much of the criticism of our class is true. We proclaim the “net” benefits of free trade, technological advances, and immigration, safe in the knowledge that we will be among the beneficiaries. Equipped with high levels of human capital, we can flourish in a global economy. The cities we live in are zoned to protect our wealth, but deter the unskilled from sharing in it. Professional licensing and an immigration policy tilted toward the low-skilled shield us from the intense market competition faced by those in nonprofessional occupations. We proclaim the benefits of free markets but are largely insulated from the risks they can pose. Small wonder other folks can get angry.

...

I am British by birth, but I have lived in the United States since 2012 and became a citizen in late 2016. There are lots of reasons I have made America my home, but one of them is the American ideal of opportunity. I always hated the walls created by social class distinctions in the United Kingdom. The American ideal of a classless society is, to me, a deeply attractive one. It has been disheartening to learn that the class structure of my new homeland is, if anything, more rigid than the one I left behind and especially so at the top.

Indeed, the American upper middle class is leaving everyone else in the dust. The top fifth of U.S. households saw a $4 trillion increase in pretax income in the years between 1979 and 2013.  The combined rise for the bottom 80 percent, by comparison, was just over $3 trillion. The gap between the bottom fifth and the middle fifth has not widened at all. In fact, there has been no increase in inequality below the eightieth percentile. All the inequality action is above that line.
...

The broader case, in case this is a TL:DR weekend for you, dear reader, is that there's what some call income inequality of revenue and then there's other thing Reeves calls "opportunity hoarding". A lot of income inequality is not necessarily found in bankers and deals for plutocrats, though it's obviously found there; it can also be manifest in the ways the upper middle class parents take steps to ensure their kids will have AT LEAST the comfort and access to resources they themselves have enjoyed.  The prospect that your kid may have to live with economic downward mobility is never going to be acceptable, is it?  Reeves suggests that the United States university system abolishes altogether legacy admissions practices.  Just because one or both of you parents went to school X doesn't mean you get a tuition discount or any advantageous consideration compared to someone who comes from a family that never previously attended school X.

I've made this somewhat joking observation before but it's not a surprise to me that in the last decade the signature take on the all-American superhero Batman managed to come from a British director, Christopher Nolan.  Perhaps Americans have been so eager to not think about class, yet between Nolan's version of Batman and the by now iconic take of Batman: the animated series, Batman is the sort of character that most directly interrogates questions folks can have in the United States about how, if there's going to be a one percent, if there's just going to be a plutocratic caste, what kind of conduct do we want from that caste?  That is, somewhat predictable riffs from some branches of the left withstanding, not necessarily an advocacy of "fascism".  Inequality is ineradicable from the human species and the people who are least able to avoid this reality about our species are generally those who were quite literally born with a disadvantage of some kind, what in bygone eras might have been inelegantly called a handicap. 

I'll admit to some frustration that a student who thinks that writers at The Baffler or Jacobin or n+1 signify the emergence of a new intellectual group.  I do read stuff from those publications on a roughly monthly basis.  I also toggle through Commentary (obviously), The New Criterion, The American Conservative, and a few other venues that are not quite left of center.  It's been a little surreal to get the sense that the far left and the far right agree more with each other these days than the proverbial "center" left and right.  Part of what makes this a little odd is remembering a macabre observation by one Richard Taruskin about how the  history of Europe shows that if you move far enough to the left or the right the thing they all agree on is that the bad stuff is the fault of Jews who should get ostracized or punished. 

It doesn't seem like there's much reason to be optimistic about the far edges of the left and right or the center.  It sometimes seems as though agitators and partisans across the board are angling for some kind of race war or class war or all of the above if that can be managed. Those that do angle for those kinds of things very like cannot be disinterested parties.  Revolutions tend to get embraced by those sorts of agitators who ultimately intend to be the next ruling class.  The history of the Soviet bloc suggests, no, more than suggests that regardless of formally espoused ideologies ruling classes tend to end up behaving more or less similarly across the board. 

I'm not so sure that the "thought leader" that a David Sessions at The New Republic decries is ultimately much more than a "meet the new boss, same as the old boss" of the public intellectual from an earlier era. 

Let's put it this way, there's no reason the thought leader isn't really the organic intellectual of the top twenty percent and not just the top one percent, but those sorts of people who might be in the top twenty percent and have, say, a penchant for founding left-leaning or socialist `zines might have very powerful incentives to exempt the nineteen percent they are part of over against that one percent in the top twenty percent they aren't themselves part of.

https://newrepublic.com/article/143004/rise-thought-leader-how-superrich-funded-new-class-intellectual
...
The intellectual institutions of postwar America were far from perfect; universities and think tanks accepted military-oriented funding from the U.S. government and often provided the intellectual foundations for American imperialism. Nevertheless, the three decades after World War II—when corporate power was checked by a strong labor movement, higher education became broadly accessible, and social services were expanded—were the most democratic in American history. Universities and think tanks were able to establish a baseline of public trust, in part because their production of knowledge was not directly beholden to the whims of idiosyncratic billionaires demanding that their “metrics” be met and their pet political ideas be substantiated.

...

The golden age of little magazines ... I admit to cynicism.  Plutocrats may be more brazenly direct in their interests in influencing policy but I wonder whether it's necessarily worse than earlier eras in which the American university system was in full post-war bloom.  Didn't someone over at Jacobin write a long-form piece about how Jews in American academic became neo-conservatives because, with the emergence of affirmative action Jewish scholars who might otherwise have been on board with civil rights for African Americans balked at the prospect of losing their disproportionately large influence in American letters?  Yes ... I do recall reading such a piece.  What is alternately thought of as neoliberalism or neoconservativism, depending on which left/right polemics you're reading, seems to have been birthed in part from that university/think tank culture.   Not the whole, obviously, and not by a long shot but the idea that the old system was better than the new thought leader regime seems tough to buy.  There are reasons that the contemporary academic scene seems like it would be something people would want to not be part of. 

At length the question that comes up is how much you're willing to reconcile yourself to serving an empire.  That's an unavoidable question.  If you're an academic at a state school you're working for whatever the empire is that you live in.  It can be done, and I'd hardly say you should never be a teacher.  Teaching was one of the career paths I was interested in.  But I was not, at the risk of putting it in terms borrowed from the various polemics cited above, not born into the class or caste for whom those kinds of doors really opened. 

I twas not that long after graduating from college that it began to dawn on me I had had an advantage a lot of people wouldn't have in educational terms, but also that I had graduated into a job market for which my education was not necessarily a preparation.  I had also reached a slowly but steadily firm conviction that whatever I had been told about the power bestowed upon the job-seeker by higher education was at best wishful thinking and at worst a sham.  Class mobility is probably one of the most pervasive myths in the United States.  If injustice is the discovery that you and yours are on a downward trend that's not necessarily injustice, is it?  What if it's the market at work? 

Here in Seattle about two months ago posters and fliers were about saying "no more shit jobs".  There will always be those kinds of jobs and just because a lot of us get those kinds of jobs and don't exactly adore them doesn't mean the jobs don't, in some sense, have to get done.  It seems that people who weren't born into the world with disabilities can't quite get that the world will always have haves and have nots.  You could be born into the world able to digest gluten ... or not.  You don't get to choose that.  If there's something about American society that seems toxic it's that people on both the left and the right in the artsy entrepreneurial scene seem to feel like they are somehow exempt from being I n that previously mentioned top twenty percent.  You don't have to be a Trump or a Soros or a Swift to be born into a ruling class, you might just need to be born into a family where, simply because your parents and/or grandparents went to school X, gives you or gave you a hefty discount on tuition at school X, or that you get a legacy admission for their time and money and ... maybe effort.