?

Log in

No account? Create an account
Taking descriptivism to a higher level - Input Junkie
June 19th, 2009
09:45 am

[Link]

Previous Entry Share Next Entry
Taking descriptivism to a higher level
zanda_myrande is a proud proscriptivist.

I prefer to get outside the system--here's my comment:
Alternate theory: Language keeps changing, but needs to be kept stable enough to be useful.

It's both good and natural to have people pulling in both directions, according to what usages feel plausible to them.

I'm not sure how much language change comes from great writers, how much from slang, and how much from mainstream drift.

Second thought: I'm not sure how much the important resistance to change comes from people who invoke rules and stability and how much is from people who just don't use the changes they don't like.

(19 comments | Leave a comment)

Comments
 
[User Picture]
From:nellorat
Date:June 19th, 2009 02:23 pm (UTC)
(Link)
Interesting original post--I'm spending too much of my energy on other areas (themselves somewhat about language) to join in there, but here's some relevant info, as I do both (prescriptive) grammar and (descriptive) linguistics, very different stuff!

I find your first statement basically reasonable. Change and challenging rules, though the OP conflates them, are not necessarily the same thing.

It's funny that the OP defends not splitting an infinitive, because that rule is generally now considered so bogus that even the SAT writing section (which, for instance, uses as a common error the anyone/they usage) allows it. As I learned it, that and not ending a sentence with a preposition are rules based on how Latin works--and the idea that if you CAN'T do it in Latin you SHOULDN'T do it in English. Since English is not even a Romance language, those rules are totally artificial, and even most prescriptive grammarians no longer care about them. The OP makes it into a style rule, but that's not a matter of CORRECTNESS.

Many contended rules come from attempting to make English make logical sense, going against what people do naturally. The anti-double negative rule is not natural to English, as shown both by some uses in Shakespeare and by the persistence in Black English. But in MATH, two negatives equal a positive, so must it be in English.

The main importance of your first statement, I think, is that once there is a rule, even if it makes no sense, people may depend on it to decode language. However, in practice, I think, this makes less difference than one would assume. Say you have "I ain't never got no help from nobody" (a great triple negative really said by Enimen in a YouTube interview) and "It was not unhelpful": there are plenty of signifiers of which follows which set of rules.

I like to teach (as NLP would call it) requisite variety: know how to follow the most formal rules for when you need them; you probably already know at least one set of less formal rules for when you need them. (Actually, that was freshman comp: at the academy, I sometimes DO teach informal rules to ESL students.)

Like vocabulary, I'd say, language change in English comes from EVERYWHERE. English is a magpie language, which I love about it; it also is very flexible (within certain limits, largely imposed by public education).

Do you know the years-ago PBS special The Story of English or the book written about it? I recommend both!

Edited at 2009-06-19 02:24 pm (UTC)
[User Picture]
From:nancylebov
Date:June 19th, 2009 03:17 pm (UTC)
(Link)
I'm going to throw in a pet rant: "You have to know the rules before you break them" strikes me as a failure to understand both knowledge and rules. Or possibly a failure with one of them which is enough to drag the other one down.

The kind of knowledge needed to make something new skillfully isn't in rules, though a some of it may be expressed as rules. It comes from substantial engagement with the material.

Ok, here's one of those things I believe without having checked on it. If all the best abstract artists can also do good representational art (true of many of them, I don't know if it's true of all of them), it's not because they know "rules" about how to do representational art, it's because they've spent enough time training and coordinating their eyes, hands, and imaginations. Representational art is what supplies enough challenge and feedback so that training is possible.

*****

It would be cool to have an understanding of why some changes get generally accepted (whether for a while or for a long time) and why some disappear, but I don't know whether such understanding is possible.

*****

No, I didn't know about the PBS book. I now have a minor ambition of reading it along with Albion's Seed, which has been waiting for me for quite a while.
[User Picture]
From:richardthinks
Date:June 19th, 2009 04:26 pm (UTC)
(Link)
"You have to know the rules before you break them" strikes me as a failure to understand both knowledge and rules.

Thank you. I am not a linguist, but it strikes me as an idea based on secondary or formal language acquisition, which I understand has very little to do either with first language acquisition or general cognition.

The comparison with visual arts seems apt, not least because the "validity" of an artwork is only decided by its reception. I could say that there are "rules" to making abstract art, that quite a lot of abstract art consists of playing with or trying to deploy those rules, but that would really only cover formalism. FWIW I think the appeal to skill in making representational art is either vestigial or wholly irrelevant to artistic production these days: a work should be adjudged good or bad only on its own merits (however we define those).

I'm currently reading Woodruff Smith's book, Consumption and the Making of Respectability, 1600-1800, which is an attempt to explain cultural changes, such as the shift away from spices and toward sugar, tea and coffee in northern Europe in the 18th century. It points up nicely the problems with trying to find such an explanation: Smith has a delightful argument, but it's impossible really to attest or falsify it.
[User Picture]
From:nancylebov
Date:June 19th, 2009 04:46 pm (UTC)
(Link)
FWIW I think the appeal to skill in making representational art is either vestigial or wholly irrelevant to artistic production these days: a work should be adjudged good or bad only on its own merits (however we define those).

I'm not sure I understand that. I wasn't arguing for representational art as anything but a valuable training ground, though I wish it were more respectable. We'd probably get more high-quality representational art if the category were taken more seriously.

*****

I've heard a theory that all aesthetic values reduce to intensity and/or complexity. This sounds plausible.
[User Picture]
From:agrumer
Date:June 19th, 2009 10:03 pm (UTC)
(Link)
Is representational art not taken seriously? I'm having one of those maybe-I'm-living-in-a-parallel-world moments.
[User Picture]
From:nancylebov
Date:June 19th, 2009 11:30 pm (UTC)
(Link)
Maybe I'm wildly overgeneralizing, but so far as contemporary art is concerned, my impression is that abstract art is more respectable-- more likely to show up in galleries. And the more editions a book has, and the more it's presented as a classic, the less likely it is to have new representational art on the cover.

What's your take (probably better informed than mine) on the status of representational art?
[User Picture]
From:agrumer
Date:June 21st, 2009 02:13 am (UTC)
(Link)
My take is that if we're talking about what our culture takes seriously, we shouldn't be looking at art galleries, which only a small fraction of our population actually cares about.

The visual art form that American culture finds the most important and engaging is the movies, and movies are pretty much entirely representational.
[User Picture]
From:richardthinks
Date:June 20th, 2009 01:24 pm (UTC)
(Link)
all aesthetic values reduce to intensity and/or complexity

That makes perfect sense to me until I try to build an argument around it, and have to figure out what intensity and complexity mean. I still think it's basically true for my own aesthetics, but I don't think it covers all agendas: a lot of institutional/national art seems to value a sort of presence that neither elicits an intense response (in me, at least) nor appears particularly complex. OTOH, it used to be a common insult among art critics to call a work "polite," which I suspect meant that, like this institutional art, its first concern was with avoiding ruffling feathers.

All I meant by my "vestigial" comment above was that, 40 or 50 years ago, it was common to "justify" an abstract artist by saying that they had a license to be abstract because they were also capable of making representational works - their skills in representation demonstrated that they were 'real artists' and adjusted the reading of the abstracts. Something of the same attitude still prevailed when I went to art school in the early 90s. I honestly don't know why drawing would necessarily help a conceptual artists much these days, except as a general sort of planning skill.
[User Picture]
From:nancylebov
Date:June 20th, 2009 01:47 pm (UTC)
(Link)
It's not just institutional art-- the general public has a fondness for comfort art, and I think the crucial quality (at least for narrative forms) is immersion, which includes intensity but isn't limited to it. On the other hand, "aesthetic values" are designed, I think, to exclude art whose major result is comfort and/or distraction.

"Polite" art could be a field of study of its own. Some of it just offers a little human-created variety, but some of it gets affection, I think.

Semi-sidetrack: Have you read C.S. Lewis' An Experiment in Criticism?

It would make sense to sort art according to how much it benefits from hand skills and what doesn't.

Drawing might do a little more for conceptual artists than "general planning"-- at least some conceptual art (if I understand what you mean-- does it include installations?) might be well served by being really good at visualization. On the other hand, a good bit of the work of visualization can be off-loaded to a computer these days.

[User Picture]
From:richardthinks
Date:June 20th, 2009 03:59 pm (UTC)
(Link)
I haven't read that piece by Lewis - but you've also reminded me that I must finally read Gadamer, who, as I understand it, pays attention to the dialogue between author and reader, especially to what's within the reader's "horizon" of meaning.

I think drawing's useful in many ways - it's certainly basic to how I think, but I'm not prescriptive about it. Although I don't know anything about their particular methods, it seems credible to me that, for instance, Rachel Whiteread, Cindy Sherman, Dan Flavin, Richard Wilson or Damien Hirst might be able to produce many of their famous works without drawing.

Paying attention to hand skills and their importance for various artworks I think would be really useful - it would at least help to sharpen up discussion of what constitutes "skilled hand work" and what does not - whether the hand might really be considered the "cutting edge of the mind" and so on, but there are pretty severe battle lines drawn at the moment between Art and Crafts/craftsmanship, which I think impedes such a discussion.

...but I've generally shied away from any consideration of art as art over the past decade; I'm really only concerned with art, or cultural production in general, as social communication these days. It gets me out of having to worry about whether a particular work is "any good."
[User Picture]
From:nellorat
Date:June 21st, 2009 04:14 am (UTC)
(Link)
It would be cool to have an understanding of why some changes get generally accepted (whether for a while or for a long time) and why some disappear, but I don't know whether such understanding is possible.

I think it's definitely possible, but it would be a lot of work! On a micro-level, William Saffire does part of that work, but only with contemporary phrases, in his page in the NYTimes Magazine. Yes, a lot of speech is lost, but its fossils are in newspapers, advertising, novels, etc. This is mostly in terms of phrases; grammar rules are usually in books, and linguists trace them in any written language. I don't know of one book that generalizes about sources of change over time.
[User Picture]
From:whswhs
Date:June 19th, 2009 03:54 pm (UTC)
(Link)
Many contended rules come from attempting to make English make logical sense, going against what people do naturally. The anti-double negative rule is not natural to English, as shown both by some uses in Shakespeare and by the persistence in Black English. But in MATH, two negatives equal a positive, so must it be in English.

Actually, I don't think that has much to do with it; I think that whole "the negative of a negative is a positive" is just an after-the-fact rationalization that ill-informed prescriptivists came up with to justify a preference they held on other grounds. I don't think they even take it seriously themselves. Because if they did, then the sentence "I never did nothing to nobody" would be acceptable to them—it's a triple negative, and if the negative of a negative is a positive, then the negative of a negative of a negative is the negative of a positive, which is a negative again! But I'm pretty confident that anyone who objected to a double negative would also object to a triple negative.

My take on this is that it's a matter of linguistic styles. Some languages favor maximal consistency throughout a sentence, and so if they negate one thing they negate everything; in the French I was taught, for example, jen'ai jamais rien fait à personne is perfectly grammatical, with its four distinct negations. Other languages favor making the point once and getting it over with, and detest redundancy; present-day formal English is one of them. Negating over and over is bad style in English not because of some specious mathematical argument (not many people could keep careful count of whether a spoken sentence had an odd or an even number of negatives!) but because it's vulgar excess. At least for negation, English is litotic rather than hyperbolic.
[User Picture]
From:inquisitiveravn
Date:June 19th, 2009 04:57 pm (UTC)
(Link)
I'd say the French version is still a triple negation. The "ne" is not a negation by itself, but is rather the first part of a two part negation. French requires both the "ne" in front of the verb, and another negative term behind it. The default post verb term is "pas" which does not appear in that sentence. What does appear is "jamais" ("never"), "rien" ("nothing"), and "personne"("nobody" or "no one"). Yes, non-Francophones, "personne" means "nobody," not "person" in this context.

Spanish, OTOH, is quite capable of using "no" as the only negation in a sentence, but frequently adds other negative terms with no concern for double negatives. "I never did nothing to nobody" would translate to "(Yo) nunca hice nada a nadie." Note, the pronoun "yo" can be dropped from the sentence and it will still be grammatically correct. Spanish does that a lot.

I vaguely recall that Russian also doesn't have problems with double negatives, but don't feel confident enough with it to attempt a translation.

Edited at 2009-06-19 05:05 pm (UTC)
[User Picture]
From:whswhs
Date:June 19th, 2009 05:46 pm (UTC)
(Link)
I'd say the French version is still a triple negation. The "ne" is not a negation by itself, but is rather the first part of a two part negation. French requires both the "ne" in front of the verb, and another negative term behind it. The default post verb term is "pas" which does not appear in that sentence. What does appear is "jamais" ("never"), "rien" ("nothing"), and "personne"("nobody" or "no one"). Yes, non-Francophones, "personne" means "nobody," not "person" in this context.

That was true in the French I learned in the 1960s, but it seems not to be true any more; the ne has been dropped as redundant in colloquial French. I remember being perplexed the first time I encountered it, in the film title L'une chante, l'autre pas. In fact, I would say that the standard French negation in the formal French I learned was a double negative, with ne . . . pas being two different negative words, one of which seemingly has now become optional.

Of course, historically, pas was a positive word meaning "a step," inserted for emphasis, just as personne meant "a person" and rien meant "a thing." But that's a purely etymological fact with no relevance to current French usage.
[User Picture]
From:nellorat
Date:June 19th, 2009 10:07 pm (UTC)
(Link)
I understand what you're saying, but if that were totally true, then "The experience was not unhelpful" would be considered redundant, but it's not, because taking out the "not" changes (reverses) the meaning. The wordiness is one reason why, for instance, George Orwell really dislikes such utterances, but it's not why "I ain't got no" (a double negative) is considered incorrect.

I probably confused things with Eminem's example, but darn, it's just SO GREAT in its multiplicity of negatives.
[User Picture]
From:nancylebov
Date:June 19th, 2009 11:34 pm (UTC)
(Link)
"The experience was not unhelpful" doesn't mean exactly the same thing as "The experience was helpful". The former leaves open the possibility that the experience could have been helpful, but actually didn't make any difference. It might even imply that pretty strongly.

I think I want Lojban.
[User Picture]
From:whswhs
Date:June 20th, 2009 03:14 am (UTC)
(Link)
People who think you can get a superior language by systematically planning the grammar according to a rigorously defined conceptual scheme strike me as the spiritual kin of people who think you can get a superior economy by having a board of experts make all the important economic decisions. It's just fine if you think you can arrive at a static solution that will be completely and permanently right, but it's seriously suboptimal if you want to leave things open to the dynamic emergence of superior solutions, including solutions to problems we haven't even thought of yet.

I tend to look at natural languages as a vast garden of wild grammars. A little judicious pruning and weeding may be worthwhile, but I don't want anything as formalist as 18th century French landscaping.
[User Picture]
From:nancylebov
Date:June 20th, 2009 01:49 pm (UTC)
(Link)
I can see your point. The Lojban approach to word roots (randomly picked from existing languages, weighted by number of speakers) especially gets on my nerves.

Still, English isn't especially good at some logical distinctions.
[User Picture]
From:nellorat
Date:June 21st, 2009 04:18 am (UTC)
(Link)
Contrary to both George Orwell and womzilla, I have argued that "I am not unhappy" is indeed different from "I am happy" in useful ways. OTOH, reading passages on standardized tests often are difficult in ways that include a tangled nest of "not"s, so I can see Orwell's point, too. It may just be better to say, "I am neither happy nor unhappy," "I am feeling better but not yet happy," and so on.
nancybuttons.com Powered by LiveJournal.com