linguist smackdown!
May. 9th, 2005 02:27 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Whoa, the gauntlet is thrown. Richard Sproat to P&P/Minimalism: "put up or shut up": create a working P&P parser by 2008 or concede defeat.
And much very heated foofara then ensued.
I don't know how I missed all the fun before. Oh wait, yes I do. I've been kinda busy.
Quoth Sproat:
Even more puzzling is the lack of any serious attempt to build a P&P-style parser that is able to learn from unannotated input (as Klein and Manning's systems do). What is odd about this is that it is practically de rigueur when one writes a paper in the P&P framework, to invoke the old arguments about ''poverty of stimulus'' and how the only feasible explanation for the way in which children acquire language is by having a substantial portion of the grammatical knowledge already hard-wired in the form of grammatical principles and parameters. Why, if this is the case, has nobody tested this claim by building a computational model that is able to do what every child is able to do: namely learn a language well enough that reasonable structures can be assigned to any grammatical (and, yes, even ungrammatical) sentence given to it? It seems to us that if the claims on behalf of P&P approaches are to be taken seriously, it is an obvious requirement that someone provide a computational learner that incorporates P&P mechanisms, and uses it to demonstrate learning of the grammar of a natural language.
With this in mind, we offer the following challenge to the community.
THE CHALLENGE
We challenge someone to produce, by May of 2008, a working P&P parser that can be trained in a supervised fashion on a standard treebank, such as the Penn Treebank, and perform in a range comparable to state-of-the-art statistical parsers.
Our selection of May 2008 is motivated by the observation that this is about three years in the future, and a committed graduate student who starts thinking about this problem now, could, by May of 2008 reasonably be expected to produce a Ph.D. dissertation that solves this problem.
...
It is worth pointing out that our challenge allows the P&P model a considerable, if illicit advantage. We are only asking for supervised grammar induction, when, in fact, unsupervised learning on the basis of parameterized principles would be the more reasonable test of the model's viability.
...[the final zinger, emphasis mine:]
In fact, we would be delighted if someone succeeds in meeting our challenge. Such success would convince us that the P&P enterprise is, after all, a testable theory with genuine scientific content.
no subject
Date: 2005-05-10 01:26 am (UTC)no subject
Date: 2005-05-10 01:28 am (UTC)but i can relate an exciting note: i was recently reading a dan klein paper and noticed he cited something by carroll and charniak. i realized that when i visited brown i went with a coworkerd named carroll who had worked with charniak, and checking the paper's citations found it was the same guy. pretty neat.)
no subject
Date: 2005-05-10 01:34 am (UTC)Charniak really is The Man when it comes to statistical parsing.
I think he's one of those CS people who started doing it all on blackboards.
no subject
Date: 2005-05-10 02:29 am (UTC)and yeah, i get the impression from the school that's what they do there.