Science topic

Chomskyan Grammar - Science topic

Explore the latest questions and answers in Chomskyan Grammar, and find Chomskyan Grammar experts.
Questions related to Chomskyan Grammar
  • asked a question related to Chomskyan Grammar
Question
3 answers
The most venerable professors and research scholars
Your critical comments, valuable opinions, scientific facts and thoughts, and supportive discussion on how can structural grammar and IC analysis be justified in the recent pedagogical and enhancement trends in EIP for EFL adult learners.
I shall be thankful sincerely for your kind participation.
Best,
Dr. Meenakshi
Relevant answer
Answer
In my opinion, it will depend on to what extent the language learner would like to view how the language works. I would say that if we opt to view language from the perspective of medium/tool in conveying messages or communication; incorporating structural grammar and IC analysis are beneficial to use the grammar correctly within the exact communication function to deliver the correct message by the encoder to the decoder.
  • asked a question related to Chomskyan Grammar
Question
230 answers
And if in addition to advancing in “Artificial Intelligence” we further investigate our “Natural Intelligence”!?
for example, Natural Intelligence and Research in Neurodegenerative diseases.
While we are still at an early stage in answering some key questions about Natural Intelligence [NI] [such as what algorithms the mind uses] the rapidly advancing Artificial Intelligence [AI] has already begun to change our Daily Lives. Machine learning has brought to light remarkable potential in healthcare, facilitating speech recognition, clinical image analysis, and medical diagnosis. For example, there is a growing need for automation of medical imaging, as it takes a lot of time and resources to train an Expert Human Radiologist. Deep learning AI architectures have been developed to analyze medical images of the brain, lungs, heart, breast, liver, skeletal muscle, some of which have already been used in clinics to aid in disease diagnosis. Juana Maria Arcelus-Ulibarrena
Cfr.
This Question does not refer to "NATURALISTIC INTELLIGENCE" but to "NATURAL INTELLIGENCE"
We are asking by NATURAL INTELLIGENCE [NI] not by NATURALIST INTELLIGENCE
Relevant answer
  • asked a question related to Chomskyan Grammar
Question
5 answers
Is it “true” that when anyone is rewriting a “sentence” in Logical Form(LF) by deploying metalinguistic constants and variables, the ultimate output would reveal the ‘true’ meaning of a given sentence? In LF, a major concentration is devoted to describe and understand the ‘real world’. This supposed logical positivist “real” is incorporated in the logical analysis of sentences in the algorithmic chain of LF of S-Structure by deploying sentential calculus. LF mainly follows Fregean compositionality or its derivatives like Katz-Fodorian Model. The following questions may be asked:
1. What is “real” in this real world? (To answer such question, one may take a clue from Russell’s An Inquiry into Meaning and Truth: “We all start from ‘naïve realism,’ i.e., the doctrine the things are what they seem. We think that grass is green, that stones are hard and snow is cold. But physics assures us that the greenness of grass, the hardness of stones, and coldness of snow are not the greenness, hardness and coldness that we know in our own experience, but something very different. The observer, when he(sic) seems to himself (sic) to be observing a stone, is really, if physicist to be believed, observing the effects of the stone upon himself (sic).Thus science seems to be at war with itself: when it most means to be objective, it finds itself plunged into subjectivity against its will. Naïve realism leads to physics, and physics, if true, shows that naïve realism is false. Therefore naïve realism, if true, is false; therefore it is false.” (1940:15)
2. What happens in LF if anyone puts Russell’s paradox (1913) in LF? How do we incorporate Gödel’s theorem to tackle a formal system like LF? According to Goedel’s theorem (1931), no formal system is complete enough to handle all the problems within a formal paradigm. If anyone puts any Goedel’s proposition or Russell’s paradox (“One Calcuttan says that all Calcuttans are liars”) in LF of S-Structure, the total formal as well as mechanical algorithmic system to gauge the meaning may collapse.
3. Katz-Fodorian (1963) system of binary componential analysis ignores the prototypical cognition of meaning by the human being. As some cognitive scientist observed that the meaning as endorsed by human beings, could not be analyzed by the stipulated components as humans understand meaning through prototypical cognition. What should we follow in semantic analysis:technical intelligentsia’s critical discursive habit of paraphrasing or commonsense deployment of prototypes?
4. Let us switch over to another schooling and try to understand semantic problems raised by continental philosophers (under the umbrella o fso-called Post-Formalism/ Structuralism). These Post-Formalists are talking about plural meanings of non-disposable texts as well as something called‘surplus meanings’, which is not at all analyzable or quantifiable .According to them, the meaning-site is too slippery area and any futile endeavor to formalize such site will be ended in vain. Do you think that they are neglecting ‘science’ and its formalism by promoting“un-scientific” non-formalism?
Relevant answer
Answer
Your very first sentence already betrays a confusion: Rewriting a “sentence” in Logical Form (LF) is not metalinguistic. When you simply translate a sentence about the world, i.e. a sentence in the object language, the new sentence is still in the object language. A sentence in the metalanguage is one which talks about the object language. For first-order logic, the wffs (well-formed formulas, which might be translations, regimentations, or abbreviations of ordinary English sentences) are in the object language, whereas the sentences describing the formation-rules (the "grammar" that specifies what it is to be a wff) and the transformation rules (rules of deduction) are in the metalanguage. Here are some examples;
  • Object language sentence: "The cat is on the mat."
  • Metalanguage sentence: ' "The cat is on the mat" contains 6 words.'
  • Metalanguage (or meta-metalanguage) sentence: " ' "The cat is on the mat" contains 6 words.' is a sentence in the metalanguage.' "
  • Metalanguage sentence: ' "Ocm" is a translation into the notation of first-order logic of the English sentence "The cat is on the mat". '
  • Object language sentence: "Ocm"
  • Object language sentence: "Socrates was an ancient Greek."
  • Metalanguage sentence: ' "Socrates" has 6 letters" '
  • Metalanguage sentence: ' "Socrates" refers to an ancient Greek '
  • Metalanguage sentence: ' "Socrates" refers to Socrates. '
  • asked a question related to Chomskyan Grammar
Question
1 answer
There are two or more approaches in explaining the coordination in compounding: the lexicalist and the generativist. Does the morphology-syntax theory work on coordinate compounds in English? If so, can anyone explain with examples?
Relevant answer
Answer
Dear Abdul (is this the correct form of address?),
I am intrigued by your question. It seems to suggest that lexicalist approaches are not generativist. I am not sure why you'd say that. My impression was that within a generativist approach, there are lexicalist camps (taking the position that every morpho-/phono-syntactic properties are projected from terminal nodes) and there are constructivist camps that might take the construction as primary.
In any case, "father-daughter dance" might be derived via Incorporation. If we start with an NP [dance [PP of father and daughter] ] , the PP can incorporate into the head of the NP to give the requisite compound.
I haven't touched syntax for some time, so my account is likely to be problematic, but I hope it's helpful.
Best,
Lian-Hee 
  • asked a question related to Chomskyan Grammar
Question
1 answer
I am working on the Article Choice Parameter Hypothesis proposed by Ionin in which he theorizes that there are two article settings that L2 learners can have access to. The Samoan language exemplifies setting I which distinguishes articles based on specificity while English exemplifies setting II, distinguishing articles based on definiteness. If the hypothesis is true, Samoan must have se & le, denoting non-specific DPs & specific DPs respectively, differentiated by specificity. In other words, the article se must introduce both non-specific definite and non-specific indefinite DPs. I've searched through literature yet can't find any evidence.
Relevant answer
Answer
Dear Quyen,
     The hypothesis is a very interesting one, but I haven't heard of it before. Could you give a reference? I'm not familiar with Samoan, but I work with a Native American language that has determiners which distinguish human vs. non-human. Would that fit his hypothesis?
    --Rudy
  • asked a question related to Chomskyan Grammar
Question
3 answers
In Chomsky (1995), and (2000), there was the introduction of the Inclusiveness Condition which Chomsky argues that it is a principle for the efficient computation of 'perfect' languages. My question is regarding the developments of this condition (for instance Chomsky replaced 'object' with 'feature' in his description of the condition when he stated that there must be no addition of new features in the course of the syntactic computation). Further, what is the stand of the bare phrase structure theory?  Are there any serious attempts made in its development? Is it still of interest for syntacticians? Thanks. 
Relevant answer
Answer
With regard to the Inclusiveness Condition, by definition, it prohibits the introduction of new features in the course of linguistic derivation (cf. Narita (2011: 16) Phasing in Full Interpretation). I think most of the minimalist analyses implicitly (and sometimes explicitly) assume the condition, but there is an exception to it. In order to explain EPP effects, it has occasionally been assumed that the edge feature (or the EPP feature) can be introduced in the course of syntactic derivation (cf. Müller (2011) Constraints on Displacement: A phase-based approach). However, Chomsky’s (2013) Labeling Algorithm approach suggests an alternative analysis of EPP effects, and thus it can dispense with such an exceptional assumption. Consequently, the Inclusiveness Condition can be regarded as more firmly established than before. Nonetheless, I argued in my paper (The Inclusiveness Condition in Survive-minimalism) that the condition is no longer required as an independent principle under the strictly derivational theory of syntax referred to as Survive-minimalism (cf. Stroik (2009), Stroik and Putnam (2013)). I hope this can be of some help.
  • asked a question related to Chomskyan Grammar
Question
3 answers
Hello to all. I can't find any contributions to machine translation using pregroup grammar or Lambek calculus, on the net. I am working on this and wanted to know if there is any literature.
Relevant answer
Answer
Dear Muhammad ,
please find  the file in the attachment.
hope it helps .
  • asked a question related to Chomskyan Grammar
Question
14 answers
I read the following example in one of my professors notes.
1) we have a SLR(1) Grammar G as following. we use SLR(1) parser generator and generate a parse table S for G. we use LALR(1) parser generator and generate a parse table L for G.
S->AB
A->dAa
A-> lambda (lambda is a string with length=0)
B->aAb
Solution: the number of elements with R (reduce) in S is more than L.
but in one site I read:
2) Suppose T1, T2 is created with SLR(1) and LALR(1) for Grammar G. if G be a SLR(1) Grammar which of the following is TRUE?
a) T1 and T2 has not any difference.
b) total Number of non-error entries in T1 is lower than T2
c) total Number of error entries in T1 is lower than T2
Solution:
The LALR(1) algorithm generates exactly the same states as the SLR(1) algorithm, but it can generate different actions; it is capable of resolving more conflicts than the SLR(1) algorithm. However, if the grammar is SLR(1), both algorithms will produce exactly the same machine (a is right).
any one could describe for me which of them is true?
EDIT: infact my question is why for a given SLR(1) Grammar, the parse table of LALAR(1) and SLR(1) is exactly the same, (error and non-error entries are equal and number of reduce is equal) but for the above grammar, the number of Reduced in S is more than L.
Relevant answer
Answer
Perhaps my tutorial on compiling theory might help you:
In particular, take a look at the syntax section.
If you want to do experiments yourself, our jaccie tool plus additional documentation can be found at:
Happy experimenting!
  • asked a question related to Chomskyan Grammar
Question
10 answers
My question is, for a given AGREE relationship between X and Y, how could one determine which category hosts the interpretable feature and which one hosts the uninterpretable one?
AGREE is driven by interpretable/uninterpretable feature pairs e.g. uPhi/iPhi or uWH/iWH etc. For examples like phi checking between T and a subject, it is taken for granted that the subject has interpretable phi features and that T has uPhi features; but the subject has uT (or uCase) features while T has iT (or iCase).
One argument for this seems to be, in part, semantic: there is a semantic reality to, for example, number which make it plausible that number is interpretable on nouns, while uninterpretable on verbs. (Although, one must also concede that number on pluractional or reciprocal verbs is not all that far fetched, which undermines this type of argument). However, when looking at other constructions, or other types of syntactic interaction, it's not always clear that this type of argument works.
For example, in topicalization constructions such as (a,b,c), let's assume that the topicalized constituent moves to SpecTopicP to check a Topic feature. But is iTopic a feature on the moved constituent and uTopic on the head of TopicP? Or is it the other way around? Is this something that could be parameterized? More specifically, I'd like to know what *arguments* could be marshalled either way.
(a) Peter Florrick I could vote for.
Relevant answer
Answer
Ibrahim, sorry to put my nose in, but, in my understanding, that would explain nothing. Saying that islands are constraints and EPP is a contraint is as explicative as postulating that every language needs a Spec T subject. It just state the facts, rather than explain them. Phonology is a completely different manner since, in phonology, there seem to be real constraints, some universal like against syllables like "LBA", but allowing "BLA"; and some constraints that are language-particular (like not starting a syllable with "S" in Portuguese). There is nothing like that in syntax. 
  • asked a question related to Chomskyan Grammar
Question
5 answers
With grammar (bnf form) and source code as input, Antlr is generating AST/Parse tree with tokens as terminals and "nil" as parent/root in this case.This is not appropriate tree. I came to know that grammar has to be rewritten in order to generate proper AST/Parse tree. But I couldnt find any appropriate rules to rewrite grammar.
Relevant answer
Answer
I would like to know the steps that are to be followed in rewriting grammar, so that ANTLR generates appropriate parse trees.
  • asked a question related to Chomskyan Grammar
Question
2 answers
The Chomsky hierarchy is a guideline on language's expressive power. The linear feedback shift register is a very interesting "element" to the structure of a language and there is a large base of theoretical literature on the subject.
Relevant answer
Answer
Thank you for your response.
Is at level 2 (context-free) a machine capable of generating LFSR (for example that implements a Berlekamp-Massey algorithm) ?
  • asked a question related to Chomskyan Grammar
Question
2 answers
Transformation analysis on speech level
Relevant answer
Answer
thank you so much Mr. Caka. I hope I will be enlighted
  • asked a question related to Chomskyan Grammar
Question
2 answers
Are the "operators" of descriptive grammar a) word allocation and b) subffixes; while the operators of explanatory grammar a) semantics and b) pragmatics?
Relevant answer
Answer
To my mind, description is often as far as we get, but without any explanatory insights there is no point in linguistic enquiry. Plus, if "DESCRIPTION_PERIOD" was enough (or at all possible), would there exist different (and often highly conflicting) schools or paradigms? I tend to believe "descriptive grammar" does not quite exist. Best regards.
  • asked a question related to Chomskyan Grammar
Question
5 answers
I'm currently working on a comparison between Cognitive Construction Grammar and Minimalist syntax as theories for modeling language variation and change. I've received some training in minimalism, but that's long time ago, so my knowledge on that topic is a bit rusty. My supervisors are cognitive linguists too, and about everybody I work with is a cognitive linguists, so no help there. Could someone revise sections 6 and 7 of this paper (about 5 pages, no more) in order to check if I got everything about right?
Relevant answer
Answer
I can revise it if you want, I'll write you back in a couple of days,
Groeten
  • asked a question related to Chomskyan Grammar
Question
4 answers
The Minimalist Program Chomsky 1995, 1999, 2000 has the idea that there is no distinction between Morphology and Syntax and all word formation takes places in the narrow syntax. However, what are the evidence for that? Also, what is the syntax of word formation? Is there any asymmetry for the structure of words?
Relevant answer
Answer
thank you very much for your detailed answer. Yes, I am aware of all the references you cited. For the asymmetry, yes, I think the headedness is the answer. But do we need to merge a root plus a set of features before another root is merged? See Mukai 2008.