Minimal Computation in the Minimalist Program
- 格式:pdf
- 大小:88.55 KB
- 文档页数:6
人工智能依赖的12个数学基础知识下载温馨提示:该文档是我店铺精心编制而成,希望大家下载以后,能够帮助大家解决实际的问题。
文档下载后可定制随意修改,请根据实际需要进行相应的调整和使用,谢谢!并且,本店铺为大家提供各种各样类型的实用资料,如教育随笔、日记赏析、句子摘抄、古诗大全、经典美文、话题作文、工作总结、词语解析、文案摘录、其他资料等等,如想了解不同资料格式和写法,敬请关注!Download tips: This document is carefully compiled by the editor. I hope that after you download them, they can help you solve practical problems. The document can be customized and modified after downloading, please adjust and use it according to actual needs, thank you!In addition, our shop provides you with various types of practical materials, such as educational essays, diary appreciation, sentence excerpts, ancient poems, classic articles, topic composition, work summary, word parsing, copy excerpts, other materials and so on, want to know different data formats and writing methods, please pay attention!人工智能的迅速发展与普及离不开其背后的数学基础。
介绍简约客厅英文作文My living room is minimalist and modern. It is a space that exudes simplicity and elegance. The color scheme is predominantly white, with splashes of black and gray to add contrast. The furniture is sleek and streamlined, with clean lines and minimal embellishments. The overall effect is a calming and uncluttered space.The centerpiece of the room is a large, comfortable sofa. It is upholstered in white leather and is the perfect place to relax and unwind after a long day. The sofa is complemented by a pair of matching armchairs, which provide additional seating options. The furniture is arranged in a way that encourages conversation and interaction, with the sofa and chairs facing each other.The walls of the living room are adorned with a few carefully chosen pieces of artwork. These artworks serve as focal points and add visual interest to the space. The rest of the walls are left bare, creating a sense of opennessand simplicity. The absence of clutter contributes to the overall minimalist aesthetic.The lighting in the living room is soft and ambient. There are no harsh overhead lights, but instead, a combination of floor lamps and table lamps provide a warm and inviting glow. These lamps are strategically placed to create a cozy atmosphere and highlight the artwork on the walls.The flooring in the living room is made of light-colored hardwood, which adds warmth and texture to the space. A large, plush rug is placed in the center of the room, providing a soft and comfortable surface for bare feet. The rug also helps to define the seating area and anchor the furniture.Overall, my minimalist living room is a sanctuary of simplicity and tranquility. It is a space where I can relax, entertain guests, and enjoy the company of loved ones. The clean lines, neutral color palette, and carefully chosen furnishings create a timeless and sophisticated atmosphere.It is a space that reflects my personal style and provides a calming retreat from the chaos of everyday life.。
《2024年高考英语新课标卷真题深度解析与考后提升》专题05阅读理解D篇(新课标I卷)原卷版(专家评价+全文翻译+三年真题+词汇变式+满分策略+话题变式)目录一、原题呈现P2二、答案解析P3三、专家评价P3四、全文翻译P3五、词汇变式P4(一)考纲词汇词形转换P4(二)考纲词汇识词知意P4(三)高频短语积少成多P5(四)阅读理解单句填空变式P5(五)长难句分析P6六、三年真题P7(一)2023年新课标I卷阅读理解D篇P7(二)2022年新课标I卷阅读理解D篇P8(三)2021年新课标I卷阅读理解D篇P9七、满分策略(阅读理解说明文)P10八、阅读理解变式P12 变式一:生物多样性研究、发现、进展6篇P12变式二:阅读理解D篇35题变式(科普研究建议类)6篇P20一原题呈现阅读理解D篇关键词: 说明文;人与社会;社会科学研究方法研究;生物多样性; 科学探究精神;科学素养In the race to document the species on Earth before they go extinct, researchers and citizen scientists have collected billions of records. Today, most records of biodiversity are often in the form of photos, videos, and other digital records. Though they are useful for detecting shifts in the number and variety of species in an area, a new Stanford study has found that this type of record is not perfect.“With the rise of technology it is easy for people to make observation s of different species with the aid of a mobile application,” said Barnabas Daru, who is lead author of the study and assistant professor of biology in the Stanford School of Humanities and Sciences. “These observations now outnumber the primary data that comes from physical specimens(标本), and since we are increasingly using observational data to investigate how species are responding to global change, I wanted to know: Are they usable?”Using a global dataset of 1.9 billion records of plants, insects, birds, and animals, Daru and his team tested how well these data represent actual global biodiversity patterns.“We were particularly interested in exploring the aspects of sampling that tend to bias (使有偏差) data, like the greater likelihood of a citizen scientist to take a picture of a flowering plant instead of the grass right next to it,” said Daru.Their study revealed that the large number of observation-only records did not lead to better global coverage. Moreover, these data are biased and favor certain regions, time periods, and species. This makes sense because the people who get observational biodiversity data on mobile devices are often citizen scientists recording their encounters with species in areas nearby. These data are also biased toward certain species with attractive or eye-catching features.What can we do with the imperfect datasets of biodiversity?“Quite a lot,” Daru explained. “Biodiversity apps can use our study results to inform users of oversampled areas and lead them to places – and even species – that are not w ell-sampled. To improve the quality of observational data, biodiversity apps can also encourage users to have an expert confirm the identification of their uploaded image.”32. What do we know about the records of species collected now?A. They are becoming outdated.B. They are mostly in electronic form.C. They are limited in number.D. They are used for public exhibition.33. What does Daru’s study focus on?A. Threatened species.B. Physical specimens.C. Observational data.D. Mobile applications.34. What has led to the biases according to the study?A. Mistakes in data analysis.B. Poor quality of uploaded pictures.C. Improper way of sampling.D. Unreliable data collection devices.35. What is Daru’s suggestion for biodiversity apps?A. Review data from certain areas.B. Hire experts to check the records.C. Confirm the identity of the users.D. Give guidance to citizen scientists.二答案解析三专家评价考查关键能力,促进思维品质发展2024年高考英语全国卷继续加强内容和形式创新,优化试题设问角度和方式,增强试题的开放性和灵活性,引导学生进行独立思考和判断,培养逻辑思维能力、批判思维能力和创新思维能力。
流线设计英语作文Title: The Art of Streamlined Design。
In the realm of design, the concept of streamlining holds a revered status. It encompasses more than just the visual appeal; it embodies efficiency, elegance, and functionality. From automotive design to architecture, from product design to user interface, the principles of streamlined design are omnipresent, shaping our world in subtle yet significant ways.At its core, streamlined design seeks to eliminate unnecessary elements, reducing complexity while enhancing performance. It prioritizes sleekness, smoothness, and simplicity, often inspired by nature's own efficiency. Whether it's the aerodynamic curves of a sports car or the minimalist layout of a smartphone interface, streamlining aims to optimize user experience while maintaining aesthetic allure.One of the quintessential examples of streamlined design lies in transportation, particularly in the automotive industry. Vehicles crafted with streamlined principles boast improved fuel efficiency, reduced wind resistance, and enhanced maneuverability. Take, for instance, the iconic designs of the 20th century, such as the Volkswagen Beetle or the Chevrolet Corvette. Their timeless silhouettes epitomize the marriage of form and function, where every contour serves a purpose beyond mere aesthetics.Moreover, streamlined design extends its influence beyond tangible objects into digital realms. In the era of smartphones and tablets, users expect interfaces to be intuitive and clutter-free. Companies invest heavily in user experience (UX) design to ensure their products align with streamlined principles, fostering seamless interaction and engagement. The success of tech giants like Apple can be attributed, in part, to their commitment to minimalist design, evident in the clean lines and intuitive interfaces of their products.Architecture, too, embraces the ethos of streamlined design. Modern skyscrapers feature sleek facades and efficient layouts, maximizing space while minimizing environmental impact. Architects draw inspiration from natural forms, incorporating organic shapes and materials to create buildings that harmonize with their surroundings. The Burj Khalifa in Dubai stands as a testament to this ethos, its towering yet graceful form symbolizing the marriage of human ingenuity with natural elegance.In product design, streamlining transcends mere aesthetics to address sustainability and usability. Eco-conscious consumers gravitate towards products with minimal packaging and eco-friendly materials, reflecting a growing appreciation for simplicity and sustainability. Designers innovate with renewable materials and modular designs, catering to a market increasingly mindful of its environmental footprint.However, achieving streamlined design is not withoutits challenges. Balancing form and function requires meticulous attention to detail and a deep understanding ofuser needs. Designers must navigate technological constraints, market trends, and cultural preferences to create solutions that resonate with their audience. Iterative prototyping and user testing play crucial roles in refining designs, ensuring they meet the highest standards of efficiency and elegance.In conclusion, streamlined design represents the pinnacle of aesthetic and functional excellence across various disciplines. From automotive marvels to digital interfaces, from architectural wonders to sustainable products, the principles of streamlining continue to shape our world in profound ways. As designers strive to push the boundaries of innovation, the legacy of streamlined design will endure, inspiring generations to come with its timeless elegance and efficiency.。
PhasesEarly minimalism, ranging from Chomsky (1989) to Chomsky’s (1995)MinimalistProgram (MP), incorporated a weakly derivational approach. The computational sys-tem (narrow syntax, C HL 1) manipulates a selection of lexical items (LI) by means of astep-by-step application of the operations Merge and Move, until Spell-Out occurs.Then, PF and LF are created, the two levels of representation interfacing with the syn-tax-external modules A-P and C-I, respectively.Chomsky’s (2000)Minimalist Inquiries (MI) sought to reduce derivational complexity by chopping the lexical array (LA) up into sub-arrays 2, each feeding C HL to derive a particular phase – a derivational cycle .Phases are well-defined chunks of a derivation –v P, CP, and DP (more on the reasons for this choice below) –,each of which, upon completion, is transferred to the interfaces, and thus does no longer bothers the computation with its weight. This entails a theory of Multiple Spell-Out (cf. Uriagereka 1999).The syntactic objects that qualify for phases are transitive v*Ps (star’s for transitive),which contain an AGENT or an EXPERIENCER as an external argument (to the exclusion of passive and unaccusative v P), and CPs , which are specified for Tense and Force(Adger’s Clause Type )3. Crucially,defective TPs and VPsare no phases . The reasoning behind this assumption is forone conceptual (phases represent natural, propositional ob-jects, saturated expressions;v *Ps represent events, CPs fullpropositions), but also grammatical (pseudo-clefting is notpossible with non-finite raising/ECM-complements, whichare TP). CPs are complete clausal complexes and v *Ps arecomplete thematic complexes .A rather nice conceptual argument for phases concerns theuninterpretability of features: If Spell-Out has to remove un-interpretable features to avoid a crash at the interfaces, it must know which features are interpretable, and 1HL = Human Language.2 The LA replaced the notion of numeration (NUM) from minimalist frameworks previous to MI (e.g. MP). Technically speaking,a LA is a NUM if it contains more than one occurrence of one and the same lexical item (LI), in which case this item’s index is larger than ‘1’ (cf. DbP:11).3 DPs are considered phases, too (Chomsky 2005:17f.). Whether PPs are phases or nor remains to be investigated.Fig.1: Y-ModelFig.2:Multiple Spell-PHASE 1 PHASE 2to be someone here] –Merge there& check EPP]D. [TP T[EPP] [VP seems [TP there T[EPP] to be someone here]]] –Merge TE. [TP there T[EPP] [VP seems [TP there T[EPP] to be someone here]]] –Move there& check EPPNow consider the derivation of the ungrammatical (2)(1), based on the same numeration, taking another option at step B.A. [TP T[EPP] to be someone here] –Merge TB.[TP someone T[EPP] to be someone here] –Move someone& check EPPD. [TP T[EPP] [VP seems [TP someone T[EPP] to be someone here]]] –Merge TE. [TP there T[EPP] [VP seems [TP someone T[EPP] to be someone here]]] –Merge there& check EPPThe derivational step B of (2) violates Merge-over-Move, moving someone instead of merging the expletive there available in the Num, which is why the derivation produces an ill-formed sentence. Defining different sub-arrays for (2), provided the phasehood of v P and CP, can capture this issue derivationally:(3)a. {{C, T}3 {seem, there, T, to}2 {be, someone, here}1}. There seems to be someone here. –(2)be a man in the room] –Move a man & check EPP]C. [CP that a man was[EPP] be a man in the room] –Merge C/thatD. [TP T[EPP] [be a rumour that a man was[EPP]be a man in the room]] –Merge TE. [CP[TP there was[EPP] be a rumour that a man was[EPP] be a man in the room]] –Merge there&check EPP Dividing Num up into lexical sub-arrays gets rid of the problem: if in the second sub-array {…}2 no exple-tive is available, subject raising can take place.Phases consist of three parts: the phase head H (v/C), its complement ZP (the internal domain), and its edge YP (the specifier domain). After completion, the phase is inaccessible to further operations, as formally cap-tured by the Phase-Impenetrability Condition(PIC): “In phase with the head H, the domain of H is not accessible to operations outside , only H and its edge are accessible to such operations” (Chomsky 2000:108).To ensure classic successive cyclic movement (DPs moving up the tree have to make stops in the Specs of other heads), Chomsky allows a phases edge, together with the phase head, to remain accessible even after completion of a given phase (e.g. as intermediate landing site for long-distance movement – an escape hatch). This means that once a phase is completed, it is no longer accessible to operations – it is impenetra-However, unlike the bulk of minimalist theory, phases have spawned critical reactions within the minimalist camp. Thus, Boeckx & Grohmann (2005), for example, critically examine phase theory, essentially identify-ing it as old wine in new bottles (islands, bounding nodes, successive cyclicity, etc.): “[Phases] simply re-code insights from the past” (ibid.:1). Still, it should be noted that it simply is Chomsky’s phase theory that is most often exposed, for he is the leading figure in mainstream generative linguistics; but there are plenty of similar proposals relying on phase-like concepts, differing from Chomsky’s proposal in one or the other relevant aspect (e.g. Uriagereka’s 1999cascades, Grohmann’s 2003prolific domains). Furthermore, there are radically derivational approaches to syntax (cf. e.g. Epstein & Seely 2006; Chomsky’s is weakly deriva-tional), which have eliminated any level of representation (LF, PF), to the effect that a derivation is dynami-cally accessed by the interfaces at every derivational step (i.e. the smallest phases ever!).This squib was supposed to be only a technical outline of the motivation for and the implementation of phases in the minimalist framework. Many interesting aspects are left untouched, of course. For example, it is worth taking a look at defective (i.e. approximately, non-finite) TPs and intransitive v P, and how they differ derivationally from their phasal counterparts. In addition, Probe-Goal theory (Agree), which has fig-ured prominently during our SCIMS, is closely associated with the introduction of phases.4 In Chomsky (2000), [EDGE] was construed as a peripheral feature [P], in Chomsky (2001, 2004) as a generalised EPP feature[EPP], in analogy to the classic T-related [EPP]:。
043《经济学家》读译参考之四十三微缩无线-计算机芯片元件无线交互通信TEXT 43Shrinking wireless微缩无线(陈继龙编译)Jun 29th 2006From The Economist print editionTHE miniaturisation of the components of computer chips has proved unstoppable. In each new generation, those components are smaller and more tightly packed than they were in its predecessor. (1)Progress has been so rapid that chip designers are approaching apparently fundamental barriers to further reductions in size and increases in density. One of these is imposed by the need to wire the components in a chip together, so that they can exchange signals. But, in a miniaturised version of the s_______① to wireless communication in the macroscopic world, a group of researchers led by Alain Nogaret of the University of Bath, in England, think they can make chips whose components talk to each other wirelessly.事实证明,计算机芯片元件的微型化已经势不可挡。
日常生活中减少垃圾的方法英语作文Here is an English essay on the topic "Ways to Reduce Waste in Daily Life" with a word count greater than 1000 words. No title is included in the main text as per your instructions.Reducing waste in our daily lives is a crucial step in preserving the environment and creating a more sustainable future. As individuals, we have the power to make a significant impact by adopting simple yet effective strategies to minimize the amount of waste we generate. In this essay, we will explore various ways to reduce waste in our daily routines.One of the most impactful ways to reduce waste is to focus on the concept of the "3Rs": reduce, reuse, and recycle. The first and most important step is to reduce the amount of waste we generate in the first place. This can be achieved by being mindful of our consumption habits and opting for products with minimal packaging or those made from sustainable materials. For instance, we can choose to buy in bulk, use reusable shopping bags, and avoid single-use plastics such as straws, bottles, and disposable utensils.Another effective way to reduce waste is to reuse items wheneverpossible. Instead of discarding items after a single use, we can find creative ways to repurpose them. For example, we can reuse glass jars as storage containers, turn old clothing into cleaning rags, or upcycle household items into unique decorations. By extending the lifespan of products, we can significantly reduce the amount of waste that ends up in landfills or oceans.Recycling is another crucial component of waste reduction. By properly sorting and disposing of recyclable materials, such as paper, plastic, glass, and metal, we can divert these resources from landfills and give them a new life. Many municipalities and local authorities have implemented comprehensive recycling programs, making it easier for individuals to participate. It is important to educate ourselves on the recycling guidelines in our area and to make recycling a consistent habit in our daily lives.In addition to the 3Rs, there are other strategies we can employ to reduce waste in our daily lives. One such strategy is to adopt a minimalist lifestyle, which involves consciously reducing the number of possessions and focusing on quality over quantity. By being more mindful of our purchases and only acquiring what we truly need, we can minimize the accumulation of unnecessary items that eventually become waste.Another effective approach is to participate in the sharing economy.Instead of owning and discarding certain items, we can explore options like renting, borrowing, or sharing with others. For example, we can rent power tools or camping gear instead of purchasing them, or we can join a tool library or a clothing swap to access items without the need for individual ownership.Furthermore, we can reduce food waste by being more conscious of our food consumption and storage habits. This includes planning meals, buying only what we need, properly storing perishable items, and composting food scraps instead of sending them to landfills. By reducing food waste, we not only minimize the environmental impact but also save money and resources.Lastly, we can support businesses and organizations that prioritize sustainability and waste reduction. This includes choosing products and services from companies that use eco-friendly practices, such as using recycled materials, minimizing packaging, or offering take-back programs for their products. By supporting these initiatives, we can encourage more businesses to adopt sustainable practices and create a ripple effect of positive change.In conclusion, reducing waste in our daily lives is a crucial step towards a more sustainable future. By embracing the 3Rs, adopting a minimalist lifestyle, participating in the sharing economy, reducing food waste, and supporting sustainable businesses, we can allcontribute to the collective effort of minimizing the environmental impact of our actions. It is important to remember that every small step we take can make a significant difference, and by working together, we can create a cleaner, greener, and more sustainable world for generations to come.。
Minimal Computation in the Minimalist ProgramCharles D.YangMIT AI Lab,Cambridge MA02139charles@AbstractThis paper presents a parser that embodies principles of the Minimalist Program[Chomsky1993, 1995]and examines some related computational complexity issues.First,the parser adopts a derivational conception of syntactic operation[Epstein1995].Specifically,syntactic relations are completely determined by a derivational structure-building process.Second,it assumes that syntactic operations satisfy featural requirements that are inherent morphological properties of lexical items.Further,the parser uses some economy principles that prefer operations with the “least effort”.Given these,a notion of“local computation”is realized in the parser that tries to eliminate syntactically illegitimate parses as soon as possible,leading to computational efficiency. Some psycholinguistic expectations naturally fall out.1The Minimalist ProgramThe essence of Chomsky’s Minimalist Program[1993,1995]is to eliminate syntactic entities and principles that are not absolutely necessary for linguistic description and explanation.The syntactic module of the language faculty serves its role in mediating form and meaning,thus only constrained by the so-called“Bare Output Conditions”(BOC),the empirical conditions at LF (conceptual-intentional)and PF(articulatory-perceptual)interfaces.A number of conceptually natural,computationally simple Generalized Transformations(GT)are defined,for example, Merge,Move.GT operate on lexical items and compose syntactic objects from them.The relationship among syntactic objects constitutes syntactic structure.At the interfaces,the only available entities are syntactic objects and the syntactic structure they form.These entities contain inherent features of lexical items that impose constraints on PF production and LF interpretation.Syntactic operations are to deal with only these entities and to reflect these constraints–and nothing else.More specifically,a lexical item consists of a feature triple <P,S,F>,the phonological,semantic,and formal features,respectively.For instance,the S features of the verb runs specify its semantic properties,i.e.present tense,singular,etc,while its F features indicate that it has Case slots to assign.Putting phonology aside(thus PF and P features),the only conditions that LF imposes are of interpretability nature:S features are interpretable and F features are not.In this sense,syntactic operations must eliminate F and preserve S features for this LF legibility condition.What interpretations there are and how they are constructed then become properties of the structures that syntactic operations have produced.It is the nature of these syntactic operations that the Minimalist Program attempts to characterize.The aim is to reduce language universal constraints(e.g.Subjacency,ECP in the GB framework)to these general,simple operations along with some economy constraints,and to cast language variations in the form of morphological featural parameterizations–irreducible by hypothesis.Note that the interface conditions(BOC)deal with only morphological features inherent to lexical items,be formal or semantic.Since BOC are the only output constraints imposed on syntax,it is assumed that syntactic operations are feature-driven to check features.The principle Last Resort states that a syntactic operation T takes place iffT checks feature(s)or T is a necessary step for feature(s)to be checked later.Economy principle prefers operations with the“least effort”,usually defined in the number of steps in an operation,the“distance”of a movement,etc.In the rest of this paper,we adopt a derivational conception of the Minimalist Program,due to Sam Epstein[1995]and explore its computational consequences.We show that syntactic rela-tions can be established among objects,derivationally,via a simple syntactic operation Merge.A detailed feature checking algorithm is proposed,which is embedded in a canonical parser.Beyondthese language invariant,theoretically motivated assumptions,we stipulate nothing more.It is shown that this simple parsing theory captures a wide range of syntactic phenomena.Crucially, the parser obtains a type of“local computation”which aims to eliminate impossible/illegitimate parses as early as possible,thus greatly reduces the size of the parsing search space and com-putational complexity.The incremental parsing algorithm,and the“memory”it uses to store unresolved syntactic objects,mimic certain aspects of human sentence processing.2Derivational Syntactic RelationsSyntactic relations only obtain among certain syntactic objects,under certain circumstances. For example,the relation of c-command occurs between Michelle and chocolate,but not vice versa:V”=likesMichelle V’=likeslikes chocolateIn the so-called representational theories of syntax,for instance,Government and Binding The-ory,the notion of c-command is usually defined from observation(e.g.[Reinhart79]).But consider the following conceptual puzzles:Why must c-command be defined this way,not some-thing else?Why does this relation obtain between Michelle and chocolate,but not vice versa, or some other arbitrary pairs of syntactic objects?Epstein[1995]offers an solution to these puzzles.He observes that,if one adopts a deriva-tional approach to syntactic analysis and views syntactic relations to be established“in motion”as derivation proceeds,we obtain only the set of syntactically possible relations(including c-command)and nothing else.Consider the operation Merge,which takes a pair of syntactic objectsαandβ,some-times directly lexical items,concatenate them and projectα1,forming a compound object γ={α,{α,β}}.The formation of syntactic structure is thus a temporal sequence of Merges2, or a derivational history.Epstein claims(the First Law,[Epstein1995]:25)that X can have a syntactic relation with Y if X is connected to Y since the veryfirst moment X enters into the derivation.Let us consider the example above and its derivational history:1.Merge likes and chocolate and project likes to form V′={likes,{likes,chocolate}}.2.Merge Michelle and V’and project V’to form V”={likes,{Michelle,V′}}.Note that Michelle is connected to chocolate via V”and V’the veryfirst moment(step2)when it enters into the derivation,therefore,it can have a syntactic(c-command)relation with chocolate. It is not true vice versa,however,because chocolate is not connected with Michelle when it enters into the derivation at step1,where Michelle was not even present in the derivation.See[Epstein 1995]for detailed conceptual and empirical arguments.Under this view,given a derivational history and the syntactic structure it forms,we obtain the set of syntactically possible relations among syntactic objects.3Parsing and Feature CheckingOur parser builds syntactic structure canonically(from left to right)[Knuth1965].We note that all lexical items in an input sentence must be connected to each other in one single structure, when the derivationfinishes(that is,if a valid derivation exists).Parsing is roughly speaking derivation in reverse.Consider the following sequence of syntactic objects,which are lexical items when the derivation starts:x1,x2,...x nWithout loss of generality,consider x1and x2.Only the following two cases are possible,if the input sentence have a grammatical parse3:1.x1and x2can be Merged,forming syntactic object A1,2OR2.x1can not Merge with x2,but Merges with syntactic object A2,i,i≤n.The parser captures this intuition.Itfirst attempts to Merge two adjacent syntactic objects x1and x2(derivational choice1),and examine whether the compound object A1,2is legitimate (later we will show how to determine this with a feature checking algorithm),where A1,2= {H,{x1,x2}}and H=x1or x2(derivational choices1.1and1.2).If either1.1or1.2is successful,the corresponding A1,2is formed and the derivation proceeds;otherwise the parser will put x1in a memory stack,and proceed tofind A2,i(derivational choice2)and try to Merge with x1later.The parser is greedy in that as long as A i,j is admitted,it will proceed tofind the maximum value of j to build the largest possible A i,j.If a failure occurs,the parser backtracks to pursue the next available choice.Turning to the feature checking algorithm that determines whether a compound syntactic object A i,j is admissible.Recall that syntactic operations are driven to check features.Formal features must be eliminated at Spell-Out(SO),the point prior to LF interpretation;failure to do so cancels the derivation.Semantic features cannot be eliminated,but their mismatch (e.g.tense,number mismatch between subject and verb)cancels the derivation as well.In the Minimalist Program,feature checking relation is defined between a pair of features F∈αand F′∈β,whereαandβare the syntactic objects which they belong to.Now consider the compound objectγ={α,{α,β}}(whichever projects is irrelevant)),and we want to ask,Under what conditions can a checking relation between F and F’take place?We assume the following three conditions(⋆⋆⋆)for a checking relation to take place:1.Feature Availability and Compatibility.F can only be checked by a certain set of cor-responding features F’and F’must be available.An example will be the Case-assigning features,which require available Case-receiving features.2.Structural Relations(SR).αandβmust be in certain constituent positions to obtain thechecking relation.Here we continue to use the conventional notions of Specifier,Head,and Complement in the X-bar theory,although their original meanings and the X-bar theory have been eliminated altogether in the parsing system.Thefirst object Merged with a projecting head is defined as a Complement;otherwise it is a Specifier.An example is the antecedent-anaphor relation,which holds,roughly,between Specifier and Complement positions.3.Categorial Relations(CR).αandβmust be some compatible categorial pair for the check-ing to take place.For instance,Case-assigning and Case-receiving features,F and F’can have a checking relation only ifαis of a Case-assigning category(e.g.verbs,prepositions) andβis a nominal.Assume that syntactic objectsαandβare Merged andβprojects toβ′.Let T x denote the syntactic objects that have been Merged with x before(as its Complement and Specifier(s)),as in Figure2.β′αTβWe derive from Epstein’s First Law that only the following objects may have syntactic relations:1.αwith x,∀x such that x∈β∪Tβ2.βwithα,but NOT with x such that x∈TαThe feature checking algorithm determines whetherβ′is admissible.Typically checking operation involves two types of actions:Assignment and Unification.Assignment refers to assignment and reception of certain formal features,such as Case.Unification tries to match certain semantic features,such as number,tense.The principle Minimal Link Condition(MLC) [Chomsky1995:311]states that features are checked by the closest available checkers.Therefore, at some point of the derivation,if a formal feature is under the condition(as specified in⋆⋆⋆) to be assigned,the parser tries to assigns NOW;if a semantic feature is under condition to be unified,the parser tries to unify NOW.Note that,however,if a formal feature can not be checked (assignment)at the current point of the derivation,the derivation is allowed to proceed,as long as that formal feature can be checked possibly later.A misunification of semantic features,on the contrary,cancels out the derivation and triggers the parser to backtrack to the next derivational choice.When a parse has been produced,the derived structure feeds to SpellOut,where formal features are not expected.4Local Computation and ComplexityThe core parser is written in Prolog,in less than400lines.Feature checking relations are defined, in the form of(⋆⋆⋆),externally to the core parser.Features are encoded in lexical items as their idiosyncratic properties.We have been able to cover a wide range of data,including embedded clause,WH-movement,Case Filter,ϕfeature and anaphora agreements.The parser builds syntactic objects and detects illegitimate derivations through feature check-ing.In particular,at a certain derivational point D,if a semantic feature unification occurs or a formal feature is placed in a position that will never have it checked,the parser immediately cancels D and ALL its descendents:XCXXFigure1:Local computation in parsing search space.Therefore,the parser attempts to detect illegitimate derivations as early as possible,in a strictly local or locally predictable manner.This is our notion of Local Computation in syntactic derivations.Further,we note that the parser has a character of“exhaustiveness”,in that it potentially examines ALL possible derivations.It is easy to see that the parser has the worst-case complexity of O(n!).However,this worst case will probably never occur,because unlike some arbitrary formal systems,syntactic relations are highly(and often locally)constrained in natural languages.The realization of Local Computation captures these constraints so that the parser does not have to examine all the derivations,reducing computational complexity.As a result,the parser is very efficient.For instance,consider the sentence Mary has fondly enjoyed the picture of herself has8!=40320derivations;the parser,however,only examines65of them.Formal analysis of parsing complexity may prove difficult and indirect,because locality constraints can vary quite dramatically from sentence to sentence.It is interesting to reflect on why CFG parsing is efficient(cubic time)and how it relates to our parser.Simply put,CFG is easy to parse because it is context-free!Syntactic relations in CFG are highly local and thus easily predictable,thus making parsing efficient.Since the memory resources for feature-driven parsing is bounded(by the number of features in the words in an incoming sentence),and each operation cancels(at least)a feature,parsing human languag sentences is therefore space-bound,as in context-sensitive grammars(Aho,Hopcroft and Ullman1977).We thus conjecture that human language is context-sensitive.5Psycholinguistic ImplicationsNote again that the parser is directly constrained by various MP principles and prefers“cheaper”merger operations.This is reflected in the parser’s greediness nature in that it tries to merge next available word with the category being currently considered.Some properties of human sentence processing follow,as sketched briefly below.Take the classical example of center-embedding The cheese the rat the cat chased ate rot. The partial category the cheese can not be merged with the rat,which itself can not be merged with cat,since no formal feature checking is involved.Therefore,all three categories are stacked into short term memory,causing information overflow,leading to processing difficulties.Left or right embedding sentences such as John thinks Mary said Joan kissed Bill clearly don’t pose any problem:the selectional properties of thinks and said require a CP,provided by the immediately following words,are completed and sent offto semantic interpretation,leaving no pressure on memory size.Take the familiar cases of Late Closure,John said Mary died yesterday.A strong preference of attaching yesterday to died can be observed even with semantically anomalous cases,John said Mary will die yesterday.Our parser directly predicts this:yesterday directly attaches to the“current”category,the VP=“Mary died”,thus the prefered reading.Furthermore,since Merging with the upper level VP=“John said”is also a possible parsing choice,the parser is able to backtrack and obtain the other reading,which is more expensive thus less prefered.Consider the case of Garden Path:The horse raced past the barn fell.When the parser encounters raced,it has two options to consider:(1)to take raced as the verbal predicate and the horse the subject;or(2)to insert a pro in front of raced,making a clausal adjunct to modify the horse.Prior to the end of sentence,(1)is the prefered reading whereas(2)is the correct reading,thus the garden path.This phenomenon has a straightforward explanation in our parsing theory.Note that choice(1)immediately satisfies the formal feature requirement of the horse,namely,Case assignment,leading to its semantic interpretation;(2),on the other hand,will not discharge the(need case)feature until the arrival of fell.Our parsing algorithm directly predicts this preference.Notice that the parser captures these aspects of human sentence processing by transparently implementing the underlying theory of grammar:a derivational construction of syntactic rela-tions,and a simple operation Merge.And sentence processing facts follow.This is in contrast with other approaches to understand human parser,dating back to Yngve(1960),where sentence processing facts are used to motivate the human sentence processing algorithm(e.g.as a LC-parser).The possiblity and consequence of a Type-Transparent parser(Berwick and Weinberg 1984)will be in our future work.AcknowledgmentI wish to thank Bob Berwick,Noam Chomsky and Sam Epstein for useful comments and dis-cussions on a previous draft[Yang1995].References1.Robert C.Berwick and Amy S.Weinberg(1984).The grammatical basis of linguisticperformance.Cambridge,MA:MIT Press.2.Noam A.Chomsky(1993).The Minimalist Program.In The View from Building20,Haleand Keyser(eds),Cambridge,MA:MIT Press.3.Noam A.Chomsky(1995).The Minimalist Program.Cambridge,MA:MIT Press.4.Sam D.Epstein(1995).Un-principled syntax and the derivation of syntactic relations.Harvard University manuscript.Program.IBM T.J.Watson Research Center manuscript.5.Donald E.Knuth(1965).On the translation of languages from left to rmationand Control8:6,607-639.6.Tanya Reinhart(1979).The syntactic domain for semantic rules.In Formal Semanticsand Pragmatics,Guenther and Schmidt(eds).Dordrecht:D.Reidel.7.Charles D.Yang(1995).Derivational syntactic relations,feature checking and local com-putation.Term paper for Linguistic Structures,MIT.8.Victor Yngev(1960).A model and a hypothesis for language structure.Proceedings ofAmerican Philosophical Society104.。