当前位置:文档之家› A Machine Learning Approach to Automatic Production of Compiler Heuristics

A Machine Learning Approach to Automatic Production of Compiler Heuristics

A Machine Learning Approach to Automatic Production of Compiler Heuristics
A Machine Learning Approach to Automatic Production of Compiler Heuristics

A Machine Learning Approach to Automatic

Production of Compiler Heuristics

Antoine Monsifrot,Fran?c ois Bodin,and Ren′e Quiniou

IRISA-University of Rennes France

{amonsifr,bodin,quiniou}@irisa.fr

Abstract.Achieving high performance on modern processors heavily

relies on the compiler optimizations to exploit the microprocessor archi-

tecture.The e?ciency of optimization directly depends on the compiler

heuristics.These heuristics must be target-speci?c and each new proces-

sor generation requires heuristics reengineering.

In this paper,we address the automatic generation of optimization

heuristics for a target processor by machine learning.We evaluate the

potential of this method on an always legal and simple transformation:

loop unrolling.Though simple to implement,this transformation may

have strong e?ects on program execution(good or bad).However decid-

ing to perform the transformation or not is di?cult since many inter-

acting parameters must be taken into account.So we propose a machine

learning approach.

We try to answer the following questions:is it possible to devise a

learning process that captures the relevant parameters involved in loop

unrolling performance?Does the Machine Learning Based Heuristics

achieve better performance than existing ones?

Keywords:decision tree,boosting,compiler heuristics,loop unrolling.

1Introduction

Achieving high performance on modern processors heavily relies on the abil-ity of the compiler to exploit the underlying architecture.Numerous program transformations have been implemented in order to produce e?cient programs that exploit the potential of the processor architecture.These transformations interact in a complex way.As a consequence,an optimizing compiler relies on internal heuristics to choose an optimization and whether or not to apply it.De-signing these heuristics is generally di?cult.The heuristics must be speci?c to each implementation of the instruction set architecture.They are also dependent on changes made to the compiler.

In this paper,we address the problem of automatically generating such heuristics by a machine learning approach.To our knowledge this is the ?rst study of machine learning to build these heuristics.The usual approach consists in running a set of benchmarks to setup heuristics parameters.Very few

D.Scott(Ed.):AIMSA2002,LNAI2443,pp.41–50,2002.

c Springer-Verlag Berlin Heidelberg2002

42 A.Monsifrot,F.Bodin,and R.Quiniou

papers have speci?cally addressed the issue of building such heuristics.Never-theless approximate heuristics have been proposed[8,11]for unroll and jam a transformation that is like unrolling(our example)but that behaves di?erently.

Our study aims to simplify compiler construction while better exploiting optimizations.To evaluate the potential of this approach we have chosen a simple transformation:loop unrolling[6].Loop unrolling is always legal and is easy to implement,but because it has many side e?ects,it is di?cult to devise a decision rule that will be correct in most situations.

In this novel study we try to answer the following questions:is it possible to learn a decision rule that selects the parameters involved in loop unrolling e?-ciency?Does the Machine Learning Based Heuristics(denoted MLBH)achieve better performance than existing ones?Does the learning process really take into account the target architecture?

To answer the?rst question we build on previous studies[9]that de?ned an abstract representation of loops in order to capture the parameters in?uencing performance.To answer the second question we compare the performance of our Machine Learning Based Heuristics and the GNU Fortran compiler[3]on a set of applications.To answer the last question we have used two target machines, an UltraSPARC machine[12]and an IA-64machine[7],and used on each the MLBH computed on the other.

The paper is organized as follows.Section2gives an overview of the loop unrolling transformation.Section3shows how machine learning techniques can be used to automatically build loop unrolling heuristics.Section4illustrates an implementation of the technique based on the OC1decision tree software[10].

2Loop Unrolling as a Case

The performance of superscalar processors relies on a very high frequency1and on the parallel execution of multiple instructions(this is also called Instruction Level Parallelism–ILP).To achieve this,the internal architecture of superscalar microprocessors is based on the following features:

Memory hierarchy:the main memory access time is typically hundreds of times greater than the CPU cycle time.To limit the slowdown due to memory accesses,a set of intermediate levels are added between the CPU unit and the main memory;the level the closest to the CPU is the fastest,but also the small-est.The data or instructions are loaded by blocks(sets of contiguous bytes in memory)from one memory level of the hierarchy to the next level to exploit the following fact:when a program accesses some memory element,the next contiguous one is usually also accessed in the very near future.In a classical con?guration there are2levels,L1and L2of cache memories as shown on the ?gure1.The penalty to load data from main memory tends to be equivalent to executing1000processor cycles.If the data is already in L2,it is one order of magnitude less.If the data is already in L1,the access can be done in only a few CPU cycles.

1typically2gigahertz corresponding to a processor cycle time of0.5nanosecond

A Machine Learning Approach

43

Memory

Main L2Cache

Cache

L1CPU

x1000

x100

x10

x1

Fig.1.Memory hierarchy

Multiple Pipelined Functional Units:the processor has multiple functional units that can run in parallel to execute several instructions per cycle (typically an integer operation can be executed in parallel with a memory access and a ?oating point computation).Furthermore these functional units are pipelined.This divides the operation in a sequence of steps that can be performed in parallel.Scheduling instructions in the functional units is performed in an out-of-order or in-order mode.Contrary to the in-order ,in the out-of-order mode instructions are not always executed in the order speci?ed by the program.When a processor runs at maximum speed,each pipelined functional unit executes one instruction per cycle.This requires that all operands and branch addresses are available at the beginning of the cycle.Otherwise,functional units must wait during some delays.The processor performance depends on these waiting delays.

The e?ciency of the memory hierarchy and ILP are directly related to the structure and behavior of the code.Many program transformations reduce the number of waiting delays in program execution.

Loop unrolling [6]is a simple program transformation where the loop body is replicated many times.It may be applied at source code level to bene?t from all compiler optimizations.It improves the exploitation of ILP:increasing the size of the body augments the number of instructions eligible to out-of-order scheduling.Loop unrolling also reduces loop management overhead but it has also some bene?cial side e?ects from later compiler steps such as common sub-expression elimination.However it also has many negative side e?ects that can cancel the bene?ts of the transformation:

–the instruction cache behavior may be degraded (if the loop body becomes too big to ?t in the cache),

–the register allocation phase may generate spill code (additional load and store instructions),

–it may prevent other optimization techniques.

As a consequence,it is di?cult to fully exploit loop https://www.doczj.com/doc/4c16489992.html,pilers are usually very conservative.Their heuristics are generally based on the loop body size:under a speci?c threshold,if there is no control ?ow statement,the loop is unrolled.This traditional approach under-exploits loop unrolling [5]and must be adapted when changes are made to the compiler or to the target architecture.The usual approach to build loop unrolling heuristics for a given target com-puter consists in running a set of benchmarks to setup the heuristics parameters.This approach is intrinsically limited because in most optimizations such as loop unrolling,too many parameters are involved.Microarchitecture characteristics

44 A.Monsifrot,F.Bodin,and R.Quiniou

(for instance the size of instruction cache,...)as well as the other optimizations (for instance instruction scheduling,...)that follow loop unrolling during the compilation process should be considered in the decision procedure.The main parameters,but not all(for instance number instruction cache misses),which in?uence loop unrolling e?ciency directly depend on the loop body statements. This is because loop unrolling mainly impacts code generation and instruction scheduling.As a consequence,it is realistic to base the unrolling decision on the properties of the loop code while ignoring its execution context.

3Machine Learning for Building Heuristics

Machine learning techniques o?er an automatic,?exible and adaptive framework for dealing with the many parameters involved in deciding the e?ectiveness of program optimizations.Classically a decision rule is learnt from feature vectors describing positive and negative applications of the transformation.However, it is possible to use this framework only if the parameters can be abstracted statically from the loop code and if their number remains limited.Reducing the number of parameters involved in the process is important as the performance of machine learning techniques is poor when the number of dimensions of the learning space is high[10].Furthermore learning from complex spaces requires more data.

To summarize the approach,the steps involved in using a machine learning technique for building heuristics for program transformation are:

1.?nding a loop abstraction that captures the“performance”features involved

in an optimization,in order to build the learning set,

2.choosing an automatic learning process to compute a rule in order to decide

whether loop unrolling should be applied,

3.setting up the result of the learning process as heuristics for the compiler. In the remainder of this section,we present the representation used for abstract-ing loop properties.The next section shows how to sort the loop into winning and loosing classes according to unrolling.Finally,the learning process based on learning decision trees is overviewed.

3.1Loop Abstraction

The loop abstraction must capture the main loop characteristics that in?uence the execution e?ciency on a modern processor.They are represented by integer features which are relevant static loop properties according to unrolling.We have selected5classes of integer features:

Memory access:number of memory accesses,number of array element reuses from one iteration to another.

Arithmetic operations count:number of additions,multiplications or divi-sions excepting those in array index computations.

Size of the loop body:number of statements in the loop.

A Machine Learning Approach45 Control statements in the loop:number of if statements,goto,etc.in the loop body.

Number of iterations:if it can be computed at compile time.

In order to reduce the learning complexity,only a subset of these features are used for a given compiler and target machine.The chosen subset was determined experimentally by cross validation(see Section4).The quality of the predictions achieved by an arti?cial neural network based on20indices was equivalent to the predictive quality of the6chosen features.

Figure2gives the features that were selected and an example of loop ab-straction.

do i=2,100

a(i)=a(i)+a(i-1)*a(i+1) enddo Number of statements1 Number of arithmetic operations2 Minimum number of iterations99 Number of array accesses4 Number of array element reuses3 Number of if statements0

Fig.2.Example of features for a loop.

3.2Unrolling Bene?cial Loops

A learning example refers to a loop in a particular context(represented by the loop features).To determine if unrolling is bene?cial,each loop is executed twice.Then,the execution times of the original loop and of the unrolled loop are compared.Four cases can be considered:

not signi?cant:the loop execution time is too small and therefore the timing is not signi?cant.The loop is discarded from the learning set.

equal:the execution times of the original and of the unrolled loop are close.A threshold is used to take into account the timer inaccuracy.Thus,the loop performance is considered as invariant by unrolling if the bene?t is less than 10%.

improved:the speedup is above10%.The loop is considered as being bene?-cial by unrolling.

degraded:there is a speed-down.The loop is considered as a degraded loop by unrolling.

The loop set is then partitioned into equivalence classes(denoted loop classes in the remainder).Two loops are in the same loop class if their respective ab-stractions are equal.

The next step is to decide if a loop class is to be considered as a positive or a negative example.Note that there can be bene?cial and degraded loops in the same class as non exhaustive descriptions are used to represent the loops.This is a natural situation as the loop execution or compilation context may greatly in?uence its execution time,for instance due to instruction cache memory e?ects. The following criterion has been used to decide whether a class will represent a positive or a negative example:

46 A.Monsifrot,F.Bodin,and R.Quiniou

1.In a particular class,a loop whose performance degrades by less than 5%is counted once,a loop that degrades performance by 10%is counted twice.A loop that degrades performance more than 20%is counted three times.

2.if the number of unrolling bene?cial loops is greater than the number of degraded loops (using the weights above),then the class represents a positive example,else the class represents a negative example.

3.3

A Learning Method Based on Decision Trees and Boosting

We have chosen to represent unrolling decision rules as decision trees.Decision trees can be learnt e?ciently from feature based vectors.Each node of the de-cision tree represents a test checking the value(s)of one (or several)feature(s)which are easy to read by an expert.This is not the case for statistical meth-ods like Nearest Neighbor or Arti?cial Neural Network for instance,which have comparable or slightly better performance.

We used the OC1[10]software.OC1is a classi?cation tool that induces oblique decision trees.Oblique decision trees produce polygonal partitionings of the feature space.OC1recursively splits a set of objects in a hyperspace by ?nding optimal hyperplanes until every object in a subspace belongs to the same class.

n

n

y

y

x

y

n

B

A 6x+y > 60 ?

3x ?2y > 6 ?

B ?x+2y > 8 ?

A Fig.3.The left side of the ?gure shows an oblique decision tree that uses two attributes.The right side shows the partitioning that this tree creates in the attribute space.

A decision tree example is shown in Figure 3,together with its 2-D related space.Each node of the tree tests a linear combination of some indices (equiva-lent to an hyperplane)and each leaf of the tree corresponds to a class.The main advantages of OC1is that it ?nds smaller decision trees than classical tree learn-ing methods.The major drawback is that they are less readable than classical ones.

The classi?cation of a new loop is equivalent to ?nding a leaf loop class.Once induced,a decision tree can be used as a classi?cation process.An object represented by its feature vector is classi?ed by following the branches of the tree indicated by node tests until a leaf is reached.

To improve the accuracy obtained with OC1we have used boosting [13].Boosting is a general method for improving the accuracy of any given algorithm.

A Machine Learning Approach 47

Boosting consists in learning a set of classi?ers for more and more di?cult prob-lems:the weights of examples that are misclassi?ed by the classi?er learnt at step n are augmented (by a factor proportional to the global error)and at step n +1a new classi?er is learnt on this weighted examples.Finally,the global classi?cation is obtained by a weighted vote of the individual classi?er according to their proper accuracy.In our case 9trees were computed.

4Experiments

The learning set used in the experiments is Number of loops 1036

Discarded loops 177Unrolling bene?cial loops 233Unrolling invariant loops 505Unrolling degraded loops 121Loop classes 572Positive examples 139Negative examples 433Table 1.IA-64learning set.made of loops extracted from programs in

Fortran 77.Most of them were chosen in

available benchmarks [4,1].We have studied

two types of programs:real applications (the

majority comes from SPEC [4])and compu-tational kernels.Table 1presents some char-acteristics of a loop set (cf section 3.2).

The accuracy of the learning method was

assessed by a 10-fold cross-validation.We have experiment with pruning.We have ob-tained smaller trees but the resulting quality was degraded.The results without pruning are presented in Table 2.Two factors can explain the fact that the overall accuracy cannot be better than 85%:

1.since unrolling bene?cial and degraded loops can appear in the same class (cf section 3.2)a signi?cant proportion of examples may be noisy,

2.the classi?cation of positive examples is far worse than the classi?cation of negative ones.Maybe the learning set does not contain enough bene?cial loops.To go beyond cross validation another set of experiments has been performed on two target machines,an UltraSPARC and an IA-64.They aim at showing the technique does catch the most signi?cant loops of the programs.The g77[3]compiler was used.With this compiler,loop unrolling can be globally turned on and o?.To assess our method we have implemented loop unrolling at the source

Table 2.Cross validation accuracy

UltraSPARC IA-64

normal boosting normal boosting Accuracy of overall

example classi?cation 79.4%85.2%82.6%85.2%Accuracy of positive example classi?cation 62.4%61.7%73.9%69.6%Accuracy of negative example classi?cation

85.1%

92.0%

86.3%

92.3%

48 A.Monsifrot,F.Bodin,and R.Quiniou

code level using TSF[2].This is not the most e?cient scheme because in some cases this inhibits some of the compiler optimizations(contrary to unrolling performed by the compiler itself).We have performed experiments to check whether the MLB heuristics are at least as good as compiler heuristics and whether the speci?cities of a target architecture can be taken into account.A set of benchmark programs were selected in the learning set and for each one we have:

1.run the code compiled by g77with-O3option,

2.run the code compiled with-O3-funroll options:the compiler uses its own

unrolling strategy,

3.unroll the loops according to the result of the MLB heuristics and run the

compiled code with-O3option.The heuristics was learned for the target machine from learning set where the test program was removed.

4.unroll the loops according to the result of the MLB heuristics learnt for the

other target machine and run the compiled code with-O3option.

Fig.4.IA-64:-O3is the reference execution time.

The performance results are given in Figure4and Figure5respectively for the IA-64and UltraSPARC targets.

The average execution time of the optimized programs for the IA-64is93.8% of the reference execution time(no unrolling)using the MLB heuristics and 96.8%using the g77unrolling strategy.On the UltraSPARC we have respectively 96%and98.7%showing that our unrolling strategy performs better.Indeed,

A Machine Learning Approach49

Fig.5.UltraSPARC:-O3is the reference execution time.

gaining a few percent on average execution time with one transformation is signi?cant because each transformation is not often bene?cial.For example,only 22%of the loops are bene?cial by unrolling on IA-64and17%on UltraSPARC.

In the last experiment we exchanged the decision trees learnt for the two target machines.On the UltraSPARC,the speedup is degraded from96%to 97.9%and on the IA-64it is degraded from93.8%to96.8%.This shows that the heuristics are e?ectively tuned to a target architecture.

5Conclusion

Compilers implement a lot of optimization algorithms for improving perfor-mance.The choice of using a particular sequence of optimizations and their parameters is done through a set of heuristics hard coded in the compiler.

At each major compiler revision,but also at new implementations of the target Instruction Set Architecture,a new set of heuristics must be reengineered.

In this paper,we have presented a new method for addressing such reengi-neering in the case of loop unrolling.Our method is based on a learning process which adapts to new target architectures or new compiler https://www.doczj.com/doc/4c16489992.html,ing an abstract loop representation we showed that decision trees that provide target speci?c heuristics for loop unrolling can be learnt.

While our study is limited to the simple case of loop unrolling it opens a new direction for the design of compiler heuristics.Even for loop unrolling, there are still many issues to consider to go beyond this?rst result.Are there better abstractions that can capture loop characteristics?Can hardware counters

50 A.Monsifrot,F.Bodin,and R.Quiniou

(for instance cache miss counters)provide better insight on loop unrolling?How large should the learning set be?Can other machine learning techniques be more e?cient than decision trees?

More fundamentally our study raises the question whether it could be pos-sible or not to quasi automatically reengineer the implementation of a set of optimization heuristics for new processor target implementations.

Acknowledgments.We would like to gratefully thank I.C.Lerman and L. Miclet for their insightful advice on machine learning techniques.

References

1.David Bailey.Nas kernel benchmark program,June1988.

https://www.doczj.com/doc/4c16489992.html,/benchmark/nas.

2.F.Bodin,Y.M′e vel,and R.Quiniou.A User Level Program Transformation Tool.

In Proceedings of the International Conference on Supercomputing,pages180–187, July1998,Melbourne,Australia.

3.GNU Fortran Compiler.https://www.doczj.com/doc/4c16489992.html,/.

4.Standard Performance Evaluation Corporation.https://www.doczj.com/doc/4c16489992.html,/.

5.Jack W.Davidson and Sanjay Jinturkar.Aggressive Loop Unrolling in a Retar-

getable,Optimizing Compiler.In Compiler Construction,volume1060of Lecture Notes in Computer Science,pages59–73.Springer,April1996.

6.J.J.Dongarra and A.R.Hinds.Unrolling loops in FORTRAN.Software Practice

and Experience,9(3):219–226,March1979.

7.IA-64.https://www.doczj.com/doc/4c16489992.html,/design/Itanium/idfisa/index.htm.

8.A.Koseki,H.Komastu,and Y.Fukazawa.A Method for Estimating Optimal

Unrolling Times for Nested Loops.In Proceedings of the International Symposium on Parallel Architectures,Algorithms and Networks,1997.

9.A.Monsifrot and https://www.doczj.com/doc/4c16489992.html,puter Aided Hand Tuning(CAHT):“Applying

Case-Based Reasoning to Performance Tuning”.In Proceedings of the15th ACM International Conference on Supercomputing(ICS-01),pages196–203.ACM Press, June17–212001,Sorrento,Italy.

10.Sreerama K.Murthy,Simon Kasif,and Steven Salzberg.A System for Induction

of Oblique Decision Trees.Journal of Arti?cial Intelligence Research,2:1–32,1994.

11.Vivek Sarkar.Optimized Unrolling of Nested Loops.In Proceedings of the14th

ACM International Conference onSupercomputing(ICS-00),pages153–166.ACM Press,May2000.

12.UltraSPARC.https://www.doczj.com/doc/4c16489992.html,/processors/UltraSPARC-II/.

13.C.Yu and D.B.Skillicorn.Parallelizing Boosting and Bagging.Technical report,

Queen’s University,Kingston,Ontario,Canada K7L3N6,February2001.

最新The_Monster课文翻译

Deems Taylor: The Monster 怪才他身材矮小,头却很大,与他的身材很不相称——是个满脸病容的矮子。他神经兮兮,有皮肤病,贴身穿比丝绸粗糙一点的任何衣服都会使他痛苦不堪。而且他还是个夸大妄想狂。他是个极其自负的怪人。除非事情与自己有关,否则他从来不屑对世界或世人瞧上一眼。对他来说,他不仅是世界上最重要的人物,而且在他眼里,他是惟一活在世界上的人。他认为自己是世界上最伟大的戏剧家之一、最伟大的思想家之一、最伟大的作曲家之一。听听他的谈话,仿佛他就是集莎士比亚、贝多芬、柏拉图三人于一身。想要听到他的高论十分容易,他是世上最能使人筋疲力竭的健谈者之一。同他度过一个夜晚,就是听他一个人滔滔不绝地说上一晚。有时,他才华横溢;有时,他又令人极其厌烦。但无论是妙趣横生还是枯燥无味,他的谈话只有一个主题:他自己,他自己的所思所为。他狂妄地认为自己总是正确的。任何人在最无足轻重的问题上露出丝毫的异议,都会激得他的强烈谴责。他可能会一连好几个小时滔滔不绝,千方百计地证明自己如何如何正确。有了这种使人耗尽心力的雄辩本事,听者最后都被他弄得头昏脑涨,耳朵发聋,为了图个清静,只好同意他的说法。他从来不会觉得,对于跟他接触的人来说,他和他的所作所为并不是使人产生强烈兴趣而为之倾倒的事情。他几乎对世间的任何领域都有自己的理

论,包括素食主义、戏剧、政治以及音乐。为了证实这些理论,他写小册子、写信、写书……文字成千上万,连篇累牍。他不仅写了,还出版了这些东西——所需费用通常由别人支付——而他会坐下来大声读给朋友和家人听,一读就是好几个小时。他写歌剧,但往往是刚有个故事梗概,他就邀请——或者更确切说是召集——一群朋友到家里,高声念给大家听。不是为了获得批评,而是为了获得称赞。整部剧的歌词写好后,朋友们还得再去听他高声朗读全剧。然后他就拿去发表,有时几年后才为歌词谱曲。他也像作曲家一样弹钢琴,但要多糟有多糟。然而,他却要坐在钢琴前,面对包括他那个时代最杰出的钢琴家在内的聚会人群,一小时接一小时地给他们演奏,不用说,都是他自己的作品。他有一副作曲家的嗓子,但他会把著名的歌唱家请到自己家里,为他们演唱自己的作品,还要扮演剧中所有的角色。他的情绪犹如六岁儿童,极易波动。心情不好时,他要么用力跺脚,口出狂言,要么陷入极度的忧郁,阴沉地说要去东方当和尚,了此残生。十分钟后,假如有什么事情使他高兴了,他就会冲出门去,绕着花园跑个不停,或者在沙发上跳上跳下或拿大顶。他会因爱犬死了而极度悲痛,也会残忍无情到使罗马皇帝也不寒而栗。他几乎没有丝毫责任感。他似乎不仅没有养活自己的能力,也从没想到过有这个义务。他深信这个世界应该给他一条活路。为了支持这一信念,他

新版人教版高中语文课本的目录。

必修一阅读鉴赏第一单元1.*沁园春?长沙……………………………………毛泽东3 2.诗两首雨巷…………………………………………戴望舒6 再别康桥………………………………………徐志摩8 3.大堰河--我的保姆………………………………艾青10 第二单元4.烛之武退秦师………………………………….《左传》16 5.荆轲刺秦王………………………………….《战国策》18 6.*鸿门宴……………………………………..司马迁22 第三单元7.记念刘和珍君……………………………………鲁迅27 8.小狗包弟……………………………………….巴金32 9.*记梁任公先生的一次演讲…………………………梁实秋36 第四单元10.短新闻两篇别了,“不列颠尼亚”…………………………周婷杨兴39 奥斯维辛没有什么新闻…………………………罗森塔尔41 11.包身工………………………………………..夏衍44 12.*飞向太空的航程……………………….贾永曹智白瑞雪52 必修二阅读鉴赏第一单元1.荷塘月色…………………………………..朱自清2.故都的秋…………………………………..郁达夫3.*囚绿记…………………………………..陆蠡第二单元4.《诗经》两首氓采薇5.离骚………………………………………屈原6.*《孔雀东南飞》(并序) 7.*诗三首涉江采芙蓉《古诗十九首》短歌行……………………………………曹操归园田居(其一)…………………………..陶渊明第三单元8.兰亭集序……………………………………王羲之9.赤壁赋……………………………………..苏轼10.*游褒禅山记………………………………王安石第四单元11.就任北京大学校长之演说……………………..蔡元培12.我有一个梦想………………………………马丁?路德?金1 3.*在马克思墓前的讲话…………………………恩格斯第三册阅读鉴赏第一单元1.林黛玉进贾府………………………………….曹雪芹2.祝福………………………………………..鲁迅3. *老人与海…………………………………….海明威第二单元4.蜀道难……………………………………….李白5.杜甫诗三首秋兴八首(其一) 咏怀古迹(其三) 登高6.琵琶行(并序)………………………………..白居易7.*李商隐诗两首锦瑟马嵬(其二) 第三单元8.寡人之于国也…………………………………《孟子》9.劝学……………………………………….《荀子》10.*过秦论…………………………………….贾谊11.*师说………………………………………韩愈第四单元12.动物游戏之谜………………………………..周立明13.宇宙的边疆………………………………….卡尔?萨根14.*凤蝶外传……………………………………董纯才15.*一名物理学家的教育历程……………………….加来道雄第四册阅读鉴赏第一单元1.窦娥冤………………………………………..关汉卿2.雷雨………………………………………….曹禹3.*哈姆莱特……………………………………莎士比亚第二单元4.柳永词两首望海潮(东南形胜) 雨霖铃(寒蝉凄切) 5.苏轼词两首念奴娇?赤壁怀古定风波(莫听穿林打叶声) 6.辛弃疾词两首水龙吟?登建康赏心亭永遇乐?京口北固亭怀古7.*李清照词两首醉花阴(薄雾浓云愁永昼) 声声慢(寻寻觅觅) 第三单元8.拿来主义……………………………………….鲁迅9.父母与孩子之间的爱……………………………..弗罗姆10.*短文三篇热爱生

高中外研社英语选修六Module5课文Frankenstein's Monster

Frankenstein's Monster Part 1 The story of Frankenstein Frankenstein is a young scientist/ from Geneva, in Switzerland. While studying at university, he discovers the secret of how to give life/ to lifeless matter. Using bones from dead bodies, he creates a creature/ that resembles a human being/ and gives it life. The creature, which is unusually large/ and strong, is extremely ugly, and terrifies all those/ who see it. However, the monster, who has learnt to speak, is intelligent/ and has human emotions. Lonely and unhappy, he begins to hate his creator, Frankenstein. When Frankenstein refuses to create a wife/ for him, the monster murders Frankenstein's brother, his best friend Clerval, and finally, Frankenstein's wife Elizabeth. The scientist chases the creature/ to the Arctic/ in order to destroy him, but he dies there. At the end of the story, the monster disappears into the ice/ and snow/ to end his own life. Part 2 Extract from Frankenstein It was on a cold November night/ that I saw my creation/ for the first time. Feeling very anxious, I prepared the equipment/ that would give life/ to the thing/ that lay at my feet. It was already one/ in the morning/ and the rain/ fell against the window. My candle was almost burnt out when, by its tiny light,I saw the yellow eye of the creature open. It breathed hard, and moved its arms and legs. How can I describe my emotions/ when I saw this happen? How can I describe the monster who I had worked/ so hard/ to create? I had tried to make him beautiful. Beautiful! He was the ugliest thing/ I had ever seen! You could see the veins/ beneath his yellow skin. His hair was black/ and his teeth were white. But these things contrasted horribly with his yellow eyes, his wrinkled yellow skin and black lips. I had worked/ for nearly two years/ with one aim only, to give life to a lifeless body. For this/ I had not slept, I had destroyed my health. I had wanted it more than anything/ in the world. But now/ I had finished, the beauty of the dream vanished, and horror and disgust/ filled my heart. Now/ my only thoughts were, "I wish I had not created this creature, I wish I was on the other side of the world, I wish I could disappear!” When he turned to look at me, I felt unable to stay in the same room as him. I rushed out, and /for a long time/ I walked up and down my bedroom. At last/ I threw myself on the bed/ in my clothes, trying to find a few moments of sleep. But although I slept, I had terrible dreams. I dreamt I saw my fiancée/ walking in the streets of our town. She looked well/ and happy/ but as I kissed her lips,they became pale, as if she were dead. Her face changed and I thought/ I held the body of my dead mother/ in my arms. I woke, shaking with fear. At that same moment,I saw the creature/ that I had created. He was standing/by my bed/ and watching me. His

人教版高中语文必修必背课文精编WORD版

人教版高中语文必修必背课文精编W O R D版 IBM system office room 【A0816H-A0912AAAHH-GX8Q8-GNTHHJ8】

必修1 沁园春·长沙(全文)毛泽东 独立寒秋, 湘江北去, 橘子洲头。 看万山红遍, 层林尽染, 漫江碧透, 百舸争流。 鹰击长空, 鱼翔浅底, 万类霜天竞自由。 怅寥廓, 问苍茫大地, 谁主沉浮。 携来百侣曾游, 忆往昔峥嵘岁月稠。

恰同学少年, 风华正茂, 书生意气, 挥斥方遒。 指点江山, 激扬文字, 粪土当年万户侯。 曾记否, 到中流击水, 浪遏飞舟。 雨巷(全文)戴望舒撑着油纸伞,独自 彷徨在悠长、悠长 又寂寥的雨巷, 我希望逢着 一个丁香一样地

结着愁怨的姑娘。 她是有 丁香一样的颜色, 丁香一样的芬芳, 丁香一样的忧愁, 在雨中哀怨, 哀怨又彷徨; 她彷徨在这寂寥的雨巷,撑着油纸伞 像我一样, 像我一样地 默默彳亍着 冷漠、凄清,又惆怅。她默默地走近, 走近,又投出 太息一般的眼光

她飘过 像梦一般地, 像梦一般地凄婉迷茫。像梦中飘过 一枝丁香地, 我身旁飘过这个女郎;她默默地远了,远了,到了颓圮的篱墙, 走尽这雨巷。 在雨的哀曲里, 消了她的颜色, 散了她的芬芳, 消散了,甚至她的 太息般的眼光 丁香般的惆怅。 撑着油纸伞,独自

彷徨在悠长、悠长 又寂寥的雨巷, 我希望飘过 一个丁香一样地 结着愁怨的姑娘。 再别康桥(全文)徐志摩 轻轻的我走了,正如我轻轻的来; 我轻轻的招手,作别西天的云彩。 那河畔的金柳,是夕阳中的新娘; 波光里的艳影,在我的心头荡漾。 软泥上的青荇,油油的在水底招摇; 在康河的柔波里,我甘心做一条水草! 那榆荫下的一潭,不是清泉, 是天上虹揉碎在浮藻间,沉淀着彩虹似的梦。寻梦?撑一支长篙,向青草更青处漫溯, 满载一船星辉,在星辉斑斓里放歌。

(完整版)Unit7TheMonster课文翻译综合教程四

Unit 7 The Monster Deems Taylor 1He was an undersized little man, with a head too big for his body ― a sickly little man. His nerves were bad. He had skin trouble. It was agony for him to wear anything next to his skin coarser than silk. And he had delusions of grandeur. 2He was a monster of conceit. Never for one minute did he look at the world or at people, except in relation to himself. He believed himself to be one of the greatest dramatists in the world, one of the greatest thinkers, and one of the greatest composers. To hear him talk, he was Shakespeare, and Beethoven, and Plato, rolled into one. He was one of the most exhausting conversationalists that ever lived. Sometimes he was brilliant; sometimes he was maddeningly tiresome. But whether he was being brilliant or dull, he had one sole topic of conversation: himself. What he thought and what he did. 3He had a mania for being in the right. The slightest hint of disagreement, from anyone, on the most trivial point, was enough to set him off on a harangue that might last for hours, in which he proved himself right in so many ways, and with such exhausting volubility, that in the end his hearer, stunned and deafened, would agree with him, for the sake of peace. 4It never occurred to him that he and his doing were not of the most intense and fascinating interest to anyone with whom he came in contact. He had theories about almost any subject under the sun, including vegetarianism, the drama, politics, and music; and in support of these theories he wrote pamphlets, letters, books ... thousands upon thousands of words, hundreds and hundreds of pages. He not only wrote these things, and published them ― usually at somebody else’s expense ― but he would sit and read them aloud, for hours, to his friends, and his family. 5He had the emotional stability of a six-year-old child. When he felt out of sorts, he would rave and stamp, or sink into suicidal gloom and talk darkly of going to the East to end his days as a Buddhist monk. Ten minutes later, when something pleased him he would rush out of doors and run around the garden, or jump up and down off the sofa, or stand on his head. He could be grief-stricken over the death of a pet dog, and could be callous and heartless to a degree that would have made a Roman emperor shudder. 6He was almost innocent of any sense of responsibility. He was convinced that

人教版高中语文必修一背诵篇目

高中语文必修一背诵篇目 1、《沁园春长沙》毛泽东 独立寒秋,湘江北去,橘子洲头。 看万山红遍,层林尽染;漫江碧透,百舸争流。 鹰击长空,鱼翔浅底,万类霜天竞自由。 怅寥廓,问苍茫大地,谁主沉浮? 携来百侣曾游,忆往昔峥嵘岁月稠。 恰同学少年,风华正茂;书生意气,挥斥方遒。 指点江山,激扬文字,粪土当年万户侯。 曾记否,到中流击水,浪遏飞舟? 2、《诗两首》 (1)、《雨巷》戴望舒 撑着油纸伞,独自 /彷徨在悠长、悠长/又寂寥的雨巷, 我希望逢着 /一个丁香一样的 /结着愁怨的姑娘。 她是有 /丁香一样的颜色,/丁香一样的芬芳, /丁香一样的忧愁, 在雨中哀怨, /哀怨又彷徨; /她彷徨在这寂寥的雨巷, 撑着油纸伞 /像我一样, /像我一样地 /默默彳亍着 冷漠、凄清,又惆怅。 /她静默地走近/走近,又投出 太息一般的眼光,/她飘过 /像梦一般地, /像梦一般地凄婉迷茫。 像梦中飘过 /一枝丁香的, /我身旁飘过这女郎; 她静默地远了,远了,/到了颓圮的篱墙, /走尽这雨巷。 在雨的哀曲里, /消了她的颜色, /散了她的芬芳, /消散了,甚至她的 太息般的眼光, /丁香般的惆怅/撑着油纸伞,独自 /彷徨在悠长,悠长 又寂寥的雨巷, /我希望飘过 /一个丁香一样的 /结着愁怨的姑娘。 (2)、《再别康桥》徐志摩 轻轻的我走了, /正如我轻轻的来; /我轻轻的招手, /作别西天的云彩。 那河畔的金柳, /是夕阳中的新娘; /波光里的艳影, /在我的心头荡漾。 软泥上的青荇, /油油的在水底招摇; /在康河的柔波里, /我甘心做一条水草!那榆阴下的一潭, /不是清泉,是天上虹 /揉碎在浮藻间, /沉淀着彩虹似的梦。寻梦?撑一支长篙, /向青草更青处漫溯, /满载一船星辉, /在星辉斑斓里放歌。但我不能放歌, /悄悄是别离的笙箫; /夏虫也为我沉默, / 沉默是今晚的康桥!悄悄的我走了, /正如我悄悄的来;/我挥一挥衣袖, /不带走一片云彩。 4、《荆轲刺秦王》 太子及宾客知其事者,皆白衣冠以送之。至易水上,既祖,取道。高渐离击筑,荆轲和而歌,为变徵之声,士皆垂泪涕泣。又前而为歌曰:“风萧萧兮易水寒,壮士一去兮不复还!”复为慷慨羽声,士皆瞋目,发尽上指冠。于是荆轲遂就车而去,终已不顾。 5、《记念刘和珍君》鲁迅 (1)、真的猛士,敢于直面惨淡的人生,敢于正视淋漓的鲜血。这是怎样的哀痛者和幸福者?然而造化又常常为庸人设计,以时间的流驶,来洗涤旧迹,仅使留下淡红的血色和微漠的悲哀。在这淡红的血色和微漠的悲哀中,又给人暂得偷生,维持着这似人非人的世界。我不知道这样的世界何时是一个尽头!

人教版高中语文必修必背课文

必修1 沁园春·长沙(全文)毛泽东 独立寒秋, 湘江北去, 橘子洲头。 看万山红遍, 层林尽染, 漫江碧透, 百舸争流。 鹰击长空, 鱼翔浅底, 万类霜天竞自由。 怅寥廓, 问苍茫大地, 谁主沉浮。 携来百侣曾游, 忆往昔峥嵘岁月稠。 恰同学少年, 风华正茂, 书生意气, 挥斥方遒。 指点江山, 激扬文字, 粪土当年万户侯。 曾记否, 到中流击水, 浪遏飞舟。 雨巷(全文)戴望舒 撑着油纸伞,独自 彷徨在悠长、悠长 又寂寥的雨巷, 我希望逢着 一个丁香一样地 结着愁怨的姑娘。 她是有 丁香一样的颜色, 丁香一样的芬芳, 丁香一样的忧愁, 在雨中哀怨, 哀怨又彷徨;

她彷徨在这寂寥的雨巷, 撑着油纸伞 像我一样, 像我一样地 默默彳亍着 冷漠、凄清,又惆怅。 她默默地走近, 走近,又投出 太息一般的眼光 她飘过 像梦一般地, 像梦一般地凄婉迷茫。 像梦中飘过 一枝丁香地, 我身旁飘过这个女郎; 她默默地远了,远了, 到了颓圮的篱墙, 走尽这雨巷。 在雨的哀曲里, 消了她的颜色, 散了她的芬芳, 消散了,甚至她的 太息般的眼光 丁香般的惆怅。 撑着油纸伞,独自 彷徨在悠长、悠长 又寂寥的雨巷, 我希望飘过 一个丁香一样地 结着愁怨的姑娘。 再别康桥(全文)徐志摩 轻轻的我走了,正如我轻轻的来;我轻轻的招手,作别西天的云彩。 那河畔的金柳,是夕阳中的新娘;波光里的艳影,在我的心头荡漾。 软泥上的青荇,油油的在水底招摇;

在康河的柔波里,我甘心做一条水草! 那榆荫下的一潭,不是清泉, 是天上虹揉碎在浮藻间,沉淀着彩虹似的梦。 寻梦?撑一支长篙,向青草更青处漫溯, 满载一船星辉,在星辉斑斓里放歌。 但我不能放歌,悄悄是别离的笙箫; 夏虫也为我沉默,沉默是今晚的康桥。 悄悄的我走了,正如我悄悄的来; 我挥一挥衣袖,不带走一片云彩。 记念刘和珍君(二、四节)鲁迅 二 真的猛士,敢于直面惨淡的人生,敢于正视淋漓的鲜血。这是怎样的哀痛者和幸福者?然而造化又常常为庸人设计,以时间的流驶,来洗涤旧迹,仅使留下淡红的血色和微漠的悲哀。在这淡红的血色和微漠的悲哀中,又给人暂得偷生,维持着这似人非人的世界。我不知道这样的世界何时是一个尽头! 我们还在这样的世上活着;我也早觉得有写一点东西的必要了。离三月十八日也已有两星期,忘却的救主快要降临了罢,我正有写一点东西的必要了。 四 我在十八日早晨,才知道上午有群众向执政府请愿的事;下午便得到噩耗,说卫队居然开枪,死伤至数百人,而刘和珍君即在遇害者之列。但我对于这些传说,竟至于颇为怀疑。我向来是不惮以最坏的恶意,来推测中国人的,然而我还不料,也不信竟会下劣凶残到这地步。况且始终微笑着的和蔼的刘和珍君,更何至于无端在府门前喋血呢? 然而即日证明是事实了,作证的便是她自己的尸骸。还有一具,是杨德群君的。而且又证明着这不但是杀害,简直是虐杀,因为身体上还有棍棒的伤痕。 但段政府就有令,说她们是“暴徒”! 但接着就有流言,说她们是受人利用的。 惨象,已使我目不忍视了;流言,尤使我耳不忍闻。我还有什么话可说呢?我懂得衰亡民族之所以默无声息的缘由了。沉默啊,沉默啊!不在沉默中爆发,就在沉默中灭亡。

Unit5THEMONSTER课文翻译大学英语六

Unit 5 THE MONSTER He was an undersized little man, with a head too big for his body -- a sickly little man. His nerves were had. He had skin trouble. It was agony for him to wear anything next to his skin coarser than silk. And he had seclusions of grandeur. He was a monster of conceit.Never for one minute did he look at the world or at people, except in relation to himself. He was not only the most important person in the world,to himself;in his own eyes he was the only person who existed. He believed himself to be one of the greatest dramatists in the world, one of the greatest thinkers, and one of the greatest composers. To hear him talk, he was Shakespeare, and Beethoven, and Plato, rolled into one. And you would have had no difficulty in hearing him talk. He was one of the most exhausting conversationalists that ever lived. An evening with him was an evening spent in listening to a monologue. Sometimes he was brilliant; sometimes he was maddeningly tiresome. But whether he was being brilliant or dull, he had one sole topic of conversation: himself. What he thought and what he did. He had a mania for being in the right.The slightest hint of disagreement,from anyone, on the most trivial point, was enough to set him off on a harangue that might last for house, in which he proved himself right in so many ways, and with such exhausting volubility, that in the end his hearer, stunned and deafened, would agree with him, for the sake of peace. It never occurred to him that he and his doing were not of the most intense and fascinating interest to anyone with whom he came in contact.He had theories about almost any subject under the sun, including vegetarianism, the drama, politics, and music; and in support of these theories he wrote pamphlets,le tters, books? thousands upon thousands of words, hundreds and hundreds of pages. He not only wrote these things, and published them -- usually at somebody else's expense-- but he would sit and read them aloud, for hours, to his friends and his family. He wrote operas,and no sooner did he have the synopsis of a story, but he would invite -- or rather summon -- a crowed of his friends to his house, and read it aloud to them. Not for criticism. For applause. When the complete poem was written, the friends had to come again,and hear that read aloud.Then he would publish the poem, sometimes years before the music that went with it was written. He played the piano like a composer, in the worst sense of what that implies, and he would sit down at the piano before parties that included some of the finest pianists of his time, and play for them, by the hour, his own music, needless to say. He had a composer's voice. And he would invite eminent vocalists to his house and sing them his operas, taking all the parts.

Unit7TheMonster课文翻译综合优质教程四.docx

最新资料欢迎阅读 Unit 7 The Monster课文翻译综合教程四 Unit 7 The Monster Deems Taylor 1 He was an undersized little man, with a head too big for his body ― a sickly little man. His nerves were bad. He had skin trouble. It was agony for him to wear anything next to his skin coarser than silk. And he had delusions of grandeur. 2 He was a monster of conceit. Never for one minute did he look at the world or at people, except in relation to himself. He believed himself to be one of the greatest dramatists in the world, one of the greatest thinkers, and one of the greatest composers. To hear him talk, he was Shakespeare, and Beethoven, and Plato, rolled into one. He was one of the most exhausting conversationalists that ever lived. Sometimes he was brilliant;sometimes he was maddeningly tiresome. But whether he was being brilliant or dull, he had one sole topic of conversation: himself. What he thought and what he did. 3 He had a mania for being in the right. The slightest hint of disagreement, from anyone, on the most trivial point, was enough to set him off on a harangue that might last for hours, in which he proved himself right in so many ways, and with such exhausting volubility, that in the end his hearer, stunned and deafened, would agree with him,for the sake of peace. 4 It never occurred to him that he and his doing were not of the most intense and fascinating interest to anyone with whomhe came in contact. He had theories about almost any subject under the sun,including vegetarianism, the drama, politics, and music; and in support of these theories he wrote pamphlets, letters,books ... thousands upon thousands

人教版高中语文必修(1-5)必背课文

人教版高中语文必修(1-5)必背课文.txt6宽容润滑了彼此的关系,消除了彼此的隔阂,扫清了彼此的顾忌,增进了彼此的了解。人教版高中语文必修(1-5)必背课文 诗★诗★诗 ★卫风?氓(背诵全文)《诗经》 氓之蚩蚩,抱布贸丝。匪来贸丝,来即我谋。送子涉淇,至于顿丘。匪我愆期,子无良媒。将子无怒,秋以为期。 乘彼垝垣,以望复关。不见复关,泣涕涟涟。既见复关,载笑载言。尔卜尔筮,体无咎言。以尔车来,以我贿迁。 桑之未落,其叶沃若。于嗟鸠兮,无食桑葚!于嗟女兮,无与士耽!士之耽兮,犹可说也。女之耽兮,不可说也。 桑之落矣,其黄而陨。自我徂尔,三岁食贫。淇水汤汤,渐车帷裳。女也不爽,士贰其行。士也罔极,二三其德。 三岁为妇,靡室劳矣;夙兴夜寐,靡有朝矣。言既遂矣,至于暴矣。兄弟不知,咥其笑矣。静言思之,躬自悼矣。 及尔偕老,老使我怨。淇则有岸,隰则有泮。总角之宴,言笑晏晏。信誓旦旦,不思其反。反是不思,亦已焉哉! ★涉江采芙蓉(背诵全文)古诗十九首 涉江采芙蓉,兰泽多芳草。采之欲遗谁?所思在远道。 还顾望旧乡,长路漫浩浩。同心而离居,忧伤以终老。 ★秋兴八首(其一)(背诵全文)杜甫 玉露凋伤枫树林,巫山巫峡气萧森。江间波浪兼天涌,塞上风云接地阴。 丛菊两开他日泪,孤舟一系故园心。寒衣处处催刀尺,白帝城高急暮砧。 ★咏怀古迹(其三)(背诵全文)杜甫 群山万壑赴荆棘,生长明妃尚有村。一去紫台连朔漠,独留青冢向黄昏。 画图省识春风面,环佩空归月夜魂。千载琵琶作胡语,分明怨恨曲中论。 ★登高(背诵全文)杜甫 风急天高猿啸哀,渚清沙白鸟飞回。无边落木萧萧下,不尽长江滚滚来。 万里悲秋常作客,百年多病独登台。艰难苦恨繁霜鬓,潦倒新停浊酒杯。 ★锦瑟(背诵全文)李商隐 锦瑟无端五十弦,一弦一柱思华年。庄生晓梦迷蝴蝶,望帝春心托杜鹃。 沧海月明珠有泪,蓝田日暖玉生烟。此情可待成追忆,只是当时已惘然。 ★马嵬李商隐 海外徒闻更九州,他生未卜此生休。空闻虎旅传宵柝,无复鸡人报晓筹。 此日六军同驻马,当时七夕笑牵牛。如何四纪为天子,不及卢家有莫愁! ★蜀道难(背诵全文)李白

全新版大学英语综合教程3 课文翻译unit 1 Mr. Doherty Builds His Dream Life

unit 1 Mr. Doherty Builds His Dream Life In America many people have a romantic idea of life in the countryside. Many living in towns dream of starting up their own farm, of living off the land. Few get round to putting their dreams into practice. This is perhaps just as well, as the life of a farmer is far from easy, as Jim Doherty discovered when he set out to combine being a writer with running a farm. Nevertheless, as he explains, he has no regrets and remains enthusiastic about his decision to change his way of life. 在美国,不少人对乡村生活怀有浪漫的情感。许多居住在城镇的人梦想着自己办个农场,梦想着靠土地为生。很少有人真去把梦想变为现实。或许这也没有什么不好,因为,正如吉姆·多尔蒂当初开始其写作和农场经营双重生涯时所体验到的那样,农耕生活远非轻松自在。但他写道,自己并不后悔,对自己作出的改变生活方式的决定仍热情不减。 Mr. Doherty Builds His Dream Life Jim Doherty There are two things I have always wanted to do -- write and live on a farm. Today I'm doing both. I am not in E. B. White's class as a writer or in my neighbors' league as a farmer, but I'm getting by. And after years of frustration with city and suburban living, my wife Sandy and I have finally found contentment here in the country. 多尔蒂先生创建自己的理想生活吉姆·多尔蒂有两件事是我一直想做的――写作与务农。如今我同时做着这两件事。作为作家,我和E·B·怀特不属同一等级,作为农场主,我和乡邻也不是同一类人,不过我应付得还行。在城市以及郊区历经多年的怅惘失望之后,我和妻子桑迪终于在这里的乡村寻觅到心灵的满足。 It's a self-reliant sort of life. We grow nearly all of our fruits and vegetables. Our hens keep us in eggs, with several dozen left over to sell each week. Our bees provide us with honey, and we cut enough wood to just about make it through the heating season. 这是一种自力更生的生活。我们食用的果蔬几乎都是自己种的。自家饲养的鸡提供鸡蛋,每星期还能剩余几十个出售。自家养殖的蜜蜂提供蜂蜜,我们还自己动手砍柴,足可供过冬取暖之用。 It's a satisfying life too. In the summer we canoe on the river, go picnicking in the woods and take long bicycle rides. In the winter we ski and skate. We get excited about sunsets. We love the smell of the earth warming and the sound of cattle lowing. We watch for hawks in the sky and deer in the cornfields. 这也是一种令人满足的生活。夏日里我们在河上荡舟,

相关主题
文本预览
相关文档 最新文档