ToF - Measuring Principles (zh) v1.1
- 格式:pdf
- 大小:2.41 MB
- 文档页数:10
施敏 半导体器件物理英文版 第一章习题1. (a )求用完全相同的硬球填满金刚石晶格常规单位元胞的最大体积分数。
(b )求硅中(111)平面内在300K 温度下的每平方厘米的原子数。
2. 计算四面体的键角,即,四个键的任意一对键对之间的夹角。
(提示:绘出四个等长度的向量作为键。
四个向量和必须等于多少?沿这些向量之一的方向取这些向量的合成。
)3. 对于面心立方,常规的晶胞体积是a 3,求具有三个基矢:(0,0,0→a/2,0,a/2),(0,0,0→a/2,a/2,0),和(0,0,0→0,a/2,a/2)的fcc 元胞的体积。
4. (a )推导金刚石晶格的键长d 以晶格常数a 的表达式。
(b )在硅晶体中,如果与某平面沿三个笛卡尔坐标的截距是10.86A ,16.29A ,和21.72A ,求该平面的密勒指数。
5. 指出(a )倒晶格的每一个矢量与正晶格的一组平面正交,以及(b )倒晶格的单位晶胞的体积反比于正晶格单位晶胞的体积。
6. 指出具有晶格常数a 的体心立方(bcc )的倒晶格是具有立方晶格边为4π/a的面心立方(fcc )晶格。
[提示:用bcc 矢量组的对称性:)(2x z y a a -+=,)(2y x z a b -+=,)(2z y x a c -+= 这里a 是常规元胞的晶格常数,而x ,y ,z 是fcc 笛卡尔坐标的单位矢量:)(2z y a a +=,)(2x z a b +=,)(2y x a c +=。
] 7. 靠近导带最小值处的能量可表达为.2*2*2*22⎪⎪⎭⎫ ⎝⎛++=z z y y xx m k m k m k E 在Si 中沿[100]有6个雪茄形状的极小值。
如果能量椭球轴的比例为5:1是常数,求纵向有效质量m*l 与横向有效质量m*t 的比值。
8. 在半导体的导带中,有一个较低的能谷在布里渊区的中心,和6个较高的能谷在沿[100] 布里渊区的边界,如果对于较低能谷的有效质量是0.1m0而对于较高能谷的有效质量是1.0m0,求较高能谷对较低能谷态密度的比值。
·4601··方法学研究·检测阴性设计:基本原理、方法、衍生设计、优劣势及应用巫婷,刘珏*【摘要】 检测阴性设计是一种简便、快速且有效的研究方法,被广泛应用于疫苗上市后效果评价,有助于国内外基层医疗卫生机构及其他卫生机构评估疫苗上市后效果及其他干预措施效果,其应用前景广阔。
随着实时检测阴性设计、整群随机检测阴性设计等新型衍生设计类型的出现,该研究设计被逐步应用于疾病危险因素探索、干预措施效果评估等。
然而目前国内相关研究报道较为缺乏。
本研究介绍了检测阴性设计的定义、基本原理、方法、设计实施要点,以及实时检测阴性设计、整群随机检测阴性设计等最新衍生设计类型、优势、局限性,在疫苗上市后效果评价以及其他干预措施效果评估等方面的应用,为国内学者开展相关研究提供理论与实践依据。
【关键词】 检测阴性设计;疫苗;效果评价;研究方法【中图分类号】 R 373.1 【文献标识码】 A DOI :10.12114/j.issn.1007-9572.2022.0277巫婷,刘珏.检测阴性设计:基本原理、方法、衍生设计、优劣势及应用[J ]. 中国全科医学,2022,25(36):4601-4608. [ ]WU T ,LIU J. Test-negative design :basic principles ,methods of implementation ,types of derivatives ,advantages ,limitations ,and applications [J ]. Chinese General Practice ,2022,25(36):4601-4608.Test-negative Design :Basic Principles ,Methods of Implementation ,Types of Derivatives ,Advantages ,Limitations ,and Applications WU Ting ,LIU Jue *Peking University School of Public Health ,Beijing 100191,China*Corresponding author :LIU Jue ,Professor ,Doctoral supervisor ;E-mail :【Abstract 】 As a simple,rapid and effective research method,test-negative design (TND)has been widely used to support the evaluation of post-marketing effectiveness of vaccines and efficacies of interventions in healthcare institutions,showing a magnificent prospect of application. With the emergence of new derivative types such as real-time TND and cluster-randomized TND,TND has also been gradually applied to the exploration of disease risk factors and effectiveness evaluation of interventions. However,there are still few related research reports in China. We introduced the basic principles,methods and essentials of implementation,newly derivative types such as real-time TND and cluster-randomized TND,advantages and limitations of TND,as well as its applications in assessing post-marketing effectiveness for vaccines and efficacies of interventions,providing a theoretical and practical basis for researchers in China to carry out relevant research.【Key words 】 Test-negative design;Vaccine;Effectiveness evaluation;Research method近年来,TND 被广泛应用于疫苗上市后的保护效果评价,疫苗所针对的病原体包括呼吸道病原体(流感病毒[4]、新型冠状病毒[5]、肺炎链球菌[6]等)、胃肠道病原体(轮状病毒[7]、霍乱弧菌[8]等)、其他(脊髓灰质炎病毒[9]、人乳头瘤病毒[10]、肠道病毒71型[11]等),TND 通过比较病例组与对照组之间的疫苗接种情况进而评价疫苗的效果。
招聘科研岗位笔试题及解答(某世界500强集团)(答案在后面)一、单项选择题(本大题有10小题,每小题2分,共20分)1、以下哪种算法是非监督学习的一种典型应用?A、决策树B、线性回归C、K-means聚类D、逻辑回归2、以下哪一项不是科研项目管理中的关键要素?A、项目的时间管理B、预算的制定与控制C、团队协作与人员管理D、营销策略3、在模型训练过程中,过拟合的现象通常发生在:A、训练初期B、训练中期C、训练后期D、训练结束时4、关于深度学习中的反向传播算法,下列描述正确的是:A、反向传播算法仅适用于浅层网络B、反向传播算法是用来优化模型参数的基本算法C、反向传播算法是用来正向传播信号的基本算法D、反向传播算法无法与梯度下降法结合使用5、科研项目管理的核心是什么?A、技术开发效率B、团队协作能力C、项目目标达成D、创新思维能力6、在实验设计中,什么是确保研究结果可重复性的关键?A、采取随机抽样B、使用复杂实验设备C、严格的实验操作规程D、确保数据收集的全面性7、在团队项目中,哪种沟通方式能够确保信息得到准确传递和理解?A、电子邮件B、口头报告C、面对面会议D、即时消息8、科学研究中,对于实验数据的处理和分析,哪种统计方法能够用于检测两组数据是否存在显著差异?A、卡方检验B、T检验C、方差分析D、回归分析9、在材料科学中,以下哪种材料被广泛用于电子元件中的绝缘层和防腐蚀保护?(A)铝 (B) 玻璃 (C) 聚四氟乙烯 (D) 钢 10、半导体材料在电子学中起着决定性作用,以下哪种半导体材料在其价带和导带之间具有最大的能量隙?(B)砷化镓 (B) 硅 (C) 锗 (D) 碳二、多项选择题(本大题有10小题,每小题4分,共40分)1、科研岗位员工在进行项目设计时,应遵循的原则有哪些?A. 创新性B. 科学性C. 可行性D. 经济性E. 规范性2、科研人员进行学术论文写作时,应注意以下哪些方面?A. 明确研究目的和意义B. 深入研究背景和现状C. 展示实验设计与方法D. 论述结果分析与讨论E. 清晰引文引用标注3、(多项选择题)在进行实验数据处理时,常用的统计方法包括哪些?A. 方差分析B. 偏差计算C. 回归分析D. 相关性分析E. 方差计算4、(多项选择题)以下哪些技术被广泛应用于现代科学研究中?A. 基因编辑技术B. 3D打印技术C. 云计算D. 物联网技术E. 深度学习5、在机器学习领域,以下哪些算法属于无监督学习?( ) A) k-means聚类B) 决策树 C) 支持向量机 D) 随机森林 E) 线性回归 F) 主成分分析6、在深度学习中,常用的卷积神经网络(CNN)结构有哪些常见的架构?( ) A) LeNet B) AlexNet C) VGG D) Inception E) LSTM F) Transformer7、以下关于科研项目管理的说法中,哪些是正确的?()(2分)A、科研项目管理主要强调的是项目进度的控制。
目录1.TOFD检测技术定义及原理2.TOFD检测技术基本知识3.TOFD检测技术的盲区4.TOFD检测技术的特点5.几种典型缺陷TOFD图谱1TOFD检测定义及基本原理1.1TOFD检测的定义衍射时差法超声检测(Time of Flight Diffraction ,英文缩写 TOFD)是依靠超声波与被检对象中的缺陷尖端或端部相互作用后发出的衍射信号来检测缺陷并对缺陷进行定位、定量的一种无损检测技术。
概况起来说 TOFD技术就是一种基于衍射信号实施检测的技术。
1.2 TOFD检测原理1.2.1 衍射现象衍射现象:是指波在传播过程中,遇到障碍物,能够绕过障碍物,产生偏离直线传播的现象。
缺陷端点衍射现象可以用惠更斯-菲涅尔原理解释:惠更斯提出,介质上波阵面上的各点,都可以看成是发射子波的波源,其后任意时刻这些子波的包迹,就是该时刻新的波阵面。
菲涅尔充实了惠更斯原理,他提出波前上每个面元都可视为子波的波源,在空间某点的振动是所有这些子波在该点产生的相干振动的叠加。
图1.1缺陷端部衍射信号的解释由图示可见:当一束超声波入射到裂纹缺陷时:(1)在裂纹中部会形成有一定方向的反射波,其方向满足反射定律。
反射波接近平面波,其波阵面是由众多子波源反射波叠加构成;(2)在裂纹尖端则没有叠加现象发生。
这种裂纹尖端以独立的子波源发射的超声波即为衍射波。
衍射波的重要特点:1.没有明显的方向性;2.衍射波强度很弱。
衍射波的这两个特点都是由于裂纹尖端独立发射超声波没有波的叠加所造成的图1.2裂纹端点衍射波特点裂纹的上下端点都可以产生衍射波。
衍射波信号比反射波信号弱得多,且向空间的各个方向传播,即没有明显的指向性。
图1.3 端角反射与裂纹端点衍射信号波幅比较根据惠更斯-菲涅尔定理可知,缺陷端点形状改变会对衍射信号产生影响:(1)端点越尖锐,衍射特性越明显,(2)端点越圆滑,衍射特性越不明显,(3)当端点曲率半径大于波长(d>λ)时,主要体现的是反射特性。
Time-of-flight cameraFrom Wikipedia, the free encyclopediaA time-of-flight camera (TOF camera) is a camera system that creates distance data with help of the time-of-flight (TOF) principle which is different from time-of-flight mass spectrometry. The principle is similar to that of LIDAR scanners with the difference being the entire scene is captured with each laser or light pulse rather than being scanned with a moving laser. Time-of-flight cameras are relatively recent devices for normal products (2004)[1], as the semiconductor processes have only recently become fast enough for such devices. The systems cover ranges of a few meters up to about several kilometers depending upon the detector material being used.The detector material is made from (1) CMOS detectors which capture light pulses in the visible range, (2) PIN diodes or (3) avalanche photo diodes (APDs). Each material has inherent strengths and are chosen based upon application requirements.The distance resolution ranges from sub-centimeter to several centimeters depending upon the range. The lateral resolution of time-of-flight cameras is generally low compared to standard 2D video cameras, currently at 320 × 240 pixels or less.[2][3][4][5][6] Only one 3D camera reports 484 x 648 pixels of resolution using a standard CCD sensor.[7] The biggest advantage of the cameras may be that they provide up to 100 images per second.Contents1 Types of devices1.1 Pulsed light source with digital time counters1.2 RF-modulated light sources with phase detectors1.3 Range gated imagers2 Components3 Principle4 Advantages4.1 Simplicity4.2 Efficient distance algorithm4.3 Speed5 Disadvantages5.1 Background light5.2 Interference5.3 Multiple reflections6 Applications6.1 Automotive applications6.2 Human-machine interfaces / gaming6.3 Measurement / machine vision6.4 Robotics7 Brands8 See also9 References10 External linksTypes of devicesSeveral different technologies for time-of-flight cameras have been developed.Pulsed light source with digital time countersThere are devices with a pulsed laser and a custom imaging integrated circuit with a fast counter behind every pixel. These devices produce depth valuesfor each pixel on every frame. Typical image sizes are 128 x 128 pixels. Ranges up to 22,000 feet with an eye-safe narrow beam have been achieved. Detectors are typically InGaAs (indium-gallium-arsenide) devices.[8]RF-modulated light sources with phase detectorsPhotonic Mixer Devices (PMD) [9] and the Swiss Ranger works by modulating the outgoing beam with an RF carrier, then measuring the phase shift of that carrier on the receive side. This is a compact, short-range device. This approach has a modular error challenge; ranges are mod the maximum range, which is the RF carrier wavelength. With phase unwrapping algorithms, the maximum uniqueness range can be increased. The Swiss Ranger has ranges of 5 or 10 meters, with 176 x 144 pixels. The PMD can provide ranges up to 60m. Illumination is pulsed LEDs, rather than a laser. The demodulation is usually achieved by gating the sensor in synchrony with the light source modulation, so in essence they are range gated imagers.[10]Range gated imagersThese devices have a phase detector built in to the gate or shutter in the camera. The gate allows collection of portions S2 and S1 of the receivedlight pulse S. The portions are dependent of the time of arrival, and range is derived from them according to Medina's equation, z = R (S2−S1) / 2S + R / 2 for an ideal camera. R is the camera range, determined by the roundtrip of the light pulse.[7][11]The 3DV Inc. cameras,[12] and Canesta 3D cameras are range-gated systemsusing Medina's design. Microsoft purchased both companies in 2009 and 2010 respectively.Similar principles are used in the ToF camera line developed by Fraunhofer Institute of Microelectronic Circuits and Systems and TriDiCam. These cameras employ photodetectors with a fast electronic shutter.Range gated imagers can also be used in 2D imaging to suppress anything outside a specified distance range, such as to see through fog. A pulsed laser provides illumination, and an optical gate allows light to reach the imager only during the desired time period.[13][14]ComponentsA time-of-flight camera consists of the following components:Illumination unit: It illuminates the scene. As the light has to bemodulated with high speeds up to 100 MHz, only LEDs or laser diodes are feasible. The illumination normally uses infrared light to make theillumination unobtrusive.Optics: A lens gathers the reflected light and images the environment onto the image sensor. An optical band pass filter only passes the light with the same wavelength as the illumination unit. This helps suppress background light.Image sensor: This is the heart of the TOF camera. Each pixel measures the time the light has taken to travel from the illumination unit to the object and back. Several different approaches are used for timing; see types of devices above.Driver electronics: Both the illumination unit and the image sensorhave to be controlled by high speed signals. These signals have to bevery accurate to obtain a high resolution. For example, if the signals between the illumination unit and the sensor shift by only 10picoseconds, the distance changes by 1.5 mm. For comparison: currentCPUs reach frequencies of up to 3 GHz, corresponding to clock cycles of about 300 ps - the corresponding 'resolution' is only 45 mm.Computation/Interface: The distance is calculated directly in thecamera. To obtain good performance, some calibration data is also used.The camera then provides a distance image over a USB or Ethernetinterface.PrincipleSee also: time-of-flightThe simplest version of a time-of-flight camera uses light pulses. The illumination is switched on for a very short time, the resulting light pulse illuminates the scene and is reflected by the objects. The camera lens gathers the reflected light and images it onto the sensor plane. Depending on the distance, the incoming light experiences a delay. As light has a speed of approximately c = 300,000,000 meters per second, this delay is very short: anobject 2.5 m away will delay the light by:Diagrams illustrating theprinciple of a time-of-flightcamera with analog timing[15]The pulse width of the illumination determines the maximum range the cameracan handle. With a pulse width of e.g. 50 ns, the range is limited toThese short times show that the illumination unit is a critical part of thesystem. Only with some special LEDs or lasers is it possible to generate suchshort pulses.The single pixel consists of a photo sensitive element (e.g. a photo diode).It converts the incoming light into a current. In analog timing imagers,connected to the photo diode are fast switches, which direct the current toone of two (or several) memory elements (e.g. a capacitor) that act assummation elements. In digital timing imagers, a time counter, running atseveral gigahertz, is connected to each photodetector pixel and stopscounting when light is sensed.In the diagram of an analog timer, the pixel uses two switches (G1 and G2)and two memory elements (S1 and S2). The switches are controlled by a pulsewith the same length as the light pulse, where the control signal of switchG2 is delayed by exactly the pulse width. Depending on the delay, only partof the light pulse is sampled through G1 in S1, the other part is stored inS2. Depending on the distance, the ratio between S1 and S2 changes as depicted in the drawing.[16] Because only small amounts of light hit thesensor within 50 ns, not only one but several thousands pulses are sent out (repetition rate tR) and gathered, thus increasing the signal to noise ratio. After the exposure, the pixel is read out and the following stages measure the signals S1 and S2. As the length of the light pulse is defined, the distance can be calculated with Medina's formula:In the example, the signals have the following values: S1 = 0.66 and S2 =0.33. The distance is therefore:In the presence of background light, the memory elements receive an additional part of the signal. This would disturb the distance measurement. To eliminate the background part of the signal, the whole measurement can be performed a second time with the illumination switched off. If the objects are further away than the distance range, the result is also wrong. Here, a second measurement with the control signals delayed by an additional pulse width helps to suppress such objects. Other systems work with a sinusoidally modulated light source instead of the pulse source.AdvantagesSimplicityIn contrast to stereo vision or triangulation systems, the whole system is very compact: the illumination is placed just next to the lens, whereas the other systems need a certain minimum base line. In contrast to laser scanning systems, no mechanical moving parts are needed.Efficient distance algorithmIt is very easy to extract the distance information out of the output signals of the TOF sensor, therefore this task uses only a small amount of processing power, again in contrast to stereo vision, where complex correlation algorithms have to be implemented. After the distance data has been extracted, object detection, for example, is also easy to carry out because the algorithms are not disturbed by patterns on the object.SpeedTime-of-flight cameras are able to measure the distances within a complete scene with one shot. As the cameras reach up to 100 frames per second, theyare ideally suited to be used in real-time applications.DisadvantagesBackground lightAlthough most of the background light coming from artificial lighting or the sun is suppressed, the pixel still has to provide a high dynamic range. The background light also generates electrons, which have to be stored. For example, the illumination units in today's TOF cameras can provide an illumination level of about 1 watt. The Sun has an illumination power of about 50 watts per square meter after the optical bandpass filter. Therefore, if the illuminated scene has a size of 1 square meter, the light from the sun is 50 times stronger than the modulated signal.InterferenceIf several time-of-flight cameras are running at the same time, the cameras may disturb each others' measurements. There exist several possibilities for dealing with this problem:Time multiplexing: A control system starts the measurement of theindividual cameras consecutively, so that only one illumination unit is active at a time.Different modulation frequencies: If the cameras modulate their light with different modulation frequencies, their light is collected in the other systems only as background illumination but does not disturb the distance measurement.Multiple reflectionsIn contrast to laser scanning systems, where only a single point is illuminated at once, the time-of-flight cameras illuminate a whole scene. Due to multiple reflections, the light may reach the objects along several paths and therefore, the measured distance may be greater than the true distance.ApplicationsAutomotive applicationsTime-of-flight cameras are also used in assistance and safety functions for advanced automotive applications such as active pedestrian safety, precrash detection and indoor applications like out-of-position (OOP)detection.[17][18][19]Human-machine interfaces / gamingRange image of a human face captured with a time-of-flight cameraRange image with height measurements As time-of-flight cameras provide distanceimages in real time, it is easy to trackmovements of humans. This allows newinteractions with consumer devices such astelevisions. Another topic is to use this typeof cameras to interact with games on videogame consoles.[20]Measurement / machine visionOther applications are measurement tasks, e.g.for the fill height in silos. In industrialmachine vision, the time-of-flight camerahelps to classify objects and help robots findthe items, for instance on a conveyor. Doorcontrols can distinguish easily betweenanimals and humans reaching the door.RoboticsAnother use of these cameras is the field of robotics: Mobile robots can build up a map of their surroundings very quickly, enabling themto avoid obstacles or follow a leading person.As the distance calculation is simple, only little computational power is used.BrandsD-IMager - TOF camera by Panasonic [21]TOFCam Stanley - TOF camera by Stanley Electric [22]FOTONIC-C70 - TOF cameras and software by Fotonic powered by Canesta CMOS chip [23]Optrima - TOF cameras and modules [24]PMD[vision] - TOF imager, modules, cameras and software byPMDTechnologies [25]SwissRanger - an industrial TOF-only camera line originally by theCentre Suisse d'Electronique et Microtechnique (CSEM SA), now developed by the spin out company Mesa Imaging [26]3D MLI Sensor - TOF imager, modules, cameras, and software by IEE(International Electronics & Engineering), based on modulated light intensity (MLI)[27]TriDiCam - TOF modules and software, the TOF imager original developed by Fraunhofer Institute of Microelectronic Circuits and Systems, now developed by the spin out company TriDiCam [28]D-IMager by PanasonicPMD[vision] CamCube by PMDTechnologiesFOTONIC-B70 by FotonicSwissRanger 4000 by MESA ImagingUSB-powered TOF camera out of the European ARTTS projectPMD[Vision] CamBoard by PMDTechnologies (USB powered)3D MLI Sensor by IEE S.A.See alsoLaser Dynamic Range ImagerStructured-light 3D scannerReferences1. ^ "Advanced Scientific Concepts, Inc."(/products/portable.html) . ASC 3D Portable camera shipping since 2004. Advanced Scientific Concepts, Inc../products/portable.html. Retrieved21 March 2011.2. ^ Schuon, Sebastian; Theobalt, Christian; Davis, James; Thrun, Sebastian(2008-07-15). "High-quality scanning using time-of-flight depthsuperresolution" (http://www-/people/theobalt/TOF_CV_Superresolution_final.pdf) . written at Anchorage, Alaska. IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2008. Institute of Electrical and Electronics Engineers. pp. 1–7. doi:10.1109/CVPRW.2008.4563171(/10.1109%2FCVPRW.2008.4563171) . ISBN 978-1-4244-2339-2./people/theobalt/TOF_CV_Superresolution_final.pdf.Retrieved 2009-07-31. "The Z-cam can measure full frame depth at video rate and at a resolution of 320×240 pixels.".3. ^PMD[vision] CamCube 2.0 Datasheet(/fileadmin/pmdtec/downloads/documentation/datasheet_cam cube.pdf) (No. 20090601 ed.). Siegen, Germany: PMDTechnologies. 2009-06-01.p. 5./fileadmin/pmdtec/downloads/documentation/datasheet_camc ube.pdf. Retrieved 2009-07-31. "Type of Sensor: PhotonICs PMD 41k-S (204 x204)".4. ^SR4000 Data Sheet (http://www.mesa-imaging.ch/dlm.php?fname=pdf/SR4000_Data_Sheet.pdf) (Rev 2.6 ed.). Zürich, Switzerland: MesaImaging. August 2009. p. 1. http://www.mesa-imaging.ch/dlm.php?fname=pdf/SR4000_Data_Sheet.pdf. Retrieved 2009-08-18. "176 x 144 pixel array (QCIF)".5. ^ "Canesta 101: Introduction to 3D Vision in CMOS"(/assets/pdf/technicalpapers/Canesta101.pdf) (PDF).Sunnyvale, California: Canesta. March 2008. p. 16./assets/pdf/technicalpapers/Canesta101.pdf. Retrieved31 July 2009. "Our current sensor features an array of 160 x 120 pixels. […]We are considering a 320x240 sensor."6. ^PMD[vision] S3 Datasheet(/fileadmin/pmdtec/downloads/documentation/datasheet_s3.pdf) (No. 20090601 ed.). Siegen, Germany: PMDTechnologies. 2009-06-01. p. 4./fileadmin/pmdtec/downloads/documentation/datasheet_s3.p df. Retrieved 2009-07-31. "Type of sensor—PhotonICs 3k-S2, resolution:64[V] x 48[H] pixels".7. ^ a b Medina A, Gayá F, and Pozo F. Compact laser radar and three-dimensionalcamera. 23 (2006). J. Opt. Soc. Am. A. pp. 800–805/josaa/abstract.cfm?URI=josaa-23–4-800.8. ^ Stettner, Roger and Bailey, Howard. Eye-safe laser radar 3D imaging(/technology/documents/Eye-safepaper.pdf) . Advanced Scientific Concepts./technology/documents/Eye-safepaper.pdf.9. ^ Christoph Heckenkamp: Das magische Auge - Grundlagen der Bildverarbeitung:Das PMD Prinzip (/whitepaper/das-magische-auge) . In: Inspect. Nr. 1, 2008, S. 25–28.10. ^ "Mesa Imaging - Products" (http://www.mesa-imaging.ch) . August 17, 2009.http://www.mesa-imaging.ch.11. ^ Medina, Antonio. Three Dimensional Camera and Rangefinder. January 1992.United States Patent 5081530.12. ^ Iddan, G. J. and G. Yahav, G.. 3D Imaging in the studio (and elsewhere)(/technology/3D%20Imaging%20in%20the%20studio.pdf) .4298. SPIE. p. 48./technology/3D%20Imaging%20in%20the%20studio.pdf.13. ^ seroptronix.se/gated/sealynx.pdf14. ^ "Sea-Lynx Gated Camera - active laser camera system"(seroptronix.se/gated/sealynx.pdf) .seroptronix.se/gated/sealynx.pdf.15. ^ "CCD/CMOS Lock-In Pixel for Range Imaging: Challenges, Limitations andState-of-the-Art" (http://www.mesa-imaging.ch/dlm.php?fname=pdf/RIM_Lock_In_Challenges_Limitations_5.pdf) - CSEM16. ^ Gokturk, Salih Burak; Yalcin, Hakan; Bamji, Cyrus (2005-01-24). "A Time-Of-Flight Depth Sensor - System Description, Issues and Solutions"(/assets/pdf/technicalpapers/CVPR_Submission_TOF.pdf) .IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2004. Institute of Electrical and Electronics Engineers. pp. 35–45. doi:10.1109/CVPR.2004.17 (/10.1109%2FCVPR.2004.17) ./assets/pdf/technicalpapers/CVPR_Submission_TOF.pdf.Retrieved 2009-07-31. "The differential structure accumulates photo-generated charges in two collection nodes using two modulated gates. The gatemodulation signals are synchronized with the light source, and hencedepending on the phase of incoming light, one node collects more charges than the other. At the end of integration, the voltage difference between the two nodes is read out as a measure of the phase of the reflected light.".17. ^ Hsu, Stephen; Acharya, Sunil; Rafii, Abbas; New, Richard (2006-04-25)."Performance of a Time-of-Flight Range Camera for Intelligent Vehicle Safety Applications"(/assets/pdf/technicalpapers/canesta_amaa06_paper_final1.pdf) . Advanced Microsystems for Automotive Applications 2006. Springer.pp. 205–219. doi:10.1007/3-540-33410-6_16 (/10.1007%2F3-540-33410-6_16) . ISBN 978-3-540-33410-1./assets/pdf/technicalpapers/canesta_amaa06_paper_final1 .pdf. Retrieved 2009-09-36.18. ^ Free space determination for parking slots using a 3D PMD sensor -Scheunert, U. Fardi, B. Mattern, N. Wanielik, G. Keppeler, N. - Intelligent Vehicles Symposium, 2007 IEEE (/xpl/RecentCon.jsp?punumber=4290054) , Istanbul, 13–15 June 200719. ^ Elkhalili, O., Schrey, O., Ulfig, W., Brockherde, W., Hosticka, B. J., "A64x8 Pixel 3-D CMOS TOF Image Sensor for Car Safety Applications," Proc.European Solid-States Conference, 2006, pp. 568-57120. ^ Captain, Sean (2008-05-01). "Out of Control Gaming"(/gear-gadgets/article/2008-05/out-control-gaming) .. Popular Science. /gear-gadgets/article/2008-05/out-control-gaming. Retrieved 2009-06-15.21. ^ /components/built-in-sensors/3d-image-sensors/d-imager/22. ^ http://www.brainvision.co.jp/products/tof/23. ^ /content/Products/Default.aspx24. ^ /25. ^ /26. ^ http://www.mesa-imaging.ch/prodview4k.php27. ^ http://www.iee.lu/technologies28. ^ /en/products/array-sensorExternal linksARTTS (http://www.artts.eu) - Research project on time-of-flight cameras funded by the European CommissionGesturespace (http://iad.projects.zhdk.ch/gesturespace/) - a project at the Zurich University of the Arts (ZHdK)Workshop on Time of Flight based Computer Vision (TOF-CV)(/) at the 2008 IEEE Conference on Computer Vision and Pattern Recognition"Calibration and Registration for Precise Surface Reconstruction withTOF Cameras"(http://www.robotic.de/fileadmin/robotic/fuchs/TOFCamerasFuchsMay2007.pd - Institute of Robotics and Mechatronics, German Aerospace Center"First steps in enhancing 3D vision technique using 2D/3D sensors"(http://cmp.felk.cvut.cz/cvww2006/papers/14/14.pdf) - Center for Sensor Systems, University of SiegenVideosHACTOR-Hand Based Robot Control using 2D/3D Images(/watch?v=tZAKRL3KJJo) at YouTube (Adobe Flashvideo) - MultiCam (2D/3D Camera based on PMD) in actionDemo of Orange Vallee's gesture-controlled TV, with Softkineticstechnology (/watch?v=K0-4-FObaRU) at YouTube(Adobe Flash video) - Gesture control by a Canesta time-of-flight camera 【CEATEC JAPAN 2008】 HITACHI Gesture operation TV(/watch?v=O21SYHDEPOs) at YouTube (Adobe Flashvideo) - Gesture control using a Canesta time-of-flight cameraHuman-Robot-Interface: Contactless with 3D-Camera(/watch?v=DK0kvTxb5j4) at YouTube (Adobe Flashvideo) - Robot control using a SwissRanger SR3000Video of a test car drive with PMD Sensor (/watch?v=TruK_-All00) at YouTube (Adobe Flash video) - front view carapplication by a PMDTechnologies time-of-flight cameraBin-Picking System using PMD-Sensor (/watch?v=lt-WO4RlX6k) at YouTube (Adobe Flash video)Retrieved from "/wiki/Time-of-flight_camera" Categories: Digital cameras | Image sensor technology in computer visionThis page was last modified on 30 March 2011 at 07:35.Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. See Terms of Use for details. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.。