米勒麻醉学第七版7
- 格式:pdf
- 大小:2.76 MB
- 文档页数:68
4 – Medical InformaticsC. William HansonKey Points1. A computer's hardware serves many of the same functions as those of the human nervoussystem, with a processor acting as the brain and buses acting as conducting pathways, as well asmemory and communication devices.2.The computer's operating system serves as the interface or translator between its hardware andthe software programs that run on it, such as the browser, word processor, and e-mail programs.3.The hospital information system is the network of interfaced subsystems, both hardware andsoftware, that coexist to serve the multiple computing requirements of a hospital or health system,including services such as admissions, discharge, transfer, billing, laboratory, radiology, andothers.4.An electronic health record is a computerized record of patient care.puterized provider order entry systems are designed to minimize errors, increase patient careefficiency, and provide decision support at the point of entry.6.Decision support systems can provide providers with best-practice protocols and up-to-dateinformation on diseases or act to automatically intervene in patient care when appropriate.7.The Health Insurance Portability and Accountability Act is a comprehensive piece of legislationdesigned in part to enhance the privacy and security of computerized patient information.8.Providers are increasingly able to care for patients at a distance via the Internet, and telemedicinewill continue to grow as the technology improves, reimbursement becomes available, andlegislation evolves.Computer HardwareCentral Processing UnitThe central processing unit (CPU) is the “brain” of a modern computer. It sits on the motherboard, which is the computer's skeleton and nervous system, and communicates with the rest of the computer and the world through a variety of “peripherals.” Information travels through the computer on “buses,” which are the computer's information highways or “nerves,” in the form of “bits.” Bits are aggregated into meaningful information in exactly the same way that dots and dashes are used in Morse code. Bits are the building blocks for both the instructions, or programs, and the data, or files, with which the computer works.Today's CPU is a remarkable piece of engineering, totally comparable in scope and scale to our great bridges and buildings, but so ubiquitous and hidden that most of us are unaware of its miniature magnificence. Chip designers essentially create what can be thought of as a city, complete with transportation, utilities, housing, and a government, every time they create a new CPU. With each new generation of chips, the “cities” grow substantially and yet remain miniaturized to the size of a fingernail. For the purposes of this text, the CPU can be treated as a black box into which flow two highways: one for data, the other for instructions. Inside that black box, the CPU ( Fig. 4-1 ) uses the instructions to determine what to do with data—for example, how to create this sentence from my interaction with the computer's keyboard. The CPU's internal clock is like a metronome pacing the speed with which the instructions are executed.Figure 4-1 Programs and data are stored side by side in memory in the form of single data bits—the program tells the central processing unit (CPU) what to do with the data. RAM, random-access memory.Most people think that the clock speed of the CPU, which is measured in megahertz, or millions of instructions per minute, determines the performance speed of the unit. In reality, the performance of a CPU is a function of several factors that should be intuitive to anesthesiologists when an operating room (OR) analogy is used. Let us compare the clock speed to surgical speed, where a fast clock is comparable to a fast surgeon and vice versa. CPUs also have what are called caches, which are holding areas for data and instructions, quite comparable to preoperative holding areas. Information is moved around in the CPU on buses, which can be likened to the number of ORs. In other words, it is possible to have a CPU that is limited because it has a slow or small cache, in the same way that OR turnover is limited by the lack of preoperative preparation beds or too few ORs for the desired caseload.The speed of a processor is a function of the width of its internal buses, clock speed, the size and speed of internal caches, and the effectiveness with which it anticipates the future. Although this last concept may seem obscure, an OR analogy would be an algorithm that predicts a procedure's length based on previous operations of the same type by the same surgeon. Without going into detail, modern processors use techniques called speculation, prediction, and explicit parallelism to maximize efficiency of the CPU.True general-purpose computers are distinct from their predecessor calculating machines in that regardless of whether they are relatively slow and small, as in dedicated devices such as smart phones, or highly streamlined and fast, as in supercomputers, they can perform the same tasks given enough time. This definition was actually formalized by Alan Turing, who is one of the fathers of computing.Each type of CPU has its own instruction set, which is essentially its language. Families of CPUs, such as Intel's processors, tend to use one common language, albeit with several dialects, depending on the specific chip. Other CPU families use a very different language. A complex–instruction set computer (CISC) has a much more lush vocabulary than a reduced–instruction set computer (RISC), but the latter may have certain efficiencies relative to the former. It is the fact that both types of computer architecture can run exactly the same program (i.e., any windowed operating system) that makes them general-purpose computers.MemoryComputers have a variety of different kinds of memory ranging from very small, very fast memory in theCPU to much slower, typically much larger memory storage sites that may be fixed (hard disk) or removable (compact disk, flash drive).Ideally, we would like to have an infinite amount of extremely fast memory immediately available to the CPU, just as we would like to have all of the OR patients for a given day waiting in the holding area ready to roll into the OR as soon as the previous case is completed. Unfortunately, this would be infinitely expensive. The issue of ready data availability is particularly important now, as opposed to a decade ago, because improvements in central processing speed have outpaced gains in memory speed such that the CPU can sit idle for extended periods while it waits for a desired chunk of data from memory.Computer designers have come up with an approach that ensures a high likelihood that the desired data will be close by. This necessitates the storage of redundant copies of the same data in multiple locations at the same time. For example, the sentence I am currently editing in a document might be stored in very fast memory next to the CPU, whereas a version of the complete document, including an older copy of the same sentence, could be stored in slower, larger-capacity memory ( Fig. 4-2 ). At the conclusion of an editing session, the two versions are reconciled and the newer sentence is inserted into the document.Figure 4-2 Processing of text editing using several “memory” caches in which duplicate copies of the same text may be kept nearby for ready access.The very fast memory adjacent to the CPU is referred as cache memory, and it comes in different sizes and speeds. Cache memory is analogous to the preoperative and postoperative holding areas in an OR in that both represent rapidly accessible buffer space. Modern computer architectures have primary andsecondary caches that can either be built into the CPU chip or be situated adjacent to it on the motherboard. Cache memory is typically implemented in static random-access memory (SRAM), whereas the larger and slower “main memory” consists of dynamic random-access memory (DRAM) modules. RAM has several characteristics, including the facts that it can be read or written (in contrast with read-only memory), it disappears when the electricity is turned off, and it is much faster than the memory on a disk drive.To understand the impact of the mismatch in memory access times and CPU speed, consider the following. Today's fastest hard disks have access times measuring about 10 milliseconds (to get a random chunk of information). If a 200-mHz CPU had to wait for 10 milliseconds between each action requiring new data from a hard disk, it would sit idle for 2 million clock cycles between each clock cycle used for actual work. Furthermore, it takes 10 times longer for the computer to get data from a compact disk or digital video disk than it does for data from a hard disk.CommunicationsThere are many functionally independent parts of a computer that need to communicate seamlessly and on a timely basis. The keyboard and mouse have to be able to signal their actions, the monitor must be refreshed continuously, and the memory stores have to be read and written correctly. The CPU orchestrates all of this by using various system buses as communication and data pathways. Whereas some of the buses are dedicated to specific tasks on newer computers, such as communication with the video processor over a dedicated video bus, others are general-purpose buses.Buses are analogous to highways traveling between locations in the computer ( Fig. 4-3 ). In most computers, the buses vary in width, with the main bus typically being the widest and other buses narrower and therefore of lower capacity. Data (bits) travel along a bus in parallel, like a rank of soldiers, and at regular intervals determined by the clock speed of the computer. Older computers had main buses 8 bits wide, whereas newer Pentium-class computers use buses as wide as 64 bits.Figure 4-3 Buses are like highways, where the number of available “lanes” relates to bus capacity.Input-output buses link “peripherals” such as the mouse, keyboard, removable disk drives, and game controllers to the rest of the computer. These buses have become faster and increasingly standardized. The universal serial bus (USB) is a current widely accepted standard, as is Apple's proprietary Firewire bus. These buses allow “on-the-fly” attachment and removal of peripherals via a standardized plug, and a user can expect that a device plugged into one of these ports will identify itself to the operating system and function without the need for specific configuration. This is a distinct improvement over the previous paradigm, in which a user typically needed to open the housing of the computer to attach a new peripheral and then configure a specific software driver to permit communication between the device and the computer.In addition to their local computing function, modern personal computers have become our conduits to networks and must therefore act as terminal points on the Internet. As with houses or phones, eachcomputer must have an individual identifier (address, phone number) to receive communications uniquely intended for it. Examples of these kinds of specific addresses are the IP (Internet protocol) address and the MAC (media access control) address. The IP address is temporarily or permanently assigned to a device on the Internet (typically a computer) to uniquely identify it among all of the other devices on the Internet. The MAC address is used to specifically identify the network interface card for the computers that assign IP addresses.The computer must also have the right kind of hardware to receive and interpret Internet-based communications. Wired and wireless network interface cards are built into all new computers and have largely replaced modems as the hardware typically used for network communications. Whereas a modem communicates over existing phone lines also used for voice communication, network cards communicate over channels specifically intended for computer-to-computer communications and are almost invariably faster than modems.Although we commonly think of the Internet as being one big network, it is instructive to understand a little bit about the history of computer networking. In the beginning there were office networks and the progenitor Internet. The first office network was designed at the Palo Alto Research Center, which is the Xerox research laboratory where a number of major computer innovations were developed. That office network was called Ethernet and was designed as part of the “office of the future,” where word-processing devices and printers were cabled together. Separately, the ARPAnet was the Defense Advanced Research Project Agency's creation and linked mainframe computers at major universities. Over time, the two networks grew toward one another almost organically, and today we have what seems to be a seamless network that links computers all over—and above—the world.Networking technology has evolved almost as rapidly as computer technology. As with buses in a computer, the networks that serve the world can be likened to highways. Backbone networks ( Fig. 4-4 ) are strung across the world and have tremendous capacity, like interstate highways. Lower-capacity systems feed into the backbones, and traffic is directed by router computers. To facilitate traffic management, before transmission messages are cut up into discrete packets, each of which travels autonomously to its destination, where they are reassembled. Internet packets may travel over hard-wired, optical, or wireless networks en route to their destination.Figure 4-4 Lower-speed networks are attached to high-speed “core” networks that span the globe. LAN, local area network. Computer SoftwareOperating System and ProgrammingThe operating system (OS) is the government of the computer. As with a municipal government, the OS is responsible for coordinating the actions of disparate components of a computer, including its hardware and various software programs, to ensure that the computer runs smoothly. Specifically, it controls the CPU, memory, interface devices, and all the programs running on the machine at any given time. The OS also needs to provide a consistent set of rules and regulations to which new programs must adhere to participate.Although most of us think of Apple and Windows synonymously with OSs, there are other OSs that deservemention. Linux is an open-source, meaning nonproprietary, OS for personal computers that is distributed by several different vendors but continuously maintained by a huge community of passionate programmer devotees who contribute updates and new programs. In addition, every cell phone and smart device has its own OS that performs exactly the same role as for a personal computer OS.OSs can be categorized into four broad categories ( Fig. 4-5 ). A real-time OS is typically used to run a specific piece of machinery, such as a scientific instrument, and is dedicated solely to that task. A single-user, single-task OS is like that found on a cell phone, where a single user does one job at a time, such as dialing, browsing, or e-mail. Most of today's laptop and desktop computers are equipped with single-user, multitasking OSs, whereby a single user can run several “jobs” simultaneously, such as word processing, e-mail, and a browser. Finally, multi-user, multitasking OSs are usually found on mainframe computers and run many jobs for many users concurrently.Figure 4-5 Several operating system configurations.All OSs have a similar core set of jobs: CPU management, memory management, storage management, device management, application interfacing, and user interfacing. Without getting into detail beyond the scope of this chapter, the OS breaks a given software job down into manageable chunks and orders them for sequential assignment to the CPU. The OS also coordinates the flow of data among the various internal memory stores, as well as determines where that data will be stored for the long term and keeps track of it from session to session. The OS provides a consistent interface for applications so that the third-party program that you buy at a store will work properly on a given OS. Finally, and of most importance for manyof us, the OS manages its interface to you, the user. Typically, today that takes the form of a graphic user interface (GUI).E-mailE-mail communication over the Internet antedated the browser-based World Wide Web by decades. In fact, the earliest e-mail was designed for communication among multiple users in a “time-sharing,” multi-user environment on a mainframe computer. E-mail was used for informal and academic communications among the largely university-based user community. Without going into great detail, an e-mail communication protocol was designed so that each message included information about the sender, the addressee, and the body of the message. The protocol is called the Simple Mail Transfer Protocol (SMTP), and the process of message transmission proceeds as follows. The sender composes a message via a software-based messaging program (such as Outlook, Gmail). The sender then applies the recipient's address and dispatches the message. The message travels through a series of mailboxes, much as a regular letter does, and eventually arrives at the addressee's mailbox, where it sits awaiting “pickup.” Although e-mail has had dramatic and largely positive implications for the connectedness of organizations and people, it has also created hitherto unimagined problems, including spam, privacy issues, and the need for new forms of etiquette.The term spam is said to have come from a Monty Python skit. Spam is such a ubiquitous problem that most e-mail crossing the Internet is spam at this point. Spam is essentially bulk e-mail and was never envisioned by the creators of SMTP. Spam is a generic problem with e-mail communications, but the issues of privacy and etiquette are of much greater relevance for medically oriented e-mail.The American Medical Informatics Association has taken a lead role in defining the issues associated with e-mail in the medical setting. The organization defined patient-provider e-mail as “computer based communication between clinicians and patients within a contractual relationship in which the health care provider has taken on an explicit measure of responsibility for the client's care.”[1] A parallel set of issues relates to medically oriented communications between providers. [2] [3] [4] [5] Another category of medically oriented communications is that in which a provider offers medical advice in the absence of a “contractual relationship.” An egregious example of the latter is the prescription of erectile dysfunction remedies by physicians who review a Web-based form submitted by the “patient” and then prescribe a treatment for a fee.In theory, e-mail is a perfect way to communicate with patients. [6] [7] [8] [9] Because of its asynchronous nature, it allows two parties who may not be available at the same time to communicate efficiently ( Fig. 4-6 ), and it represents the middle road between two other types of asynchronous communication: voice mail and traditional mail. E-mail can also be tailored to brief exchanges, more structured communications, and information broadcasts (such as announcements). As such, a patient could send interval updates (blood pressure, blood sugar) to the physician. Alternatively, the physician could follow up on an office visit by providing educational material about a newly diagnosed condition or planned procedure.Figure 4-6 E-mail is an effective form of communication between a patient and a physician because it does not require bothparties to be present at the same time.Even though e-mail has many advantages in medicine, there are a variety of risks associated with its use, which has slowed adoption.[10] Some of the problems are generic to any e-mail exchange. Specifically, it is a more informal and often unfiltered form of communication than a letter and often has the immediacy of a conversation but lacks its visual and verbal cues. Emoticons (such as the use of “:)” to indicate that a comment was sent with a “smile”) evolved as a remedy for this problem.E-mail is also permanent in the sense that copies of it remain in mailbox backups even after deletion from local files ( Fig. 4-7 ). Every e-mail should therefore be thought of as discoverable from both a liability and recoverability standpoint. Before dispatch e-mail should be scrutinized for information or content that might be regretted at a later date.Figure 4-7 E-mail leaves copies of itself as it travels across the Internet.E-mail is also vulnerable to inadvertent or malicious breaches in privacy or disclosure through improper handling of data at any point along the “chain of custody” between the sender and the recipient. Alternatively, a hacker could potentially acquire sensitive medical information from unsecured e-mail or possibly even alter medical advice and test results in an e-mail from physician to patient.The Healthcare Insurance Portability and Accountability Act (HIPAA) legislation mandates secure electronic communication in correspondence regarding patient care. Three prerequisites for secure communication include authentication (that the message sender and recipient are who they say they are), encryption (that the message arrived unread and untampered with), and time/date stamping (that the message was sent at a verifiable time and date), although these techniques are not yet widely deployed in the medical community.It is beyond the scope of this chapter to go into great detail about the methods used to authenticate, encrypt, and time-stamp e-mail. However, it is possible to ensure that each of these elements by using mathematically linked pairs of numbers (keys), in which an individual's public key is published and freely available through a central registry (like a phone book) whereas a linked private key is kept secret ( Fig. 4-8 ). Public key encryption combined with traditional encryption is used to transmit messages securely across public networks, ensure that messages can be read only by a specific individual, and digitally sign the message.Although the use of e-mail for medical patient-provider and provider-provider communications is growing, it is not yet universally adopted for several reasons, including physician distrust of the medium, unfamiliarity with software, lack of standards, and lack of clear methods for reimbursement for time spent in e-mailcommunications. Nevertheless, several professional societies have published consensus recommendations about the management of e-mail in medical practice. Common consensus-based elements are enumerated in ( Box 4-1 ).Box 4-1BrowserMany people think of the Internet and the World Wide Web as one and the same. The Internet is theworldwide network, whereas the Web is one of its applications characterized by the browsers with which its users interact. The browser was invented by Tim Berners Lee at the European Organization for Nuclear Research, commonly known as CERN, in 1990. Marc Andreessen wrote the Mosaic browser andsubsequently the Netscape browser, which like all subsequent browsers, has a GUI and uses a specific “language” called hypertext markup language (HTML). Microsoft eventually developed its own version ofFigure 4-8 Public/private key encryption in which Joe sends a message intended only for Bob by using Bob's public key—the message remains encrypted until Bob decrypts it with his private key.Suggested Rules for E-mail Correspondence in a Medical SettingAll patient-provider e-mail should be encrypted.Correspondents should be authenticated (guarantee you are who you say you are).Patient confidentiality should be protected.Unauthorized access to e-mail (electronic or paper) should be prevented.The patient should provide informed consent regarding the scope and nature of electroniccommunications.Electronic communications should (ideally) occur in the context of a preexisting physician-patientrelationship.On-line communications are to be considered a part of the patient's medical record and should be included with the same.the browser, Internet Explorer, after recognizing the inevitability of the Web.The browser is a computer program, just like a word-processing or e-mail program, with a GUI. It can be thought of as a radio or television insofar is it serves as an interface to media that do not originate within it. The address of a webpage is analogous to the channel or frequency of a television or radio, and the browser “tunes in” to that address. In actuality, the local browser on your machine communicates with a server somewhere on the Internet (at the address specified in the address line) and uses a communication protocol called HTML as its language. The webpage displayed on your local browser was first constructed on the server and then sent to you.The original HTML was extremely spare and permitted the construction of very simple webpages. A variety of new “languages” and protocols have subsequently come into existence, such as Java, Javascript, ActiveX, Flash, and others, that allow enhancements to HTML. New browsers support interactivity, security, display of audio and video content, and other functions. Even though the scope of topics that could be covered in discussing browser communications far exceeds that of this chapter, certain issues deserve mention.“Cookies” is the term used for short lines of text that act like laundry tickets and are used by an Internet server (such as a Google search engine server) to “remember” things about the client computers with which it interacts. Cookies allow the server to keep track, for example, of the items that you have put in your virtual shopping cart as you shop ( Fig. 4-9 ). Although cookies are not inherently risky, there are other risks to the use of a browser.Figure 4-9 Cookies are used by a website, for example, to keep track of the items that a user has put in the “shopping cart.” Like a television, the browser acts like a window on the Internet, and for a long time it was safe to think of that window as being made of one-way glass. Unfortunately, many of the new innovations that allow us tofunction interactively with websites also have built-in flaws that permit malicious programmers to gain access to your computer. The best way to protect a computer involves timely application of all updates and patches issued by software manufacturers and the use of antivirus software with up-to-date definitions.Computers and Computing in MedicineHospital Information SystemsModern hospital information systems invariably fall somewhere on the spectrum between a monolithic single comprehensive system design, wherein a single vendor provides all of the components of the system, and a “best-in-breed” model consisting of multiple vendor-specific systems interacting through interfaces or, more typically, an interface “engine.” [11] [12] [13] [14] The monolithic system has the advantage of smooth interoperability, but some of the component elements may be substantially inferior to those offered by best-in-breed vendors.Component elements of a hospital information system include administrative, clinical, documentation, billing, and business systems. [15] [16] [17] Medical information technology is increasingly subject to governmental regulation, security concerns, and standards. Standards are essential for interoperability among systems and to ensure that systems use uniform terminology.[18]Health Level 7 (HL7) is an accepted set of rules and protocols for communication among medical devices. Clinical Context Management Specification (also known as CCOW) is a method to enable end users to seamlessly view results from disparate “back-end” clinical systems as though they were integrated. Some of the common medical terminologies or vocabularies include the Systematized Nomenclature of Medicine (SNOMED) and the International Classification of Diseases (the ICD family of classifications).[19]Modern complicated medical information systems often weave a host of disparate systems at geographically dispersed locations into an extended “intranet.” A core hospital, for example, may share an intranet with a geographically remote outpatient practice, or several hospitals in the same health system may coexist within the same intranet. Some of the elements may be physically connected ( Fig. 4-10 ) along a network “backbone,” whereas others may use virtual private network (VPN) connections that allow the user to appear to be part of the network while at a remote location.[16]Figure 4-10 Modern health care information systems consist of elements attached to a backbone. ADT, admission, discharge, and transfer.。
外科护理学第七版麻醉病人护理课前导入答案一、A1型题(总题数:20,分数:40.00)1.全身麻醉病人清醒前最危险的意外及并发症是(分数:2.00)A.呕吐物窒息√B.体温过低C.坠床D.引流管脱落E.意外损伤解析:2.为防止全麻时呕吐和术后腹胀,手术前禁食丶禁饮的时间分别是(分数:2.00)A.4h禁食,2h禁水B.6h禁食,4h禁水C.8h禁食,6h禁水D.10h禁食,4h禁水E.12h禁食,4-6h禁水√解析:3.手术麻醉的目的是(分数:2.00)A.保持呼吸道通畅B.保持循环稳定C.3预防术后感染使肌肉松弛’消除疼痛D,保证术中安全E.预防术后并发症解析:4.全麻的基本要求,不正确的是(分数:2.00)B.完全抑制应激反应√C.镇痛完全D.肌肉松弛E.呼吸丶循环等生理指标相对稳定解析:5.能预防局麻药中毒的术前用药是(分数:2.00)A.氩丙嗪B.异丙嗪C.阿托品D.哌替啶E.苯巴比妥钠√解析:6.吸入麻醉与静脉麻醉相比’其优点有(分数:2.00)A.诱导迅速B.操作方便C.药物安全无爆炸性D.对呼吸道无刺激E.容易调节麻醉深度√解析∶解析∶吸入麻醉是将挥发性麻醉剂或气体馷醉剂经呼吸道吸人肺内’经肺泡毛细血管吸收进入血液循环’达到中枢神经系统’产生非麻醉效应的一种方法。
吸入麻醉的优点为可产生安全丶有效的无知觉状态,并可使肌肉松弛、感觉淌失。
由于麻醉药经肺通气进入体内和排出体外’故麻醉深度的调节较其他麻醉方法更为容易。
7.不可采用硬膜外麻醉的手术部位是(分数:2.00)A.头部√B.上肢C.腹部解析∶解析∶硬膜外麻醉是捋局麻药注入硬膜外间隙’阻滞脊神经根,使其支配区域产生暂时性麻醉的麻醉方法。
适用于除头部以外的任何手术。
8.硬膜外麻醉最严重的并发症是(分数:2.00)A.血压下降B.血管扩张C.尿潴留D.I呼吸变慢E.全脊髓麻醉√解析∶解析∶全部脊神经受阻滞称全脊麻,是硬膜外麻醉最危险的并发症。
麻省总医院临床麻醉手册(第7版翻译版)【(美)Peter F.Dunn 著于永浩主译李文硕邓廼封校注】《Cliniacl Anesthesia Procedures of the Massachusetts General Hospital》《麻省总医院临床麻醉手册(第7版)》系统地介绍了麻醉临床基本技能,还补充了课本、杂志一些内容以及麻醉学与重症监护学的前沿知识。
另外,每一章节都包含了推荐阅读,《麻省总医院临床麻醉手册(第7版)》末尾增加了常见药物的功能及用法信息,旨在提高临床教学经验和鼓励进一步详尽的学习。
《麻省总医院临床麻醉手册(第7版)》是针对临床麻醉和重症监护住院医师、麻醉护士、医学生,以及相关专业住院医师的较实用的参考书。
作为对课本的有效补充,该书出版以来保持着数年更新一版的传统,成为美国和其他国家临床麻醉医生的常用"口袋书"。
《麻省总医院临床麻醉手册(第7版)》以其实用性和方便携带的特点深受广大年轻麻醉医生的欢迎。
在每一章后附有详细的推荐阅读资料,方面读者查阅参考。
于永浩天津医科大学总医院麻醉科序言第七版麻省总医院临床麻醉手册由麻省总医院麻醉科与重症监护病房的医师及其同仁以及相关医务人员编写。
本手册一直注重临床基本技能,主要涉及麻醉的安全管理、围术期护理和疼痛治疗。
本书主要反映了目前麻省总医院的临床操作,代表了麻醉住院医师、重症监护病房、疼痛、心血管麻醉医师的基本培养模式。
该手册补充了课本、杂志一些内容以及麻醉学与重症监护学的前沿知识。
本书编写深入浅出、内容严谨,适用于资深的麻醉医师、麻醉住院医师、麻醉护士、学医学生、内外科住院医师、呼吸治疗医师以及病人围术期治疗相关的医务人员。
该手册的目的是提高临床教学经验和鼓励进一步详尽的学习。
因此,每一章节包含了推荐阅读材料。
同前几版本麻醉手册一样。
该手册的每一章节都做了详细的回顾与更新,适当保留了先前版本的内容。
各章节联系紧密,为读者提供了容易理解相关材料的平台。
中文版电子书索引(不断更新)点击进入中文版电子书分版1.【精华】米勒麻醉学6中文版2.麻醉求生指南电子书3.临床麻醉手册第四版电子书4.麻醉和重症监护新进展5.简明眼科麻醉学:英汉对照6.最新麻醉技术操作规范与新技术应用实用手册7.麻醉意外事故预防与应急处理实用手册8.心血管麻醉及体外循环9.麻醉科主治医师500问电子书10.麦克明彩色人体解剖图谱第四版11.产科麻醉袖珍手册12.麻醉新概念(汉英对照)/医学专业双语读物13.美国名医诊疗手册(麻醉学)14.麻醉苏醒病人的管理chesis子午麻醉资源共享16.临床疼痛治疗学(修订版)17.麻醉中的疑难问题及处理方法18.现代临床麻醉和重症监测治疗手册19.麻醉解剖学20.疼痛治疗麻醉医师培训手册21.机械通气与临床 (第二版)22.麻醉恢复室学习班讲义23.美国最新临床医学问答:麻醉学24.骨科麻醉学25.困难气管插管技术26.起死回生一百例:危重急症抢救经验集27.临床麻醉实施程序28.婴幼儿麻醉学29.心脑血管急症急救程序30.围手术期心血管药物的应用31.神经阻滞学:100种神经阻滞术图解32.体外循环手册(手机版)33.麻醉信息学34.实用硬脊膜外腔神经阻滞学(第二版)35.美国麻省总医院危重监测治疗手册(第三版中文)36.老年麻醉学与疼痛治疗学37.临床疼痛治疗学(修订版)38.实用解剖摄影图谱39.急救原则与应用40.麻醉生理学(第1版本科教材)41.麻醉复苏室42.麻省总院临床麻醉手册第五版(中文)43.麻醉技术全书44.现代麻醉学(第三版.word版)45.斯坦福临床麻醉全书第3版46.烧伤麻醉学47.杨拔贤教授麻醉基础理论48.当代麻醉手册49.上海医院麻醉手册50.麻醉与复苏新论51.全麻原理及研究新进展52.麻醉求生指南:How to Survive in Anaesthesia(中文版)53.危重病和心脏医学2006版(中文版)54.临床麻醉与镇痛彩色图谱55.临床疼痛鉴别诊断学56.呼吸系统-影像鉴别诊断指南57.图解血气分析58.现代麻醉学第三版手机版59.临床麻醉监测教程60.临床决策分析(哈佛版)61.低流量新鲜气体吸入麻醉技术62.麻醉药理学63.心电图速成宝典64.临床麻醉学65.麻醉相关并发症处理学66.最新临床头痛治疗学67.临床诊疗指南-麻醉分册68.麻醉解剖实物图谱69.现代口腔颌面外科麻醉70.新编麻醉工作指南71.麻醉并发症及其处理72.局部麻醉图谱(第2版) David L Brown著73.本科教材:疼痛诊疗学74.高级医师案头丛书:麻醉学75.周围神经阻滞原理与实践76.神经阻滞技术解剖学彩色图解77.美国最新临床医学问答-重症监护78.心血管手术麻醉学79.心血管病诊疗指南解读80.机械通气(第二版)81.牛津临床麻醉手册82.喉罩麻醉原理与实践83.区域麻醉图解操作指南(第三版)84.麻醉与恶性疾病 (英)菲尔谢(Filshie,J.),罗比(Robbie,D.S.)著85.现代临床医学外科进展-麻醉分册86.麻醉学临床疑难解答87.《癌症疼痛治疗:原理与实践》88.美国最新临床医学问答--疼痛治疗89.摩根临床麻醉学(第4版)90.小儿心脏麻醉学91.循证临床麻醉学92.《癌症疼痛治疗:原理与实践》93.美国最新临床医学问答--疼痛治疗94.临床软组织疼痛诊疗指南95.缓解疼痛有妙招(英)96.肿瘤外科麻醉97.人体疾病与麻醉98.实用疼痛治疗处方手册99.八年制临床诊断学100.老年麻醉学101.妇产科麻醉学102.现代麻醉学技术103.当代麻醉学104.小儿麻醉手册105.麻醉中疑难问题处理及处理方法106.21世纪临床心电图教学图谱107.奈特人体生理学彩色图谱108.麻醉学相关指南汇总(手机版) 109.本科教材:麻醉设备学110.脊柱外科麻醉学111.临床局部麻醉技术112.临床麻醉管理规范与技术常规113.疼痛诊断治疗图解114.麻醉治疗学115.麻醉失误与防范116.《触诊解剖学图谱》117.肝脏移植麻醉学118.缘分(一位美国麻醉界华人领袖的经历) 119.临床口腔颌面外科麻醉学120.麻醉手术前评估与决策121.简明病历书写手册(英汉对照)122.麻醉后恢复期病人的评估与治疗123.实用小儿麻醉学124.急症麻醉学125.麻醉科药物速查手册126.呼吸病学PDB手机版127.《外科学》第七版128.《内科学》第七版129.实用临床输血指南130.现代麻醉学多选题131.《生理学》第七版132.心脑缺血患者麻醉管理133.麻醉与抢救中气管插管学134.临床麻醉学135.现代麻醉学3版(txt版)136.经典简洁的麻醉手册(手机版)小儿先天性心脏病手术的麻醉137.病理生理学协和听课笔记138.病理生理学(吴和平)139.实用ICU临床应用技术140.病理生理学(第六版)141.《病理学》第七版142.病理生理学(七年制)143.麻醉干预无痛技术144.小儿麻醉期间的呼吸道管理及麻醉相关注意事项145.麻醉意外146.麻醉危象急救和并发症治疗147.《诊断学》第七版148.《生物化学》第七版149.实用老年麻醉学150.精美高清心电图图谱151.成分输血指南英文版电子书索引(不断更新)点击进入英文版电子书分版1.cardiac arrest duing anesthesia2.A Guide to Reading and Understanding the EKG3.anatomy for anaesthetists4.Kaplan Anatomy Coloring Book5.麻理省总医院疼痛治疗手册(英文)6.Barash—Clinical Anesthesia 4th edition7.Hurford - Clinical Anesthesia Procedures8.How to Survive in Anaesthesia(英文版)9.呼吸机波形口袋书10.外周神经阻滞概要(英文)11.区域阻滞麻醉图解(手机电子书)12.麻醉学口试书(The Anaesthesia Science Viva Book )13.McGraw-Hill Professiona出版的《Anesthesiology》(2007)14.Textbook Of Anaesthesia15.Physics, Pharmacology and Physiology for Anaesthetists16.Clinical Anesthesia: Near Misses and Lessons Learned17.麻省总院临床麻醉手册第6版18.麻醉药物指南(英文手机版)19.麻省总院临床麻醉手册第6版(手机电子书)20.外周区域阻滞教程21.[医学术语表].Medical.Terminology.An.Illustrated.Guide22.Modern Anesthetics (Handbook of Experimental Pharmacology)23.Obstetrics for Anaesthetists(08新书)24.Stanford Anesthesiology Regional Anesthesia Syllabus25.Management of Cardiac Arrhythmias26.ANAESTHESIA FOR THE HIGH RISK PATIENT27.Anaesthesia, Pain, Intensive Care and Emergency28.ADVANCES IN ANESTHESIA29.ABC of Interventional Cardiology30.emergency medicine Dr. D. Cass31.Safe Anaesthesia (3rd edition)32.麻省总院临床麻醉手册第7版33.摩根麻醉学第四版(彩图原版)34.(PDF)2007年Springer出版的Geriatric Anesthesiology(第二版)35.Handbook of Regionalanaesthesia ESRA 200736.Handbook of Neuroanaesthesia 4th Edition37.Anasthasia for Cardiac Surgeryler's Anesthesia: 2-Volume Set 6th ed39.2008新书:A Practical Approach to Pediatric Anesthesia40.Turning up the Heat on Pain: TRPV1 Receptors in Pain and Inflammation41.The 5-Minute Pain Management Consult 200742.牛津临床麻醉手册(第2版)43.Anatomy and Physiology: The Unity of Form and Function44.牛津突发事件与急症手册(第2版)45.Local Anaesthesia for Minor Surgery Procedures46.Basic Science for Anaesthetists(剑桥大学出版社)47.Developing Anaesthesia Textbook 200748.输血输液指南:Practice Guidelines for Blood Transfusion49.精美的在线解剖图谱50.马萨诸塞州总医院疼痛诊疗手册51.紧急气道管理手册52.Mayo Clinic Analgesic Pathway53.Lecture Notes on Clinical Anesthesia54.Legal and Ethical Aspects of Anaesthesia55.Handbook of Regional Anesthesia - ESRA 200756.Anesthesia in Cosmetic Surgery57.Gastrointestinal and Colorectal Anesthesiaplications of Regional Anesthesia59.The Complete Textbook of Anaesthesia and intensive care medicine on CD-ROM60.A Concise Guide to Intraoperative Monitoring61.Anesthesia and pain ebooks(不断更新)62.Clinical Anatomy by Systems63.Cardiac Anesthesia:Today and Tomorrow, An Issue of Anesthesiology Clinics64.Anesthesia and Analgesia in Dermatologic Surgery65.Clinical Manual of Trigeminal Neuralgia and Facial Pain66.Practical Ultrasound In Anesthesia For Critical Care and Pain Management67.Central Pain Syndrome: Pathophysiology, diagnosis and management68.Pain Management: Evidence-Based Tools and Techniques for Nursing Professionals69.Pain Management: A Practical Guide for Clinicians, 6th edition70.Trauma Anesthesia71.Manual of Emergency Airway Management72.ASA refresher course in Anesthesiology 2008 - Vol 3673.Current Diagnosis and Treatment: Critical Care, 3rd edition74.Cardiopulmonary Anatomy & Physiology :Essentials for Respiratory Care75.CAMBRIDGE UNIVERSITY PRESS出版的TRAUMA ANESTHESIA(2008版)76.ANALGESIA,ANESTHESIA and PREGNANCY:A PRACTICAL GUIDE(第二版)。
2 – Scope of Modern Anesthetic PracticeWilliam L. Young,Jeanine P. Wiener-Kronish,Lee A. FleisherKey Points1.With the increase in the elderly population, more of the surgeries performed will be proceduresrequired by elderly patients.2.Minimally invasive procedures are increasing; anesthesiologists will be performing more anestheticprocedures outside operating rooms. Anesthesia may be the major risk to patients as the surgicalprocedures become more minimal.3.The mandates for quality, competency, and uniform process will change the way anesthesia isdelivered. More standardization and protocols will be used; this will allow more evaluation andresearch as to what optimal anesthesia is and what competent anesthesiologists are required todo.4.The increase in nurses with degrees will change the number of anesthetics delivered byphysicians. Team management and relationships between physicians and nurses will becomemore crucial, and the demand for skills in personnel management will increase.5.Not enough research is being done by anesthesiologists. Anesthesiologists will need to engage inresearch to maintain an academic foothold. Opportunities for multidisciplinary research areincreasing, and they need to be embraced to increase the number of research-trainedanesthesiologists.Forces That Will Change Anesthetic PracticeMultiple changes are occurring that will affect the role of the anesthesiologist in the United States and perhaps globally. The population of industrialized nations is aging so that many more operations are performed on elderly patients; this dictates which operations are being performed. The cost of medical care is increasing globally, and perhaps most rapidly in the United States. The increased cost will lead to more scrutiny regarding the need for operations, increased focus on quality and use of care that has documentation by research, changes in Medicare reimbursement, changes in the ability of patients to pay, and a demand for the use of less costly health care providers whenever possible. Technologic advances are leading to less invasive procedures that can be done on patients who heretofore would be denied routine surgical procedures. The mandates for quality metrics and evaluation of processes will change the way we practice medicine. Changes in personnel also will affect the workforce in practices. We can only speculate as to what the practice of anesthesia will be like in the next 10 years, but these forces will likely have a major effect ( Fig. 2-1 ).Figure 2-1 Changing scope and settings of anesthesia and perioperative medicine. A,The Cure of Folly, by Hieronymus Bosch (c. 1450-1516), depicting the removal of stones in the head, thought to be a cure for madness. B, Friedrich Esmarch amputating with the use of anesthesia and with antisepsis. Woodcut from Esmarch's Handbuch der KriegschirurgischenTechnik (1877). C, Harvey Cushing performing an operation, with the Harvey Cushing Society observing, 1932. D and E, Placement of deep brain stimulator for the treatment of Parkinson's disease using a real-time magnetic resonance imaging technology (MR fluoroscopy). The procedure takes place in the MR suite of the radiology department. The patient isanesthetized (D) and moved into the bore of the magnet (E). F and G, A sterile field is created for intracranial instrumentation(F), and placement of electrodes is done using real-time guidance (G). (A, Courtesy of Museo del Prado, Madrid; B, courtesyof Jeremy Norman & Co, Inc; C, photograph by Richard Upjohn Light [Boston Medical Library]; D-G, courtesy of Paul Larson, MD, UCSF/SF Veterans Administration Medical Center.)Aging of SocietyWith the aging of the population and improvements in anesthetic and surgical methods, the fraction of the elderly population undergoing surgical procedures is increasing in the United States (see Chapter 71 ). Use of surgical services in older patients is not unexpectedly higher than in younger patients. In the Centers for Disease Control and Prevention report of inpatient hospitalizations for 2005, there were 45 million procedures performed on inpatients with a similar number of outpatient procedures. From 1995 through 2004, the rate of hip replacements for patients 65 years old and older increased 38%, and the rate of knee replacements increased 70%. There seems to be a direct correlation between age and use of surgical services.[1]Changes in Location of Care DeliveryBecause of the high costs to the insurance industry, the pressure to move less invasive surgical procedures to locations remote from the hospital setting will continue, frequently motivated by changes in reimbursement.[2] Providing anesthesia in ambulatory surgical settings and offices has increased dramatically over the last several decades (see Chapter 78 ). As procedures become less invasive than those currently performed in the operating room, they will be performed in procedure units. It will be important to determine the need for anesthesiologists to provide traditional anesthesia and moderate to deep levels of conscious sedation in these settings. Postoperative care has been shifted from the medical setting to the families, which may present difficulties in elderly patients. Anesthesiologists must be involved in determining which patients are appropriate candidates to have different procedures in these different locations of care, and the level of monitoring needed to perform these procedures safely.[3]Cost of Medical CareAs the cost of health care in the United States approaches 15% of the gross national product, there hasbeen increased interest in determining the factors that are increasing the costs, attempting to find methods to decrease the cost, and obtaining more valuable health care for the money spent. The primary driver of cost in the United States apparently is technical progress, as health care costs are increasing throughout the world, regardless of the insurance system. [4] [5] [6] The increase in elderly patients and patients with chronic disease in the population also is increasing health care costs.[7]The increase in cost has led to a movement to get more value for the money spent. Pay-for-performance programs have been created (i.e., rewarding medical care that is consistent with published evidence and not paying for care that is inconsistent with evidence). [8] [9] [10] The concept and reality of pay-for-performance programs has now moved to other countries, including the United Kingdom.[11]In the nonsurgical arena, the concept of pay-for-performance has been studied for several years. [12] [13] In addition to paying for performance, in the United States, there is increasing emphasis on not paying for “never” events, such as decubitus ulcers or urinary tract infections, unless they are present on admission to the hospital. Because of the role of anesthesiologists in the entire continuum of perioperative care, including postoperative intensive care and pain management, we have an opportunity to influence many of the practices that can be associated with poor outcomes and increased cost, but that traditionally have not been considered under our domain of care.Appropriate and timely administration of antibiotics has a significant impact on surgical site infection, but before the initiation of the Surgical Care Improvement Project, many anesthesiologists were arguing that control of antibiotics was not within their domain (see Chapter 5 ).[14] Anesthesiologists tend to focus on intraoperative cardiac arrests and other short-term outcomes, such as postoperative nausea and vomiting (see Chapters 33 and 86 [Chapter 33] [Chapter 86] ). Anesthesiologists will need to rethink this approach for other “never” conditions, such as urinary tract infections because they help determine the need for Foley catheters. Anesthesiologists and intensivists also can have a significant impact on the rate of ventilator-associated pneumonia. Some of these proposed measures, particularly the use of ventilator-associated pneumonia as a quality measure, have become quite controversial, however.[15] Pain is considered the fifth vital sign, and the management of postoperative pain is another area in which anesthesiologists can have a significant impact with respect to cost and potential interaction with other members of the hospital team. Process Assessment and Quality MetricsAnesthesiology was among the first professions to focus on reducing the risk of complications, and anesthesiologists were among the first groups to develop evidence-based guidelines and standards, as shown by the American Society of Anesthesiologists standards and practice parameters (see Chapter 5 ).[16] It will be important to continue to be involved in multidisciplinary approaches to surgical care. Examples where anesthesiologists were initially less involved include the Society of Thoracic Surgeons database and the National Surgical Quality Improvement Project. [17] [18] More recently, anesthesiologists have become involved in both groups, and the Society of Cardiovascular Anesthesiologists have begun discussions with the Society of Thoracic Surgeons. Anesthesiologists have been involved from an early stage in quality initiatives with the Institute of Healthcare Improvement and the Surgical Care Improvement Project.[19] Other quality measures that will have an impact on anesthesiologists include the new demand for metrics of competency, to be measured on a focused and an ongoing basis. Defining competency will demand that anesthesiologists adhere to more protocols, and that the concept of safe anesthesia be standardized. Rather than stifling medical innovation, standardization should be viewed as a mechanism for evaluating process and outcomes because comparisons cannot be made without standardization. Anesthesiologists will need to engage in creating quality and competency metrics, or these instruments will be created by others. This is an opportunity to formulate meaningful metrics that can be used in training physicians. Metrics also will be sought for nurse anesthetists and other health care professionals.Changing processes has become a cottage industry in medical care, with courses being offered on how to change behaviors and processes in medical care. These mandates offer the opportunity for more research as to whether changing processes leads to improved patient outcomes. These mandates also allow anesthesiologists to assume a leadership role in team management. For anesthesiologists to accomplish this role, new skills need to be taught, including leadership training, improved communication skills, and improved relationship training.One advantage that anesthesiologists have is a long tradition and training in system approaches to care. These date back to the original checklists for the anesthesia machine. It is crucial that this skill set be disseminated beyond the intraoperative setting. Many ambulatory surgery centers are directed by anesthesiologists.Changes in PersonnelThere are approximately 250,000 active physicians, one third of whom are older than 55 years old and likely to retire by 2020.[20] Although in the 1960s enrollment in U.S. medical schools doubled, during the years 1980 through 2005, the enrollment has been flat. There has been zero growth in U.S. medical school graduates. In this same interval, the U.S. population grew by more than 70 million, creating a discrepancy between the supply of medical school graduates and the demand for physician-associated care.There also has been a significant increase in the number of women in medical schools, so that about 50% of medical students are now women.[21] Women tend not to work as many hours as their male counterparts, even when part-time status is taken into consideration.[22] Also, today's younger physicians choose to work fewer hours than their older counterparts, regardless of gender. [20] [23]There has been a steady use of international medical graduates; 60,000 international medical graduates are used as residents and constitute 25% of all residents in training.[24] There also has been an increase in the number of osteopath schools and schools offering advanced degrees in nursing, including training of nurses to become nurse anesthetists.[20] Given the increase in demand for medical care resulting from the increase in the geriatric population, this need will most likely be met by a combination of physicians and nonphysician personnel.ResearchIn terms of creative new investigations, most benchmarks suggest that the specialty of anesthesiology fares poorly compared with other disciplines, especially clinical disciplines. Using data gleaned from publicly available National Institutes of Health (NIH) sources, Reves[25] produced a troubling figure showing that anesthesiology ranked second to the last for many medical disciplines. The exact position on any such figure would undoubtedly change from year to year. The fact that U.S. anesthesiology inhabits the lowest quartile is of concern, however, because the external forces on the practice components are generally applicable to all specialties.NIH is not the only source of funding that may influence the specialty of anesthesiology; it is not even the largest portion of total research funding in the United States ( Fig. 2-2 ).[26] For all sources, there has been a doubling over the last decade in research expenditures for health and biomedical science research, although compared with biologically based disciplines, health services research is considerably less well funded. In anesthesiology journals, the fraction of non-U.S., original peer-reviewed articles has increased dramatically. This increase does not seem to be attributable directly to research support. Adjusted per capita, research support in Europe is only 10% of that in the United States, even though the proportion of scientists in the population is similar.[27]Figure 2-2 Research expenditures in the United States, 1994 to 2003, by funding source (see text).There is no a priori reason why anesthesiology should differ from any other medical specialty in terms of research output. In our opinion, it should excel. One might argue that practitioners at academic institutions might be better compared in terms of surgical versus medical specialties, but even in that comparison we fare poorly. One line of remedy is to develop clinician-scientist training programs further, such as the NIH T-32 model. This approach is limited, however, by the fact that selection of individuals entering the specialty has already occurred.One of the most exciting and promising developments in this regard has been the initiative by several programs in the United States to construct a special residency track that allows training for scientific independence to occur roughly in parallel with clinical residency training over the course of a 4-year track after internship. Such an approach recognizes that, to change the culture of the specialty, we must build from the ground up. Not all physicians who enter anesthesiology training go on to participate in the research/academic sector. Similar to other highly visible specialties, however, a sufficiently large cadre will be available to replenish those who leave active practice and further build up intellectual capital for future generations.One of the challenges in training is the inherent and increasingly diverse nature of the disciplines involved in any one-subject area. The term “interdisciplinary research” has become a tautology. Practically all new frontiers lie at the boundaries of established departmental or specialty divisions, which are largely a historical relic of 19th century or early 20th century conceptualizations. A look at any large institution's roster of academic divisions yields a growing number of “centers,” “programs,” and “institutes,” reflecting the ever-increasing interdependency of branches in biomedical knowledge. [28] [29] In basic science departments, with conjugate names such as “Physiology and Cellular Biophysics,” “Anatomy and Cell Biology,” “Biochemistry and Biophysics,” and “Cellular and Molecular Pharmacology,” it is becoming increasingly difficult to differentiate one faculty research program from another, solely on the basis of thetopics and methods of study. Although this differentiation is clearly less complicated for domains that do not involve patient care, the trend is evident. One might cite the example of endovascular surgery as anexample in the collision of science, technology, and historical boundaries of medical specialties.[30]Medical research is at one level original creative work that involves systemic investigation of medical phenomena with the direct or indirect consequence of improving health care. Activity that might be subsumed under “medical research” is scholarly synthesis of available data to generate new insights into phenomena. Scholarship that involves this synthesis is perhaps underappreciated and underemphasized. The most common type of publication in this category is the review article. Although often review articlesare written by the individuals performing the original research, this is neither a necessary nor a sufficient precondition for a meaningful review. Especially for clinically oriented reviews, a broad perspective on a particular clinical practice may not only be sufficient, but also desirable.Ramachandran[31] quotes Medwar: “An imaginative conception of what might be true is the starting point of all great discoveries in science.” Ramachandran says that if he would show the reader a talking pig, it is unlikely that he or she would respond: “Ah, but that's just one pig. Show me a few more, and then I might believe you!” In this regard, an unintended consequence of moving toward quantitative indices for impact of biomedical journals has led to the demise of the case report. The case report is just such a Medwarian “imaginative conception of what might be true.” It is also an ideal setting in which to perform a synthesis of available knowledge on a particular topic. Well-written and timely case reports can add significantly to the medical literature and provide an “early warning” function before laborious efforts are undertaken to conduct larger scale efforts to characterize a particular complication or phenomenon. Finally, case reports are“entry-level” scholarly activities that students, residents, and junior faculty can use as a stairstep to more complex modes of writing.An area of potentially increasing contribution of anesthesia and perioperative care is in the realization of clinical trials for efficacy of surgical therapy. It is not unreasonable to assume that reimbursement fordelivery of clinical care in the future may be increasingly tied to participation in some form of organized assessment of efficacy, such as a randomized controlled trial (e.g., the case of lung reduction surgery).[32] Such considerations are especially pertinent for procedures that are highly reimbursed, but controversial in terms of efficacy, as in the case of minimally symptomatic cerebrovascular diseases. [30] [33] Perverse incentives can drive clinical practice and influence expert opinion. In addition to insights into the delivery of perioperative medical care, anesthesiologists can help provide an “honest broker” function in such settings.In the clinical and policy research domain, anesthesiologists have an opportunity to study all aspects of surgical care. By joining or leading multidisciplinary teams, they can ask questions relevant to the patient undergoing surgery and not just undergoing anesthesia. Anesthesiologists must take leading roles inhelping to define best surgical practice.Why is it important that anesthesiology maintain the strongest possible profile in research and scholarly contributions to medical knowledge? Besides the intuitive sense that medical specialties must strive for advancing their domains, there is a strategic necessity of significant proactive involvement in scholarly and investigative pursuits. In the modern medical marketplace, physicians are not the only stakeholders in the delivery of health care. Physicians, particularly those in professorial tracks, traditionally conduct the bulk of investigative research, however. The advent of higher-level, doctoral degrees in nursing could significantly change this dynamic. The public that we serve, and the various governmental and institutional bodies that regulate health care delivery, should have a clear vision of our mission and approach.References1.. National Center for Health Statistics: Health, United States, 2007, with Chartbook on Trends in the Health of Americans. Available at /books/bv.fcgi?indexed=google&rid=healthus07.chapter.trend-tables20072.. Ruther MM, Black C: Medicare use and cost of short-stay hospital services by enrollees with cataract, 1984. Health Care Financ Rev 1987; 9:91-99.3.. Fleisher LA, Pasternak LR, Herbert R, et al: Inpatient hospital admission and death after outpatient surgery in elderly patients: Importance of patient and system characteristics and location of care. ArchSurg 2004; 139:67-72.4.. Cutler DM: Your Money or Your Life: Strong Medicine for America's Health Care System. New York, Oxford University Press, 2004.5.. Bodenheimer T: High and rising health care costs, part 2: Technologic innovation. Ann InternMed 2005; 142:932-937.6.. Mongan JJ, Ferris TG, Lee TH: Options for slowing the growth of health care costs. N Engl JMed 2008; 358:1509-1514.7.. Thorpe KE: The rise in health care spending and what to do about it. Health Aff (Millwood) 2005; 24:1436-1445.8.. Rosenthal MB: Nonpayment for performance? Medicare's new reimbursement rule. N Engl JMed 2007; 357:1573-1575.9.. Shortell SM, Rundall TG, Hsu J: Improving patient care by linking evidence-based medicine and evidence-based management. JAMA 2007; 298:673-676.10.. Lee TH: Pay for performance, version 2.0?. N Engl J Med 2007; 357:531-533.11.. Campbell S, Reeves D, Kontopantelis E, et al: Quality of primary care in England with the introduction of pay for performance. N Engl J Med 2007; 357:181-190.12.. Lindenauer PK, Remus D, Roman S, et al: Public reporting and pay for performance in hospital quality improvement. N Engl J Med 2007; 356:486-496.13.. Centers for Medicare and Medicaid Services: Medicare Program; Hospital Outpatient Prospective Payment System and CY 2007 Payment Rates; CY 2007 Update to the Ambulatory Surgical Center Covered Procedures List; Medicare Administrative Contractors; and Reporting Hospital Quality Data for FY 2008 Inpatient Prospective Payment System Annual Payment Update Program—HCAHPS Survey, SCIP, and Mortality, Vol 71. Dept of Health and Human Services. Federal Register, 2006.14.. Griffin FA: Reducing surgical complications. Jt Comm J Qual Patient Saf 2007; 33:660-665.15.. Klompas M, Kulldorff M, Platt R: Risk of misleading ventilator-associated pneumonia rates with use of standard clinical and microbiological criteria. Clin Infect Dis 2008; 46:1443-1446.16.. Arens JF: A practice parameters overview. Anesthesiology 1993; 78:229-230.17.. Khuri SF: The NSQIP: A new frontier in surgery. Surgery 2005; 138:837-843.18.. Tong BC, Harpole Jr DH: Audit, quality control, and performance in thoracic surgery: A North American perspective. Thorac Surg Clin 2007; 17:379-386.19.. QualityNet: Available at /dcs/ContentServer?pagename=QnetPublic/Page/QnetHomepage20.. Salsberg E, Grover A: Physician workforce shortages: Implications and issues for academic health centers and policymakers. Acad Med 2006; 81:782-787.21.. AAMC Data Book: U.S. Medical School Women Applicants, Accepted Applicants, and Matriculants.Washington, DC, Association of American Medical Colleges, 2005.22.. Heiligers PJ, Hingstman L: Career preferences and the work-family balance in medicine: Gender differences among medical specialists. Soc Sci Med 2000; 50:1235-1246.23.. Jovic E, Wallace JE, Lemaire J: The generation and gender shifts in medicine: An exploratory survey of internal medicine physicians. BMC Health Serv Res 2006; 6:55.24.. Graduate medical education. JAMA 2005; 294:1129-1143.25.. Reves JG: We are what we make: Transforming research in anesthesiology.Anesthesiology 2007; 106:826-835.26.. Moses 3rd H, Dorsey ER, Matheson DH, et al: Financial anatomy of biomedical research.JAMA 2005; 294:1333-1342.27.. Philipson L: Medical research activities, funding, and creativity in Europe: Comparison with research in the United States. JAMA 2005; 294:1394-1398.28.. Columbia University Medical Center. Academic and Clinical Departments, Centers andInstitutions: Available at /depts/200829.. University of California San Francisco. Department Chairs, ORU Directors, and Assistants: Available at /listbuilder/chairs_dirs_assts.htm 200830.. Fiehler J, Stapf C: ARUBA—beating natural history in unruptured brain AVMs by intervention.Neuroradiology 2008; 50:465-467.31.. Ramachandran VS, Blakeslee S: Phantoms in the Brain: Probing the Mysteries of the Human Mind . New York, William Morrow, 1998.32.. Centers for Medicare and Medicaid Services: Lung Volume Reduction Surgery (LVRS). Available at /MedicareApprovedFacilitie/LVRS/List.asp 200833.. Mathiesen T: Arguments against the proposed randomised trial (ARUBA). Neuroradiology 2008; 50:469-471.。
Chapter 1 - History of Anesthetic PracticeMerlin D. LarsonThe first anesthetics were given to ameliorate the pain associated with dental extractions and minor surgery. As the complementary fields of surgery and anesthesiology matured together, new skills were required of the anes-thesiologist, including expertise in resuscitation, fluid replacement, airway management, oxygen transport, opera-tive stress reduction, and postoperative pain control. Today, personnel from the anesthesiology department are scattered in several locations throughout the hospital, from the ambulatory care center to the intensive care unit. Organizing the background of these various activities into a coherent historical document is therefore complicated by the diverse roles that anesthesiologists play in the modern hospital.One approach to the history of anesthesiology is to relate in detail the events surrounding the 1846 public demon-stration of ether anesthesia by William T. G. Morton (1819–1868).[1][2][3][4] This event represents the starting point from which anesthesiology emerged as a specialty. Although the ether demonstration was dramatic and enacted by interesting personalities, it was just the opening act of the pain control story. Since 1846, there has been enormous progress and change in the specialty of medicine that has become known as anesthesiology, and these changes often have occurred in small, incremental steps that are hardly noteworthy on their own. Most oper-ations in the modern operating room could not have been performed before the great progress in anesthetic prac-tice that took place in the years between 1925 and 1960, but historians often overlook these advances, because they were introduced without the drama and spectacle of previous developments.In addition to the advancements of the 20th century, it is necessary to look prior to mid-19th century to appreciate the groundwork laid by those curious individuals who sought a scientific understanding of cardiopulmonary phy-siology and pain. These fundamental discoveries provide the physiologic foundation for safe anesthetic practice.A brief survey of these developments is provided in the opening sections of the following narrative. Dentists, priests, musicians, pediatricians, engineers, ophthalmologists, neurophysiologists, pharmacologists, urologists, otolaryngologists, surgeons, ministers, dilettantes, philosophers, physiologists, missionaries, chemists, South American Indians, and anesthesiologists all had a role in shaping the practice of contemporary anesthesiology, one of the most fascinating stories in the history of medicine.I have attempted to describe and reference the origins of ideas relating to modern anesthetic practice, but the issue of priority is vague on some topics and may be open to question by other historians of specific subjects. I have attempted to verify information from several sources, including original manuscripts whenever possible. Notably lacking are early references to anesthetic methods in Asia, because these texts and manuscripts are difficult to ob-tain. The historical developments of the subspecialties are not documented, but that information can be obtained from specialized textbooks. Perhaps Sir William Osler (1849–1919) expressed the difficulties of the medical his-torian best when he related the History of British Medicine in an address to the British Medical Association in 1897.[5]To trace successfully the evolution of any one of the learned professions would require the hand of a master—of one who, like Darwin, combined a capacity for patient observation with philosophic vision. In the case of medi-cine, the difficulties are enormously increased by the extraordinary development, which has taken place during the19th century. The rate of progress has been too rapid for us to appreciate, and we stand bewildered and as it were, in a state of intellectual giddiness, when we attempt to obtain a broad, comprehensive view of the subject.CARDIOPULMONARY PHYSIOLOGYRespirationAlthough volatile and gaseous anesthetics have changed over the past 150 years, there are two gases that will al-ways be a part of anesthetic practice. How oxygen and carbon dioxide are consumed and produced and how they interact with the body has been the subject of intense research over the past 400 years. The ability to manipulate the pressure of these gases in the tissues has contributed much to the success of intensive care medicine. Because of the importance that carbon dioxide and oxygen have in the practice of anesthesiology, it is worth-while to con-sider how our understanding of respiration came about.Galen (120–200 AD) and Aristotle (384–322 BC) both thought that the air moving in and out of the lungs served merely to cool the heart, which otherwise became overheated in working to sustain life.[6] In 1678, Robert Hook (1635–1703) attached a bellows to the trachea of a dog with an open chest and demonstrated that the animal could be kept alive by rhythmic and sustained contraction of the bellows. Hook proved that movement of the chest wall was not the essential feature of respiration, but rather it was exposure of fresh air to the blood circulating through the lungs.[7] Richard Lower (1631–1691), who also was the first to transfuse blood from one animal to another, demonstrated in 1669 that the blood absorbed a definite chemical substance necessary for life, that it changed the venous blood from dark blue to red, and that the process was the chief function of the pulmonary circulation.[8]The nature of the process that takes place in the lungs was misunderstood until the 1780s because of the generally accepted, but erroneous, phlogiston theory promoted by Georg Ernst Stahl (1660–1734).[9]Stahl theorized that combustible substances were composed of phlogiston (Greek for "burnt") and calc ("ash") and that the phlogiston was released during burning and during respiration. Joseph Priestley (1733–1804) ( Fig. 1-1A ), a complex indi-vidual who was a dissenting minister in Leeds, England and later the "resident intellectual" to the Earl of Shel-burne, observed that respiration and combustion had many similarities,[10] because a candle flame would go out and an animal would die if left within a closed space. He thought this was because the air was putrefied with phlogiston. Priestley discovered photosynthesis by showing that placing plants that imbibed the "phlogistic mat-ter" within the contained space could restore this "bad air." By heating mercuric oxide, he generated a gas that would make flames brighter and keep mice alive longer in a closed space. Priestley called it dephlogisicated air, and Carl Scheele[11] (1742–1786) in Sweden, who found it earlier but failed to publish, called it feuer luft ("fire air"). Priestley thought this process absorbed phlogiston and informed the French chemist Antoine-Laurent La-voisier (1743–1794) (see Fig. 1-1B ) of his discovery.Lavoisier realized that heating mercuric oxides released a new element and called it oxygen. Lavoisier[12]also showed that sulfur and phosphorus gained weight when burned and thereby produced acid-forming sub-stances—hence the name oxygen, which is Greek for "acid producer." When Lavoisier showed that burning some substances caused them to gain weight, the proponents of the phlogiston theory attempted to persist by asserting that phlogiston had a negative weight. Lavoisier was a very wealthy member of the Ferme Generale, the primary tax-collecting agency of the royalist government, but he supported the revolution and was its principal economist. Despite this, he was unceremoniously beheaded at the age of 51 years during the French Revolution. However, hislegacy as the father of modern chemistry is without question, and his greatest contribution was to outline the great facts of respiration: absorption of oxygen through the lungs with liberation of carbon dioxide.Lavoisier thought respiration was accomplished in the lungs, but Humphry Davy (1778–1829) (see Fig. 1-1C ), a young English teenager who dropped out of school at the age of 16 years, read the works of Lavoisier and de-signed his own experiments to study the site of metabolism. Davy heated blood and collected the gases that were produced. By showing these gases were oxygen and carbon dioxide, he surmised that metabolism takes place in the tissues,[13] a conclusion confirmed by Eduard Friedrich Wilhelm Pflüger[14] (1829–1910) 60 years later. Davy also estimated the rates of oxygen consumption and carbon dioxide production, and he measured the total lung and residual volumes.Figure 1-1A, Joseph Priestley was born in Fieldhead, England, and educated as a minister. His early career was spent as a schoolmaster in Leeds, England. In 1780, he accepted an appointment in Birmingham, as mi-nister, where he joined Erasmus Darwin and James Watt in forming the Lunar Society, which met to discuss the new ideas in chemistry and physics emerging at that time. Because of his political views, his chapel and home were vandalized in 1789. Five years later, he joined his sons in Pennsylvania in the United States, where he died in 1804 at age 70. B, The portrait of Lavoisier and his wife, Marie Anne Pierrette Paulz, was painted in 1788 by the famous French artist Jacques Louis David. Marie Paulz, who married Lavoisier when she was only 14 years old, was taught to draw by David, and she drew many of the illustrations in Lavoisier's magnum opus, Traite Elementaire de Chimie.[483] Several experimental devices and gasometers are shown on and below the table. C, Humphry Davy was born in Cornwall, England, and became apprenticed to a surgeon, J. B. Borlase of Penzance, at age 17. At 20 years of age, he was appointed Superintendent of the Beddoe's Pneumatic Institute, where he stu-died the effects of nitrous oxide inhalation. His later career in chemistry gained him fame and honors. He directed the Royal Institute and was made a baronet in 1818 at the age of 40. (Courtesy of the Wood Library-Museum of Anesthesiology, Park Ridge, IL.)John S. Haldane (1860–1936) was a pioneer investigator in the study of respiration a century ago. His apparatus for measurement of blood gases was described in 1892.[15] He was the first to promote oxygen therapy for respi-ratory disease,[16] and in 1905, he discovered that the carbon dioxide tension of the blood was the normal stimulus for respiratory drive.[17]Haldane experimented with self-administration of hypoxic mixtures and coined the ominous phrase that lurks within the inner recesses of the anesthesiologist's mind: "Anoxemia not only stops the machine but wrecks the machinery." He believed until his death, despite experimental evidence, that the lung ac-tively secreted oxygen into the blood from the air. His landmark monograph Respiration, published in 1922, [18] summarized his studies on the respiratory system.At the end of the 19th century, it was known that hemoglobin had a vital role in the transport of oxygen to the tis-sues. In 1896, Carl Gustav von Hufner (1840–1908) showed that the presence of hemoglobin in the blood greatly enhanced its oxygen carrying capacity, quantifying that 1 g of hemoglobin carried 1.34 mL of oxygen.[19] It was soon observed that delivery of high concentrations of oxygen to patients with advanced pulmonary disease was often inadequate to fully maintain tissue respiration and saturate hemoglobin completely. Gradients for inspired, alveolar, and arterial oxygen partial pressures became apparent after the development of the oxygen electrode by Leland C. Clark[20] (1918-) ( Fig. 1-2 ) in 1956. The Clark electrode consisted of a platinum cathode and a silver anode separated from blood by a polyethylene membrane. The platinum was negatively charged to react with any oxygen reaching it through the membrane and was sensitive to the oxygen pressure outside the membrane (i.e., the oxygen tension in the blood).Further understanding of respiratory physiology arose because of the worldwide polio epidemic that occurred roughly between the years of 1930 and 1960. Thousands of afflicted patients were kept alive with mechanical res-pirators, but the adequacy of ventilation could not be assessed without some measure of carbon dioxide tension (PCO2) in the blood. Several methods to measure PCO2indirectly were made by Donald D. Van Slyke[21] (1883–1971) and Paul B. Astrup (1915–2000), but the modern solution rested on the development of the carbon dioxide electrode. This problem was solved in 1958, when John W. Severinghaus [22] (1922-) (see Fig. 1-2 ) im-proved the accuracy of a prototype carbon dioxide electrode produced by Richard Stow (1916-), which measured pH of a thin film of electrolyte separated from the blood by a Teflon membrane through which carbon dioxide could diffuse and equilibrate. Severinghaus and A. F. Bradley (1932-) constructed the first blood gas apparatus by mounting the carbon dioxide electrode and Clark's oxygen electrode in cuvettes in a 37°C bath. To measure blood PO2 accurately, Severinghaus found it necessary to rapidly stir the blood in contact with Clark's electrode because of its high oxygen consumption rate. A pH electrode was added in 1959. Blood gas analysis made possible the rapid assessment of respiratory exchange and acid-base balance. The use of blood gas analysis was rapidly taken up by the anesthesia community and has become oneFigure 1-2 This photograph of Leland Clark and John Severing-haus, whose contributions led to modern techniques of blood gasanalysis, was taken in Clark's laboratory at the Cincinnati Children'sHospital in 1982. (From Severinghaus JW, Astrup PB: History ofBlood Gas Analysis. International Anesthesiology Clinics, Vol. 25,Boston, Little Brown, 1987.)of the most common laboratory tests performed in the modern hospital. The impact of a more in-depth under-standing of gas exchange on the practice of anesthesia is summarized by John Nunn in his book Applied Respira-tory Physiology.[23]Until the mid-20th century, the saturation of hemoglobin could be determined only by directly measuring a sam-ple of arterial blood, a technique that required an arterial puncture. Oximetry achieves the same measure noninva-sively through a finger or ear probe by using optical measures of transmitted light. Glenn Millikan, working in theJohnson Foundation for Medical Physics at the University of Pennsylvania, devised the first ear oximeter in 1942, and it was used to detect hypoxia in pilots, who flew in open cockpits during World War II. Its introduction into anesthesia practice was delayed until the discovery of pulse oximetry by a Japanese engineer, Takuo Aoyagi.[24] Pulse oximetry added the additional measure of heart rate, and it provided assurance that the signal was actually measuring a biologic parameter. A highly successful commercial product, the Nellcor pulse oximeter, was intro-duced in 1983 and had the unique feature of lowering the pitch of the pulse tone as the saturation dropped.Intravascular PressuresThe first measurement of the blood pressure was made by Stephen Hales (1677–1761) ( Fig. 1-3A ), the curate of Middlesex, England, who between sermons occupied himself with experiments on the mechanics of the circula-tion. He[25] described one of his experiments performed (1733) on an "old mare who was to be killed, as being unfit for service":I fixed a brass pipe to the carotid artery of a mare ... the blood rose in the tube till it reached to nine feet six inches in height. I then took away the tube from the artery and let out sixty cubic inches of blood, and then replaced the tube to see how high the blood would rise after each evacuation; this was repeated several times until the mare expired. In the three horses, death occurred when the height of the blood in the tube was about two feet.Hales also discovered that the resistance of a vascular bed could change by mixing alcohol in the blood, which he observed could account for changes in blood pressure brought about by diverse ingested agents. In 1828, Jean L. Poiseuille [26] (1799–1869) repeated these experiments and devised a hemomanometer that used mercury instead of the long blood-filled tubes used by Hales. Poiseuille also showed that the blood pressure varied with respira-tion.In 1854, Karl Vierrordt[27] (1818–1884) invented a sphygmograph, acting on the principle that indirect estimation of blood pressure could be accomplished by measuring the counterpressure necessary to obliterate the arterial pulsation. Scipione Riva Rocci's (1863–1937) sphygmomanometer, described in 1896,[28] used the same principle but used a rubber cuff that occluded a major arterial vessel and then slowly deflated. In 1905, Nikolai Korotkov[29] (1874–1920) described the sounds produced during auscultation over a distal portion of the artery as the cuff was deflated. The Korotkov sounds resulted in more accurate determinations of systolic and diastolic blood pressures. Oscillometric blood pressure measurements relied on a cuff that sensed the changes in arterial pulsations and was described by H. von Recklinghausen in 1931.[30]Automatic blood pressure devices based on the oscillometric method were developed in the 1970s and have become the standard noninvasive measures of arterial pressure in most hospitals.The past 50 years have seen a gradual return to direct measurements of arterial blood pressure, which are in prin-ciple much the same method used by Stephen Hales nearly 250 years ago. However, the Poiseuille method of us-ing mercury-filled glass tubes was found to be totally inadequate for recording accurate pressures in a dynamic system. In 1876, Herbert Tomlinson[31] introduced the principle of the strain gauge: resistance in a wire increases when it is stretched. It took 70 years after this principle was described before a strain gauge was used to measure blood pressure. In 1947, at the Mayo Clinic in Rochester, Minnesota, E. H. Lambert and E. H. Wood[32] first re-ported the use of a strain gauge to measure blood pressure continuously in human subjects exposed to acceleration forces. Cannulas placed directly into vessels were first described in 1949,[33] and since then, direct measurements of blood pressure expanded gradually into the operating room and intensive care units.Figure 1-3 Two scientists of the Enlightenment era. A, StephenHales (1677–1761): detail of an oil painting by T. Hudson, 1759,in the National Portrait Gallery, London. Hales was educated atCambridge University, England and was ordained as a ministerin 1703. He spent his career as minister to the parish of Ted-dington, England. His rudimentary studies on the gas producedby mixing Walton pyrites (i.e., ferric disulfide) and spirit of nitre(i.e., nitric acid) was the spark that prompted Priestley to pursuehis studies on nitric oxide, which led to the discovery of nitrousoxide in 1773. Hales was the first to measure blood pressure andcardiac output. He also developed ventilators that brought freshair into prisons and granaries. B,Albrecht von Haller(1708–1777): detail of an engraving by Ambroise Tardieu. Hal-ler was born in Bern, Switzerland. He served as professor ofmedicine and surgery at the University of Göttingen, Germany,where he began his encyclopedic work, Physiological Elementsof the Human Body,published in eight volumes between 1757and 1766. His demonstration that "irritability" was a property ofmuscle and "sensitivity" was a property of nerves was derivedfrom nearly 600 experiments on live animals. He returned toBern in 1753, and while there he published a catalog of thescientific literature containing 52,000 references. (Portraitscourtesy of the National Library of Medicine, Bethesda, MD.)Venous pressures were of less interest to anesthesiologists until convenient methods for placing cannulas into central vascular structures were described 50 years ago by Sven Seldinger.[34] Werner Forssman (1904–1979), a urologist, described the methods of central venous access and right heart catheterization in humans in 1929,[35] originally experimenting on himself, and was awarded the Nobel Prize in Physiology and Medicine for his work on venous pressures in 1956. The introduction of plastic catheters[36] gradually made it possible to measure central pressures in the clinical setting. Although arm and femoral veins were used initially, subclavian and internal jugu-lar vein cannulation eventually replaced the peripheral sites. Pulmonary artery catheterization with a bal-loon-tipped, flow-directed catheter was described in 1970[37] and has been used extensively since then by anesthe-siologists to measure cardiac outputs using the Fick[38] principle and pulmonary wedge pressures. The pulmonary artery catheter also allowed the clinician to use the well-known pressure-volume relationships of the heart de-scribed by Ernest H. Starling (1866–1927) in 1918 to maximize cardiac outputs and oxygen delivery to the tis-sues.Transesophageal echocardiography (TEE) was described in 1976[39] and used in anesthesia practice a few years later. One of the original probes used during anesthesia was fashioned from an esophageal stethoscope combined with an M-mode echocardiographic probe and was used to calculate cardiac output and ejection fraction in a 65-year-old woman undergoing mitral valve repair.[40]An improved electronic phased-array transducer was in-itially applied at the University of California, San Francisco for monitoring regional myocardial function in high-risk surgical patients.[41] Biplane TEE and color flow mapping were introduced in the 1980s and resulted in an explosive growth in TEE applications.[42] With experience and training in TEE, the anesthesiologist can quick-ly evaluate filling pressures of the heart as well as obtain measures of myocardial contractility and valvular func-tion. TEE has become a routine monitor for certain surgical procedures.AUTONOMIC NERVOUS SYSTEM AND NEUROHUMORAL TRANSMISSIONEven though the first practitioners of anesthesia had essentially no knowledge of adrenergic or cholinergic trans-mission, it would be difficult to administer anesthetics today without a thorough understanding of the autonomic nervous system and its neurotransmitters. These concepts are required in part because neuraxial blocks and gener-al anesthetics are known to profoundly alter the ability of the body to respond to fluid losses and stress. Many modern surgical and neurovascular techniques require strict control of arterial and venous pressures, and this con-trol is performed through interventions that alter autonomic tone.The first hint of involuntary control of glandular and vascular function occurred in the 17th and 18th centuries. Robert Whytt (1714–1766) was the first to describe the reflex nature of many involuntary activities,[43] and Tho-mas Willis[44] (1621–1675) had described the sympathetic chain as early as 1657. He called it the intercostal nerve because it received segmental branches from the spinal cord at each level.Pourfour du Petit (1664–1741) observed that there was a corresponding miosis and retraction of the nictitating membrane when this nerve was unilaterally cut in the neck of a cat.[45] Winslow gave the intercostal nerve the name of grand sympathique, stressing that this nerve brought the various organs of the body into sympathy,[46] a term that was originally coined by the Greek physician Soranus (98–138) in the first century AD (sym, "together," and pathos,"feeling"). Claude Bernard (1813–1878) observed vasoconstriction and pupillary dilation that fol-lowed stimulation of the same intercostal (now called the sympathetic) nerve and then described the vasomotor nerves arising between the cervical and lumbar enlargements of the spinal cord.[47]In 1889, John N. Langley (1852–1925) began his classic work on sympathetic transmission in autonomic ganglia. He blocked synaptic transmission in the ganglia by painting them with nicotine and then mapped the distribution of the presynaptic and postsynaptic autonomic nerves.[48] He observed the similarity between the effects of injec-tion of adrenal gland extracts and stimulation of the sympathetic nerves.[49] The active principle of adrenal medul-lary extracts was called epinephrine by John J. Abel[50] (1837–1938) in 1897. Abel was one of the first pharma-cologists in the United States, and with his discovery of the hormone epinephrine, he uncovered one of the mostcommonly used lifesaving agents in the anesthesiologist's pharmacopoeia. Although epinephrine is a highly prac-tical agent used by paramedics, emergency room physicians, intensivists, and anesthesiologists, Abel considered himself a "pure scientist." In one of his writings, [51] he made the following comments about the life of an investi-gator:The investigator should freely dare to attack problems not because they give promise of immediate value to the human race, but because they make an irresistible appeal of an inner beauty. ... From this point of view the inves-tigator is a man whose inner life is free in the best sense of the word. In short, there should be in research work a cultural character, an artistic quality, elements that give to painting, music, and poetry their high place in the life of man.Thomas R. Elliott (1877–1961) postulated that sympathetic nerve impulses release a substance similar to epineph-rine and considered this substance to be a chemical step in the process of neurotransmission.[52] George Barger (1878–1939) and Henry H. Dale (1875–1968) then studied the pharmacologic activity of a large series of synthet-ic amines related to epinephrine and called these drugs sympathomimetic.[53] The different effects on end organs produced by adrenal extracts and sympathetic stimulation were analyzed by Walter B. Cannon[54](1871–1945) and by Ulf Svante von Euler[55] (1905–1983). In a series of papers, these authors demonstrated that the sympa-thetic nerves released norepinephrine, whereas the adrenal gland released both epinephrine and norepinephrine.In 1907, Walter E. Dixon[56] (1871–1931) observed that the alkaloid muscarine had the same effect as stimulation of the vagus nerves on various end organs. He proposed that the nerve liberated a muscarine-like chemical that acted as a chemical mediator. In 1914, Henry H. Dale [57] investigated the pharmacologic properties of acetylcho-line and was impressed that its effects reproduced the same effects as stimulation of the craniosacral fine myeli-nated fibers that Walter H. Gaskell[58] (1847–1914) had called the bulbosacral involuntary nerves and had by then been called parasympathetic by Langley.The final proof of neurotransmission by acetylcholine through chemical mediation was provided through the ele-gant experiments of Otto Loewi (1873–1961). He stimulated the neural innervation of the frog heart and then al-lowed the perfusion fluid to come in contact with a second isolated heart preparation.[59] The resulting bradycardia provided evidence that some substance was released from the donor nerves that slowed the heart rate of the second organ.[59] Loewi and Navrail presented evidence that this substance was acetylcholine, as Dale had sug-gested. Loewi and Dale were jointly awarded the Nobel Prize in 1936 for their work on chemical neurotransmis-sion.Theodore Tuffier (1857–1929) first demonstrated the relevance of the sympathetic nervous system to the practice of anesthesia in 1900. In a series of experiments on dogs, he demonstrated the sympatholysis that occurs after spinal anesthesia.[60] Further studies on the sympathectomy resulting from neuraxial block were performed by G. Smith and W. Porter in 1915. [61] Working with cats, they concluded that the fall in blood pressure was secondary to sympathetic paralysis of the vasomotor fibers in the splanchnic vessels. Gaston L. Labat[62] (1877–1934) en-couraged the use of sympathetic stimulants such as ephedrine to counter the hypotensive effects of spinal anesthe-sia.Reversal of neuromuscular blockade rests on a fundamental understanding of two types of cholinergic receptors, muscarinic and nicotinic, originally described in 1914 by Dale.[57]Physostigmine was isolated from the West African calabar bean by T. R. Fraser (1841–1920) in 1863,[63] and he also demonstrated that atropine blocked its muscarinic side effects. Jacob Pal[64] (1863–1936) discovered the effect of physostigmine to reverse the paralytic effect of curare in 1900. Neostigmine, synthesized in 1931,[65]given alone during reversal of neuromuscular。
神经安定镇痛麻醉在超声内镜检查中的应用周晓东;李爽【摘要】目的:探讨神经安定镇痛术用于超声内镜检查中的镇痛效果与安全性.方法:选择720例行超声内镜检查患者,随机分成2组各360例,神经安定镇痛麻醉组(观察组)使用咪唑安定加舒芬合剂,单纯镇痛组(对照组)于检查前输注地佐辛,记录内镜检查时间、苏醒时间,以及检查过程中两组研究对象并发症(呛咳、恶心、呕吐等)发生情况,同时行Ramsay镇静评分.结果:观察组不良反应情况(呛咳、恶心、呕吐)发生率平均为12.6±7.9%,明显少于对照组14.5±3.5%(P<0.05);观察组Ramsay评分为6.4±0.6,明显高于对照组5.3±0.8(P<0.05).结论:超声内镜检查使用神经安定镇痛麻醉,明显提高了患者的舒适度及操作安全性.%Objective To evaluate the efficacy and safety of anesthesia of neuroleptic analgesia in examination of endoscopic ultrasonography. Methods Total 720 patients were selected to perform the endoscopic ultrasonography examination. These patients were divided into two groups: observation group and control group (360 cases in each group). The observation group used midazolam and sufentanil mixture to perform neuroleptic analgesia anesthesia; the control group used dezocine. The examination time, recovery time, adverse reactions, Ramsay sedation score in two groups were observed and recorded. Results After anesthesia, between the observation group and the control group, in these indicators, adverse reactions (cough, nausea, vomiting)(12.6±7.9% vs.14.5±3.5%) and Ramsay score(6.4±0.6 vs. 5.3±0.8), there were significantly statistic differences (P<0.05). Conclusion Neuroleptic analgesia anesthesiaused in endoscopic ultrasonography examination can improve the adaptability and safety of patients, and have clinical applied worth.【期刊名称】《中国中西医结合外科杂志》【年(卷),期】2018(024)003【总页数】3页(P302-304)【关键词】神经安定;超前镇痛;超声内镜【作者】周晓东;李爽【作者单位】天津市第二人民医院麻醉科,天津 300192;天津市第二人民医院内镜中心,天津 300192【正文语种】中文【中图分类】R614.4超声内镜(endoscopic ultrasonography,EUS)检查属侵入性检查,与常规胃镜相比,其口径较粗,硬性部较大且长,检查时间也较长,其操作可引起患者恶心、呕吐、呛咳、咽喉疼痛不适,同时令患者承受较大的心理压力,导致很多患者对该检查产生恐惧心理,甚至拒绝检查。
7 – Patient SimulationMarcus Rall,David M. Gaba,Peter Dieckmann,Christoph EichKey Points1.Simulators and the use of simulation have become an integral part of medical education, training,and research. The pace of developments and applications is very fast, and the results arepromising.2.Different types of simulators can be distinguished: computer-based or screen-basedmicrosimulators versus mannequin-based simulators. The latter can be divided into script-basedand model-based simulators.3.The development of mobile and less expensive simulator models allows for substantial expansionof simulator training to areas where this training could not be applied or afforded previously. Thebiggest obstacles to providing simulation training are not the simulator hardware but are (1)obtaining access to the learner population for the requisite time and (2) providing appropriatelytrained and skilled instructors to prepare, conduct, and evaluate the simulation sessions.4.Realistic simulations are a useful method to show mechanisms of error development (humanfactors) and to provide their countermeasures. The anesthesia crisis resource management(ACRM) course model with its ACRM key points (see Chapter 6 on Crisis Resource Management)is the de facto world standard for human factor–based simulator training. Curricula should usescenarios that are tailored to the stated teaching goals, rather than focusing solely on achievingmaximum “realism.”5.Simulator training is being adapted by many other fields outside anesthesia (e.g., emergencymedicine, neonatal care, intensive care, medical and nursing school).6.Simulators have proved to be very valuable in research to study human behavior and failuremodes under conditions of critical incidents and in the development of new treatment concepts(telemedicine) and in support of the biomedical industry (e.g., device beta-testing).7.Simulators can be used as effective research tools for studying methods of performanceassessment.8.Assessment of nontechnical skills (or behavioral markers) has evolved considerably and can beaccomplished with a reliability that likely matches that of many other subjective judgments inpatient care. Systems for rating nontechnical skills have been introduced and tested in anesthesia;one in particular (Anaesthetists' Non-Technical Skills [ANTS]) has been studied extensively andhas been modified for other fields.9.The most important part of simulator training that goes beyond specific technical skills is the self-reflective (often video-assisted) debriefing session after the scenario. The debriefing is influencedmost strongly by the quality of the instructor, not the fidelity of the simulator.10.Simulators are just the tools for an effective learning experience. The education and training,commitment, and overall ability of the instructors are of utmost importance.How can clinicians experience the difficulties of patient care without putting patients at undue risk? How can we assess the abilities of clinicians as individuals and teams when each patient is unique? These are questions that have challenged medicine for years. In recent years, these and related questions have begun to be answered in health care by the application of approaches new to medicine, but borrowed from years of successful service in other industries facing similar problems. These approaches focus on simulation, a technique well known in the military, aviation, space flight, and nuclear power industries. Simulation refers to the artificial replication of sufficient elements of a real-world domain to achieve a stated goal. The goals can include understanding the domain better, training personnel to deal with the domain, or testing the capacity of personnel to work in the domain. The fidelity of a simulation refers to how closely it replicates the domain and is determined by the number of elements that are replicated and the discrepancy between each element and the real world. The fidelity required depends on the stated goals. Some goalscan be achieved with minimal fidelity, whereas others require very high fidelity.Simulation has probably been a part of human activity since prehistoric times. Rehearsal for hunting activities and warfare was most likely an occasion for simulating the behavior of prey or enemy warriors. Technologic simulation probably dates back to the dawn of technology itself. Good and Gravenstein[1] pointed to the medieval quintain as a technologic device that crudely simulated the behavior of an opponent during sword fighting. If the swordsman did not duck at the appropriate time after striking a blow, he would be hit by a component of the quintain. In modern times, preparation for warfare has been an equally powerful spur to the development of simulation technologies, especially for aviation, shipping, and the operation of armored vehicles. These technologies have been adopted by their civilian counterparts, but they have attained their most extensive use in commercial aviation.Simulation in AviationAlthough some aircraft simulators were built between 1910 and 1927, none of them could provide the proper feel of the aircraft because they could not dynamically reproduce its behavior. In 1930, Link filed a patent for a pneumatically driven aircraft simulator. The Link Trainer was a standard for flight training before World War II, but the war accelerated its use and the further development of flight simulators. In the 1950s, electronic controls replaced pneumatic ones through analog, digital, and hybrid computers. The aircraft simulator achieved its modern form in the late 1960s, but it has been continuously refined. Aviation simulators are so realistic now that pilots with experience flying one aircraft are routinely certified to fly totally new or different aircraft, even if they have never flown the actual aircraft without passengers on board. Similar stories of the development of simulators can be told for numerous other industries.Uses of SimulatorsAlthough simulators originally were used to provide basic instruction on the operation of aircraft controls, the variety of uses of simulators in general has expanded greatly. Table 7-1 lists possible uses of simulators in all types of complex work situations. Simulation is a powerful generic tool for dealing with human performance issues (e.g., training, testing, and research) (see Chapter 6 ), for investigating human-machine interactions, and for the design and validation of equipment. As described later in this chapter, each of these uses is potentially relevant to anesthesiology. A few books are devoted solely to the topic of simulation and their use in and outside of anesthesia. [2] [3] [4]Table 7-1 -- Use of Simulators in Complex Work EnvironmentsTeam training, as human factor or CRM trainingTraining in dynamic plant controlTraining in diagnostic skillsDynamic mockup for design evaluationTest bed for checking operating instructionsEnvironment in which task analysis can be conducted (e.g., on diagnostic strategies)Test bed for new applications (e.g., telemedicine tools such as the “Guardian-Angel-System”)Source of data on human errors relevant to risk and reliability assessmentVehicle for (compulsory) testing/assessment and recertification of operatorsAdapted from Singleton WT: The Mind at Work. Cambridge, Cambridge University Press, 1989.CRM, Crisis resource management.Twelve Dimensions of SimulationCurrent and future applications of simulation can be categorized by 12 dimensions, each of which represents a different attribute of simulation ( Fig. 7-1 ).[5] Some dimensions have a clear gradient and direction, whereas others have only categorical differences. The total number of unique combinations across all the dimensions is very large (on the order of 412 to 512—4 million to 48 million). Some combinations overlap strongly with others, and some are inappropriate or irrelevant, so the actual number of meaningful combinations is much lower. Nonetheless, although the demonstrated applications of simulation in health care have been quite diverse, the space of possible applications (a large number,although quite a bit smaller than millions) has by no means been fully examined.Figure 7-1 The 12 dimensions of simulation applications (10 to 12 shown on next page). Any particular application can be represented as a point or range on each spectrum (shown by diamonds). This figure illustrates a specific application—multidisciplinary CRM-oriented decision making and teamwork training for adult intensive care unit personnel. CRM, crises resource management; ED, emergency department; ICU, intensive care unit; OR, operating room. *These terms are used according to Miller's pyramid of learning.Dimension 1: Purpose and Aims of the Simulation ActivityThe most obvious application of simulation is to improve the education and training of clinicians, but other purposes also are important. As used in this chapter, education emphasizes conceptual knowledge, basic skills, and an introduction to work practices. Training emphasizes the actual tasks and work to be performed. Simulation can be used to assess performance and competency of individual clinicians and teams, for low-stakes or formative testing and (to a lesser degree as yet) for high-stakes certification testing. [6] [7] Simulation rehearsals are now being explored as adjuncts to actual clinical practice; for example, surgeons or an entire operative team can rehearse an unusually complex operation in advance using a simulation of the specific patient. [8] [9] [10] Simulators can be powerful tools for research and evaluation, concerning organizational practices (patient care protocols) and for the investigation of human factors (e.g., of performance-shaping factors, such as fatigue,[11] or of the user interface and operation of medical equipment in high hazard clinical settings[12]). Simulation-based empirical tests of the usability of clinical equipment already have been used in designing equipment that is currently for sale; ultimately, such practices may be required by regulatory agencies before approval of new devices.Simulation can be a “bottom up” tool for changing the culture of health care concerning patient safety. First, it allows hands-on training of junior and senior clinicians about practices that enact the desired “culture of safety.”[13] Simulation also can be a rallying point about culture change and patient safety that can bring together experienced clinicians from various disciplines and domains (who may be “captured” because the simulations are clinically challenging) along with health care administrators, risk managers, and experts on human factors, organizational behavior, or institutional change.Dimension 2: Unit of Participation in the SimulationMany simulation applications are targeted at individuals. These may be especially useful for teaching knowledge and basic skills or for practice on specific psychomotor tasks. As in other high hazard industries, individual skill is a fundamental building block, but a considerable emphasis is applied at higherorganizational levels, in various forms of teamwork and interpersonal relations (often summarized under the rubric of crisis resource management (CRM) adapted from aviation cockpit resource management) (more on human factors and CRM concepts, see Chapter 6 ). [14] [18] CRM is based on empirical findings that individual performance is not sufficient to achieve optimal safety.[15] Team training may be addressed first to crews (also known as single discipline teams), consisting of multiple individuals from a single discipline, and then to teams (or multidisciplinary teams).[16] There are advantages and disadvantages to addressing teamwork in the single discipline approach that “trains crews to work in teams” versus the “combined team training” of multiple disciplines together.[21] For maximal benefit, these approaches are used in a complementary fashion.Teams exist in actual “work units” in an organization (e.g., a specific intensive care unit [ICU]), each of which is its own target for training. There also is growing interest and experience in applying simulation to nonclinical personnel and work units in health care organizations (e.g., to managers or executives)[22] and to organizations as a whole (e.g., entire hospitals or networks).Dimension 3: Experience Level of Simulation ParticipantsSimulation can be applied along the entire continuum of education of clinical personnel and the public at large. It can be used with early learners, such as schoolchildren, or lay adults to facilitate bioscience instruction, to interest individuals in biomedical careers, or to explain health care issues and practices. The major role of simulation has been, and will continue to be, to educate, train, and provide rehearsal for individuals actually involved in the delivery of health care. Simulation is relevant from the earliest level of vocational or professional education (students) and during apprenticeship training (interns and residents), and increasingly for experienced personnel undergoing periodic refresher training. Simulation can be applied regularly to practicing clinicians (as individuals, teams, or organizations) regardless of their seniority, [17] [18] providing an integrated accumulation of experiences that should have a long-term synergism.Dimension 4: Health Care Domain in Which the Simulation Is AppliedSimulation techniques can be applied across nearly all health care domains. Although much of the attention on simulation has focused on technical and procedural skills applicable in surgery, [11] [20] [21] [22] obstetrics, [23] [24] invasive cardiology, [25] [26] and other related fields, another bastion of simulation has been recreating whole patients for dynamic domains involving high hazard and invasive intervention, such as anesthesia, [27] [28] critical care, [29] [30] and emergency medicine. [31] [32] [33] [34] Immersive techniques can be used in imaging-intensive domains, such as radiology and pathology, and interactive simulations are relevant in the interventional sides of such arenas.[35] In many domains, simulation techniques have been useful for addressing nontechnical skills and professionalism issues, such as communicating with patients and coworkers, or addressing issues such as ethics or end-of-life care.Dimension 5: Health Care Disciplines of Personnel Participating in the SimulationSimulation is applicable to all disciplines of health care, not only to physicians. In anesthesiology, simulation has been applied to anesthesiologists, Certified Registered Nurse Anesthetists, and anesthesia technicians. Simulation is not limited to clinical personnel. It also may be directed at managers, executives, hospital trustees, regulators, and legislators. For these groups, simulation can convey the complexities of clinical work, and it can be used to exercise and probe the organizational practices of clinical institutions at multiple levels.Dimension 6: Type of Knowledge, Skill, Attitudes, or Behavior Addressed in SimulationSimulations can be used to help learners acquire new knowledge and to understand conceptual relations and dynamics better. Today physiologic simulations allow students to watch cardiovascular and respiratory functions unfold over time and respond to interventions—in essence making textbooks, diagrams, and graphs “come alive.” The next step on the spectrum is acquisition of skills to accompany knowledge. Some skills follow immediately from conceptual knowledge (e.g., cardiac auscultation), whereas others involve intricate and complex psychomotor activities (e.g., catheter placement or basic surgical skills). Isolated skills must be assembled into a new layer of clinical practices. An understanding of the concepts of general surgery cannot be combined only with basic techniques of dissecting and suturing or manipulation of instruments to create a capable laparoscopic surgeon. Basic skills must be integrated into actual clinical techniques, a process for which simulation may have considerable power, especially because it can readily provide experience with even uncommon anatomic or clinical presentations. In the current health care system, for most invasive procedures, novices at a task typically first perform the task on a real patient,albeit under some degree of supervision. They climb the learning curve, working on patients with varying levels of guidance. Simulation offers the possibility of having novices practice extensively before they begin to work on real patients as supervised “apprentices.”In this way and others, simulation is applicable to clinicians throughout their careers to support lifelong learning. It can be used to refresh skills for procedures that are not performed often. Knowledge, skills, and practices honed as individuals must be linked into effective teamwork in diverse clinical teams, which must operate safely in work units and larger organizations. [36] [37] [38] Perpetual rehearsal of responses to challenging events is needed because the team or organization must be practiced in handling them as a coherent unit.Dimension 7: Age of the Patient Being SimulatedSimulation is applicable to nearly every type and age of patient literally from “cradle to grave.” Simulation may be particularly useful for pediatric patients and clinical activities because neonates and infants have smaller physiologic reserves than most adults. [40] [41] Fully interactive neonatal and pediatric patient simulators are now available. Simulation also addresses issues of the elderly and end-of-life issues for every age.Dimension 8: Technology Applicable or Required for SimulationsTo accomplish these goals, various technologies (including no technology) are relevant for simulation. Verbal simulations (“what if” discussions), paper and pencil exercises, and experiences with standardized patient actors [41] [42] [43] require no technology, but can effectively evoke or recreate challenging clinical situations. Similarly, very low technology—even pieces of fruit or simple dolls—can be used for training in some manual tasks. Certain aspects of complex tasks and experiences can be recreated successfully with little technology. Some education and training on teamwork can be accomplished with role playing, analysis of videos, or drills with simple mannequins.[44]Ultimately, learning and practicing complex manual skills (e.g., surgery, cardiac catheterization) or practicing the dynamic management of life-threatening clinical situations that include risky or noxious interventions (e.g., intubation or defibrillation) can only be fully accomplished using either animals—which for reasons of cost and issues of animal rights is becoming very difficult—or a technologic means to recreate the patient and the clinical environment. The different types of simulation technologies relevant to anesthesiology are discussed later in this chapter.Dimension 9: Site of Simulation ParticipationSome types of simulation—those that use videos, computer programs, or the Web—can be conducted in the privacy of the learner's home or office using his or her own equipment. More advanced screen-based simulators might need more powerful computer facilities available in a medical library or learning center. Part-task trainers and virtual reality simulators are usually fielded in a dedicated skills laboratory. Mannequin-based simulation also can be used in a skills laboratory, although the more complex recreations of actual clinical tasks require either a dedicated patient simulation center with fully equipped replicas of clinical spaces or the ability to bring the simulator into an actual work setting (in-situ simulation). There are advantages and disadvantages to doing clinical simulations in situ versus in a dedicated center. Using the actual site allows training of the entire unit with all its personnel, procedures, and equipment. There would at best be limited availability of actual clinical sites, and the simulation activity may distract from real patient care work. The dedicated simulation center is a more controlled and available environment, allowing more comprehensive recording of sessions, and imposing no distraction on real activities. For large-scale simulations (e.g., disaster drills), the entire organization becomes the site of training.Video conferencing and advanced networking may allow even advanced types of simulation to be conducted remotely (see dimension 10 ). The collaborative use of virtual reality surgical simulators in real time already has been shown, even with locations that are separated by thousands of miles (see later for more on Site of Simulator).Dimension 10: Extent of Direct Participation in SimulationMost simulations—even screen-based simulators or part-task trainers—were initially envisioned as highly interactive activities with significant direct “on-site” hands-on participation. Not all learning requires direct participation, however. Some learning can occur merely by viewing a simulation involving others, as one can readily imagine being in the shoes of the participants. A further step is to involve the remote viewers either in the simulation itself or in debriefings about what transpired. Several centers have been usingvideoconferencing to conduct simulation-based exercises, including morbidity and mortality conferences.[45] Because the simulator can be paused, restarted, or otherwise controlled, the remote audience can readily obtain more information from the on-site participants, debate the proper course of action, and discuss with those in the simulator how best to proceed.Dimension 11: Feedback Method Accompanying SimulationSimilar to real life, one can learn a great deal just from simulation experiences themselves, without any additional feedback. For many simulations, specific feedback is provided to maximize learning. On screen-based simulators or virtual reality systems, the simulator itself can provide feedback about the participant's actions or decisions,[46] particularly for manual tasks where clear metrics of performance are readily delineated. [47] [48] More commonly, human instructors provide feedback. This can be as simple as having the instructor review records of previous sessions that the learner has completed alone. For many target populations and applications, an instructor provides real-time guidance and feedback to participants while the simulation is going on. The ability to start, pause, and restart the simulation can be valuable. For the most complex uses of simulation, especially when training experienced personnel, the typical form of feedback is a detailed postsimulation debriefing session, often using audio-video recordings of the scenario. Waiting until after the scenario is finished allows experienced personnel to apply their collective skills without interruption, and then allows them to see and discuss the advantages and disadvantages of their behaviors, decisions, and actions.Dimension 12: Organizational, Professional, and Societal Embedding of SimulationAnother important dimension is the degree to which the simulation application is embedded into an organization or industry.[19] Being highly embedded may mean that the simulation is a formal requirement of the institution or is mandated by the governmental regulator. Another aspect of embedding would be that—for early learners—the initial (steep) part of the learning curve would be required to occur in a simulation setting before the learners are allowed to work on real patients under supervision. Also, complete embedding of simulation into the workplace would mean that simulation training is a normal part of the work schedule, rather than being an “add-on” activity attended in the “spare time” of clinicians. Conceptual Issues about Patient Simulation“The key is the programme, not the hardware,” was a truth about simulation learned early in aviation. Using simulators in a goal-oriented way is equally or more about the conceptual aspects of the technique than it is about the technology of the simulation devices. An understanding of the conceptual and theoretical aspects of the use of simulation techniques can be helpful for determining the right applications of the technique and the important matchups that must be made in the design and conduct of simulation exercises to get the best results. When used most effectively, simulation can be—to borrow a line from the band U2—“even better than the real thing.”[20] The concepts discussed in this section concern the nature of “realism” and “reality” as they apply to simulation, and the way that these issues relate to the goals of simulation endeavors to generate a complex social undertaking. The ideas and concepts presented here are largely contributed by Peter Dieckmann and his adaptation of broader psychological concepts to simulation in medicine.Reality and Realism of SimulationUnless we are dreaming about it or we are an unrepentant solipsist, a simulation exercise is always “real” (it is actually happening), but it may or may not be a “realistic” replication of the reality that is the target of the exercise. Realism addresses the question of how closely a replication of a situation resembles the target. A key distinction is between a simulator (a device) and a simulation (the exercise in which the device is used).A simulator could be indistinguishable from a real human being (e.g., a standardized patient actor) and yet be used in an implausible and useless fashion. Conversely, certain kinds of realism (see later) can be evoked by simulation exercises that use very simple simulators or even no simulator at all (as in role-playing when the participants in a sense “become the simulator”). Merely creating a realistic simulation does not guarantee that it would have any meaning or utility (e.g., learning). [21] [22] A closely related aspect concerns the question of relevance and the social character of simulation.Three Distinct Dimensions for Simulation RealismA lot has been written about simulator and simulation realism using a variety of terms and concepts that are subtly different and often overlapping. These include physical fidelity (the device replicates physical aspects of the human body), environmental fidelity (the simulation room looks like an operating room), equipment fidelity (the clinical equipment works like or is the real thing), and psychological fidelity[23] (the simulationevokes behaviors similar to the real situation), and a variety of forms of validity, such as face validity (“looks and feels” real to participants), content validity (the exercise covers content relevant to the target situation), construct validity (the simulation can replicate performance or behavior according to predefined constructs about work in real situations), and predictive validity (performance during a simulation exercise predicts performance in an analogous real situation). [24] [25]The results of studies trying to investigate roots and effects of simulation “realism” are not conclusive partly because they may each concentrate on a different aspect of this complex whole. It is simply not true that maximum “realism” is either needed or desired for every type of simulation endeavor. For some applications with some target populations, it can be highly advantageous to reduce the realism to heighten the learning experience.[26]In 2007, we published an article attempting to clarify some issues about realism, reality, relevance, and the purposes of conducting simulations.[21] We applied the model of thinking about reality by the German psychologist Laucken to the realism of simulation.[21] Laucken described three modes of thinking—physical, semantical, and phenomenal, which have been renamed physical, conceptual, and emotional and experiential modes by colleagues in Boston.[27]Physical ModeThe physical mode concerns aspects of the simulation that can be measured in fundamental physical and chemical terms and dimensions (e.g., centimeters, grams, and seconds). The weight of the mannequin, the force generated during chest compressions, and the duration of a scenario all are physical aspects of the simulation reality. Existing simulator mannequins have many “unrealistic” physical elements despite their roughly human shape: they are made of plastic, not flesh and bone; they may have unusual mechanical noises detectable during auscultation of the chest; the “skin” does not change color. Some physical properties are not readily detectable and can be manipulated. Some clinical equipment used in mannequin-based simulation is fully functional and physically identical to the “real thing,” although in some cases functional physical limitations may have been introduced for convenience or for safety.[24] Labeled syringes may contain only water instead of opioids, or a real defibrillator may have been modified so that it does not actually deliver a shock (one manufacturer sells a “Hollywood defibrillator”). That certain physical properties and functions have been altered is not usually apparent to participants, at least without special briefings or labels.Semantical ModeThe semantical mode of thinking concerns concepts and their relationships. Within the semantical mode, a simulation of hemorrhage might be described in conceptual terms as “bleeding” of flow rate X beginning at time Y occurring at site Z and associated with a blood pressure of B that is a decrease from the prior value of A. In this mode of thinking, it is irrelevant how the information is transmitted or represented. The same pieces of information could be represented using a vital signs monitor, a verbal description, or the tactile perception of decreasingly palpable pulses. The semantical recoding of physical objects is the cornerstone of simulation. It allows the simulation exercise to represent a real situation, and it allows water-filled syringes to be treated as if they contain a drug.Phenomenal ModeThe phenomenal mode deals with the “experience,” including emotions and beliefs triggered by the situation. For many purposes, providing high phenomenal realism is the key goal, and the physical realism and semantic realism are merely means to this end.Relevance versus RealityA naive view of simulation would suggest that greater realism in all modes would lead to better achievement of the goals of simulation; this view has been criticized repeatedly. [23] [28] [29] Simulation is a complex social endeavor, conducted with different target populations for different purposes. The relevance of a simulation exercise concerns the match between the characteristics of the exercise and the reasons for which the exercise is conducted. Different elements of realism are emphasized or sacrificed to maximize the relevance of a simulation exercise. When training on invasive procedures, it is typical to forgo phenomenal realism and emphasize physical and semantic realism so that psychomotor skills can be the focus. Semantic realism may be sacrificed, especially to help early learners. Situations that might become lethal (e.g., trigger a cardiac arrest) very quickly may be slowed down so that inexperienced clinicians can try to think their way out of the problem. If such a situation were allowed to evolve at its normal speed, it would。