To use all functions of this page, please activate cookies in your browser.
With an accout for my.bionity.com you can always see everything at a glance – and you can configure your own website and individual newsletter.
- My watch list
- My saved searches
- My saved topics
- My newsletter
Fluoroscopy is an imaging technique commonly used by physicians to obtain real-time images of the internal structures of a patient through the use of a fluoroscope. In its simplest form, a fluoroscope consists of an x-ray source and fluorescent screen between which a patient is placed. However, modern fluoroscopes couple the screen to an x-ray image intensifier and CCD video camera allowing the images to be played and recorded on a monitor. The use of x-rays, a form of ionizing radiation, requires that the potential risks from a procedure be carefully balanced with the benefits of the procedure to the patient. While physicians always try to use low dose rates during fluoroscopy procedures, the length of a typical procedure often results in a relatively high absorbed dose to the patient. Recent advances include the digitization of the images captured and flat-panel detector systems which reduce the radiation dose to the patient still further.
The beginning of fluoroscopy can be traced back to 8 November 1895 when Wilhelm Röntgen noticed a barium platinocyanide screen fluorescing as a result of being exposed to what he would later call x-rays. Within months of this discovery, the first fluoroscopes were created. Early fluoroscopes were simply cardboard funnels, open at narrow end for the eyes of the observer, while the wide end was closed with a thin cardboard piece that had been coated on the inside with a layer of fluorescent metal salt. The fluoroscopic image obtained in this way is rather faint. Thomas Edison quickly discovered that calcium tungstate screens produced brighter images and is credited with designing and producing the first commercially available fluoroscope. In its infancy, many incorrectly predicted that the moving images from fluoroscopy would completely replace the still x-ray radiographs, but the superior diagnostic quality of the earlier radiographs prevented this from occurring.
Ignorance of the harmful effects of x-rays resulted in the absence of standard radiation safety procedures which are employed today. Scientists and physicians would often place their hands directly in the x-ray beam resulting in radiation burns. Trivial uses for the technology also resulted, including the shoe-fitting fluoroscope used by shoe stores in the 1930s-1950s.
Due to the limited light produced from the fluorescent screens, early radiologists were required to sit in a darkened room, in which the procedure was to be performed, accustomizing their eyes to the dark and thereby increasing their sensitivity to the light. The placement of the radiologist behind the screen resulted in significant radiation doses to the radiologist. Red adaptation goggles were developed by Wilhelm Trendelenburg in 1916 to address the problem of dark adaptation of the eyes, previously studied by Antoine Beclere. The resulting red light from the goggles' filtration correctly sensitized the physician's eyes prior to the procedure while still allowing him to receive enough light to function normally.
The development of the X-ray image intensifier and the television camera in the 1950s revolutionized fluoroscopy. The red adaptation goggles became obsolete as image intensifiers allowed the light produced by the fluorescent screen to be amplified, allowing it to be seen even in a lighted room. The addition of the camera enabled viewing of the image on a monitor, allowing a radiologist to view the images in a separate room away from the risk of radiation exposure.
More modern improvements in screen phosphors, image intensifiers and even flat panel detectors have allowed for increased image quality while minimizing the radiation dose to the patient. Modern fluoroscopes use CsI screens and produce noise-limited images, ensuring that the minimal radiation dose results while still obtaining images of acceptable quality.
Because fluoroscopy involves the use of x rays, a form of ionizing radiation, all fluoroscopic procedures pose a potential health risk to the patient. Radiation doses to the patient depend greatly on the size of the patient as well as length of the procedure, with typical skin dose rates quoted as 20-50 mGy/min. Exposure times vary depending on the procedure being performed, but procedure times up to 75 minutes have been documented. Because of the long length of some procedures, in addition to standard cancer-inducing stochastic radiation effects, deterministic radiation effects have also been observed ranging from mild erythema, equivalent of a sun burn, to more serious burns.
A study has been performed by the Food and Drug Administration (FDA) entitled Radiation-induced Skin Injuries from Fluoroscopy with an additional publication to minimize further fluoroscopy-induced injuries, Public Health Advisory on Avoidance of Serious X-Ray-Induced skin Injuries to Patients During Fluoroscopically-Guided Procedures.
While deterministic radiation effects are a possibility, radiation burns are not typical of standard fluoroscopic procedures. Most procedures sufficiently long in length to produce radiation burns are part of necessary life-saving operations.
The first fluoroscopes consisted of an x-ray source and fluorescent screen between which the patient would be placed. As the x rays pass through the patient, they are attenuated by varying amounts as they interact with the different internal structures of the body, casting a shadow of the structures on the fluorescent screen. Images on the screen are produced as the unattenuated x rays interact with atoms in the screen through the photoelectric effect, giving their energy to the electrons. While much of the energy given to the electrons is dissipated as heat, a fraction of it is given off as visible light, producing the images. Early radiologists would adapt their eyes to view the dim fluoroscopic images by sitting in darkened rooms, or by wearing red adaptation goggles.
X-ray Image Intensifiers
The invention of X-ray image intensifiers in the 1950s allowed the image on the screen to be visible under normal lighting conditions, as well as providing the option of recording the images with a conventional camera. Subsequent improvements included the coupling of, at first, video cameras and, later, CCD cameras to permit recording of moving images and electronic storage of still images.
Modern image intensifiers no longer use a separate fluorescent screen. Instead, a cesium iodide phosphor is deposited directly on the photocathode of the intensifier tube. On a typical general purpose system, the output image is approximately 105 times brighter than the input image. This brightness gain comprises a flux gain (amplification of photon number) and minification gain (concentration of photons from a large input screen onto a small output screen) each of approximately 100. This level of gain is sufficient that quantum noise, due to the limited number of x-ray photons, is a significant factor limiting image quality.
Image intensifiers are available with input diameters of up to 45 cm, and a resolution of approximately 2-3 line pairs mm-1.
The introduction of flat-panel detectors allows for the replacement of the image intensifier in fluoroscope design. Flat panel detectors offer increased sensitivity to X-rays, and therefore have the potential to reduce patient radiation dose. Temporal resolution is also improved over image intensifiers, reducing motion blurring. Contrast ratio is also improved over image intensifiers: flat-panel detectors are linear over a very wide latitude, whereas image intensifiers have a maximum contrast ratio of about 35:1. Spatial resolution is approximately equal, although an image intensifier operating in 'magnification' mode may be slightly better than a flat panel.
Flat panel detectors are considerably more expensive to purchase and repair than image intensifiers, so their uptake is primarily in specialties that require high-speed imaging, e.g., vascular imaging and cardiac catheterization.
In addition to spatial blurring factors that plague all x-ray imaging devices, caused by such things as Lubberts effect, K-fluorescence reabsorption and electron range, fluoroscopic systems also experience temporal blurring due to system lag. This temporal blurring has the effect of averaging frames together. While this helps reduce noise in images with stationary objects, it creates motion blurring for moving objects. Temporal blurring also complicates measurements of system performance for fluoroscopic systems.
Common procedures using fluoroscopy
Another common procedure is the modified barium swallow study during which barium-impregnated liquids and solids are ingested by the patient. A radiologist records and, with a speech pathologist, interprets the resulting images to diagnose oral and pharyngeal swallowing dysfunction. Modified barium swallow studies are also used in studying normal swallow function.
|This article is licensed under the GNU Free Documentation License. It uses material from the Wikipedia article "Fluoroscopy". A list of authors is available in Wikipedia.|