Wir freuen uns über jede Rückmeldung. Ihre Botschaft geht vollkommen anonym nur an das Administrator Team. Danke fürs Mitmachen, das zur Verbesserung des Systems oder der Inhalte beitragen kann. ACHTUNG: Wir können an Sie nur eine Antwort senden, wenn Sie ihre Mail Adresse mitschicken, die wir sonst nicht kennen!
unbekannter Gast

Path to DICOM: Driven by quality control and Information exchange#

by Gerald Q. Maguire Jr., KTH Royal Institute of Technology, School of Electrical Engineering and Computer Science, Department of Computer Science and
Marilyn E. Noz,New York University, School of Medicine, Department of Radiology

In 1974, Marilyn E. Noz left the Physics faculty at Indiana University of Pennsylvania (IUP) to join the faculty of the Department of Radiology at New York University. She applied her knowledge of nuclear physics in the Nuclear Medicine Division of the Department of Radiology. At that time computers were just being to be introduced, with each vendor creating their own hardware and software system. Additionally, most of these systems were closed proprietary systems that could only exchange information with another machine of the same make and model from one manufacturer. Even the 8-inch floppy disks were incompatible. If these systems had an external interface (and this was rare), it was generally limited to an RS-232 connector.Marilyn had experience with programming via NYU’s Courant Institute in an NSF program using their CDC6000 and at IUP with the university’s Xerox Sigma 6 computer and the physics department’s digital and analog computers.

In the Summers of 1974 and 1975, Gerald Q. “Chip” Maguire Jr. worked in the Lindenbaum-Ozaki High Energy Experimental Physics Group at Brookhaven National Laboratory. This was the first high-high energy research group to do on-line data analysis from the data coming from the detectors. One of the techniques that the group pioneered was monitoring the detectors by doing kinematic reconstructions of some of the data every other second and analyzing the data to see if there were any signs of systematic problems with the detector (for example, a broken wire in a magnetostrictive spark chamber or a proportional wire chamber detector). During his first summer he was tasked to convert seventy thousand lines of Digital Equipment Corporation (DEC) PDP-10 assembly code to FORTRAN. The motivation for this was to utilize FORTRAN IV’s overlay mechanism, as the existing assembly coded program had reached the limits of the 18 bit address space (of 36 bit words) and it was not possible to add any additional code.

The BNL On-Line Data Facility (OLDF) had two PDP-10 computers, one for on-line analysis and the other for development. Each computer was located in a 40 x 10 foot (approximately12m x 3m) refrigerated trailer (the kind that would normally be used to ship meat and frozen vegetables)[1]. The two trailers were interconnected by a third trailer, a fourth trailer contained teletypes and a line printer. All of the trailers were located in the parking lot beside the building with all of the detectors and the 1 T dipole magnet (as part of the Multi-particle Spectrometer (MPS) with 1 Million feet of wiring (~304.8km) each measured to the nanosecond). The computers were connected to a system that allowed the data links from the experiments to switch between the computers – so that the development computer worked as a hot spare for the system doing on-line data analysis. There were a number of teletypes and CRT equipped terminals in the trailers and in the user facilities in the adjacent experimental halls. Experiments ran 24 hours a day and there were two shifts, 8AM to 8PM and 8PM to 8AM. Chip worked the night shift and another summer student worked the day shift. The two students overlapped for an extra hour at each shift change over to update the other on what had happened during their shift. His second summer was spent developing visualization of the experimental data (both reconstructions and detector by detector views of the detailed status of each detector). This visualization was done using the recently introduced DEC GT40 vector graphics display and it featured a light pen and keyboard for user interaction.

The result is that both authors had experience with computing and Chip was involved in on.-line data analysis, so this lead to a series of papers about using computing in support of nuclear medicine imaging, starting with [2, 3]. Additionally, James H. "Jim" Schimpf, who had done his Master’s degree in Physics at IUP, joined the Department of Radiology. The head of the department R. Thomas Bergeron, MD, wanted to improve quality control of the X-ray imaging in the department. The first step was that a step wedge (a meter phantom that had a series of every increasing thickness of metal) was imaged on each X-ray system each day and the films of these exposures were developed and then digitized. Initially, this was done using a Joyce-Loebl microdensitometer that plotted the optical density at a given point on the film by comparing the light through the film and a 10 μm slit versus the light that did not go through the film. This was plotter on an analog plotter as the film was scanned along a line along the image of the step wedge. This was augmented with the digitization of the plotter’s output to create a computer interface to microdensitometer [4]. The result was that the digital data could be analyzed to track the behavior of each X-ray source and each film developing unit. The result was that one could see how the film was aging (i.e., the base fog level of the film increased simply as the file was stored), problems with the X-ray source (as the X-ray tube degrades with use), problems with the film developers, and other details of the entire process from X-ray source to final film. If there were problems along this process they could be addressed before there were human observable flaws in the resulting X-ray images (that would require redoing the X-ray).

In this climate of nuclear medicine increasingly using computers to collect and process data, there was a major advance in radiology – the introduction of computed tomography (CT). So the next thing that Dr. Bergeron asked Jim to do was to figure out which CT scanner the hospital should buy. To compare the different CT scanners, Jim took a CT thorax phantom (i.e. a device that was design to mimic the properties of a human thorax in terms of electron density – which is what the CT scanner measures). At that time the way to get the digital data out of the CT scanner was print the numeric values from scan slices on a line printer, While many of these scanners were equipped with some form of magnetic media (typically a ½ inch 9-track tape drive), each manufacturer used a proprietary format for their data. As Jim wanted to understand which scanner was “best” he scanned the same phantom with each of the different scanners around New York City over a period of weeks (resulting in multiple scans over time with each machine). Based upon analyzing this data he was able to understand both the resolution of the scanner and its reproducibility – as both were important.

A side effect of the above was that Jim had a stack of tapes, but no way to use the data on them. So an effort began to decode the data on these tapes. Fortunately, in 1976 the Nuclear Medicine department had gotten a Varian Data Machines (VDM) V76 minicomputer and nuclear medicine image processing software that had been developed in England in BCPL[5].

BCPL was a predecessor language to C. This computer system featured 32 Kbytes of 16-bit memory (later expanded to 64 KB) and a hard disk drive with a built in 5 MB disk and a removable 5 MB platter. To boot the computer you used a paper table (in actuality a Mylar tape – as paper tapes soon wore out) to install a loader and then loaded the first program from the hard disk. Programs were developed as a script that chained a series of executables into memory in order to carry out the overall processing of the data (BCPL supported the idea of a Global vector that could remain in memory as each of the executables were loaded (similar to using FORTRAN IV’s common with overlays). On 11 Jan 1978, Nodecrest Ltd. was formed by employees who had left Varian. They continued the development of both a nuclear medicine system and a radiation therapy treatment planning systems on the VDM computer and their own hardware for data acquisition and display. The first display systems output a 256x256 image in PAL format – hence we had to use a PAL TV as the display. Later this was replaced by a system that output 512x512 images on a color CRT monitor. On 13 May 1980, Nodecrest Medical Systems Limited was incorporated. By September 1986, Nodecrest had introduced their Micas 5 system [6] running on a Motorola MC68020 – the first medical image processing system to use a Sun Microsystems workstation as a platform. The system ran UNIX 4.2 BSD as its operating system. In the first week of March 1986, Chip wrote the device drivers for the display hardware that Nodecrest designed and built. Later (in 1987), he wrote the device driver for their bitmapped laser printer (including software to emulate a phototypesetter – so one could use nroff and troff to print documents in addition to images). In 1983, this lead to the first all-digital nuclear medicine department at Robert Wood Johnson University Hospital, Rutgers Medical School, New Brunswick, NJ, USA [7].

Inspired by the Name-Type-Value approach used to store auxiliary information in the Rochester Image Format developed by Peter G. Selfridge [8]. Chip together with two colleagues at the University of Utah wrote a report the proposed a Key-Value format using ASCII for the information about an image. So rather than focus on binary encodings as the Name-Type-Value approach used, this new approach simply used a string for the key, followed by “:=”, followed by the value as printed in an ASCII string. This removed the encoding of what type of computer, how floating point numbers were represented, etc. that Selfridge had used. The report was subsequently published by the American Association of Physicists in Medicine (AAPM) as report 10 [9]. This standard was adopted by the NIH, National Cancer Institute, Radiation Oncology section for image and radiation treatment planning exchange (thus it is used to exchange images for their quality control/evaluation of radiation treatment plans). A large number of proprietary formats had been decoded and it was possible to convert them to this AAPM format [10, 11].

In January 1989, the EEC COST-B2 committee adopted a standard based on my additions to the original AAPM format as the European standard for exchange of Nuclear Medicine images. This was also adopted as the standard in Canada, New Zealand, and Australia. This all helped lead to the American College of Radiology (ACR) and the National Electrical Manufacturers Association (NEMA) Standard 300-1985 – popularly known as Digital Imaging and Communications in Medicine (DICOM). Unfortunately, DICOM uses a binary encoding where the identifiers (composed of Group Number followed by an Element Number) have to be looked up in the standard. Additionally, the type can be explicit or implicit depending on the identifier. Similarly there can be an implicit or explicit length for the value. Additionally, odd numbered groups are private and vendors can define their own identifiers – hence to interpret these you need to have information from the specific vendor. In 1992, the first filmless Radiology Department was opened at the Social Medical Center East - Danube Hospital (SMZ-Ost) [12]– and the devices could exchange information using DICOM.


  • [1] Seymour J. Lindenbaum and Satoshi Ozaki, ‘Operational experience with the Brookhaven on-line data facility’, CERN, pp. 921–930, 1970 [Online]. DOI: 10.5170/CERN-1970-021-V-2.921
  • [3] M E Noz, J H Schimpf, and G Q Maguire Jr., ‘A modular computer system for the nuclear medicine/ultrasound laboratory: a multidisciplinary proposal.’, Journal of Medical Systems, vol. 1, no. 3, pp. 251–261, 1977. DOI: 10.1007/BF02222587
  • [4] James H. Schimpf, Steven Horii, Gerald Q. Maguire Jr., and Marilyn E. Noz, ‘Design and construction of a microdensitometer computer interface’, J Med Syst, vol. 2, no. 4, pp. 315–326, Dec. 1978 [Online]. DOI: 10.1007/BF02221897
  • ~5] Martin Richards and Colin Whitby-Strevens, BCPL, the language and its compiler. Cambridge ; New York: Cambridge University Press, 1979, ISBN: 978-0-521-21965-5.
  • [6] ‘Advertisement: Micas System 5 Computer’, JNM: The Journal of Nuclear Medicine, vol. 27, no. 9, p. 33A, Sep. 1986 *[Online]. Available: http://jnm.snmjournals.org/content/27/9/local/advertising.pdf
  • [7] M E Noz, W A Erdman, G Q Maguire, T J Stahl, R J Tokarz, K L Menken, and J A Salviani, ‘Modus operandi for a picture archiving and communication system.’, Radiology, vol. 152, no. 1, pp. 221–223, 1984.
  • [8] Peter G. Selfridge, ‘A Flexible Data Structure for Accessory Image Information’, University of Rochester. Computer Science Department, Rochester, NY, USA, Technical Report TR-45, May 1979 [Online]. Available: https://apps.dtic.mil/dtic/tr/fulltext/u2/a069823.pdf, https://urresearch.rochester.edu/fileDownloadForInstitutionalItem.action?itemId=12351&itemFileId=28247
  • [9] G. Q. Maguire, B. S. Baxter, and L. E. Hitchner, ‘An American-Association-of-Physicists-in-Medicine (AAPM) Standard Magnetic-tape Format for Digital Image Exchange’, Proceedings of the Society of Photo-Optical Instrumentation Engineers, vol. 318, pp. 284–293, 1982 [Online]. Available: https://www.aapm.org/pubs/reports/rpt_10.pdf
  • [10] G Q Maguire Jr. and M E Noz, ‘Image formats: five years after the AAPM standard for digital image interchange’, Med Phys, vol. 16, no. 5, pp. 818–823, Oct. 1989 [Online]. DOI: 10.1118/1.596433
  • [11] David P .Reddy, Gerald Q. Maguire Jr., Marilyn E. Noz, and Robert Kenny, ‘Automating Image Format Conversion - Twelve Years and Twenty-five Formats Later’, in Proceedings of the International Symposium : Car’93 Computer Assisted Radiology/Computergestutzte Radiologie :, 1993, pp. 253–258.
  • [12] W. Hruby, H. Mosser, and W. Krampla, ‘The Danube Hospital digital radiology as a prerequisite for a health care network’, International Congress Series, vol. 1281, pp. 992–996, May 2005 [Online]. DOI: 10.1016/j.ics.2005.03.320