The principle of ultrasound: Difference between revisions

Jump to navigation Jump to search
no edit summary
No edit summary
No edit summary
Line 156: Line 156:


Image production is a complex process.  Echo instrumentation must generate and transmit the ultrasound and receive the data.  Then the data needs to be amplified, filtered and processed.  Eventually the final result needs to be displayed for the clinician to view the ultrasound information.  As the first step in data processing, the returning ultrasound signals need to be converted to voltage.  Since their amplitude is usually low, they need to be amplified.  The ultrasound signal usually is out of phase so it needs to be realigned in time.  At this point one has the raw frequency (RF) data, which is usually high frequency with larger variability in amplitudes and it has background noise.  The next step is filtering and mathematical manipulations (logarithmic compression, etc) to render this data for further processing.  At this stage one has sinusoidal data in polar coordinates with distance and an angle attached to each data point.  This information needs to be converted to Cartesian coordinate data using fast Fourier transform functions.  Once at this stage, the ultrasound data can be converted to analog signal for video display and interpretation.   
Image production is a complex process.  Echo instrumentation must generate and transmit the ultrasound and receive the data.  Then the data needs to be amplified, filtered and processed.  Eventually the final result needs to be displayed for the clinician to view the ultrasound information.  As the first step in data processing, the returning ultrasound signals need to be converted to voltage.  Since their amplitude is usually low, they need to be amplified.  The ultrasound signal usually is out of phase so it needs to be realigned in time.  At this point one has the raw frequency (RF) data, which is usually high frequency with larger variability in amplitudes and it has background noise.  The next step is filtering and mathematical manipulations (logarithmic compression, etc) to render this data for further processing.  At this stage one has sinusoidal data in polar coordinates with distance and an angle attached to each data point.  This information needs to be converted to Cartesian coordinate data using fast Fourier transform functions.  Once at this stage, the ultrasound data can be converted to analog signal for video display and interpretation.   
Image display has evolved substantially in clinical ultrasound.  Currently, 2D and real time 3D display of ultrasound date is utilized.  Without going into complexities of physics that are involved in translating RF data into what we see every day when one reads echo, the following section will provide the basic knowledge of image display.  If one can imagine a rod that is imaged and displayed on an oscilloscope, it would look like a bright spot.  Displaying it as a function of amplitude (how high is the return signal) is called A-mode.  If one converts the amplitude signal into brightness (the higher the amplitude the brighter the dot is), then this imaging display is called B-mode.  Using B mode data, once can scan the rod multiple times and then display the intensity and the location of the rod with respect to time.  This is called M-mode display.  Using B-mode scanning in a sector created a 2D representation of anatomical structures in motion.     
Image display has evolved substantially in clinical ultrasound.  Currently, 2D and real time 3D display of ultrasound date is utilized.  Without going into complexities of physics that are involved in translating RF data into what we see every day when one reads echo, the following section will provide the basic knowledge of image display.  If one can imagine a rod that is imaged and displayed on an oscilloscope, it would look like a bright spot.  Displaying it as a function of amplitude (how high is the return signal) is called A-mode.  If one converts the amplitude signal into brightness (the higher the amplitude the brighter the dot is), then this imaging display is called B-mode.   
 
[[File:PhysicsUltrasound_Fig28.svg|thumb|left|400px| Fig. 28]]
{{clr}}
 
 
Using B mode data, once can scan the rod multiple times and then display the intensity and the location of the rod with respect to time.  This is called M-mode display.  Using B-mode scanning in a sector created a 2D representation of anatomical structures in motion.     


'''Second Harmonic''' is an important concept that is used today for image production.  The basis for this is that fact that as ultrasound travels through tissue, it has a non-linear behavior and some of its energy is converted to frequency that is doubled (or second harmonic) from the initial frequency that is used (or fundamental frequency).  There are several parameters that make second harmonic imaging preferential.  Since it is produced by the tissue, the deeper the target the more second harmonic frequency is returned.  As the ultrasound beam travels through tissue, new frequencies appear that can be interrogated.  Second harmonic data gets less distortion, thus it produces better picture.  Also, the second harmonic is strongest in the center of the beam, thus it has less side lobe artifacts.  At the chest wall the fundamental frequency gets the worst hit due to issues that we have discussed (reflection, attenuation) – if one can eliminate the fundamental frequency data then these artifacts will not be processed.  One concept of eliminating fundamental frequency data is called pulse inversion technology.  The transducer sends out 2 fundamental frequency pulses of the same amplitude but of different phase.  As these pulses are reflected back to the transducer, because of the different phase they cancel each other out (destructive interference) and what is left is the second harmonic frequency data which is selectively amplified and used to generate an image.
'''Second Harmonic''' is an important concept that is used today for image production.  The basis for this is that fact that as ultrasound travels through tissue, it has a non-linear behavior and some of its energy is converted to frequency that is doubled (or second harmonic) from the initial frequency that is used (or fundamental frequency).  There are several parameters that make second harmonic imaging preferential.  Since it is produced by the tissue, the deeper the target the more second harmonic frequency is returned.  As the ultrasound beam travels through tissue, new frequencies appear that can be interrogated.  Second harmonic data gets less distortion, thus it produces better picture.  Also, the second harmonic is strongest in the center of the beam, thus it has less side lobe artifacts.  At the chest wall the fundamental frequency gets the worst hit due to issues that we have discussed (reflection, attenuation) – if one can eliminate the fundamental frequency data then these artifacts will not be processed.  One concept of eliminating fundamental frequency data is called pulse inversion technology.  The transducer sends out 2 fundamental frequency pulses of the same amplitude but of different phase.  As these pulses are reflected back to the transducer, because of the different phase they cancel each other out (destructive interference) and what is left is the second harmonic frequency data which is selectively amplified and used to generate an image.
0

edits

Navigation menu