Since there are numerous possibilities for things that can go wrong during a deployment, we need to use all the information we have to qualify if the data is good or bad. Bad data should be discarded or flagged. The following are some important considerations when it comes to data quality and data assurance. Some common things that can happen will also be presented with suggestions on how to avoid them or reduce their risk of occurrence. You will also find links to other FAQs relevant to the topics discussed here. A more comprehensive article about quality control and quality assurance of data can be found here: QA/QC current measurement (comprehensive).
Amplitude and correlation
Amplitude and correlation tell something about the quality of the returned signals and determines the range of each deployment.
Amplitude is also called signal strength and tells how strong the received signals are. The unit is dB and we recommend using a 3 dB Signal-to-noise (SNR) threshold, that is the received signal should be at least 3 dB higher than the instrument's noise floor. An amplitude check should be carried out on every beam and every cell. Anything that deviates from the normal amplitude behavior may indicate that something unfortunate has happened. An amplitude increase with distance in one more beam can be due to a solid boundary, and a single high value may indicate a passive obstacle.
Correlation is about how similar the received signals are to the transmitted signals. This is given in percent, where 100% means perfect correlation and 0% means no similarity. A commonly accepted threshold for range when considering correlation data is 50%.
- What are amplitude and correlation, and what do they tell us about data quality?
- What factors affect maximum profiling range?
- How can I increase my profiling range?
Measuring current velocities is the main task of ADCP instruments, and the data can be given in the three coordinate systems BEAM, XYZ, and ENU. With good knowledge about the measurement area, and what values to expect, looking at the velocity output is a good means of data quality control. Then you can evaluate whether speed and direction seem realistic. Another thing to keep in mind is the vertical velocity, which under normal circumstances is close to zero in the ocean.
- What are the different coordinate systems and how are they defined?
- How is a coordinate transformation done?
Pitch, roll, and heading
Pitch, roll, and heading give the orientation of instruments in three planes. A lot of variation in these variables indicates much movement.
Pitch and roll tell if the instrument was leveled or not during deployment, and output data in degrees. Velocity calculations are made on the premise that the instrument is leveled, which implies that tilt will affect the data. Typical things that happen with tilt are increased sidelobe interference which can reduce the effective range, changed direction of velocity vectors in BEAM and XYZ coordinates, and shifted depth of measurement cells. In the case of the latter, the bin mapping procedure can be performed to shift the cells' location back to the intended ones. Pitch and roll data are used to bin map and convert to ENU coordinates.
The heading gives information about an instrument's orientation relative to the magnetic north pole. It works like a traditional mechanical compass with the X-axis as the magnetic needle, with output data given in degrees. A compass calibration should be performed just before the submergence of every deployment, to correct for magnetic disturbances from other elements on the rig. Heading data are used to convert to ENU coordinates, and unreliable readings will provide wrong directions of speed.
- What do pitch, roll, and heading tell us and what are they used for?
- What happens when my instrument is tilted and what actions should I take?
- What is bin mapping?
- What are the different coordinate systems and how are they defined?
- How is a coordinate transformation done?
- How do I carry out a compass calibration?
All instruments measure temperature, and this is given in degrees Celsius. The data are used to calculate the speed of sound at the instrument, which is further used to convert the time between transmitted and received signals into distances. Make sure the values are realistic considering the measurement site and conditions. Anything affecting the values can result in cells being located at different distances from the instrument than desired.
Pressure data is output in dBar. As 1 dBar corresponds to approximately 1 meter of seawater, pressure can be used to find the depth of the instrument and to determine the velocity at specific depths. Make sure the readings are realistic. A pressure offset should be carried out just before each deployment to correct for local atmospheric pressure at the surface. For submerged instruments, pressure will naturally vary with the tide. Another common change that produces an elevated pressure value is drag down.
An instrument needs enough power in order to make any measurements. Confirm that the power supply during deployment has been above the threshold value stated in the technical specifications. The power supply will not affect the data quality but determines whether a planned measurement is made or not. If you have problems with the power alternating between on/off or above/below the threshold, check if the connections between the instrument and power are good, and perhaps consider using a Chinese finger strap.
- How can I estimate the instrument power consumption?
- How can I get my battery to last longer?
- How can I calculate voltage drop over cables?
- How do I change batteries in my Signature 1000?
- For how long can batteries be stored before use?
- How to store batteries
- How to dispose of batteries
- How to transport batteries
- How much current does my instrument draw?
Typical things that can happen during deployment
Cause: Something is located along the beams. This can be physical obstructions such as rocks, ropes, buys, and underwater structures. It can also be sediments from the seabed moved by the tidal and causing tidal burial of the transducers. Fouling on the transducers provides a gradually increasing blockage right above the instrument.
Consequence: The velocity of the blockage is measured, instead of the water. Even though sound waves can propagate beyond the blockages, the obstruction might reflect or absorb so much energy that the signal is left with little energy. This can further reduce the profiling range.
How to discover it: A blockage is typically represented by an amplitude spike, and the amplitude can be severely reduced immediately after the blockage. Tidal burial causes periodic changes in amplitude that follow the approximately two high and two low tides throughout the day. Fouling results in a gradual decrease of amplitude in the profile.
Post-processing options: There is no way to filter out the blockage effect. All cells affected by blockages should be discarded. Data in front and behind a blockage can be used if the amplitude and correlation are good.
Measures to avoid or reduce the risk of it: Keep the measurement area free of any known obstacles. Tidal buries can be prevented by changes in the setup such a to lift the instrument higher above the seabed. Anti-fouling patches are beneficial when fouling is an issue.
Cause: Drag forces, such as strong currents or waves, act on a mooring with a force greater than the rig can withstand.
Consequence: The instrument is being dragged down in the water column. The instrument itself is thus in motion, meaning that the velocity measurements in the whole profile are based on both instrument movement and water movement.
How to discover it: Pressure pikes are typical since the instrument is being dragged down, and so are spikes in pitch and roll. Drag down is often first noticed by high values of speed at the relevant times.
Post-processing options: There is no way to distinguish between the motion of the instrument and the motion of the water. All data at times of drag down should be discarded.
Measure to avoid or reduce the risk of it: An effective way to reduce the risk of drag down is to minimize the amount of mass that the current can take hold of. This can be to use a thinner or shorter rope.
Cause: Sound waves from other sources are detected. This can be interactions with external sources, other instruments nearby, or with the instrument itself.
Consequence: The amplitude of all interacting sound waves will be combined and create a new wave with different properties, such as changed frequency or amplitude.
How to discover it: Amplitude and correlation will deviate som their typical behavior, but how depends on the interaction. One common thing is to see diagonal lines in overall amplitude and correlation.
Post-processing options: All data affected by acoustic interference should be discarded.
Measures to avoid or reduce the risk of it: The safest option will always be to stagger the instrument. This means that only one instrument measures at a time. Other measures are to keep sound sources far apart or have large differences in their frequencies. To have one instrument measuring downwards and another upwards, will not necessarily help as there may be reflection at the sea surface and seabed.
Cause: Leaked energy (sidelobes) from the main lobe encounters a boundary before the main lobe and contaminate the signal because of strong reflection.
Consequence: Bias towards the velocity of the interfering boundary. Ultimately it prevents us from measuring close to any remote boundary.
How to discover it: The signal strength increases with range in areas with sidelobe interference. It starts increasing at the beginning of a sidelobe layer and increases until a peak that represents the interfering boundary. It is often noticed with high values of speed close to the boundary, the vertical velocities can also be elevated.
Post-processing options: There is no way in post-processing to separate out the bias effect from the side lobes, and all data affected should be discarded. This often corresponds to around 10% of the profile, which can easily be discarded when processing in Nortek software.
Measures to avoid or reduce the risk of it: Placing the instrument closer to the boundary will make the size of the sidelobe interference layer smaller. Reducing the cell size will increase the spatial resolution. This can be advantageous because even when cells only are partially contaminated by sidelobes, the entire cell must be discarded. Also, make sure to keep the instrument leveled as tilt reduced the range additional.
Cause: Not enough scattering materials in the water to reflect the transmitted signal. This is particularly a problem in polar regions during winter, and in areas where zooplankton make up a large proportion of the scatterers (because of diel vertical migration).
Consequence: The signal strength (and hence the Signal-to-noise ratio) will quickly become very low, and as a result, the range will be short.
How to discover it: Amplitude profiles flatten out when the signal strength reaches the noise floor. The signal strength should be at least 3 dB higher than the noise in order to have good data quality. The noise floor can also be determined by pining in air. Velocity outputs will typically be random with no structure.
Post-processing options: Norteks's post-processing software offer simple methods to discard all data with low SNR.
Measures to avoid or reduce the risk of it: We cannot increase the amount of scattering materials in water, but emitting more energy by increasing the power level can help. And increasing the cell size allows more data points in each cell.