Article

Tools for analyzing intersecting tracks: The x2sys package

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract and Figures

I present a new set of tools for detection of intersections among tracks in 2-D Cartesian or geographic coordinates. These tools allow for evaluation of crossover errors at intersections, analysis of such crossover errors to determine appropriate linear models of systematic corrections for each track, and application of these corrections and further adjustments to data that completely eliminates crossover discrepancies from final 2-D data compilations. Unlike my older x_system tools, the new x2sys tools implement modern algorithms for detecting track intersections and are capable of reading a wide range of data file formats, including data files following the netCDF COARDS convention. The x2sys package contains several programs that address the various tasks needed to undertake a comprehensive crossover analysis and is distributed as a supplement to the Generic Mapping Tools, making them available for all computer platforms and architectures.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Large errors at crossovers often resemble as artificial features and question the geophysical interpretations made from the data sets. The source of such crossover errors (COE) could be due to various reasons (Wessel, 2010) such as (a) uncertainties in the coordinates leading to erroneous locations at point of intersections, (b) a non-homogenous phenomena resulting in a non-zero COE, (c) dynamic process measured from different epochs, (d) improper calibration at instrument level, etc. Rapid detection of COE at intersecting tracks are extremely important for global and regional data analysis involving repeated measurements in space or time (Bansal et al., 2005;Sandwell et al., 2014;Sandwell & Smith, 1997). Crossover analysis has evolved as a standard technique for evaluating the performance of satellite altimeters (Dorandeu et al., 2004;Krishna et al., 2023;Prandi et al., 2015;Tai & Fu, 1986) and generation of global products such as mean sea surface, marine gravity and sea level maps (Ballarotta et al., 2023;Pujol et al., 2016;Schaeffer et al., 2012;Yuan et al., 2020). ...
... Existing algorithms extract crossovers by considering consecutive data points from a pair of tracks as segments and check for possible intersections at pair-wise combinations of these segments over all tracks (Greene et al., 2017;Hsu, 1995;Li et al., 2022;Wessel, 1989Wessel, , 2010. For a pair of tracks with N segments, the number of computations and time required to check for an intersection increases as N 2 (Hsu, 1995). ...
... PyXover (Bertone et al., 2020) subsamples the tracks at a fixed ratio (Bertone et al., 2021;Desprats et al., 2024) to get a rough/potential crossover using the Schwarz method (Schwarz, 2024) and then iteratively refines the search locally to get the exact crossover. Therefore, we utilize these crossover detection methods viz., the Schwarz method (Schwarz, 2024) and IRST (Li et al., 2022) along with the x2sys package (Wessel, 2010) from GMT (Wessel et al., 2019(Wessel et al., , 2024 to investigate the performance of PyReX (without numba and without multiprocessing). This is demonstrated by a preliminary comparision of the time taken (50 runs using the bash utility/usr/bin/time) to detect crossovers for satellite altimetric tracks subsampled at different along-track sampling rates over a local study region (40-20°S and 60-80°E). ...
Article
Full-text available
A crossover point is the location of intersection of any two ground tracks charted by multiple platforms (ships, satellite radar and laser altimeters etc.). Detection of crossovers is of prime importance to estimate the discrepancies in the geophysical measurements at the crossover points. Usual approach of crossover detection considers consecutive data points in tracks as segments and checks for intersections between all combinations of these segments. We present a Recursion based crossover detection algorithm in Python (PyReX) for rapid detection of crossovers by avoiding redundant intersection checks. We test the performance of this algorithm using along‐track sea surface height measurements from satellite altimeters. We observe that the time taken for flagging a crossover between pair of tracks with N segments each varies as logN logN\log \,N vis‐a‐vis the N2 N2{N}^{2} dependency associated with the traditional methods. We further demonstrate that PyReX significantly improves the computation speed for high frequency along‐track measurements from satellite altimeters and ship‐borne gravity data compared to existing algorithms. PyReX is a flexible, open‐source code that could be easily customized for variety of applications involving large‐scale track‐line data sets.
... Moreover, the data set contains other sources of noise that must be removed to reduce inconsistency between different surveys and improve anomaly definition. Crossover errors (COE), which are data (here magnetic anomaly) offsets at ship track intersections (Wessel, 2010), were used as indicators of the error budget to evaluate the coherency of the data set. Furthermore, the mean and root mean square of crossover errors (Mean COE and RMS COE ) are used to estimate improvement of each correction step in the analysis. ...
... Subsequently, older surveys with large COE were divided into segments and those segments were merged to the backbone by shifting magnetic values of a given line segment by a constant value, which is the Mean-COE of that segment compared to the backbone. The determination of this offset was achieved by the use of the "x2sys" package in the GMT suite (Wessel, 2010). Because pre-GPS navigation systems improved with time, later pre-GPS data were analyzed first, expanding the backbone backwards in time as more corrected data were added. ...
Article
Full-text available
Shatsky Rise oceanic plateau was emplaced during a period of frequent geomagnetic polarity reversals, allowing reconstruction of its tectonic evolution using magnetic anomalies. Prior studies mainly focused on identifying magnetic isochrons and encountered difficulties in tracing magnetic lineations over high relief. We complied a large magnetic data set over Shatsky Rise and its environs, using 5.5 × 10⁶ data points from 96 geophysical cruises spanning 54 years. The long‐time span and heterogeneity of component data sets made data merger a challenge. Contributions of internal and external fields, and spurious readings were removed during processing. A “backbone” method, using recent GPS‐navigated data as a foundation, was developed to improve the coherency of the data set. The singular characteristic of the new magnetic anomaly map is that linear magnetic anomalies are ubiquitous. In nearly every place where data are dense enough to delineate anomaly trends, the plateau and surrounding crust are characterized by linear anomalies. Discordant anomalies in some areas imply complex tectonics related to triple junction migration and ridge reorientation. Tamu Massif apparently formed along a segment of Pacific‐Farallon spreading ridge that rotated by 90° as a triple junction migrated through the edifice. Ori Massif appears to have formed on the Pacific‐Izanagi ridge between triple junctions. Shirshov Massif contains discordant lineations that may indicate a microplate. The pervasive occurrence of linear magnetic anomalies within Shatsky Rise implies that these volcanic edifices must have formed by spreading analogous to mid‐ocean ridges that formed anomalously thick crust.
... If this approach is adopted by the community, after the initial period of assessing each vessel's different data structure issues, conversion into standard data exchange formats such as MGD77T [Hittleman et al., 2010], the Generic Mapping Tools [Wessel et al., 2013] mgd77 -5- [Wessel and Chandler, 2007], enables thorough data quality assessment and control using along-track Wessel, 2008, 2012] and crossover analysis. Crossover analysis is supported in that the mgd77 formats are supported by GMT's crossover analysis toolkit, x2sys [Wessel, 2010], for cases in which sufficient ship track intersections occur. ...
... Widely used for analyzing survey tracks that intersect with other survey tracks, especially in gravity and magnetic surveys, COE analysis can greatly inform the scientist as to the magnitude and source of errors in geophysical data. We perform COE analysis using the method of Wessel [2010], which involves creation of an x2sys database particular to the project and consisting of all involved survey tracks. At sea, COE analysis would typically be internal only given the preliminary nature of the newly acquired magnetic and gravity anomalies, for example, in order to examine instrument drift. ...
Article
Full-text available
We announce a new and integrated system for planning and executing marine geophysical surveys and for scrutinizing and visualizing incoming shipboard data. The system incorporates free software designed for use by scientists and shipboard operators and pertains to underway geophysics and multibeam sonar surveys. Regarding underway data, a crucial first step in the approach is to reduce and merge incoming center beam depth, gravity, and towed magnetic data with navigation, then reformat to the standard exchange format. We are then able to apply established quality control methods including along‐track and cross‐track analyses to identify error sources and to incrementally build the candidate archive file as new data are acquired. Regarding multibeam data, these are subjected to both an automated error removal scheme for quick visualization and to subsequent ping editing in detail. The candidate archive file and sonar data are automatically and periodically updated and adapted for display in Google Earth, wherein survey planning is also carried out. Data layers are also updated automatically in Google Earth, allowing scientists to focus on visual inspection and interpretation of incoming data. By visualizing underway and sonar data together with reference gravity, magnetic, and bathymetry grids in Google Earth, data familiarity is enhanced and the likelihood of noticing extreme errors increased. We hope scientists will embrace these techniques so that each data set being submitted to a data repository is vetted by the seagoing science party.
... We compiled eight previous sea surface magnetic surveys over the Hess Deep rift valley. The average cross-over error between each dataset was calculated using the x2sys program package (Wessel, 1989;Wessel, 2010). Cross-over errors were reduced from 29.2 to 3.7 nT by using a linear spline interpolator to level each of the track lines (Wessel, 2010). ...
... The average cross-over error between each dataset was calculated using the x2sys program package (Wessel, 1989;Wessel, 2010). Cross-over errors were reduced from 29.2 to 3.7 nT by using a linear spline interpolator to level each of the track lines (Wessel, 2010). The final compiled magnetic data were then interpolated onto a 0.25-arc-minute spaced grid using the GMT software minimum-curvature algorithm with a tension parameter of 0.25 (Fig. 1C). ...
Article
Marine magnetic anomalies are a powerful tool for detecting geomagnetic polarity reversals, lithological boundaries, topographic contrasts, and alteration fronts in the oceanic lithosphere. Our aim here is to detect lithological contacts in fast-spreading lower crust and shallow mantle by characterizing magnetic anomalies and investigating their origins. We conducted a high-resolution, near-bottom, vector magnetic survey of crust exposed in the Hess Deep “tectonic window” using the remotely operated vehicle (ROV) Isis during RRS James Cook cruise JC21 in 2008. Hess Deep is located at the western tip of the propagating rift of the Cocos-Nazca plate boundary near the East Pacific Rise (EPR) (2°15′N, 101°30′W). ROV Isis collected high-resolution bathymetry and near-bottom magnetic data as well as seafloor samples to determine the in situ lithostratigraphy and internal structure of a section of EPR lower crust and mantle exposed on the steep (~20°dipping) south facing slope just north of the Hess Deep nadir. Ten magnetic profiles were collected up the slope using a three-axis fluxgate magnetometer mounted on ROV Isis. We develop and extend the vertical magnetic profile (VMP) approach of Tivey (1996) by incorporating, for the first time, a three-dimensional vector analysis, leading to what we here termed as “vector vertical magnetic profiling” approach. We calculate the source magnetization distribution, the deviation from two dimensionality, and the strike of magnetic boundaries using both the total field Fourier-transform inversion approach and a modified differential vector magnetic analysis. Overall, coherent, long-wavelength total field anomalies are present with a strong magnetization contrast between the upper and lower parts of the slope. The total field anomalies indicate a coherently magnetized source at depth. The upper part of the slope is weakly magnetized and magnetic structure follows the underlying slope morphology, including a “bench” and lobe-shaped steps, imaged by microbathymetry. The lower part of the slope is strongly magnetized, with a gradual reduction in amplitude from east to west across the slope. Surface morphology and recent drilling results indicate that the slope has been affected by mass wasting, but the observation of internally coherent magnetization distributions within the upper and lower slopes suggest that the disturbance is surficial. We attribute the spatial differences in magnetization distribution to the combination of changes in in situ lithology and depth to the source. These survey lines document the first magnetic profiles that capture the gabbro-ultramafic and possibly dike-gabbro boundaries in fast-spreading lower crust.
... These cross-over errors are minimised by levelling the ship-track data (e.g. Wessel, 2010). Shiptrack data can also be merged with satellite data to improve coverage and to help overcome the limitations of satellite altimetry data near the coast (e.g. ...
... We thank Dieter Franke, Carina Kemp, Rob Langford and an anonymous reviewer for their comments that improved the paper. Figures were prepared using the Generic Mapping Tools (Wessel and Smith, 1991, 1998). This paper is published with the permission of the Chief Executive Officer, Geoscience Australia. ...
Article
Offshore frontier sedimentary basins are characterised by a lack of constraining geological and geophysical data. This lack of data is generally the result of deep water (>500 m), difficult geology (volcanics and salt), remoteness and harsh met-ocean conditions. These characteristics present significant challenges to marine surveying, which means that frontier basins tend to be underexplored. With continuing interest in exploration for energy resources in frontier regions, many frontier basins around the world have been the focus of increasingly-sophisticated geophysical studies that integrate a range of methodologies, including those based on potential-field (gravity and magnetic) data. Underexplored frontier basins around Australia’s continental margin have received increased attention during the last decade, largely as a result of government-funded programs of precompetitive data acquisition and analysis. The components of this work that have relied heavily on potential-field data include: first-pass depth-to-basement estimation using spectral techniques applied to magnetic data; enhancement of gravity and magnetic images to aid the identification of basin depocentres and to facilitate onshore–offshore geological interpretation of basement structure; multi-scale edge-detection applied to gravity and magnetic data to aid the interpretation of basement structure; 3D forward and stochastic inverse modelling of gravity data to guide seismic interpretation of sediment thickness and basement structure; and using supercomputers for high-resolution, regional-scale 3D inverse modelling of magnetic and gravity data to constrain the physical properties of the crust. Despite the additional insight offered by this work, efforts to understand frontier basins are not without challenges, one of the most fundamental of which is to ensure that non-specialists are not misinterpreting data (e.g. wrongly interpreting artefacts arising from specific processing). The other main challenge in Australian frontier basins arises from a lack of constraints on crustal structure. This leads to significant ambiguity when using gravity data to infer sediment thickness or to understand the nature of basement. This ambiguity could be vastly reduced through the acquisition of seismic refraction data that focuses on imaging crustal structure. Further opportunities exist in using alternative methods for automated depth-to-basement estimation, incorporating process-oriented rather than static potential-field modelling, and in applying 3D forward and inverse gravity and magnetic modelling to other Australian frontier basins.
... In order to evaluate the accuracy of the global single-beam depths, we use the x2sys module of GMT (Generic Mapping Tools) to calculate the ECOE of 5464 single-beam track-lines from NCEI (Wessel 2010). We processed each cruise individually. ...
... However, the enumeration is inefficient, and therefore the possible location of a crossover should be confirmed to narrow the range of brute force enumeration (Wessel 1989). The x2sys toolkit (Wessel 2010) was used to calculate the elevation discrepancies at the crossovers. Based on the x_system, x2sys uses a new algorithm to locate the crossovers, and it can handle various formats of observational data, such as the NetCDF format. ...
Article
Full-text available
The lunar orbiter laser altimeter (LOLA) onboard the lunar reconnaissance orbiter has performed high-precision, full-coverage, and high-density laser ranging observations for the entire lunar surface since its launch. Statistics have shown that LOLA has collected 6.94 billion effective altimeter data up to June 2022. Most of the typical orbits in the LOLA dataset have a high quality and exhibit horizontal offsets of almost 7 m and radial offsets of almost 0.5 m. However, there is still a category of orbits in the dataset that will cause apparent noise in the constructed DEM, which is attributed to the orbits with large or anomalous errors. We call such orbits as flawed orbits in this paper. The flawed orbits can be identified and screened by the elevation discrepancy at the crossovers of the orbits. The results show that the flawed orbits are caused by significant along-track errors, which also result in the radial error of up to several kilometers. Moreover, most of the flawed orbits are concentrated in several consecutive time intervals. A correction method is then proposed to correct the flawed orbits in the local region. The position of the flawed orbits is reconstructed using the feature points matching of the DEMs before and after they are removed. Some experimental analyzes show that the apparent terrain artifacts have been eliminated and more identifiable terrain details are reappeared. Identifying and correcting these flawed orbits with significant along-track offsets paves the way for improving the quality of the LOLA data and reconstructing the topography of the Moon.
... Based on the MGD7/MGD77T tool of GMT, the bathymetric data in MGD77T format is converted into XYZ ASCII data format [18]. According to the IHO criteria, the accuracy of DBMs is evaluated, and the error sources exceeding the limit differences were removed, and the invalid values (corr_ = null, or ≥ 0 m) of bathymetric data were removed according to the IHO criteria, to reduce the influence of coarse differences on the accuracy estimation results. ...
Article
Full-text available
High-resolution seafloor topography is important in scientific research and marine engineering in regard to marine resource development and environmental protection monitoring. In this study, multi-dimensional comparisons were made between GEBCO_2022, SRTM15_V2.5.5, SRTM30_PLUS, SYNBATH_V1.0, ETOPO_2022, and topo_25.1 in the South China Sea and surrounding waters (SCS). This study has found that ETOPO_2022 had the best overall accuracy and reliability. Based on the results of the model accuracy analysis and by considering the topographic slope, ETOPO_2022, GEBCO_2022, and SRTM15_V2.5.5 were weighted and fused to form a fusion model. The error of the fusion model was 94.80% concentrated in (−100–100 m). When compared with GEBCO_2022, SRTM15_V2.5.5, SRTM30_PLUS, SYNBATH_V1.0, ETOPO_2022, and topo_25.1, the RMSE was reduced by 2%, 9%, 62%, 15%, 1%, and 73%, respectively. The slope-based weighted fusion method has been shown that it can overcome the limitations of a single data source and provide a reference for timely reconstruction and updating of large-scale seafloor topography.
... Therefore, the geometric intersection method first screens out combinations of lines that may intersect and then judges whether these combinations do intersect. The implementation algorithms of this method include the rapid rejection and straddle test (RST) algorithm (Greene et al., 2017); scanning for intersections algorithm, which has a lower time complexity (Sedgewick, 1990;Wessel, 2010); and rectangular partitioning algorithm, which is more suitable for satellite and aircraft tracks (Wessel, 1989). This paper combines the advantages of the above algorithms, and studies how the precise geolocation of the crossover from satellite ground tracks can be derived efficiently and precisely. ...
Article
A crossover refers to the intersection of two satellite ground tracks. Crossovers are required for performing crossover differences adjustment to remove the orbit error and for establishing the time series of elevation in surfaces, which is why crossovers are important for satellite altimetry measurements of ice sheet or ice shelf elevation changes. How to extract more crossovers precisely is the objective of this paper. On the basis of the traditional method of solving crossovers and computer graphics, this paper proposes an improved algorithm called improved rapid rejection and straddle test (IRST) for computing the position of the crossover point. This algorithm efficiently and accurately searches for two points in ascending pass and two points in descending pass that can form a crossover, and then computes crossovers’ geolocation. By using CryoSat-2 satellite altimeter data, we conducted our study on the Antarctic Ross ice shelf and Filchner–Ronne ice shelf using the fixed iteration (FI) algorithm, rapid rejection and straddle test (RST) algorithm, and the IRST algorithm. Results show that IRST is superior to the two other algorithms in terms of the number of crossovers, geolocation accuracy and computational efficiency. These advantages are more noticeable in the border areas between the ice shelf and the mountain with large terrain slope, which solves the problem that crossovers are less in these areas with poor-quality data coverage.
... Once the lake outlines are obtained, we conducted further crossover analysis to generate a higher temporal resolution (< ICESat-2's 91-day repeat cycle) elevation change time-series on areas of interest (see Fig.4.4). We utilized the x2sys_cross package (Wessel, 2010), setting a maximum crossover distance threshold of 250 m, with crossover values linearly interpolated from their actual track points. For each crossover point, an elevation anomaly time series is generated by subtracting the crossover elevation at any time (h n at t n ) with the rst crossover elevation value (h 0 at t 0 ). ...
Thesis
Full-text available
To narrow uncertainties in the Antarctic ice sheet's contribution to sea level rise, we present a collection of novel machine learning and automated satellite remote sensing methods which use ice surface observations to infer the subglacial nature of Antarctica. A super-resolution deep neural network called DeepBedMap was designed and trained to produce a high-resolution (250 m) bed elevation model of Antarctica called DeepBedMap_DEM that preserves bed roughness details useful for catchment- to continent-scale ice sheet modelling. This DeepBedMap_DEM is compared with a smoother, medium-resolution (500 m) BedMachine topography in a basal inversion experiment over Pine Island Glacier, with results motivating more research into the interacting roles of subglacial hydrology which influences skin drag and high resolution bed topographies which influences form drag. Active subglacial lakes in Antarctica were mapped using an unsupervised density-based classification method on ICESat-2 point cloud data from 2018-2020, yielding 194 active subglacial lakes, including 36 new lakes in the 86-88°S area not detected by the previous ICESat (2003-2009) mission. This thesis showcases both the rich diversity in subglacial landscapes and the dynamic nature of subglacial hydrology in Antarctica, forming a foundation enabling the accurate modelling of overland ice flow in critical regions of the vulnerable West Antarctic Ice Sheet. Plain language summary: Antarctica has a lot of ice, but we're unsure how fast ice can slide into the sea and cause water to go up in beaches around the world. So we teach computers to solve hard math problems that tell us how fast sea water might go up. These computers are fed with lots of pictures taken from cameras up in the sky and space. Ice sits on top of rock in Antarctica, and with practice, the computers get pretty good at telling us how high and bumpy the rock is. The rock under the ice appears quite bumpy, and ice probably doesn't like sliding over bumpy rocks since it's rough. Sometimes though, ice may not mind sliding over rough bits of rock if the rock moves along with it, or if water gets in between the rock and ice to makes things slippery, but we ask our smart computers to be sure. There are also lasers from space shooting down at earth and bouncing back to tell us how ice in Antarctica is going up or down. Once in a while, they tell us that ice in parts of Antarctica moved up or down a bit too fast. Smart people think these are lakes hiding under the ice, filling up with water or draining, and we found many of these lakes over Antarctica, especially in an area called Whillans Ice Stream on the Siple Coast. We hope that the computers can keep learning faster because there's a lot of pictures showing ice moving pretty fast, and it doesn't look like there's much time before a big chunk of ice might break away in Antarctica and flood beaches around the world. Code availability: Python code for reproducing the methods in this thesis are publicly available at https://github.com/weiji14/deepbedmap for Chapter 2 (DeepBedMap), https://github.com/weiji14/pyissm for Chapter 3 (Basal inversion); and https://github.com/weiji14/deepicedrain for Chapter 4 (ICESat-2 subglacial lakes).
... Magnetic diurnal variation was corrected using data from the Gesashi magnetic observatory on Okinawa-Jima operated by the Geospatial Information Authority of Japan. Crossover error, assumed to be mainly caused by ship magnetization, was minimized by using the Generic Mapping Tools software package x2sys (Wessel, 2010). ...
Article
Full-text available
Offshore northern Ishigaki‐Jima Island, in the southern Okinawa Trough, offers outstanding opportunities to explore the rifting stage of a backarc system. We report the results of integrated marine geological and geophysical surveys with high‐density survey lines in this area. We identify a graben bounded by normal faults and extending approximately 59 km in an ENE‐WSW direction off‐axis of the southern Okinawa Trough. Submarine volcanoes with active hydrothermalism and associated intrusive structures lie in the graben. Magnetic anomaly and seismicity data in and around the graben suggest the presence of relatively shallow magma acting as a heat source. All features identified in and around the graben suggest active rifting in the southern Okinawa Trough.
... We built our magnetic map of the Japan Trench by first considering the absolute PPM data, then including the relative total field computed from the STCM data. The PPM data gathered from different cruises and databases are leveled (for instance using X2SYS, a crossover analytic tool available in GMT; Wessel, 2010). In the next step, the corrected PPM data are used as a reference to tie the STCM surveys at their intersections. ...
Thesis
Full-text available
The purpose of this study is to understand the causes of the decaying seafloor spreading magnetic anomalies on subducting oceanic crust. We investigate the magnetization of the oceanic crust both before and after subduction and extend our initial study area from the Japan-Kuril subduction zone to other subduction zones to try to generalize our observations. Before subduction, a 20% loss of magnetization between the outer-rise and the trench occurs in old seafloor, caused by rejuvenated hydrothermal circulations and alteration of magnetic minerals. Conversely, such a loss of magnetization is not observed for the young seafloor because the flexure remains very limited. After subduction, both exhibit a fast decay of magnetization due to thermal demagnetization of titanomagnetite (Tc:150-350°C) in the extrusive basalt, followed by a much slower one due to thermal demagnetization of magnetite (Tc: 580 °C) in the deeper crust. However, the fast decay is more rapidly achieved in the young seafloor due to differences in the thermal structure. Overall, the magnetic anomalies in subducting oceanic crust decay as an effect of flexure, normal faulting and hydrothermal alteration before subduction, and thermal demagnetization of the different magnetic minerals after subduction. The seawater injected in the oceanic crust before subduction is trapped by the sediment cover after entering subduction and may significantly heat up the slab through thermal blanketing, adding to the thermal gradient and possibly heat released by serpentinization of the mantle wedge. The speed of thermal demagnetization is modulated by the lithospheric thickness, hydration rate, and therefore the age of the seafloor.
... Ship-borne magnetic data along all the tracks (Fig. 2) are further analyzed for crossover error as the data have been compiled from the different cruises. The crossover errors at each intersection are estimated and appropriate linear models of systematic corrections for each track are determined using the 'x2sys' package of Wessel (2010). The crossover corrected data is gridded with minimum curvature method and the resultant magnetic grid is utilized to generate magnetic anomaly map of the study area (Fig. 4) for further analysis and interpretation. ...
Article
The Laxmi and Laccadive ridges are two major aseismic ridges in the NW Indian Ocean. Bathymetry, gravity and magnetic data and their derivatives are analyzed and modelled to assess the interrelationship between both the ridges as well as to establish their crustal structure, nature and isostatic compensation. Boundaries of the Laxmi and Laccadive ridges are delineated by tilt derivative with constraints from available seismic-sections. Integrated gravity and magnetic models reveal that both the ridges have almost similar crustal layers and are underplated all along its length. Crustal models also depict that both the ridges are carpeted with flood basalt and heavily intruded. 3D coherence between Mantle Bouguer Anomaly and residual bathymetry reveals that the elastic plate thickness and subsurface to surface load ratio for both the ridges vary from 3 to 4 km and 0.7 to 0.8, respectively. Several characteristic similarities viz. crustal structure and nature, elastic plate thickness, magmatism have been observed for both the ridges. Seismic sections near the junction of the ridges suggest that the basement high corresponding to the Laxmi Ridge is further continued towards the Laccadive Ridge. In addition, the northwestern part of the Laccadive Ridge has NW-SE structural lineaments similar to that of the NW-SE segment of the Laxmi Ridge. Based on results of the present and previous studies, we infer that (i) the Laxmi Ridge is extended southeastward towards the Laccadive Ridge, (ii) both the ridges are continental slivers which are underplated as well as intruded and carpeted by volcanics, and (iii) both the ridges are locally compensated.
... It also provides some constraint on crustal thickness variation. To create the ship-derived FAA, cross-over analysis (Wessel, 2010) was performed on the data acquired along each profile within the NG to remove systematic errors, and a datum shift was applied to equate the ship data to the global satellite-derived FAA (Sandwell et al., 2014 - Fig. 10b). The resulting ship-derived FAA is shown in Fig. 11b. ...
Article
3-D tomographic modelling of wide-angle seismic data, recorded at the intermediate-spreading Costa Rica Rift, has revealed a P-wave seismic velocity anomaly low located beneath a small overlapping spreading centre that forms a non-transform discontinuity at the ridge axis. This low velocity zone displays a maximum velocity anomaly relative to the ‘background’ ridge axis crustal structure of ∼0.5 km s^{−1} , has lateral dimensions of ∼10 × 5 km, and extends to depths ≥2.5 km below the seabed, placing it within layer 2 of the oceanic crust. We interpret these observations as representing increased fracturing under enhanced tectonic stress associated with the opening of the overlapping spreading centre, that results in higher upper crustal bulk porosity and permeability. Evidence for ongoing magmatic accretion at the Costa Rica Rift ridge axis takes the form of an axial magma lens beneath the western ridge segment, and observations of hydrothermal plume activity and microearthquakes support the presence of an active fluid circulation system. We propose that fracture pathways associated with the low velocity zone may provide the system through which hydrothermal fluids circulate. These fluids cause rapid cooling of the adjacent ridge axis and any magma accumulations which may be present. The Costa Rica Rift exists at a tipping point between episodic phases of magmatic and tectonically enhanced spreading. The characteristics inherited from each spreading mode have been preserved in the crustal morphology off-axis for the past 7 Myr. Using potential field data, we contextualize our seismic observations of the axial ridge structure at the whole segment scale, and find that the proposed balance between magmatic and tectonically dominated spreading processes observed off-axis may also be apparent along-axis, and that the current larger-scale magma supply system at the Costa Rica Rift may be relatively weak. Based on all available geophysical observations, we suggest a model for the inter-relationships between magmatism, faulting and fluid circulation at the Costa Rica Rift across a range of scales, which may also be influenced by large lithosphere scale structural and/or thermal heterogeneity.
... The XO differences are used not only to assess the quality of the campaign results but also to assess the accuracy of the drift estimate. The Generic Mapping Tools' x2sys_cross script (Wessel 2010;Wessel et al. 2013) is used to extract the crossover points either between different campaigns (external crossover points) and within the same campaign (internal crossover points) excluding the repeating measurements in the harbour. ...
Article
Full-text available
In 2017 and 2018 GFZ performed two gravimetry campaigns on commercial ferries in the Baltic Sea. The nature of such “non-dedicated” campaigns is different from “dedicated” campaigns that are performed on research vessels with tracks planned according to gravity measurement needs. The non-dedicated campaigns use non-survey vessels or survey vessels running for other purposes such as hydrographic measurements, which may require additional corrections. To assess the usefulness of non-dedicated campaigns, we analysed gravity measurements collected on two commercial ferries as part of the EU funded FAMOS project. Besides the typical marine gravimetry corrections, we also investigated the corrections for the vertical accelerations due to the ship’s movement and the dynamical effect due to the cross-coupling between horizontal and vertical acceleration components. Taking the latter two corrections into account partly leads to slight improvements, but our results also demonstrate that the standard processing without the two corrections, as used in most of the dedicated campaigns, already delivers good quality end products that fulfil the requirements of a typical marine gravimetry survey with an uncertainty of about 1 mGal. Our findings suggest that gravimetry campaigns on commercial ferries can be used to complement dedicated marine gravimetry campaigns and contribute to geodetic purposes.
... Items requiring attention include adherence to the exchange file specification, adequacy of the archive header, validity of the included geophysical data, presence of extreme errors, and adequacy of data digitization. The archivist ensures that header and data records fully characterize field observations and are sufficiently populated for wide-ranging subsequent use while data and format errors can be detected using existing methods including along-track Wessel, 2008, 2012) and across-track (e.g., Wessel, 2010) analyses as well as via data visualization techniques such as the Google Earth-based techniques of Hamilton et al. (2019). ...
Article
Preserving costly marine geophysical trackline data is of paramount importance but variability in priorities, funding, personnel, and technology impact our data archival capacity. We have addressed one crucial facet of this dilemma by devising an open source approach to merge and reduce underway geophysical data and to generate marine geophysical archive files using common command line programs along with the Generic Mapping Tools and its mgd77 supplement. Archive files generated using this approach retain full precision and may be converted automatically to MGD77T, MGD77+, as well as MGD77 formats. We successfully applied the approach to 340 geophysical data sets acquired by R/V Kilo Moana from 2002 to 2018 and in the near term we plan to submit the non-proprietary archive files to the National Centers for Environmental Information’s trackline geophysics archive. We encourage international oceanographic communities to explore our methodology as a larger user-base will strengthen the software and the procedures.
... Many existing cruise data were collected before GPS navigation, which poses a problem because older navigation systems are less accurate. Small navigational offsets are not a substantial issue for our modelling and interpretation efforts, but they do inflate the mean cross-over error (COE; the values of data offsets where ship tracks cross) of the dataset 50 . ...
Article
Full-text available
Tamu Massif is an immense Mesozoic submarine volcano, the main edifice of the Shatsky Rise oceanic plateau. It is located at a spreading ridge triple junction, but considered to be a shield volcano formed by effusive volcanism from an emerging mantle plume. However, it is unclear how Tamu Massif eruptions interacted with the spreading ridges, which are enormous linear volcanoes themselves. Here we create a magnetic anomaly map for Tamu Massif, which can provide clues about crustal formation. For Tamu Massif, we find dominantly linear magnetic field anomalies caused by crustal blocks of opposite magnetic polarity. This pattern suggests that Tamu Massif is not a shield volcano, but was emplaced by voluminous, focused ridge volcanism. If the magma source at the Shatsky Rise was a plume, it was closely connected to and controlled by seafloor spreading. By implication, even the largest oceanic plateau edifices can be formed by seafloor spreading. We suggest that the widely accepted analogy between continental flood basalts and oceanic plateaus requires reconsideration.
... The altimetric data were analyzed with Bash scripts on the basis of GMT tool box WESSEL, 2010). Before we process the full data sets, two types of interpolations were compared in order to build DEMs (Digital Elevation Model) . ...
Thesis
Ce travail de thèse est une étude multi-échelles de temps de la morphodynamique d'une île barrière soumise à des conditions de forçages paroxysmaux de mousson, de tempêtes tropicales et de typhons. Cette étude se concentre dans un premier temps, sur un large travail de terrain effectué dans le cadre du projet KUN-SHEN. Sept mois de mesures (de novembre 2011 à janvier 2012 et de mai à septembre 2012) ont permis l'acquisition d'un jeu de données inédit dans la littérature, exhaustif et de haute résolution temporelle et spatiale. Des mesures hydrodynamiques du large jusqu’à la zone de jet de rive ont été acquises en parallèle d'un suivi topographique de la plage émergée. Les mesures de vagues ont été acquises à partir d'une bouée au large, de deux houlographes-courantomètres en avant-côte (mesures en continu à 2 Hz) ainsi que de pressiomètres enfouis dans la plage sub-aérienne (mesures en continu à 5 Hz). Les levés topo-bathymétriques ont été réalisés au D-GPS (résolution centimétrique) une fois par semaine pendant la saison de mousson et juste après et avant chaque événements de tempêtes extrêmes pendant la saison des typhons. Le travail d'analyse cible ensuite la morphodynamique de la zone de jet de rive et de la plage émergée sur un panel d’échelles spatio-temporelles allant de l'instantanée à annuelle. A l’échelle instantanée, ce sont les variations d’élévation de la surface libre de l'eau et du lit sableux (quelques secondes) qui sont observées au cours de chacune des phases (montant, apex, tombant) de la tempête tropicale Talim (juin 2012, Hs = 10.34 m et Ts = 14.6 s). Pour le même événement de tempête, la réponse morphologique de toute la plage émergée est ensuite décrite et quantifiée dans le détail à l’échelle événementielle (quelques jours). Les bilans sédimentaires de chacune des saisons sont ensuite quantifiés dans le but de caractériser la dynamique saisonnière (quelques mois) à annuelle d'une barrière sableuse soumise à deux types de forçages extrêmes différents (mousson/typhons). L'impact sur le front de plage, des groupes de tempêtes faibles à modérées pendant l'hiver est ainsi comparé à l'impact des tempêtes extrêmes pendant l'été. On souligne finalement 1) l'importance d'un jeu de mesures in-situ de bonne qualité dans un travail d'analyse morphodynamique, 2) l'essentialité de l’emboîtement des échelles spatio-temporelle ainsi que 3) le rôle du profil morphologique héritée dans la réponse morphologique d'une plage émergée lorsqu'elle est soumise à des conditions de forçages de vent et de vague extrêmes.
... Ship-borne magnetic data along all tracks (Fig. 2) are further analyzed for crossover error as the data have been compiled from different cruises. The crossover errors at each intersection are estimated and appropriate linear models of systematic corrections for each track are determined using the x2sys package (integrated with GMT package) of Wessel (2010). Relative weight for each track is assigned by following formula (Hsu 1995): where, W i : weight for the ith track, D ij : jth crossover error for ith track, N: total number of crossover points in the ith track. ...
Article
Full-text available
Angria Bank, a submerged plateau with coral reefs, is located off the central-west coast of India. Biologically, the bank attracts scientific community for its coral formations. But geologically, origin and tectonic setting of this feature are not established yet due to scarcity of geological and geophysical data. Newly acquired bathymetry and magnetic data along with existing geophysical data have allowed us to determine crustal structure and tectonic evolution of the Angria Bank. The entire geological setup of the bank consists of 34 km long water plateau (average water depth ~ 20 m), and two prominent spurs extending to its western side. The area of water plateau and entire geological setup are 365 km² and 1460 km², respectively. The bank is characterized by two prominent magnetic low with a relative high in the middle. Integrated gravity and magnetic models revealed that flood basalt has carpeted the entire geological structure which later acted as a foundation for coral growth. The crust below the bank is continental in nature and underplated by high-density magmatic material. Moho is almost flat and lies at a depth of ~ 27 km. The results of 3D Euler deconvolution suggest that the study area is characterized by two types of linear trend viz. NE–SW to NNE–SSW and NW–SE. These trends are interpreted mostly as basement faults but at few places, they might be associated with sills/dykes. The geological setup of the bank is fault-bounded and comprised of two horst structure (interpreted as spurs) trending NE–SW to NNE–SSW. Integrated interpretation of the geophysical data revealed that the Angria Bank is an isolated feature, evolved during rifting between India and Seychelles-Laxmi Ridge in the Late Cretaceous.
... In addition, using the "x2sys" package in the GMT suite (Wessel, 2010) to determine offset, we determined a constant value that could be added or subtracted to make some older cruises match recent high-quality survey data. Two steps were used to grid the data. ...
... We conducted a cross-over error analysis at several stages during the data processing flow using the 170 software described by Wessel, 2010. The analysis computes the difference between magnetic 171 anomalies at all track intersections. ...
Article
Full-text available
We present new constraints on the opening of the South Atlantic Ocean from a joint interpretation of marine magnetic anomaly grids and forward modelling of conjugate profiles. We use 45,000 km of recently collected commercial ship track data combined with 561,000 km of publically available data. The new data cover the critical ocean–continental transition zones and allow us to identify and downgrade some poorly navigated older ship tracks relied upon in earlier compilations. Within the final grids the mean cross-over error is 14 nT computed from 8,227 ship track intersections. The forward modelling used uniformly magnetised bodies whose shapes were constrained from coincident deep-seismic reflection data. We find the oldest magnetic anomalies to date from M10r (134.2 Ma, late Valanginian) north of the Falkland-Agulhas Fracture Zone and M3 (129.3 Ma, Barremian) south of the Rio Grande Fracture Zone. Hence, assuming the GPTS used is correct, continental breakup was contemporaneous with the Parana and Etendeka continental flood basalts. Many of the landward linear anomalies overlap seismically mapped Seaward Dipping Reflectors (SDRs). We interpret this to mean that a significant portion of the SDRs overlay crust formed by subaerial seafloor spreading. Here crustal accretion is envisaged to be similar to that at mid-ocean ridges, but sheet lava flows (that later form the SDRs) rather than pillow basalts form the extrusive component. Segmentation of the linear anomalies generated implies that this stage of continental breakup is organised and parallels the seafloor spreading centre that follows. Our results call into question the common assumption that at volcanic continental margins the first linear magnetic anomalies represent the start of conventional (submarine) oceanic crustal generation.
... Therefore, if a model of ionospheric and magnetospheric currents (such as CM4) cannot be used, leveling methods are required to reduce the effect of temporal variation. Crossover differences (CODs) are calculated in most leveling methods, and a correction in temporal variation is carried out to reduce the differences by assuming that the variation is a linear, polynomial, or sinusoidal function of time (Yarger et al. 1978;Sander and Mrazek 1982;Mittal 1984;Hsu 1995;Wessel 2010). Here, the author has developed a new leveling method, which consists of the calculation of corrections obtained by adjusting each measurement to a weighted average of its neighboring data, and a time-domain filtering calculation of these corrections. ...
Article
Full-text available
The author has developed a new leveling method for use with magnetic survey data, which consists of adjusting each measurement using the weighted spatial average of its neighboring data and subsequent temporal filtering. There are two key parameters in the method: the `weight distance' represents the characteristic distance of the weight function and the `filtering width' represents the full width of the Gaussian filtering function on the time series. This new method was applied to three examples of actual marine survey data. Leveling using optimum values of these two parameters for each example was found to significantly reduce the standard deviations of crossover differences by one third to one fifth of the values before leveling. The obtained time series of correction values for each example had a good correlation with the magnetic observatory data obtained relatively close to the survey areas, thus validating this new leveling method.
... Magnetic diurnal variation was corrected using data from the Gesashi magnetic observatory on Okinawa-jima operated by the Geospatial Information Authority of Japan. Crossover error assumed to be mainly caused by the ship's magnetization was minimized by using the software package x2sys of Generic Mapping Tools (GMT) (Wessel 2010;Bullard and Mason 1961). These corrections reduced the standard deviation at crossover points from 12.2 to 5.1 nT. ...
Article
Full-text available
Detailed bathymetry and magnetic anomalies in the southern part of the Central Ryukyu Arc reveal recent volcanic structures in a southwestward extension of the active volcanic front of the Ryukyu Arc. A line of bathymetric highs running subparallel to this recent volcanic front was observed approximately 20 km to the east. A set of small, sharply defined magnetic anomalies extends southward from this line of bathymetric highs to the islands Kume-jima and Aguni-jima, suggesting the former existence of an ancient volcanic front. The ages of volcanic rocks from these islands indicate that magmatic activity along the ancient volcanic front continued until at least approximately 2.1 Ma. The presence of magnetic anomalies between the two volcanic fronts suggests that the volcanic front has moved gradually westward. This shift can be explained by the termination of asthenospheric upwelling and/or the rapid retreat of the Ryukyu Trench after its change in subduction direction.
... The altimetric data were analyzed with Bash scripts on the basis of GMT tool box (WESSEL & SMITH, 1998;WESSEL, 2010). Before we process the full data sets, two types of interpolations were compared in order to build DEMs (Digital Elevation Model). ...
Conference Paper
Full-text available
The wave-dominated beach barrier of Villeneuve-Lès-Maguelone extends between Palavas and Frontignan (Gulf of Lions, northernmost Mediterranean Sea), along a slightly natural sandy protected area characterized by typical and well-expressed morphologies. In the nearshore, the sea bottom shows one or two approximately rectilinear sand bars. In 2010-2011, we performed a 6 month-long monitoring (autumn 2010/late spring 2011) of a 500 meter-long segment along this sand barrier. We used RTK D-GPS offering centimeter resolution to acquire sets of elevation data on the beach and the upper shoreface. The study is innovative because a full morphologic monitoring was performed each time the wave/ wind conditions changed significantly. These changes were tracked in real-time thanks to a set of hydrodynamic equipments deployed at sea, including well-known DREAL buoys. We identify from well- documented data: (1) a beach reconstruction cycle at the shoreline after a moderate storm event (wave height at 3-4 m) and (2) how a sand bar in the nearshore migrates to the shoreline and drives the shoreline shift.
... I organize MGD77 files as specified by Wessel and Chandler [2007] and additionally create an X2SYS database [Wessel, 2010] specific to MGD77 data. The X2SYS ...
Thesis
Full-text available
Professor P�al Wessel was responsible for my recruitment and also served as my academic adviser throughout my graduate studies. Dr. Wessel's role in this work and in my education can not be overstated. His was, without exception, a productive and positive work environment where all forms of inquiry were encouraged and where impediments to progress were always speedily addressed. The �4,275 e-mails that passed between Dr. Wessel and myself were exceeded only by the number of trackline surveys I reviewed and perhaps by his cumulative consumption of ca�einated beverages. I am a very fortunate and appreciative recipient of Dr. Wessel's outstanding guidance. Research funding was provided by the National Science Foundation, J. Watumull Scholarship, International Association for Mathematical Geosciences, Leonida Family Scholarship, Korea Ocean Research and Development Institute and the University of Hawai`i Graduate Student Organization. I thank Brian Taylor, Fernando Mart��nez, Kiseong Hyeong and the Hawai`i Mapping Research Group for involving me in numerous research topics and seagoing expeditions. Chapter 2 bene- �ted greatly from collaborative contributions by Dietmar Muller, Maria Seton, Brian Taylor, Seung-Sep Kim and Kiseong Hyeong. I thank William Sager for his encouraging review of Chapter 3. Dan Metzger, John Campagnoli and George Sharman of the National Geophysical Data Center provided exceptional support toward the global trackline data review. I thank Edgar Lobachevskiy, Seung-Sep Kim, Seunghye Lee, Todd Bianco, Kolja Rotzoll and Jonathan Weiss and the rest of my contemporaries for making my time away from the computer so interesting. And last but not least, I thank the world-class faculty and sta� of the GG department to whom I promise to donate should I ever get a job.
... [8] We organize MGD77 files as specified by Wessel and Chandler [2007] and additionally create an X2SYS database [Wessel, 2010] specific to MGD77 data. The X2SYS database allows us to quickly query for tracks within specific regions and/or having certain data columns. ...
Article
Full-text available
We announce the completion of 5,230 errata correction tables which remove extreme, obvious errors from the National Geophysical Data Center's marine geophysical trackline archive. Along with a range of error types correctable using along-track analysis, we determined that ˜62% of gravity surveys omit raw measurements and that ˜89% of magnetic anomalies are outdated and require recomputing. These errata tables reduce median global crossovers from 27.3 m to 24.0 m (bathymetry), 81.6 nT to 29.6 nT (magnetic anomalies) and 6.0 mGal to 4.4 mGal (free air gravity). We consider these methods gentler and more fundamental than the predominant crossover approach in that along-track corrections affect only data flagged as erroneous by a reviewer; the vast majority of data are left unchanged. Removal of obvious errors is an important initial step that should precede crossover analysis, interpolation, and gridding, thereby strengthening the scientific analysis.
Article
Several satellite gravity anomaly models are freely available to calculate the free‐air gravity anomaly in areas where shipborne gravity measurements are scarce. Two models produced by the Technical University of Denmark (DTU17) and the Scripps Institution of Oceanography (SIOv32.1), respectively, were selected to compute the free‐air anomalies over the Cosmonaut Sea, East Antarctica. A statistical comparison analysis was performed to evaluate the resolution of satellite gravity anomaly models by comparing them with the shipborne surveying date. The radially averaged energy spectra of free‐air anomaly from different sources were calculated and compared over two selected regions to further evaluate the reliability of the data derived from satellite gravity anomaly models. The satellite gravity anomaly models have a better resolution in the ocean basin than in the area near the continental shelf. The comparison analysis revealed that the precision of both DTU17 and SIOv32.1 is close to the shipborne gravity data, but on average, SIOv32.1 is a little bit better than DTU17. The spectral analysis showed that the shipborne measurements may provide higher resolution than the satellite gravity anomaly model at wavelengths shorter than 20 km, and the free‐air data derived from SIOv32.1 have better resolution than the one from DTU17. These shipborne datasets will provide contributions for the updates of the Antarctic gravity anomaly and enable new high‐resolution combined Earth gravity models to be derived in Antarctica.
Article
The Generic Mapping Tools (GMT) is one of the most used toolsets in the Earth, Ocean, and Planetary sciences, originating as far back as the 1980s. It is an early example of an open‐source software code modeled after contemporaneous UNIX tools, and it was one of the first to employ PostScript as its graphics language and netCDF for binary files to ensure portability across different computing platforms. Here I trace the origin and evolution of GMT to the present day. The additions of MATLAB, Python, and Julia wrappers around the GMT C Application Program Interface (API) are now introducing GMT to numerous new and younger users and the platform shows no sign of diminishing after almost 40 years; in fact, usage continues to expand. Pursuing GMT for fun (and funding) has positively affected other areas of my scientific interests, and my new research modules continue to be added to GMT. The future holds many promises but will require formation and leadership of communities to steer and maintain the essential science tools that have served us well for many decades.
Article
Geomagnetic surveys were conducted to make a marine geophysical map in the northern part of the Tokara Islands. A total magnetic anomaly map was made based on the observed total magnetic field. In addition, a magnetic anomaly map of the whole Tokara Islands was made together with the total magnetic anomaly calculated from the vector magnetic surveys obtained in the southern Tokara Islands last fiscal year. Magnetic dipole anomalies are observed around the island arc area and several bathymetric highs, presumed to be due to volcanic activity. Geomagnetic and published gravity features suggest that the north-south trending ridge on the western side of the survey area forms the eastern edge of the Okinawa Trough containing igneous activities. A positive magnetic anomaly is observed in a part of this topographic ridge, which is presumed to be due to magnetization caused by surface volcanic activity or a deep-seated magnetic body. On the trough, a positive magnetic anomaly without seafloor bathymetry is observed. Based on the published regional magnetic map, this anomaly is considered part of a magnetic dipole anomaly, suggesting subseafloor magmatic activity.
Article
Full-text available
A typical problem for magnetic surveys with small Unmanned Aerial Systems (sUAS) is the heading error caused by undesired magnetic signals that originate from the aircraft. This can be addressed by suspending the magnetometers on sufficiently long tethers. However, tethered payloads require skilled pilots and are difficult to fly safely. Alternatively, the magnetometer can be fixed on the aircraft. In this case, aircraft magnetic signals are removed from the recordings with a process referred to as magnetic compensation, which requires parameters estimated from calibration flights flown in an area with magnetically low-gradients prior to the survey. We present open-source software fully written in Python to process data and compute compensations for two fundamentally different magnetometer systems (scalar and vector). We used the software to compare the precision of two commercially available systems by flying dense grid patterns over a 135 × 150 m area using different suspension configurations. The accuracy of the magnetic recordings is assessed using both standard deviations of the calibration pattern and tie-line cross-over differences from the survey. After compensation, the vector magnetometer provides the lowest heading error. However, the magnetic field intensity recovered with this system is relative and needs to be adjusted with absolute data if absolute intensity values are needed. Overall, the highest accuracy of all suspension configurations tested was obtained by fixing the magnetometer 0.5 m below the sUAS onto a self-built carbon-fiber frame, which also offered greater stability and allowed fully autonomous flights in a wide range of conditions.
Article
Full-text available
해양지자기자료는 해양지각의 물리적 특성과 판구조운동을 이해하는 데 중요한 역할을 한다. 이 지자기자료를 수집하고 고해상도 해양지자기이상도를 만들기 위해 대표적으로 두 종류의 다른 자기센서가 사용된다. 양성자세차자력계는 선박 뒤에서 견인되며 자기장의 절대 강도를 측정하고, 선상삼성분자력계는 선박의 중심부에 설치되어 유도되는 주변자기장의 변화를 측정한다. 이상도를 만들기 위해 민감도가 높고 자료처리가 단순한 양성자세차자력계로 얻은 자료가 일반적으로 선호된다. 하지만 해양의 넓은 면적에 비해 지자기자료는 많이 부족하기 때문에, 이러한 자료의 공백을 메우기 위해 민감도가 상대적으로 많이 떨어지고 복잡한 보정과정을 거쳐야 하는 선상삼성분자력측정이 여전히 필요하다. Wessel (2010)에 의해 제시된 교차점 오류 평등화보정법을 이용한 교차 오류를 분석한 결과, 대부분의 선상삼성분자료에서 선형적인 변화양상을 가지는 추가적인 오류가 존재함을 발견하였다. 이 연구에서는 교차점에서의 선형회귀분석방법을 통해 이러한 오류를 획기적으로 줄임으로써 향상된 결과를 얻을 수 있었다. 이 오류는 점성자화 또는 잘못된 보정에 의해 만들어진 것으로 향후 향상된 보정과 효과적인 자료의 획득 및 이용을 위해 수치모델제작을 통한 분석이 요구된다. ABSTRACT: Marine magnetic data play an important role in understanding the physical properties of oceanic crust and tectonic movements. To collect the data, the two different magnetic sources, Proton Precession Magnetometer (PPM) measuring absolute intensity of the magnetic field, while Shipboard Three-Component Magnetometer (STCM) measuring relative variations of three magnetic components, are utilized and applied to make a high-resolution magnetic anomaly map. The PPM data are generally preferred for the data compilation because these are showing high sensitivity and requiring simple data processing. However, due to lack of the data, STCM measurements are still required to fill in the gaps in spite of the complications involved in data correction. After applying the crossover error analysis suggested by Wessel (2010), we found systematic errors showing linear trend from the most of the STCM tracks. In this study, we correct the errors and improve the results by applying a new crossover error analytic method based on linear regression. These errors are generally originated from false correction or viscous magnetization of the vessel. In the future, it is necessary to conduct numerical model analysis for improved correction, efficient data acquisition and use.
Article
To identify the causes of complex magnetic anomalies on the three guyots located in the northeastern part of the Ulleung Basin, East Sea, a linear least-squares inversion method was implemented. Modeling under the assumption that the guyots are uniformly magnetized did not yield reliable results. However, an inhomogeneously-magnetized model shows relatively high goodness-of-fit parameters and low residual magnetic anomalies, indicating that these guyots are considerably nonuniformly magnetized. Our modeling results show that all three guyots show similar magnetic patterns. Magnetic anomalies north of guyots have different magnetic polarities at similar depths in the south, indicating that these magnetic anomalies were formed at different times. These results suggest that relatively short wavelengths and complicated magnetic anomalies in the study area are the compound result of different magnetic polarities. Paleomagnetic analysis on trachyandesite lavas and pyroclastic breccia indicates a thermo-chemical remagnetization activity after Reunion Subchron (2.138–2.122 Ma). This result also suggests that remagnetization contributes to the creation of complex magnetic anomalies to some extent as well. In addition, the upward continuation results up to 1500 m using a FFT routine show high positive magnetic anomalies with a linearly segmented shape at the base of guyots. These linear and positive magnetic anomalies are interpreted as volcanic eruptions along fissures because mafic lava should have much higher magnetic anomalies compared to continental crust, where guyots are located. Therefore, we interpret that the complex magnetic anomalies distributed in the study area are due to polarity reversals occurring at least twice, and were formed during the formation of the current guyots as volcanic eruptions derived from fissures.
Article
An understanding of ocean bathymetry is important in marine planning, navigation, military activities, and environmental monitoring. The fusion of spatial data, such as those from multi-source digital bathymetric models (DBMs), can effectively improve the efficiency of large-scale seafloor topographical research and the accuracy of the results obtained therein. On the one hand, with the release of 15arc-seconds resolution DBMs, it is now necessary to verify the quality of these new products. On the other hand, in order to generate a new, high-quality, seamless DBM in the South China Sea (SCS) and adjacent areas, an adaptive subregional spatial-domain-weighted fusion framework is proposed. First, seven of the most widely used global DBMs are selected, and multi-source subregional measured depth-sounding data undergo data cleaning and other preprocessing. Next, based on the homogeneity of the terrain features, an adaptive subregional topographical analysis is performed, and the subregional data are weight-fused. Finally, the fusion dataset is post-processed via model smoothing and other procedures. In addition, the advantages and limitations of the DBMs of the SCS are compared. The results show that SRTM15_PLUS V2 is the most reliable of the original DBMs. The updated seamless SCS DBM is void-free and more similar to SRTM15_PLUS V2 with a resolution of 15arc-seconds. The root mean square error (RMSE) of the new model is 99.60 m. Its accuracy is 13%, 40%, 15%, and 1% higher than those achieved by the GEBCO_2019, GEBCO_2014, SRTM30_PLUS, and SRTM15_PLUS models, respectively, and its expression of the topography is more detailed and realistic. The feasibility and limitations of the proposed fusion framework are demonstrated. The present findings provide a useful reference for the timely reconstruction and updating of large-scale seafloor topography from multiple datasets.
Article
The Japanese asteroid explorer Hayabusa-2 will be launched in the mid-2010s to return samples from C-type near-Earth asteroid 1999JU3. During the rendezvous phase (i.e. proximity operation phase), we will conduct scientific observations to estimate physical parameters (e.g. gravity field, shape, pole direction, spin-rate, ephemeris) of the target body, which are crucial not only for its scientific investigation but also for spacecraft navigation. In particular, the mass is essential to perform a stable touchdown sequence to collect samples from the asteroids surface. We will attempt to estimate the gravity field of the target body using Earth-based radiometric tracking measurements (2-way Doppler and range) and spacecraft-based measurements (information from optical navigation cameras and laser altimeter) using a global parameter estimation technique. As the first step for gravity field estimation, we performed a simulation study on mass estimation under simple configuration and evaluated the relation between the quality and quantity of measurements and the accuracies of the estimation results. Subsequently, the detectability of the low degree and order gravity field coefficients was also studied. We will also present a method for ephemeris improvement of 1999JU3 using spacecraft relative position data and radiometric tracking measurements.
Article
We document a new high-resolution multibeam bathymetry compilation for the Canada Basin and Chukchi Borderland in the Arctic Ocean–United States Arctic Multibeam Compilation (USAMBC Version 1.0). The compilation preserves the highest native resolution of the bathymetric data, allowing for more detailed interpretation of seafloor morphology than has been previously possible. The compilation was created from multibeam bathymetry data available through openly accessible government and academic repositories. Much of the new data was collected during dedicated mapping cruises in support of the United States effort to map extended continental shelf regions beyond the 200 nm Exclusive Economic Zone. Data quality was evaluated using nadir-beam crossover-error statistics, making it possible to assess the precision of multibeam depth soundings collected from a wide range of vessels and sonar systems. Data were compiled into a single high-resolution grid through a vertical stacking method, preserving the highest quality data source in any specific grid cell. The crossover-error analysis and method of data compilation can be applied to other multi-source multibeam datasets, and is particularly useful for government agencies targeting extended continental shelf regions but with limited hydrographic capabilities. Both the gridded compilation and an easily-distributed PDF map are freely available through the University of New Hampshire's Center for Coastal and Ocean Mapping (ccom.unh.edu/theme/law-sea). The pdf is a full resolution, small file-size product that supports interpretation of Arctic seafloor morphology without the need for specialized gridding/visualization software.
Article
The plate tectonic revolution of the 1960s and 1970s is said to mark the Earth Sciences' transition from data-driven discovery to hypothesis testing. This is largely the case in marine geoscience as modern research expeditions focus on isolated study areas rather than globe spanning surveys typical of the past. Although the onus among scientists is generally to explore new problems by gathering new sets of data, I contend that we have not yet fully digested existing data sets. During my doctoral studies, I engaged in researches that examined large amounts of previously collected data. I utilized paleolatitude measurements in my attempts to constrain the past movements of the Ontong Java, Manihiki and Hikurangi oceanic plateaus. Through my resultant familiarity, I was able to discover a pattern within the paleolatitudes that suggested significant rotation of the plateaus. This rotation may explain why Ontong Java's paleo-pole does not agree with other coeval Pacific paleo-poles and with the Pacific apparent polar wander path in general. This inference further implies that Ontong Java may have been decoupled from the Pacific plate during the past or that, speculatively, the entire Pacific plate was rotated by ˜30°--50° to coincide with Ontong Java's paleo-orientation. I further immersed myself in the entirety of the National Geophysical Data Center's marine geophysical trackline archive in an effort to identify and correct large-scale and systematic errors in marine gravity, magnetic, and single/center beam depth measurements. I produced 5,203 "E77" correction tables pertaining to along-track analysis of each of the archived surveys. Initial inspection of discrepancies at intersecting tracks indicates improvements in median crossover errors from 27.3 m to 24.0 m, 6.0 mGal to 4.4 mGal, and 81.6 nT to 29.6 nT for depths, free air gravity anomalies, and residual magnetic anomalies, respectively.
Article
Full-text available
Version 3.1 of the Generic Mapping Tools (GMT) has been released. More than 6000 scientists worldwide are currently using this free, public domain collection of UNIX tools that contains programs serving a variety of research functions. GMT allows users to manipulate (x,y) and (x,y,z) data, and generate PostScript illustrations, including simple x-y diagrams, contour maps, color images, and artificially illuminated, perspective, and/or shaded-relief plots using a variety of map projections (see Wessel and Smith [1991] and Wessel and Smith [1995], for details.). GMT has been installed under UNIX on most types of workstations and both IBM-compatible and Macintosh personal computers.
Article
Full-text available
We have examined 4918 track line geophysics cruises archived at the U.S. National Geophysical Data Center (NGDC) using comprehensive error checking methods. Each cruise was checked for observation outliers, excessive gradients, metadata consistency, and general agreement with satellite altimetry-derived gravity and predicted bathymetry grids. Thresholds for error checking were determined empirically through inspection of histograms for all geophysical values, gradients, and differences with gridded data sampled along ship tracks. Robust regression was used to detect systematic scale and offset errors found by comparing ship bathymetry and free-air anomalies to the corresponding values from global grids. We found many recurring error types in the NGDC archive, including poor navigation, inappropriately scaled or offset data, excessive gradients, and extended offsets in depth and gravity when compared to global grids. While ~5-10% of bathymetry and free-air gravity records fail our conservative tests, residual magnetic errors may exceed twice this proportion. These errors hinder the effective use of the data and may lead to mistakes in interpretation. To enable the removal of gross errors without over-writing original cruise data, we developed an errata system that concisely reports all errors encountered in a cruise. With such errata files, scientists may share cruise corrections, thereby preventing redundant processing. We have implemented these quality control methods in the modified MGD77 supplement to the Generic Mapping Tools software suite.
Article
Full-text available
The global 5-arcmin gridded topography data ETOPO-5 are based on contour maps rather than original soundings and have large errors. The artificial statistical distribution generated by digitizing contours makes these data unsuitable for use in regression models for depth-age variations. Their amplitude spectrum is bounded by a (frequency)−4 power law, so that they should not be used when the gravity-topography transfer function is important, such as in studies of flexural isostasy and mantle convection. I assess the accuracy of 14,491,069 digital ship soundings in 2253 cruise surveys collected between 1955 and 1992 in the Lamont-Doherty Earth Observatory on-line data base by analyzing 329,058 crossover errors (COEs) at intersecting ship tracks. Five percent of cruises with internal COEs yield root-mean-square COE amplitudes exceeding 500 m; all of these have errors in digitizing two-way travel time from analog precision depth recorder traces. Twenty-eight cruises were found which had errors caused by misinterpretation of the nominal sound velocity used when travel times were reported as nominal depths. Two nominal sound velocities in common use differ by 2.5%, an amount which is often undetectable, producing uncertainties in depth of this magnitude. Ship data have been acquired at different rates over time, with the peak of activity in the early 1970s. Although present technologies can yield very accurate data, these are acquired at a rate which is small with respect to the total available data; the cumulative median global COE has remained constant at 26 m since the late 1970s. Most recent data acquisition has been in the northern hemisphere oceans, and the oldest and least accurate data are in the southern oceans where median COEs are 100–250 m. The majority of the data in the South Pacific were acquired before the advent of satellite navigation.
Article
Full-text available
We have developed a new software package, called MB-System, for processing and display of Hydrosweep DS multibeam data on the R/V Maurice Ewing. The new software includes tools for modeling water sound velocity profiles, calculating multibeam bathymetry from travel time values by raytracing through a water sound velocity profile, interactive and automatic editing of multibeam bathymetry, as well as a variety of tools for the manipulation and display of multibeam data. A modular input/output library allows MB-System programs to access and manipulate data in any of a number of supported swath-mapping sonar data formats, including data collected on Hydrosweep DS, Sea-Beam “Classic”, SeaBeam 2000, SeaBeam 2100, H-MR1, Simrad EM12, and other sonars. Examples are presented of the software's application to Hydrosweep data recently collected on the R/V Maurice Ewing.
Article
Full-text available
The accuracy of Lamont-Doherty Geological Observatory's global marine gravity data bank has been assessed by examining the crossover errors (COEs) at intersecting ship tracks. More than 63,000 COEs were found, having a standard deviation of 22.43 mGal. The COEs are used to find and remove linear drifts and DC shifts present in the data set. This adjustment reduces the standard deviation to 13.96 mGal. COEs generally decrease with latitude, which is attributed to uncertainties in the Eotvos correction. High COEs occur in areas of high gravity gradients. These two features point to poor navigation as the principal source of error in marine gravity surveys. COEs have generlly been decreasing during the last two decades, reflecting improvements in instrumentation and quality of navigation. A comparison of the shipboard gravity data to Seasat derived gravity data revealed a 9-mGal bias in the terrestrial data, which probably reflects uncertainties in the choice of reference field. The adjusted data set was used to generate a gravimetric geoid for the NW Atlantic Ocean. By removing this geoid from the Seasat sea surface heights, a residual 'geoid' was obtained. A special feature of this map is an elongate ENE trend that appears to correlate with the edge of the Gulf Stream.
Article
Full-text available
Three approaches are used to reduce the error in the satellite-derived marine gravity anomalies. First, we have retracked the raw waveforms from the ERS-1 and Geosat/GM missions resulting in improvements in range precision of 40% and 27%, respectively. Second, we have used the recently published EGM2008 global gravity model as a reference field to provide a seamless gravity transition from land to ocean. Third, we have used a biharmonic spline interpolation method to construct residual vertical deflection grids. Comparisons between shipboard gravity and the global gravity grid show errors ranging from 2.0 mGal in the Gulf of Mexico to 4.0 mGal in areas with rugged seafloor topography. The largest errors of up to 20 mGal occur on the crests of narrow large seamounts. The global spreading ridges are well resolved and show variations in ridge axis morphology and segmentation with spreading rate. For rates less than about 60 mm/a the typical ridge segment is 50–80 km long while it increases dramatically at higher rates (100–1000 km). This transition spreading rate of 60 mm/a also marks the transition from axial valley to axial high. We speculate that a single mechanism controls both transitions; candidates include both lithospheric and asthenospheric processes.
Article
Full-text available
A gridding method commonly called minimum curvature is widely used in the earth sciences. The method interpolates the data to be gridded with a surface having continuous second derivatives and minimal total squared curvature. The minimum-curvature surface has an analogy in elastic plate flexure and approximates the shape adopted by a thin plate flexed to pass through the data points. Minimum-curvature surfaces may have large oscillations and extraneous inflection points which make them unsuitable for gridding in many of the applications where they are commonly used. These extraneous inflection points can be eliminated by adding tension to the elastic-plate flexure equation. It is straightforward to generalize minimum-curvature gridding algorithms to include a tension parameter; the same system of equations must be solved in either case and only the relative weights of the coefficients change. Therefore, solutions under tension require no more computational effort than minimum-curvature solutions, and a
Article
Full-text available
When creating camera‐ready figures, most scientists are familiar with the sequence of raw data → processing → final illustration and with the spending of large sums of money to finalize papers for submission to scientific journals, prepare proposals, and create overheads and slides for various presentations. This process can be tedious and is often done manually, since available commercial or in‐house software usually can do only part of the job. To expedite this process, we introduce the Generic Mapping Tools (GMT), which is a free, public domain software package that can be used to manipulate columns of tabular data, time series, and gridded data sets and to display these data in a variety of forms ranging from simple x‐y plots to maps and color, perspective, and shaded‐relief illustrations. GMT uses the PostScript page description language, which can create arbitrarily complex images in gray tones or 24‐bit true color by superimposing multiple plot files. Line drawings, bitmapped images, and text can be easily combined in one illustration. PostScript plot files are device‐independent, meaning the same file can be printed at 300 dots per inch (dpi) on an ordinary laserwriter or at 2470 dpi on a phototypesetter when ultimate quality is needed. GMT software is written as a set of UNIX tools and is totally self contained and fully documented. The system is offered free of charge to federal agencies and nonprofit educational organizations worldwide and is distributed over the computer network Internet.
Article
A detailed map of the Caribbean region is nearing completion as part of the magnetic map of North America project. A new total intensity magnetic anomaly map of the Colombian Basin has been prepared from the available magnetic data. The magnetic field through much of this basin is characterized by low amplitude anomalies (200-400 nT), hence cross over errors have a more noticeable effect. The paper discusses how the cross tie errors were minimized using detailed cross over analysis of the data set. -from Authors
Article
A relatively simple method of diurnal drift removal from aeromagnetic data which does not require the use of a base station is presented. The underlying assumption is that diurnal drift during flight is a smoothly varying low-order polynomial in time. The polynomial coefficients are determined by minimizing residuals at flight-line/tie-line intersections using least squares. This procedure is applied to a conventional sensitivity aeromagnetic survey in northeastern Kansas. The results of the drift determinations compare favorably with independent knowledge of actual drifts such as the magnetic K indices and other measurements.
Article
A method is presented which does not require a model for the source of crossover errors in marine gravity data in order to minimize them. The cruises are divided up into straight line segments and the assumption is made that whatever the sources of error, their net effect will be constant over the length of the track segment. A least‐squares approach is used where the crossover differences in the original data are the observations which it is desired to match. The desired set of constant corrections, one for each segment, is that which will minimize the sum of the squares of the residual crossover errors. This method has the advantage of reducing the crossover errors while simultaneously preserving the relative gravity anomalies along individual ship’s profiles. A data set consisting of gravity measurements made on nine cruises in the region of the Vema fracture zone in the equatorial Atlantic is used as a case study. The resulting least squares solution reduces the root mean square (rms) of 298 crossover...
Article
A quantitative analysis method is developed to objectively identify significant bathymetric changes between multibeam sonar surveys of the same seafloor area. The technique involves mathematical gridding of bathymetric surveys from different years, co-registration of grids based on prominent features covered in both surveys, calculation of slope-weighted depth difference grids, and determination of the boundaries of statistically significant differences. The resulting difference grids are used in interpreting geological processes and calculating the volume of new material added to the seafloor. The method is applied to data from the southern Juan de Fuca Ridge, where a seafloor volcanic eruption of 0.05 km3 is inferred to have occurred between 1983 and 1987. Camera-tow data and submersible observations provide ground truth which confirms the validity of the quantitative analysis method, and indicates empirically that the detection limit of the method allows new seafloor landforms measuring 1×105 m2 area (or about 200-300 m in diameter) and 5-15 m in thickness to be detected through repeated Sea Beam bathymetric surveys.
Article
Cross-over errors (XOEs) from track data may give rise to pseudostructures in a contoured map. Such pitfalls can be reduced by adjusting all the measured values. FORTRAN programs are presented to determine automatically cross-over points (XOPs) and XOEs, to adjust XOEs to zero and to correct the data proportionally between every two XOPs.
Article
We describe a new technique that can be used to level data collected along regular and irregular line patterns with or without tie-line control. The technique incorporates a moving differential median filter to minimize line-level errors, to level survey-line data, and to micro-level data with no tie-line control. This overcomes the problem of standard leveling methods that lose their effectiveness with irregular flight patterns. To validate the method, we use it to level very-low-frequency (VLF) electromagnetic (EM) data from a helicopter survey where flight lines are parallel. Leveling is also performed oil a set of vintage aeromagnetic data from the North Sea, gathered from nonparallel flight lines. Results show that the differential median filter leveling technique is superior to the standard leveling method because it results in fewer line errors and less distortion of high-wavenumber anomalies when processing irregular survey lines, making the method suitable for a wide variety of data sets.
Article
A procedure for estimating background-correction terms for the uranium channel of an airborne gamma-ray survey has been developed. The residuals obtained from a multiple linear regression of flight-line means for the uranium channel on the means for thorium and potassium are used to correct the uranium channel for each line. The procedure assumes that, were it not for these background errors, the uranium flight-line means would be a linear function of the means for potassium and thorium. It also assumes that the background correction is the same for the whole of each line. In spite of these limitations, the method produces good background estimates consistent with those found by more sophisticated methods. -Author
Article
Gravity and magnetic surveys on sea, land, or air invariably exhibit discrepancies at intersections of survey lines. These 'misties' caused by many errors have to be removed before any meaningful contour map can be prepared. A simple algorithm based upon weightage assignments using a variance criterion is presented. It assumes the misties to be random variables. The misties at the crosspoints are invariably reduced to zero, and solutions are unique. The method is adaptable to any conceivable survey configuration. Each individual profile is given a weightage depending upon how its observations compare with those of other profiles at the crosspoints. It is a noniterative algorithm and is easily amenable to processing by computer.-from Author
Article
In its first 15 months of continuous operation, the Mars Orbiter Laser Altimeter (MOLA) instrument aboard Mars Global Surveyor ranged to Mars over 330 million times, generating more than 5000 orbital profiles, with a ranging precision of 0.4 m over smooth terrain. The accuracy of the profiles depends on knowledge of the spacecraft position, orientation, and observation time, which are subject to errors. We model these errors via the analysis of over 24 million altimetric crossovers. A quasiperiodic, once per revolution adjustment of the ground tracks as a function of time in three locally orthogonal directions minimizes the altimetric residuals via least-squares. Using a sparse matrix technique, computational effort scales linearly with the number of crossovers and only marginally with the number of parameters. Orbital errors mainly result from poor modeling of spacecraft thrusting events in the absence of tracking. Seasonal effects, likely due to changing thermal environment, as well as residual miscalibrations, are evident in the pointing solutions. Incorporating multiple parameters per revolution significantly improves crossover residuals, and resolves pointing aberrations during orbital transitions from night to day. Altimetry from the adjusted tracks generates a topographic model whose accuracy is typically better than 1 m vertically with respect to the center of mass of Mars. The centroid position of each MOLA shot is typically accurate to ∼100 m horizontally. Terrain models from accurately located lidar data can be gradient-shaded to illuminate geological structures with 1 in 1000 slopes that are invisible to cameras. Temporal changes in elevation (e.g., frost deposition/ablation) at decimeter levels may also be assessed using crossovers, but results must be interpreted with caution due to uncertainties in range walk correction.
Article
The theoretical model of Tu et al. (1984) describing the radial variation of the power spectra of interplanetary Alfvenic fluctuations has been extended to explain the radial variations of Alfvenic fluctuations between 0.3 and 5 AU and of the solar wind proton temperature observed by Helios between 0.3 and 1 AU. The cascade energy flux function is derived, leading to a -5/3 power law in the high-frequency range of the spectrum and the corresponding analytical solution of the spectral equation. The calculated results concerning the spectrum and other parameters of Alfvenic fluctuations are presented and compared with observations. The radial variations of the heating rate and of the proton magnetic moment corresponding to the radial variation of the power spectra between 0.3 and 1 AU are given. A relationship between the thermal speed of the solar wind protons and the variance of the solar wind velocity fluctuations is presented.
Article
The magnetisation of a ship may be divided into a permanent and an induced part. If the induced part is a linear function of the field components and the permanent part is independent of them, the disturbance of the total force by the ship contains a term independent of magnetic heading and terms proportional to the sine and cosine of the heading and of twice the heading, the sine terms being less than a tenth of the cosine terms.The variation with distance astern of the ship is similar to that due to a pole near the bow and one near the stern plus a line of vertical dipoles.These results are verified by experiment. The disturbance of the field by Discovery II and by Sarsia is less than 10 γ at a distance of 2 ship's lengths astern. It is believed that this result will hold for most ships. A method for the determination of the principal coefficients and their variation with distance is described.
Article
An updated, new version (3.0) of the Generic Mapping Tools (GMT) has just been released. GMT is a public domain collection of UNIX tools that contains programs to manipulate (x,y) and (x,y,z) data and to generate PostScript illustrations, including simple x-y diagrams, contour maps, color images, and artificially illuminated, perspective, shaded-relief plots using a variety of map projections [Wessel and Smith, 1991]. GMT has been installed on super computers, workstations and personal computers, all running some flavor of UNIX. We estimate that approximately 5000 scientists worldwide are currently using GMT in their work.
Article
Aeromagnetic survey data is in error due to diurnal variations, navigational inaccuracies, and instrumental drift. There is a corresponding discrepancy between the readings at the point of intersection between any two lines. The main problem in correction procedures is how to prorate this discrepancy between the two lines. We formulate this as a problem in statistical estimation and find all solutions under the assumption that the systematic errors are varying slowly along a line of data. From this class of solutions we select as “Optimal” the one that results in least adjustment of the data. The resulting algorithm leads to a simple and efficient computer program.
Article
Helicopter-borne frequency-domain electromagnetic (EM) data are used routinely to produce resistivity maps for geologic mapping, mineral exploration, and environ- mental investigations. The integrity of the resistivity data depends in large part on the leveling procedures. Poor resistivity leveling procedures may, in fact, generate false features as well as eliminate real ones. Resistivity leveling is performed on gridded data ob- tained by transformation of the leveled EM channel data. The leveling of EM channel data is often imper- fect, which is why the resistivity grids need to be leveled. We present techniques for removing the various types of resistivity leveling errors which may exist. A semi- automated leveling technique uses pseudo tie-lines to remove the broad flight-based leveling errors and any high-magnitude line-based errors. An automated level- ing technique employs a combination of 1-D and 2-D nonlinear filters to reject the rest of the leveling errors in- cluding both long-and short-wavelength leveling errors. These methods have proven to be useful for DIGHEM he- licopter EM survey data. However, caution needs to be exercised when using the automated technique because it cannot distinguish between geological features parallel to the flight lines and leveling errors of the same wave- length. Resistivity leveling is not totally objective since there are no absolutes to the measured frequency-domain EM data. The fundamental integrity of the EM data depends on calibration and the estimate of the EM zero levels. Zero level errors can be troublesome because there is no means by which the primary field can be determined absolutely and therefore subtracted to yield an absolute measure of the earth's response. The transform of in- correctly zero-leveled EM channels will yield resistivity leveling errors. Although resistivity grids can be leveled empirically to provide an esthetically pleasing map, this is insufficient because the leveling must also be consistent across all frequencies to allow resistivity to be portrayed in section. Generally, when the resistivity looks correct in plan and section, it is assumed to be correct.
Article
A simple technique is described for removing residual levelling errors from aeromagnetic data. These residual errors have a distinct spectral signature and are easily removed from a grid of the data using existing directional grid-filtering methods. The filtered grid is then used to correct the located data. The method can not distinguish between levelling errors and real elongate anomalies parallel to the flight-line direction. It should therefore be used selectively.
Article
With the passage of time it is interesting to note how the present generation of geophysicists is absorbed with the mathematical development and substantiation of geophysical observations, whereas earlier, more simple and practical methods predominated. It was with particular interest that I read the paper in the October, 1978 issue of Geophysics regarding diurnal drift removal and the mathematical development of a least-squares procedure to do this.
Article
Using the established procedure, the calibrated covariance matrix of harmonic geopo‐tential coefficients of the new (Earth Gravity Model) EGM96 (to 70 × 70) is projected to single‐and dual‐satellite crossover errors, and their spectral latitude lumped coefficient constituents. These results are compared with previous gravity solutions, such as JGM 2 and JGM 3, to assess the strengths and weaknesses of the new solution. This analysis quantifies the level of improvement over previous solutions, as well as suggests areas where further refinements are required to achieve subdecimeter accuracy over a wide range of satellite missions.
Article
A new technique removes the leveling errors in airborne geophysical data based on an assumption that the data are continuous i.e., not renulled or recalibrated from line to line. A single flight line is selected as a reference to level and tie all survey lines to this continuously varying datum. The leveling errors are determined in a least-squares sense from the reference line and the adjacent line to be leveled. The technique markedly improves the quality of the unleveled, raw data and works best if the reference line overlaps most of the line being leveled. This method cannot distinguish a lin-ear geologic feature in the direction of flight lines from a lev-eling error. The technique initially was developed for air-borne electromagnetic EM data in cases where tie-line lev-eling works poorly. However, it may also be used for air-borne-magnetic and other data, in which case tie-lines used in conventional leveling techniques are not needed.
Article
Total field magnetic values recorded during a survey by RRS Charles Darwin off Ghana yielded large track-crossover errors of up to 120 nT (RMS value of 58.7 nT), which masked the weak magnetic anomalies in this equatorial region. the heading effect of the ship's magnetic field and strong diurnal variation in the Earth's field are likely causes of the errors. A heading effect experiment shows differences of up to 30 nT for Charles Darwin on different headings, which have been corrected for. the diurnal variation has been calculated by using the magnetic field observations themselves, because observatories are either too distant or were inoperative at the time of the survey. A method that uses the anomalies corrected for heading effect and differences at track crossovers was found to produce an acceptable curve, with an amplitude of 120 nT and a shape similar to that of equatorial observatories. Fully corrected anomalies have crossover errors of up to only 40 nT with an RMS value of 17.5 nT. These anomalies reveal a linear magnetic anomaly low along the continental slope off Ghana.
Article
Adjustments to satellite constrained navigation are required to match SeaBeam bathymetric data at track crossings due to errors in dead reckoning and inaccuracies in satellite fixes. By shifting one of the SeaBeam swaths involved in a track crossing relative to the other and calculating the sum of the squares of the differences in bathymetry within the area of overlapping coverage, we map a two-dimensional error surface whose minimum corresponds to the best estimate of the correction to navigation required at the crossing point. Estimates of the covariance of this correction are derived from the error surface. We employ the curve fitting technique of Tarantola and Valette (1982) to invert for a smooth correction function to a starting model of the position of the ship as a function of time. This technique incorporates formal errors assigned to dead reckoning, satellite fixes, and the shifts required to match bathymetric swaths at crossing points in a simultaneous inversion for the correction function for all tracks within the study area. In a test of the method in a study area on the southern Mid-Atlantic Ridge, a data set involving two cruises, 30 days of SeaBeam data, and 753 track crossings, we found that crossing SeaBeam swaths can potentially resolve the relative position of the ship on the two tracks to within 30 to 70 m. The inversion procedure yielded a much better constrained navigation function and much improved match of bathymetry. The final model of the navigation fit crossing shifts about as well as satellite data (with respect to their assigned data errors) with the RMS value of the crossing shifts decreasing from 1200 m in the original satellite-constrained navigation to 200 m in the final solution. However, the potential resolution of position using SeaBeam swaths was not fully achieved in the solution because there are systematic bathymetric artifacts in SeaBeam data, multiple local minima in the error surfaces in highly lineated topography, inadequate dead reckoning data, occasional bad satellite fixes, and limitations on the short period corrections allowed in the model.
Article
Differences in values obtained at trackline crossovers can be used to give an indication of the overall accuracy of marine geophysical surveys. A study of the statistical validity and interpretative value of several parameters commonly used to summarize crossover discrepancies has been made. It was found that the interpretation of commonly reported statistical parameters such as standard deviation and R.M.S. error can be complicated by the presence of a systematic error in the survey data, by possible non-normalities in the error distribution, or by the use of a distribution of positive differences. A standard procedure for the analysis of survey error on the basis of crossover discrepancies is outlined, including techniques for estimating systematic survey errors and for the evaluation of the normality of the observed distribution of crossover discrepancies.
Article
Marine geophysical data collected by government and academic vessels are archived at the US National Geophysical Data Center in Boulder, Colorado. Data exchanges between NGDC and source institutions use an ASCII, punch-card format known as the MGD77 format, reflecting a style of file design common in the 1970s. We have developed a set of new software tools that can convert between this exchange format and a new COARDS-compliant, netCDF-based, architecture-independent file format that we call the MGD77+ format. The new mgd77 tools allow the data to be manipulated in a variety of ways useful for marine research. These tools are written in POSIX-compliant C, are distributed as a supplement to the Generic Mapping Tools, and can be installed on any computer platform. Because the new format is COARDS and CF-1.0 compliant, the files can be read by any general-purpose program capable of decoding these standards; a welcome side effect. One such program is the Java application ncBrowse developed by NOAA. Furthermore, the more compact netCDF files have file sizes that are, on average, only 30% of the original sizes. Because procedural changes at NGDC and source institutions necessarily occur infrequently, it is expected that the MGD77 format will remain the official exchange format for underway geophysical data for some time, whereas the new MGD77+ format offers users much needed flexibility in how they use the data.
Article
Changes of bathymetry derived from multibeam sonars are useful for quantifying the effects of many sedimentary, tectonic and volcanic processes, but depth changes also require an assessment of their uncertainty. Here, we outline and illustrate a simple technique that aims both to quantify uncertainties and to help reveal the spatial character of errors. An area of immobile seafloor is mapped in each survey, providing a common ‘benchmark’. Each survey dataset over the benchmark is filtered with a simple moving-averaging window and depth differences between the two surveys are collated to derive a difference histogram. The procedure is repeated using different length-scales of filtering. By plotting the variability of the differences versus the length-scale of the filter, the different effects of spatially uncorrelated and correlated noise can be deduced. The former causes variability to decrease systematically as predicted by the Central Limit Theorem, whereas the remaining variability not predicted by the Central Limit Theorem then represents the effect of spatially correlated noise. Calculations made separately for different beams can reveal whether problems are due to heave, roll, etc., which affect inner and outer beams differently. We show how the results can be applied to create a map of uncertainties, which can be used to remove insignificant data from the bathymetric change map. We illustrate the technique by characterizing changes in nearshore bed morphology over one annual cycle using data from a subtidal bay, bedrock headland and a banner sand bank in the Bristol Channel UK.
Article
Cross-over errors (COEs) are problems encountered when analyzing intersecting data. A substantial portion of the error may have a systematic origin, which can be determined by analyzing the COEs for each cruise. An automatic method for locating COEs by identifying the location of cross-over, and estimating the error with cubic splines has been applied to a marine gravity survey in the western Atlantic Ocean.
Article
A new mathematical method is developed for interpolation from a given set of data points in a plane and for fitting a smooth curve to the points. This method is devised in such a way that the resultant curve will pass through the given points and will appear smooth and natural. It is based on a piecewise function composed of a set of polynomials, each of degree three, at most, and applicable to successive intervals of the given points. In this method, the slope of the curve is determined at each given point locally, and each polynomial representing a portion of the curve between a pair of given points is determined by the coordinates of and the slopes at the points. Comparison indicates that the curve obtained by this new method is closer to a manually drawn curve than those drawn by other mathematical methods.
Article
A new method (constrained sinusoidal crossover adjustment) for removing the orbit error in satellite altimetry is tested (using crossovers accumulated in the first 91 days of the Geosat non-repeat era in the tropical Pacific) and found to have excellent qualities. Two features distinguish the new method from the conventional bias-and-tilt crossover adjustment. First, a sine wave (with wavelength equaling the circumference of the Earth) is used to represent the orbit error for each satellite revolution, instead of the bias-and-tilt (and curvature, if necessary) approach for each segment of the satellite ground track. Secondly, the indeterminacy of the adjustment process is removed by a simple constraint minimizing the amplitudes of the sine waves, rather than by fixing selected tracks. Overall the new method is more accurate, more efficient, and much less cumbersome than the old. The idea of restricting the crossover adjustment to crossovers between tracks that are less than certain days apart in order to preserve the large-scale long-term oceanic variability is also tested with inconclusive results because the orbit error was unusually nonstationary in the initial 91 days of the GEOSAT mission.
Crossover analysis of Mars Orbiter Laser Altimeter data Improvements in navigation using SeaBeam crossing errors
  • G A Neumann
  • D D Rowlands
  • F G Lemoine
  • D E Smith
  • M T Zuber
Neumann, G.A., Rowlands, D.D., Lemoine, F.G., Smith, D.E., Zuber, M.T., 2001. Crossover analysis of Mars Orbiter Laser Altimeter data. Journal of Geophysical Research 106, 23,753–23,768. Nishimura, C.E., Forsyth, D.W., 1988. Improvements in navigation using SeaBeam crossing errors. Marine Geophysical Researches 9, 333–352.
The Marine Geophysical Data Exchange Format -"MGD77", In: Key to geophysical records documentation 10 (revised)
  • A M Hittelman
  • R C Groman
  • R T Haworth
  • T L Holcombe
  • G Mchendrie
  • S M Smith
Hittelman, A.M., Groman, R.C., Haworth, R.T., Holcombe, T.L., McHendrie, G., Smith, S.M. 1977. The Marine Geophysical Data Exchange Format -"MGD77", In: Key to geophysical records documentation 10 (revised). National Geophysical Data Center, NOAA, Boulder, CO.
In: Algorithms in C On the accuracy of digital bathymetric data
  • R Sedgewick
Sedgewick, R., 1990. In: Algorithms in C. Addison-Wesley, Reading, MA 657 pp. Smith, W.H.F., 1993. On the accuracy of digital bathymetric data. Journal of Geophysical Research 98 (B6), 9591–9603.
Simple micro-leveling for aeromagnetic data. Exploration Geophysics. v22
  • Minty