Project

Solar System Gravimetry and Gravitational Engineering

Goal: Develop an Integrated network Of earth and space based sensors to map, in real time, the gravitational potential and acceleration field, and mass distributions in the solar system. Identify and calibrate existing networks. Adapt existing technologies. Develop new gravitational engineering technologies.

Updates
3 new
19
Recommendations
1 new
8
Followers
3 new
34
Reads
85 new
436

Project log

Richard Collins
added an update
I have never used ResearchGate much for talking to many people, so please excuse me if this is a bit clumsy. I am trying to share some of the things I think about when I read different papers and look at different projects. I have a pretty complete picture of where gravitational and related sensor arrays could be used for global collaborations. But everyone is going in different directions, with mostly local and personal concerns - not the needs of the human species - in mind or heart.
These are comments. Not demands, not criticism, not commands. Just some ideas that might point to future industries, issues and interesting possibilities.
Belhadj Attaouia
I am still getting over a bout with covid, so not sure if this holds together very well. But many parts of it might give you some ideas, and I hope to go back through is later. That is why I write things down.
The first things that come to mind when reading this.
There are a lot of cell phone GPS sensors now, and those used for ionospheric and atmospheric correlations are slowly organizing for different areas. So GPS/GNSS networks for geo-atmospherics should be more available, data accessible, archives managed and algorithms evolving.
The three axis gravimeters are few and far between, and still expensive. There are a few broadband seismometers but they are mostly stuck inside "geophysics" and "seismic" stations. The ones that are sensitive enough to operate as gravimeters (accelerometers sensitive enough to track the sun moon tidal signal) are mostly tuned to pick up distant earthquakes, since the people running the sites tend to want to do seismology, rather than climate modeling. And why would anyone spend tens of thousands of dollars (seismometer) when a handful of GPS/GNSS are cheap. Add antennas to them, and still not that expensive. I think the constraint for GNSS/GPS is software. And your algorithms certainly are a step in the right direction.
But, for me or anyone to test your algorithms, I might have to code it from scratch, and find some suitable data. I have not looked (when I look for something I almost always find it) for GPS data lately. The data tends to get dumped into large stores - separate from the tools and algorithms. Usually requiring a large investment of time, money and frustration to find all the pieces just to run some simple "hello world" statistical profiles.
I would recommend that you work with the second time derivative of the data. Position is hard to deal with. Even velocity tends to wander all over the place. I think the reason Newton and others picked the second time derivative (acceleration) for simplifying models, is that it is often fairly stable. Me, I almost never stop there, but also use the 3rd ("jerk") and then Taylor series out to as many as 20 terms. It is easy to do with differences and simple approximations. The acceleration term for local earth tides can be calculated with a couple of lines of code from the sun moon earth centers, station location and earth rotation rate to high accuracy. But, I still have not found inexpensive MEMS or atom interferometer gravimeters to recommend. The few groups who make such things are all academic and so know little about growing global markets or starting new industries. They tend to patent first and then maybe think about the needs of society later. Give every person in the world an accelerometer sensitive enough to track the sun and moon, and solve for the position and orientation of the cell phone sensor - then there is also a lot of data for geophysics and regional models of many kinds.
I have not had time to read what else you are working on. And you put lots of methods into one paper - but really did not talk about the tools and data requirements. Setting up a problem is often 95% of the effort. The algorithms (once you have a toolkit or development environment and stable data streams) are fun and fairly quick. I cannot tell what directions you are going in from one paper. I like what I see, just hope to have more time.
I think there is a lot more possible with auscultation networks. I check infrasound networks periodically, but I have not checked for cell phone and other microphone networks lately. Any of the software defined radio tools work well at "audio" frequencies (from nanoHertz to 1 GHz or 1 Gsps). And the statistical measures (because there is so much data and the processes are slowly varying) tend to be good summary statistics for correlations and models.
While I know there are lots of uses for "continuous displacement fields", the energy and power are tied to acceleration and "jerk". Not sure of your background in gravitation, but the earliest papers on gravitational radiation were all about the third time derivative. I can't write it here symbolically, but "x triple dot", the jerk, the change in the acceleration. When the jerk changes things are happening. And those tend to be events that get tracked and correlated by many different networks - or could be if people valued correlations more.
Not sure if these comments will be useful. I have been through so many data groups and datasets on the Internet in the past 30 years. There are many of these "global sensor networks" that have lots more data that they throw away. The LIGO ignores all the earth based data, which is much stronger and more applicable to human activities. CERN throws away all the "low energy" data and events because a few PI's have zero interest in the billions of people learning about the world, or the hundreds of thousands who should have access to basic training in statistics and machine learning algorithms - but need good solid data streams to grow with.
I am trying to see if there is anything I can do to speed things up. I had hoped the groups building gravimeters and gradiometers would hurry up. But I am getting tired of waiting. If a group gets enough income to pay salaries and stipends, they stop looking. I cannot easily explain it, but I call it one of the "internet pathologies" that stops growth and development dead. Things that ought to get done in weeks take decades. All the tools and data and algorithms (models) are there, but not the will to share and work together.
I think it would be fairly easy to use fine grained 3D time series to track the whole volume of the regions you are monitoring. I just looked to see who else is doing "displacement field modelling". The gravitational sensors are not built yet, but infrasound, magnetotelluric, magnetometers, meteorological, climate, radar, lidar, passive seismic, electric field, and some others would work. But without some kind of global coordination and sharing, the diffusion of results and issues would be so slow. On the Internet, one has to take into account decades of work by many different groups and individuals - many who retire or die. And also many different human languages, organizational purposes and professional languages and purposes.
Sorry. I really don't have terms for all the things that come up when working on global communities.
"displacement field" OR ""displacement fields" has 2.02 Million entry points (Google, 26 Jun 2022). That is lots of overlap, duplication, fragments, untraceable mentions, false leads. It adds up to massive duplication, lost time and lost opportunities, wasted decades for thousands of people.
I think I see a way through. I am going to look at LIDAR interferometry and synthetic aperture radar methods again for displacement, but concentrate on the spectrum of time derivatives, and co-located clusters of those data events. There are lots of fun machine vision and "machine learning" tools now, but they tend to have too many dependencies and requirements that most people don't have.
Richard Collins, The Internet Foundation
 
Richard Collins
added an update
I am posting this as a project update. It came as a private message, and I answered it as such. But there are so many people following now, some of the background things I wanted to record and share with anyone looking at the project. If you have specific questions, I can try to answer them. Mostly I work on global problems that affect all humans. That is for the Internet Foundation. This gravitational project is sort of my personal project, because I spend so many decades digging and checking, modeling and testing thousands of variations. Using the gravimeters is a powerful method. It allows me to combine any combination of sensors - electromagnetic across all frequencies, acoustic across all frequencies, gravitational across all frequencies. In a fairly seamless and simple framework. The "trick" is to write the algorithms in universal form, get the data so everyone can see and use it, and then combine the efforts, skills and experiences of up to billions of individuals and groups. Solar system colonization is growing fast. I was even surprised that "interstellar projects" is growing so fast.
Anyway, you might want to check out what Yener is doing. I do not have time to help every pair of followers to learn how to work together. I can suggest make all the models (algorithms and data in open form and globally accessible). The diffusion rate now for scientific and technical knowledge takes decades for really important changes. The education takes decades. But there are fairly simple global collaboration methods to reduce years to days in some cases. My "average improvement expected by using best practices" is about 200 times. Something that normally takes a hundred work days, can be organized and executed in a day. Because all the data and models are accessible without years of human transmitted words and images - rather in the computer and available to all, with all the background information actually available.
Yener Turen,
It is a rather broad project. But, essentially, I am trying to share some of what I have learned in the last five decades studying the gravitational potential, and the technologies for measuring and using it. I am biased by my own experiences and interests. So I want to use gravitational fields for practical things like imaging communication, levitation, process monitoring and control.
Even though I studied "classical gravitation", I was originally a physical chemist by temperament, and wanted to model ALL phenomena in a common framework. With a solid computer, physics, chemistry, mathematics and artificial intelligence background, I can use models and algorithms from any combination of discipline. Most of my professional work was on global issues - international development, long range planning of countries and industries, industrial planning and financial modeling, long range planning of companies and global projects. I remember most everything I read, or see or think about. So I have been able to combine things like business intelligence, market research, sociology, anthropology, industrial development.
But this project, I will try to keep focused on gravitational engineering. Those tools and methods, devices and processes where low frequency electromagnetic methods are indistinguishable from gravitational methods.
The gravitational potential gradient is an acceleration field. If you measure acceleration precisely, you are also measuring gravitational acceleration - if the source of the acceleration can be tied to mass movements of things like the sun, moon, earth, earth's atmosphere, earth's oceans, seismic waves, solar mass motions and ejections (I am working on that almost full time now, trying to get a good dataset for estimating the real time mass flows by particle type from the sun. I want to see if the energy from a coronal mass ejection can be used to power an interstellar mission. The velocities are hypersonic and reach thousands of kilometers per second. At 3000 km/s one light year is 100 years. I keep track of all things happening in society and showing up on the Internet. There is a lot of interest in solar system colonization, and in interstellar projects of any sort)
I read your paper, "The quality of velocities from GNSS campaign measurements when gaps in data exist." So you understand how hard it is to use GNSS when the electromagnetic noise levels are high. Or where there are storms.
The gravitational signals from the sun and moon tidal signal are extremely stable and reliable. They are almost purely Newtonian GMm/r^2 and only require the positions of the sun, moon, earth (and earth moon barycenter, readily available for JPL ephemeris software or many toolkits), the GM for sun moon and earth. The geodetic latitude longitude height and orientation of the sensor, and the earth rotation rate. Thats it! Since each of the three axes of the vector signal only requires a linear regression, then 6 numbers (three sets of offset and multiplier) are sufficient to fit a stationary sensor data stream precisely. It is good enough to use the residuals, and solve for the sensor location and orientation. I haven't been able to get anyone to build a three axis sensor, but what I was able to test, the potential (likely) application is location tracking of ground stations at better then VLBI resolution. Part of why I think it will be better than GNSS for some applications (under the earth, under the ocean, inside shielded facilities, in high electromagnetic noise environments) is that at these lower frequencies, there is no shielding, and likely no interference from ionospheric and electron density.
It is hard for me to read all the papers of all the people who are following this project and guess why they might be interested. I try to spend a few hours each week doing that, but I have the whole Internet to follow and there is a LOT going on.
If you have particular things you want to do, it is better to just ask me, and I will try to summarize some things I have found. I started on the gravitational potential about 1970 when I was working as a scientific programmer for the CIA, tracking all the objects in orbit. Because the data was sparse, and requires forward and inverse modeling methods, I became good at, and interested in constrained optimization methods generally. Later, because of that background with satellites, orbits and orbital mechanics and sensors, I was hired to work on the GEM series of NASA gravitational potential models.
I keep repeating my past experiences. I went to many universities - Case Western Reserve, University of Florida Pensacola, University of Houston Physics, University of Texas Austin statistical mechanics and celestial mechanics, University of Maryland College Park gravitational field detection and astrophysics and partial differential equations, Georgetown University Center for Population Research (USAID global projects), Georgetown chemistry mostly magnetic resonance imaging and chemistry at nuclear energies, Virginia Polytech. I can mostly remember things at a glance, and have so many memories much of the background needed for any area I already have. I am trying to write algorithms for the Internet central operations so that is available to everyone.
I guess I cannot tell you specific details about this project. It is more a place for me to talk to people about things I am interested in. If you could see the people I follow on ResearchGate (and many other sites) you can see thousands of global Internet topics and projects I follow. I review thousands of sites each year and write comments, suggestions and recommendations for many organizations. I try to get groups like NASA and ESA, CERN and LIGO to share data and models in useful and efficient ways, because I found the methods they use now on the Internet are thousands of times less efficient that Internet best practices I have found.
Sorry this is so long. All I can suggest, is ask me questions. If I have time, I will give you some hints, usually fairly specific. Me, every day I spend about 16 hours going through topics and groups on the Internet - usually with very specific global projects and long term needs of the human species in mind.
I mostly am comfortable combining gravitational phenomena and electromagnetism. The "gravitational" sources are mostly high spatial frequency (picometer) and from many incoherent signal sources. The signals are electromagnetic, but when there are many combined into complex signals, the "gravitational" part is just sorting out where the signal comes from.
The gravitational speed and the speed of light are IDENTICAL. They share the same underlying potential. That potential has a complex spectrum of signals and fluctuations. The gravitational or electromagnetic character of each is embedded in the local spectrum. And the local spectrum changes from distant sources.
The only way (that I have found) to separate "gravitational" from "electromagnetic" signals is to use global arrays of clusters of three axis, high sampling rate sensors. Then correlate with the gravitational potential and gravitational acceleration models of large processes like earthquakes, atmospheric density variations, motions of planets and moons, motions of oceans and magma.
That image is one month of superconducting gravimeter vertical signal in Taiwan. It is pure Newtonian vector signal with only GMsun GMmoon GMearth station location and earth rotation. That lets you calibrate to an absolute system (JPL is good enough for me to consider it a stable and "absolute" reference source). The residual is all "earth local" after removing the sun moon and centrifugal terms. I was checking to see how much earth moon barycenter correction is possible. It does improve the fit, and only takes a couple of extra lines of code.
Richard
 
Richard Collins
added an update
John-Erik,
It is relatively easy to measure the vector tidal gravitational signal from the sun and moon at any location on earth. And to check what speed is required for the fields to come to equilibrium. What I found is that it is exactly the speed of light and gravity.
So your ether particles which are already at some location, acting on masses there - most of the potential (measured in Joules/kilogram) is already there. I say "near equilibrium" meaning the additional changes at that location are small.
BUT, if you have mass changes elsewhere that modify the local gravitational potential, those effects diffuse to the target region (where you are tracking a test mass) at the speed of light and gravity.
I am strongly influenced by statistical mechanics. So I tend to go a step further. I would say "the ether particles are not identical particles (probably), but rather constitute a spectrum of sizes and energies of fluctuations in a fluid"
There is nothing wrong with an "ether" - except when you use it as a word to throw around with no measurements, properties, calculations and tests. A good solid measurable "ether" is fine.
You cannot imagine how many papers and books I have slogged through in the last 50+ years on the "properties of the vacuum". The first one that got me interested in the problem was one on "birefringence of the vacuum". But even the ordinary "permittivity", "magnetic permeability" of a volume of vacuum on the surface of the earth has measurements, models, calculations, data and tests you can use to see if it is useful.
There are LOTS of tools and groups working on "gravitational potential" and "vacuum" ("properties" OR "property of"). Just now on Google I find 3.42 Million and 161 Million entries respectively. What I have been doing for the last 24 years every day is tracking out searches like that to see WHO is involved, what tools and methods they use, what models they create and maintain and share. A lot more than that, but if you think "what are all the ways the "gravitational potential" affects human society - that is close to what I attempt to find, document and use.
I am trying to tease and encourage you to not lock down your thinking and descriptions of things. If you just spend a few weeks reading about "gravitational potential", you will find many people actively measuring and using it. With time of flight methods slowly emerging, the use of the gravitational potential (local time dilation, imaging of mass movements, changes in chemical and nuclear processes) is growing faster every year.
So "ether particles" is fine. But if you go look, people have fads and preferences. So expect to also find "quantum gravity", "electromagnetic gravity", "magnetogravity", "thermo gravity" -- ANY place where energy per unit mass is used can be used as part of measurement of "potential". ANY place an acceleration (second time derivative of a position, first time derivative of a velocity, weighted first and second spatial derivatives of a potential). All those kinds of things are going on - practical with instruments, jobs, careers, new products and industries.
I call the whole of it "gravitational engineering". I am trying to get some of the older people who only learned "gravitational theory" and symbol manipulation to get out of their comfort zones, and go find the developers, designers, startup companies, experimenters, and the practical groups working in "acoustic levitation:", "magnetic levitation", "electromagnetic levitation", "microgravity" and many related fields. If something changes the position of something in a way where there is a second time derivative - however small - that can be used to measure "gravitational potential source size, distribution and characteristics.
Richard
 
Richard Collins
added an update
John-Erik Persson,
For most everyday measurements at a few samples per second, you are right. The gravitational potential at any place is in equilibrium with the masses and events around it. You take the gradient to get the local gravitational acceleration. Things are pretty simple. In the old way of talking about such things, the potential is incompressible, "ideal". Lots of ways of reminding that - for everyday measurements and estimates - the potential does not move on its own.
But I have worked with the potential field in detail for more than 50 years now. And to understand what is happening at higher sampling rates (Mega samples per second, or Giga Samples per second) for devices like gravimeters, gradiometers, strain measurements etc - it is necessary to consider that the changes in the potential have to get from whereever they occur, to the local potential. It has to reach equilibrium of some sort. Then you can take the gradients, or do the measurements and figure out what is going on.
For many kinds of situations - where there are a multitude of changes going on everywhere, the best picture and method I found is to allow the gravitational potential to have structure, velocity and properties of its own. To have particles if you want. Then to say "the potential diffuses, on average, at the speed of light and gravity". That allows those who measure really tiny and fast changes to understand why some signals "diffuse" faster than light.
I have had to go through countless models in the last 50 years. Many good partial methods get you close enough to slog through most any problem. But for my everyday dealing with gravitational potential changes coming from each kilogram of the sun to earth based detector arrays - I have to think of two things. One, there is a part of the signal that goes at the speed of light. But the energy density and the potential changes everywhere - so that speed is also varying everywhere - a little. But I know how to calculate and use it precisely.
It doesn't matter if a model is complicated and needs a lot of data - if you have access to the data and model and tools to do the calculation - and some assurance it works, and is reliable.
I let the gravitational potential have all the properties of a fluid. I have used "quantum fluids", "superfluids", "incompressible or nearly incompressible gases", "real gases", Bosonic fluids, Fermi fluids, superconducting fluids of dozens of sorts. There are hundreds of variations, and tens of thousands of examples and fragmented models. I have not checked all of them, but I slogged through and checked a lot of them.
You don't have to trust my internal memories. People don't give a flip about what others see in their minds to organize what they have read and seen, or thought. To deal with "what is known" about gravity is impossible for humans. No one, nor any group has sufficient memory and stamina to keep it all. And the computer versions are mostly fragmented, proprietary, wrong, or simply so tedious to work with - not much changes in human generations. Actually in fives and tens of human generations.
So don't be too rigid in your choice of what is the best. And don't cut off your options because you get tired of trying to remember it all.
The gravitational potential of the earth is about an electron volt per atomic mass unit. From the sun it is 10 electron volts per amu. And from the whole universe, adding up contributions that have diffused from where ever to here where we deal with some current potential fields - the whole is c^2 (speed of light and gravity squared).
There are good workable gravitational potential models for static field descriptions. Those work OK. But if you check what is going on for the last few decades, the potential changes constantly. I use the changes because of the earth's atmosphere, oceans and interior - the changes from the sun moon and other things in the solar system - including changes from solar wind and such - to calculate the total potential at any given point and time. Then with the full field in hand, calculate the gradients to get the expected accelerations.
It is really quite beautiful. I allow the field to have vorticity, turbulence, circulation, density, particle size distributions, fluctuation distributions - and a host of properties. The reason I can do that with assurance is because I found the models that people actually use when they work out practical problems like spacecraft trajectories, time corrections, movements of continental plates, changes in global magnetic potential - and link all the models and data and data streams into one system - for the whole Internet.
I have only been working on indexing and standardizing the Internet for 24 years everyday. Even a smart dedicated human with nearly perfect memory can only sketch the basics and check a few ten thousands groups and methods. But I have a pretty good idea of how it all fits together. And the gravitational potential, is just one standard dataset and models and instruments and networks - with hundreds of possible cross-checks and inter-network verification and correlations.
You might not get to work on data fusion problems at scale. That is pretty much all I have done in my life. But the serious problems are NOT physics and astronomy, but rather economics agriculture finance sociology anthropology organizations and human needs. I worked on all those too. But all of the hard problems are just tedious - find the parts, find the people, find the models, find the data, find the measurement and methods - write it down, test it, compare and merge and move on the to the next one.
Richard Collins, The Internet Foundation
 
Richard Collins
added an update
Everyone,
I am including this as an update to this project because the routine laboratory and wide area measurement of the speed of the gravitational potential is critical to all the groups working together. You HAVE to be able to measure the signals at high enough sampling rates to get good correlations and comparisons. For large things like tracking coronal mass ejections and mass flows from the sun, there is now enough information on the precise timing and mass distributions involved to solve for the signals that will arrive at the sensors. The reason LIGO could start was because the signal was clear, they had LOTS of stronger signals they could have used, but at 16,384 samples per second, and only three detectors, there are not a lot of reference sources that can be used.
On this paper, I am not sure. Any mass that reaches relativistic levels where there is additional mass, that can be used for signal generation and detection. But macroscopic masses are hard to get to those speeds. Many of the heavy ion accelerators, especially collisions of highly symmetric isotopes could be use. The laser vacuum experiments that generate particle antiparticle pairs can be used. Anywhere mass is converted to energy or energy to mass will change the gravitational potential. With precise timing, orientation and energy levels the signals involved are often well within reach of SOMEONE's methods. But with everyone going in their own directions, it is like random chance that gets any progress overall. I am hopeful, but I have been at it for 40 years and I am getting a bit tired.
ALL the "kT" noise is actually a mixture of the local wave function, magnetic variations, power system "noise", and gravitational potential changes impacting rate of clocks or accelerations. Most of that can be modeled and calibrated now. But it takes many global networks to each calibrate their systems, then to intercalibrate. Only talking to one sensor type won't get the job done. Pretty much every sensor needs to be compared to every other one. I think that there is enough information on the sun to image its interior, AND to use its mass flows to calibrate sensors on earth and in orbit. I have not found any big holes. But just "solar observing" and "stellar modeling" is large, and all the groups are chasing Nobel prizes and fame for discoveries, not consolidating, calibrating and sharing. It is that sharing and comparing that will have the largest impacts for the human species, not a few hundred new Nobel prizes or new job titles or perks.
Richard Collins, The Internet Foundation
-------------------------
Johan,
I have two hurdles for groups developing gravitational sensors.
First they have to calibrate their instrument over long periods (days and weeks) to the sun moon vector tidal acceleration signal. There are several groups that can do that, but they have not gone further.
Second, they measure time of flight (sampling rates high enough to track laboratory targets) signals from observable masses. To make things easy they can use atmospheric models, earthquake seismic waves, ocean surface waves, ocean current, highway traffic, or even humans. That is not hard, but it takes a level of effort and skills that is a good indication of the abilities of the group.
No one has reached stage II - measuring the speed of gravity (and making sure it is a gravitational, not an electromagnetic near field signal) that travels at the speed of light.
Amrit Šorli I agree with you that a superfluid quantum vacuum is a good model for the local gravitational potential. The gradient of that potential is the acceleration. There are tens of thousands of groups and individuals hammering away on gravitational engineering problems. Trying to keep the language simple - to fit the common background of the people and groups involved - is needed so the pieces all come together without a lot of chatter and ill feelings and wasted time. The gravitational waves of LIGO type detectors are measuring displacement. It is better to keep that in mind, then differentiate to get the velocity, and again to get acceleration. Those pieces will fit, but groups are somewhat sloppy and don't carefully separate their signals. That is why I keep harping on groups using arrays of time of flight and angle of arrival sensors so they can determine the source distance and characteristics with mathematical and statistical methods everyone is familiar with.
When groups go riffing on their favorite methods, that only slows things down. Keep it simple, make sure all the steps are easy to follow and testable. Reach out to new groups who are struggling. Get real data. Get ALL the sensors so they are calibrated to the HUGE sun moon tidal signal (anyone says "earth tides" should be able to do that easily). There are close to 40 groups who could do a time of flight gravitational imaging demonstration of the speed of gravity. I found groups under every possible term - quantum experiments, Bose Einstein Experiments, atomic clocks, atom interferometers, atomic microscopy, all kinds of scattering experiments, precise magnetic moment experiments, most latest generation accelerators, Sagnac detectors and many derivatives, nuclear fission statistics, atomic decay statistics, GPS time variations, Mossbauer. And dozens or hundreds of groups for each. LOTS of individuals who have spent decades of hard effort to figure out pieces of the puzzle.
It just has that many parts and ramifications. Changing for electromagnetism to gravitation is not hard, it is just tedious. Much of what we call "electromagnetic" is also gravitational. Much of what we call "gravitational"" also applies to electromagnetism. Almost ALWAYS it is a mix that requires patient and tedious separation. But it is not hard.
Richard Collins, The Internet Foundation
 
Richard Collins
added an update
Keyun Tang,
I used the network of superconducting gravimeters over 15 years ago to measure the speed of gravity. Because they are single axis, one sample per second detectors, the precision is limited. I looked for three axis accelerometers sensitive enough to track the sun moon vector tidal signal. In the IRIS.edu seismometer network there are a few broadband long period seismometer where the main part of the signal is the sun moon tidal signal. AND most importantly, it shows that all three axes of the signal only require a linear regression to match to the sun moon signal.
The sun moon vector tidal signal can be calculated precisely using the NASA Jet Propulsion Labs solar system ephemeris which gives the positions of the sun, moon and earth and earth moon barycenter. From these and the latitude longitude and height of the station and its orientation, you can calculate the signal that will be recorded by a three axis gravimeter.
The calculation is the vector Newtonian acceleration at the station due to the sun acting on the station, minus the sun's acceleration on the center of the earth. Plus the moon's acceleration on the station minus the moon acting on the center of the earth. To that acceleration vector add the centrifugal rotation of the station around the earth's axis. The resulting vector, transformed to the station North East Vertical coordinates only requires a linear regression for each axis.
I also posted some details of requirements for sensor - three axis, high sampling rate (time of flight) accelerometers that can be used in arrays for imaging the interior, oceans and atmosphere of earth. The interior and atmosphere of the sun, and other bodies and targets in the solar system. The reason for time of flight is to allow ease of separation of signals where there are strong magnetic signals, which are nearly indistinguishable from gravity at low frequencies.
Here are some notes at https://hackaday.io/project/164550-low-cost-time-of-flight-gravimeter-arrays That image on the left is one month of superconducting gravimeter data from a station in Japan.
If you look at the image at https://hackaday.io/project/164550/gallery#21f5826751cf7b199a02263bd201e9bb see the cell B2 and B3? Those are the ONLY two numbers you need to fits the theoretical Newtonian vector tidal signal from the sun and moon acting on the station. The multiplier is related to the Lamb number and the offset depends on things like the power at the station, the atmospheric pressure, the magnetic field, and orientation of the sensor. With a three axis, broadband seismometer, they are less sensitive but all three axes will only each require a linear regression (offset and multipler) to fit the station data to the sun and moon signal.
It only requires GM for sun moon and earth, positions of the center of the sun moon and earth over time. Rotation rate of the earth, and geodetic longitude latitude and height of the station,
If you have a permanent station, you can use the residuals from the fit on all three axes to solve for the orientation of the three axes, and allow the location (lat long height) to vary and find the best fit for the station location. This is effectively a "gravitational GPS" and "gravitational compass"
In Aug 2017 the merger of two neutron stars was picked by both the gravitational and electromagnetic sensor networks. They arrived at the same time after a race of 130 Million Light Years. So the speed of gravity and the speed of light are identical. I know of places that won't be exactly true, but for everyday cases they have identical speeds. It is the speed of light, and the speed of diffusion of the gravitational potential. The gravitational potential of the sun and moon is already at the earth and nearly in equilibrium with the earth's field at each instant. The changes come at the speed of light and gravity and only at the margin are different.
The Japan earthquake was strong enough that its gravitational potential changes, arriving at both the superconducting gravimeter stations and the broadband seismometer stations, were just barely detectable. At GravityNotes.Org I have notes on the "elastogravity" groups looking at the problem of earthquake early warning, but there are many other groups and ways to tackle that issue.
At the top of the page at GravityNotes.org there is a link (second one) that shows a different station. This is the full spreadsheet to show the regression. I chose this method because it ties the station reading directly to the JPL dataset. It puts the local sensor on an absolute reference frame with the sun moon earth and station gravity comparable across a whole network. It is simple to teach and to calculate. Any group can implement the regressions in small computers directly connected or near the sensor to give absolute gravimeter readings referenced on all three axes.
The signal at a sensitive gravimeter is about 98% sun moon vector tidal signal. The residual (after calibrating to the sun and moon) subtract to get the local effects. Those local effects are mainly atmospheric and ocean, magma, rain, humans. A good sensitive high sampling rate detector can be calibrated from that point by using ocean surface waves, traffic on local highways, and object carefully moved and tracked in the laboratory.
The "game" is to use an an easy model to remove the sun and moon big changes. Leaving all the earth based signals in the residual. Nothing is lost because you keep the raw data. If the algorithm improves and a better "best in the world algorithm" comes up, then you use that, carefully record what you do and ratchet up to a better global sensitivity and cadence.
My goals are three axes instruments (so each individually can be locked to the sun and moon for long times at VLBI accuracies. The statistic get finer and finer. Noting and separating noise sources does two things - it gives you information on new phenomena (rain, atmospheric rivers, winds, ocean waves, rivers, etc etc etc) and it removes those from the baseline sun moon calculations to better match that portion of the signal.
Using "time of flight" or "high sampling rate (MegaSamplesPerSecond = Msps, Gsps, Tsps are all possible) with today's amplifiers and ADCs and gravimeter designs.
Anything lower than 40 Hertz will be comparable to the size of the earth at the speed of light and gravity.
I am fairly certain that most of the "gravitational" signal is broadband electromagnetic noise. And that it will refract. The gravitational potential and velocity time dilation (or redshift or change in vacuum index of refraction) should apply to the high frequency signals. They are electromagnetic, but it requires dealing with extended electromagnetic sources the size of the earth or sun or some part of them. I have a rough handle on those using methods from statistical mechanics, plasma physics, and many fields. but it is not completely integrated. I have to take each case one by one and figure out how best to deal with the complex signals. There are people better at that, but none of them are working with gravimeter signals on a regular basis (lots of groups I can barely keep up with all the new ones).
So these signals that arrive at the gravimeter station by diffusion of the gravitational potential, will be refracted, reflected, absorbed. They tend to have character much like the wind. Sometimes I think to myself "the local gravitational weather".
The strain sensors measure the position. The seismometers measure the velocity. The accelerometers, vibration sensors, gravimeters measure the acceleration. The LIGO and Mossbauer detector I think are mostly measuring changes in the local energy density or the local potential field. That is hard to measure. Like measuring pressure. But the acceleration is a force and almost ALL the electronic sensor do a good job with that. The atom interferometer gravimeters could be the best but they usually use vacuum laser and detection methods that are expensive and bulky. The MEMS gravimeters are good for three axis low cost applications - and for stationary or near stationary application an use continuous three axis high samples per second methods to calibrate all three axes to the sun and moon tidal signal. Once earth based sources are tracked continuously then even the local climate models can be used to calculate the spatial and temporal three axis signal that will arrive at each sensor in an array of three axis detectors.
But there are so many groups now. And many devices that are sensitive enough to routinely measure gravitational potential variations - either directly like Mossbauer and LIGO, or by gradients by gravimeter, gravity gradiometer, or tensor detectors (all directions and pairs).
One thing I will keep mentioning is that the best gravitational detector is an electron detector.  If you say "electron intererometer gravimeter" that is a real thing.  But it is much much too narrow to capture the richness and creativity of what people are doing or could be doing.
The electron is well studied and has a vast array of tools.  It has charge for Coulomb fields and force.  It has magnetic moment for scattering, formation of superconducting and entangled pairs, for precise frequency measurements and direction measurements.  And it has a mass that is nicely tied to the local gravitational potential through the relativistic (velocity and potential) relations.  MANY precise measurements of electron transitions - in atoms, molecules, in small floating gate electron wells of memory chips, in the wells of cameras in dark mode, in purpose built arrays and printed antennas - to measure the change in the current is to measure the acceleration.
Since the electron is so versatile and useful, I normally recommend just use an extended framework where you calculate the full Taylor series.  Monitor the position (0), velocity (1), acceleration (2), jerk (3), snap (4), crackle (5), pop (6).  I did some experiments where I went twenty time derivatives out and there was still data to be found.  Crackle is the 5th time derivative of the position, the 4th time derivative of the velocity, the 3rd time derivative of the acceleration. And those data streams are easy to monitor in real time.  To correlate, and to share.
Richard Collins, The Internet Foundation
 
Richard Collins
added an update
I was reading Eric Baird's notes at
ResearchGate has no way for me to manage and share my content in an effective way. So I am copying this to the project. It it pretty rough (just a comment), but when trying to sort out gravitational signals, you have to use ALL the incoming data. And the spectra from stars and distant things is often shifted in frequency from what we measure locally.
(I just realized that monitoring the spectrum of things in the sky precisely might be sufficient to solve for local gravitational potential variations at the receiving site. Will check soon).
------------------
Eric Baird Thanks for posting this. As Abdul Malek indicates, this topic is much more complex and fragmented than it ought to be.
 
I have to deal with gravitational time dilation almost every day, trying to trace out gravitational signals of many sorts. I have come to know that the electromagnetic and gravitational waves share the same underlying potential.
 
Your pathways are good, but not complete enough. My own best guess right now it that much of the cosmological redshift ought to be re-labeled as "gravitational". Particularly since so many galaxies have black holes.
 
Personally, I believe that the "big bang" was a tiny event in a trillion-fold larger universe that is also trillions of times older. (I am looking for gravitational signals that would be related to quark gluon condensation nova that should routinely happen inside galactic black holes as a routine part of the evolution of stars to larger and larger black holes. They are not black intrinsically, but rather dense and happen to trap visible light. I say "all black holes are quark gluon stars, and the largest can support core condensations that can be as large or larger than the big bang.")
 
The gravitational time dilation equation depends on the gravitational potential differences between locations, the difference in velocity potential (squared velocity), AND also the magnetic, nuclear and electromagnetic potential of the materials where the light signals originates, and as the signals travel.
 
Doesn't it seem strange to you that the path is not a straight line? Rather the earth and Milky way galaxy have turned many times while the signals take their journeys.
 
So, as a practical matter (I don't have time to argue lots of theories from equations) I assign about 80% of all "cosmological" redshifts to a simple difference in the TOTAL potential of the source and receiving regions. Gravitational potential, velocity potential, magnetic potential, electric potential, chemical potential, nuclear potential. It has many forms and all have common units so they can be interconverted. But it is tedious to translate, and mostly doesn't matter at all for the world that still has to deal with idiots who invade countries, death, sickness, poor education, greed, violence and shameful exploitation of the weak.)
 
I made progress in the last 10 years or so by visualizing the actual signals as their spatial and temporal Fourier transform across a range of roughly picoHertz to AttoHertz. That isn't everything, but most of the actual data is in that range. So an "electromagnetic" signal usually has much larger component of signal in those ranges we call "electromagnetic", BUT it also has "gravitational" parts that are much more fine grained AND larger. All the frequencies of electromagnetism are accessible to gravitational methods. If you want to look at "infrared gravitational signals", there are ways to tackle that. If you want "MegaHertz gravitational signals" there are ways to measure and work with that.
 
I review and try to organize literally tens of thousands of global topics - topics and issues that cross all country and organizational boundaries. That is the routine "job" of someone working on global communities on the Internet. These problems of fragmented research in "cosmology" or "general relativity" or whatever tag people use - they are just groups of organizations and individuals going about their business in their own directions, with their own purposes. And mostly not trying very hard at all to document carefully or to put things in order. NOT by arguing tiny things, but by putting the whole together so that anyone can see what has been done, what tools there are, what data exists. And to put those tools and data into forms accessible to all. And I mean literally ALL humans, not just "all people who studied a particular thing" or took a course somewhere, or have a job with a title.
 
I put the time dilation equations into a form so that the various potentials are all subtracted from the "universal potential" with the average value (C^2 about 9E16 Joules/KiloGram). (Multiply the square root term by C to change "1" into C^2 and all the places where you divide by C^2 are just potentials (Joules/kg) in absolute units.) That makes the time relative to the reference time a ratio of the index of refraction of the vacuum to the index of refraction of the vacuum where the gravitational and electromagnetic fields are accounted for. And the velocity is counted by its effect on the local rate of clocks. Change any of the potentials, and it changes the local "vacuum index of refraction", and that changes slowly enough and uniformly enough it can be shared globally on the Internet in locally useful forms.
 
The "gravitational weather" at any location on earth is absolutely critical now that so many industries and tools have reached the nano, pico, femto regions where to ignore the changing gravitational potential and the changing gravitational potential gradient variations with time is to let small errors creep in. I go so far as to say to myself - "These "quantum" or "nuclear" or "photonic" or "astrophysical" or "entanglement" or "Bose Einstein" or whatever experiment are sensitive enough to be affected by the changing gravitational field at all frequencies, so their "noise" is likely to have significant "gravitational components". Any time that happens, I try to flip those experiments and methods to use those experiments and methods as gravitational imaging array detectors. Not just simple single axis amplitude estimators.
 
It is not perfect, but who has time to spend years on each problem? Education, nutrition, clothing, housings, clean water, protection, jobs, resources for 7.9 Billion humans is a full time job. Will adding "gravitational engineering" to the mix make it easier? Some places, not others.
 
Richard Collins, The Internet Foundation
 
Richard Collins
added an update
Just writing out some of my experiences measuring the speed of gravity, some trends in gravitational detector arrays, a hope that I will see the interior of the sun and earth using gravitational imaging array methods in my lifetime.
It is possible to measure the speed of gravity by using the network of superconducting gravimeters. They record the vector tidal gravitational acceleration due to the sun, moon and earth at the station. I did this almost two decades ago now. And repeated the same with the three axis broadband seismometers. The MEMS gravimeters now show the same. To match the very precise calculated Newtonian vector tidal gravity to the stations data simultaneously at all stations requires that the gravitational potential reaching the stations arrive there at the speed of light. A difference of a second is detectable. It takes light about 499 seconds to reach the earth from the sun, and a coronal mass ejection on the sun, the signal would takes the same for the gravitational potential to come to equilibrium.
To be clear, it is NOT the gravitational acceleration (1/r^2) that is flowing or diffusing, but the gravitational potential.
The Japan earthquake was strong enough to register signals at the speed of light at gravity on both the superconducting gravimeter and the broadband seismometer networks. These signals due to the spreading seismic waves from the earthquake travel as changes in the gravitational potential at the speed of light and gravity. There are groups trying to improve gravitational potential and gravitational potential gradient detectors (gravimeters) to allow for an effective earthquake early warning system.
In August 2017, the merger of two neutron stars was recorded as a gravitational wave in the LIGO network as event GW170817, and also on many simultaneous electromagnetic sensor networks at many frequencies of electromagnetic waves. And, racing the gravity signal and electromagnetic signals over a distance of 170 Million light years, they arrived at the same time. This event, after having spent many years going over the implications of the speed of gravity measurements connecting sun earth moon and stations as a routine thing, and this demonstration that light and gravity (gravitational potential changes is what LIGO measures) made me realize that the only way that can happen is if the electromagnetic field and the gravitational field share the same underlying potential field.
When I was studying gravitational wave detection at the University of Maryland at College Park in the later half of the 1970's, I was working part time with Steve Klosko at Wolfe Research and Development on the NASA GEM series of gravitational potential models - using satellite orbits, C-band radar tracking, laser ranging and other methods to carefully track things in orbit to extract the gravitational potential field necessary to fit the motions observed. So I had also gotten a good intuition for how the potential field works.
The gravitational potential (in my view) is a fluid or dense gas that is already at every location in space. It fits the universe. Changes in the density of voxels, because of moving masses changes the gravitational potential. And this is by diffusion. So I say "the speed of diffusion of the gravitational potential is the same as the speed of light." Now are there signals that can be transmitted at much higher speeds? At many times the speed of light? I would say that is very possible. And it would be a different kind of wave than that due to the flow of potential fluid or gas or boson superfluid (depending on the model you choose)
I have followed most every group trying to make more sensitive gravimeters. Because the gravimeter is measuring the second derivative of the strain, at higher frequencies each term of the spectrum is multiplied by the square of the angular frequency. So the absolute value of the signal gets larger as you go to higher and high frequencies. If you, as I have done many times, walk through how the gravitational potential field comes to equilibrium when observing at high sampling rates (MegaSamplesPerSecond (Msps), Giga Tera and higher) those squared frequencies make gravimeters and jerk (jerk is the third time derivative) detectors or higher time derivative detectors are likely the better choice for following what is happening to the gravitational potential at high frequencies.
The ONLY way I have found to separate gravitational and electromagnetic signal sources is to take samples at high enough sampling rates to use time of flight methods using arrays of detectors to correlate and identify the places in space and time where the signals could originate and arrive at the detectors at the speed of light. So I have have looked at atom interferometers gravimeters, MEMS gravimeters (there are two that I know have been tested at Liang Cheng Tu's group in China and Richard Middlemiss group in Glasgow Scotland) There are electron interferometer gravimeters, seismometers optimized to work as gravimeters, a whole range of high sampling rate sensitivty "vibration sensors" that could be adapted. There are electrochemical gravimeters. You can use interferometry to precisely track the motion of masses down to nano and pico meter range reliably at low cost. You can use atomic force methods to do the same.
If you search "atomic force" "gravimeter" on Google you will find several groups trying to adapt the atomic force methods to high sampling rate gravimeters. i worked with those enough to know that the higher order resonances in the cantilevers are significantly more sensitive.
Without going through all the groups working on different detectors I will encourage you to stop wasting more time trying to work things out from old theories. It can be a guide, but measuring things is much more satisfying and practical. The MEMS gravimeter is just a MEMS accelerometer. I coined the rule "if an accelerometer is sensitive enough to measure the vector tidal gravity signal on all three axes, it can be called a gravimeter" Do you understand? The gravimeter is just a sensitive accelerometer measuring the accelerations locally. Measuring the gradient of the gravitational potential locally. The gravitational potential is changing.
I am rather tired just now. I tried for many years to encourage groups to use the Jet Propulsion Labs solar system ephemeris which gives precise positions of the planets, sun, moons (and I think a lot of the asteroids and comets and things) and solves for the gravitational potential of the whole solar system. The calculation of the vector signal at a superconducting (or MEMS or electrochemical or atomic force or atom interferometer) gravimeter only requires the positions or the earth, sun, moon and station. The masses of the sun moon and earth. And the earth rotation rate (the calculated vector centrifugal acceleration at the station) to calculate the vector tidal acceleration at the station. Then a LINEAR regressions (offset and multiplier) to fit each of the three axes, to calibrate the station in absolute (locked to the solar system ephemeris and JPL network of sensors and calculations) units. You can "weigh" things locally against the sun and moon signal
In words it is "The sun's acceleration at the station minus the suns gravitational acceleration on the center of the earth" plus the "moon's gravitational acceleration at the station minus the moons gravitational acceleration on the center of the earth" plus the "centrifugal acceleration of the station due to earth rotation". That vector, rotated into station North East Vertical coordinates ONLY requires a linear regression to match the signal. The offset (the constant) is tied to local variations at the station. For a superconducting gravimeter you can take the daily regressions (hourly, every minute if you want) and trace out every event in the station logs - changes in the helium level, changes in the power supply, outages, temperature changes. I did that a few times, but no one seemed interested in combining all the gravimeters on earth into a single gravitational detector array.
This simple "Newtonian vector tidal gravity signal" allows the sun and moon portion of the signal to be removed is a globally standard way. Not much quibble about the model or what it is tied to. I tried using the traditional earth tide models first, but the Wentzel died and his widow would not let me go through his notes to improve ETERNA. So I used my old orbital mechanics background and guessed the form of the signal. Then spent a year getting it to fit precisely to the superconducting gravimeter network. I did all the data they had at that time. Because the SGs (superconducting gravimeters) are only single axis instrument you cannot take data from a single station and solve backwards for the sun and moon. But the broadband seismometers are three axes, can operate as gravimeters (albeit not very sensitive ones) and the three axes calibrated to the sun moon signal. A Newtonian calculation.
I am writing this out here because I am tired of explaining this. I spent much of the last two decades looking for high sampling rate, low cost gravimeters to combine to image the interior of the earth, the atmosphere of earth, the magma inside the earth, the oceans of the earth in real time using the residual after locking all the detectors to the sun and moon. The local residual is primarily atmospheric density and wind, atmospheric and soil moisture, earth tides and "earth local" effects. AND they all arrive at the station at the speed of diffusion or flow of the gravitational potential. "The speed of light and gravity" So you can use an array around a city to constrain models of the local atmosphere, correlate with radar tracking of storms, model flows of moisture in what are called "atmospheric rivers".
I have worked out how to do the time of flight (speed of gravity) correlations to focus arrays on any voxel in the sun earth moon or space. Or to look outwards. For all practical purposes it matches well with low frequency magnetic and electromagnetic methods in the nanoHertz to GigaHertz range of sampling frequencies. If you measure using software defined radio direct sampling methods you say "samples per second" or "measurements per second" or "frames per second" or "records per second".
Jan Harms at LIGO solved for the gravitational potential changes due to seismic waves. He did it for the strain data at the LIGO network. But the gravimeters can run at much higher sampling rates, so can form images more easily than those giant expensive and not accessible few sites.
Now I have an odd history of jobs. So i worked in research at Phillips Petroleum on 3D models and visualization tools using microcomputers. This is because I was considered an expert in GIS on microcomputers from my earlier work on the Famine Early Warning System (it is still going after I set up it up in the mid 1980's) FEWS.net. My point is that I met people doing subsurface imaging and exploration using magnetic and gravitational and electromagnetic methods. And I tried to improve those devices and combine those kinds of data streams as I had be combining data from many sources before.
Well, I could write for months and not cover everything. But I will mention that I did check to see if the vector tidal signal needs to be corrected for the station's rotation about the sun moon barycenter. And the answer is yes, but it is not clear. Remember the gravitational potential has more in common with a fluid at these high sampling rates. So it is better to think of "flows", "fluctuations", "currents", "compressibility", "composition", "fluctuation spectrum" than some idealized rigid and unchanging field. I have been working on the theory of gravitational signals and the properties of the gravitational potential ever since Joe Weber told me that gravitational signals can be used for communication, and Robert Forward wrote that the gravitational and electromagnetic fields are based on the same foundation, you just need to have a common set of units. I worked that out for everyday things.
Readers probably won't care what I did. To me it is my memories and things I felt were important. Now I am getting older, I can track a few things, but I don't have the resources to pay people to make the kinds of sensors that I know can be used for this kind of imaging and communication. Robert Forward wrote about how to do lab measurements of gravitational signals. I have not done that, but it is well within the capabilities of most groups doing atom interferometer work now - if they would go to higher sampling rates.
The global lightning detection network evolved from Msps (mega samples pe second) analog methods to global Gsps digital methods. And in the process allowed for global real time tracking of lightning. The same kind of approach will work to track earthquakes, volcanoes. Now if you set up global arrays, the sensitivity goes up, so the requirements for correlations go up, so the cost of computers nd processing go up. But the correlation nodes and methods for radio astronomy and the VLBI will work just fine with gravimeter or gravitational detectors.
Now the LIGO and Mossbauer, atomic clock and any "gravitational time dilation" detectors are what I call "direct gravitational potential detectors". They measure the change in the "pressure" "density" and composition of the gravitational potential. The combined gravitational and velocity time dilation equations that are used routinely for GPS and orbital corrections can have terms added for time dependency. And they can be put in a form that used the energy density so that magnetic and electric and pressure fields can be used. The gravitational energy density at the earth's surface is roughly equivalent to a magnetic field of 380 Tesla. If you use the blackbody temperatures it is a bit over a million degrees, and the radiation temperature is about 1200 electron volts. I am writing from memory and I don't use those often. I do keep in mind that the energy density of the gravitational field at the surface of the earth is in the "soft x-ray" region so I read EVERY x-ray paper that comes out, as I have time. Because at some point it will be possible to map and measure the local gravitational potential and its gradients at high 3D and temporal resolution.
I have worked hard for the last 24 years to study how groups use the Internet for global collaborations. Particularly for the cross-cutting issues that affect all countries and all people. There are tens of thousands of these things. EVERY country now is duplicating research, because every small group wants fame and fortune from anything they do. Rather than sharing what has already been found as living tools and workplaces with all the data that is available, each group tries to do the whole thing themselves. Now I have been writing for almost two hours and it is already past midnight. So I want to share what I hve learned, but I just don't have much energy left.
I would not have spend more than 40 years trying to sort out "What is the gravitational potential" if Joe Weber had not been so adamant that is was for communication. And Robert Forward wanted very much to combine the gravitational data and tools with electromagnetic data and tools. Now I would say "combine gravitational and electromagnetic sensors in global arrays with all the global sensor networks. For the Internet Foundation I found them all, learned how these sensors work, how to use the data, how to combine and correlate the data. So that the any "purely gravitational" portions can be extracted.
But if you set the expression for the gravitational force between to masses and the electrostatic force between two masses and ask "how many electron charges per kilogram would be enough for the gravitational force to be replaces with an electron charge per unit mass - it is just a few millions. Not Avogadro's sized. I wrote that for someone here the other day. Here is the link https://theinternetfoundation.net/?p=3556 and it was 537,851,028.228 electrons per kilogram - about half a billion electron charges per kilogram. And that is probably temperature dependent. I have other ways to model gravity, but it bothers me that people solve these same problems over and over again. and don't simply measure, correlate, share data (not words about data), share algorithms (not words about algorithms).
If you go to https://hackaday.io/project/164550-low-cost-time-of-flight-gravimeter-arrays on the left side you can see one month of a calibrated dataset at a superconducting gravimeter station. This is just the vertical signal. The ONLY PARAMETERS needed are two numbers - an offset and a multiplier. That complex signal that varies in direction and intensity is a one line equation as I mentioned about. The sun at the station minus the sun on the center of the earth, the moon at the station minus the moon at the center of the earth, plus the centrifugal acceleration of the station on an ellipsoidal earth (WGS84 is what I used). Other than that the only data needed is the station latitude longitude and height, the sun moon and earth GM values. Thats it! Nie easy Newtonian gravitational acceleration. You can get the vector positions of the sun moon and earth in station centered coordinates for any time period from NASA Jet Propulsion Labs Horizon system at https://ssd.jpl.nasa.gov/horizons/app.html#/
The constants you need are under "extras" "astrodynamical parameters" at https://ssd.jpl.nasa.gov/astro_par.html
I wrote the original model in Turbo Pascal and calculated the positions from the JPL Chebyshev polynomials. It is not hard, just really tedious. If you want to use MatLab or Python they have those calculations built in now. AstroPy has them. I think Mathematica and Maple have them. I wrote Javascript programs to take the data from Horizon and do the correlations automatically. But JPL recently updated their Horizon site and I am not sure those work now.
Putting data online with all the required tools, where the programs can run on your computer would be the best solution. I asked JPL about taking data from global networks of gravimeters to give data back to JPL about the positions of the sun and moon.
When I was going through literally EVERY seismometer dataset to see which ones might be able to detect the sun moon vector tidal signal, it was brutal hard work. I had to use minimum a month at a time to see the signal at all, and I did not know how to use IRIS.edu that well. I worked blind for close to six months, already knowing how to calibrate a single axis SG (superconducting gravimeter) before I had "first light" view of all three axes of the seismometer with each the two numbers (offset and multiplier) to lock that instrument to the sun and moon signal. What I found was a set of instruments called the "Transportable Array" and these were placed all over the United States. And, because they moved sometimes there were instruments where the North East Vertical orientation of the instrument was not clear or not correct or not written. So, with only a linear regression needed, there is a LOT of data to work with for three axes. And if you run regressions allowing the orientation and position to vary. you can, over time, put bounds on the position and orientation of the instrument in earth centered and station centered coordinates. The JPL positions are good to 10 meters, 1 meter, 100 meters 0- it depends on the body. But they are doing lunar and Mars and all over the solar system projects - more and more every year. And I don't think they miss their targets by much.
I call this method of using gravitational signals for precise position AND orientation "gravitational GPS". It works in caves, or under water, or in bore holes, or in places where electromagnetic signals are not going to work. it is mainly a long term thing, with the current sensors. I think both Liang Cheng Tu and Richard Middlemiss used cell phone MEMS accelerometer technology as a starting point for their "gravimeters" (something that can measure the sun moon tidal signal). Could that be pushed, or some other technology, so that everyone can carry around a three axis gravimeter and use it to know where they are because of the local gravitational gradient changes? Kind of a stretch. But probably possible. I do know that there was a contract announcement for methods to update the magnetic orientation network to levels needed for cities and military now. I am just too tired to see now after three hours writing from memory.
I cannot just tell people what to do. I cannot pay them. I cannot afford most of the detectors and I cannot run an array of them myself because it requires spreading them to every part of the globe. But time of flight (high sampling rate) gravimeters and other gravitational detectors can focus inside the earth, or on the surface of the sun, or at the surface of the core of the earth, or the atmosphere or the oceans - and image them, the data used to improve climate and gravity models. It is not hard, just tedious. But I think the rewards and returns are worth it. Because in the process we could measure the speed of gravitational field adjustments precisely as electromagnetic ones, use such signals for imaging, use them for communication. I have some examples.
Sorry to write so long. I am a cautious person. I spent most of my life in public service in health, education, international development, energy, transportation - planning, modeling, gathering and organizing data for global problems.
Oh, I did also track most all of the levitation methods - field methods for moving masses - acoustic, electromagnetic, magnetic, beams, lasers, explosions. There are microgravity groups nullifying gravity that way. I would say, as I have for 40 years now, that if you create an acceleration field that matches what gravity does then it is equivalent. Groups can move things with fields now in programmed and precise paths (orbits even if they are arbitrary human chosen paths). That is good enough for me. Can we measure and exactly match gravitational acceleration inside whole volumes. Probably. Yes. it is not hard, just tedious. I call that "gravitational engineering". Hard work, lots of measurements, lots of data and eventually nearly absolute control over matter and energy. Probably not in my lifetime.
Richard Collins, The Internet Foundation
 
Richard Collins
added an update
Ileana-Christina, and others,
I saw you have a series of these including some for fluctuations. This is long but I hope helpful to do or keep in mind.
Please, if you can, resolve the fluctuations on three axes and run it for several days. A once per second, or once per minute average based on counts of the sizes of the fluctuations, is sufficient to see if these are fluctuations in the gravitational potential. The signal measured by a superconducting gravimeters is the acceleration (gradient of the gravitational potential). It varies by +/- 1000 nanometers/second^2 over a day. It is very precise. Even in the presence of a lot of noise, because you only need to fit 6 constants to a month of data, you can tell if you are seeing it.
I have been tracking progress in gravimeters for a couple of decades. I used it to calibrate the network of superconducting gravimeters to measure the speed of gravity. Those are single axis devices at 1 sample per second and 0.1 nm/s2 at 1 sps. I was lucky, there are a few broadband seismometers in the IRIS.edu seimometer database and they are three axis, 100 sps instruments. They report velocity, but a simple differentiation get them to report acceleration. At quiet stations and in quiet periods you can see the sun moon tidal signal. It only requires a linear regression to fit each axis. There is a lot of data cleaning, and some stations are just too noisy.
There are now MEMS gravimeters, atom interferometer gravimeters and gradiometers, electrochemical gravimeters, Bose Einstein gravimeters, and other "quantum" experiments starting to pick it up. The sun moon vector tidal signal is just the Newtonian acceleration of the sun at the station minus the sun at the center of the earth, plus the moon's acceleration at the station minus the moon at the center of the earth. Add the centrifugal acceleration of the station from rotation and dot that into the station North East and Vertical unit vectors. With the motion of the sun and moon, and the precise JPL ephemeris, you cannot "game" the signal. It is precise in time and strength. With only linear scale and offset to get the instrument locked to a common reference any where in or on the earth, it can be used for solving for position or orientation of the instrument. That is helpful for portable devices (MEMS gravimeters could be used in cell phones, same technology slightly different readouts).
About 98.5% of the variation of the signal at an SG station is the sun moon signal, then 4% atmospheric, and then other things. I am trying to get arrays of these. After you remove the sun moon variation, the residual signal is mostly atmospheric, so it a good strong signal for correlating with climate models, weather radar, lidar a rapidly growing set of electromagnetic and acoustic 3D image method for monitoring the structure of the atmosphere. The correlation between sensor networks give a reference signal, and they can gradually converge and agree on what they are measuring.
Because the low frequency electromagnetic signals, the magnetic variations at low frequencies, low frequency variations in the thermal radiation fields (use the temperature of the air and ground and estimates of the intensity (Watt/m2) from every voxel, and relate that to the energy density variations.), and infrasound are hard to shield they have similar requirements for determining which signal is actually received or what mix.
They all seems to be part of one field. I just call it the gravitational potential. I have been specifying three axis, time of flight (Mega samples per second and higher preferred) so that both the three axis signal (to lock to the sun moon for global reference), and then correlation methods for the high sensitivity high frequencies time of flight localization and characterization -- can be used to bootstrap global networks and devices making comparisons. I want use global gravitational sensor networks to scan the regions near, the atmospheres, oceans or convection regions) and cores of the earth, sun, moon and planets.
Atom interferometers (I am also tracking electron interferometer methods) can get to sensitivities sufficient to track the gravitational potential changes from spreading seismic waves which diffuse at the speed of light and gravity to sensors. All the attention is on black holes and neutron stars. But the Japan earthquake registered gravitational signals on both superconducting gravimeter and broadband seismometer networks. There are a few people trying to build detectors. So, even if gravitational fluctuations and signals are not your main purpose, can you take a few minutes to consider needs of earth-based networks for faster, more sensitive and low cost detector for earthquake early warning, calibration of atmospheric and ocean models (GRACE Follow-on satellites can image ocean current from orbit)
The current gravitational potential model is about 2160x2160. but it keeps improving. I did orbits with 8x8 potentials in 1970 and then worked on the NASA GEM series (satellites and other sensors) about 1978 when they were 32x32. I spent almost a year understanding the SG signals and noise. The best I could do was to say that the speed of gravity was +/- 0.2% because the SGs only have one axis.
Anyway, if you don't want to do you own calculations, if you can take readings on a regular schedule over several days or longer, and tell me when and where, I can calculate the signal and send it to you. I have been using Excel or sheets because I like to look at the other signals. It is common to find five or more sources in each data stream. I can often separate them by using data that is shared on the Internet, but purpose-built sensors would be easier.
I keep telling Jan Harms and others at LIGO, that it is not "Newtonian noise" or "gravitational (potential) noise" but "Newtonian signal" and "gravitational signal". I went to University of Maryland at College Park where LIGO started. But I followed Joe Weber and Robert Forward track and focused on the earth gravitational potential and solar system. They wanted to build gravitational reference signals and detectors, and to connect things. The three axis sensors can be calibrated with fairly simple lab setup, by doing what I have been doing with the sun and moon - move the masses in well characterized and verifiable paths so that the timing and precise shape of the signal on all three axis or the gradients of a gradiometer all match. I want to measure the mass of the sun and the moon that way. Nothing so lofty. Just lock the big G experiments to the GM values for the sun and moon and other things by direct measurements.
I have been at this for just over 40 years now. But even though actual time on it has only been two or three years full time equivalent. It took me 6 months to go through all the seismometers in the IRIS.edu system before I knew which specific models, in which specific locations and operating conditions were getting clean enough signals to work with.
The LIGO strain data is only 16384 samples per second. Which is about 18 km resolution. If all three were running you could take the signals and look at larger earthquakes or the sun. But it is too ambiguous. Which is why I want to get the purpose built gravimeters and seismometers working as well. Liang Cheng Tu in China and University of Glasgow have MEMS gravimeters. They have had them for two years now. I know their signals wander, but that is why I recommended the lock to the sun moon signal. It is a reference with a solid pedigree (JPL) you get started and then improve. They probably need some engineering and manufacturing help. There are plenty of people now.
I keep wanting to write everything I can think of.
Pretty much every experiments that is getting down into the nanovolt range is going to pick up fluctuations. You cannot separate them from thermal noise, or power system noise, or spherics or a host of human and natural noises. You can put multiple sensors in clusters. You can stack different kinds of sensors together (gravimeters, seismometers, infrasound, meteorological, radiation field (3D temperature), magnetometers, lightning detectors, VLF (electromagnetic), human electromagnetic interference networks of all kinds, nonprofileration networks. There are more. i am getting tired of writing.
GravityNotes.Org has some of the earthquake early warning work. The GW170817 neutron star merger showed that the speed of gravity and the speed of electromagnetism are identical. Not close, identical. That means they share the same underlying potential. In my mind I classify the devices as "direct potential detectors" that track the pressure or energy density in the potential by its effect on the local speed of light and gravity. Mossbauer is in that class, and atomic clocks and atom and light interferometers like LIGO and some proposed followons. All can be operated as gravimeters, and there are so manhy now they are hard to list them all.
The second link at the top is a spreadsheet that fits the station data for a month from a superconducting gravimeter to the sun moon vector tidal signal. I wrote a Javascript program to use the JPL Horizon system. Then did the linear regression and plots in Excel. I am trying to keep things simple and traceable so that anyone can do it. I am encouraging high schools and colleges to use it as a simple introduction to gravimeter design and the effects of the changing gravitational potential on clocks, and the potential gradient on gravimeters and gradiometers.
When I hear that a system has to adjust for "earth tides" I know they are reaching the level of precision in their experiments they can use the whole instrument as a sun moon detector on three axes (or gradiometer) and lock it to the sun and moon. The new accelerator designs, most fusion projects that hope to reach stability.
Hope this helps a bit.
ANY electronic device where you measure single electron or electron or hole well voltages to nanovolt or finer resolution, where you run it for days at at least 1 sample per minute can usually pick up a three axis signal from the sun and moon. I have not been able to do that with cameras (tens of megapixels at frame rates in the hundreds of frames per second) when they are darkened and measuring "noise" some of that is magnetic, some is gravitational, some is lattice, some if local electromagnetic and other sources. But they can be sorted out, bit by careful bit. Also the electron wells (basically any nano or pico sized conductor) in floating gate devices and be instrumented like pixels in cameras with precision control over the reference voltage, amplifiers, and ADCs. For electron escape probability measurement, or quantum Hall effect, or many "quantum" experiments you can work out the acceleration of the electrons too. In the electrochemical and plasma and metal vapor devices you tag things.
I am trying to get all the different models into a set of online tools for people doing different experiments. I can do them one by one but I can't do it for everyone. It is not really that complicated. Look at the acceleration of the electron or ion and then use statistics to improve the the precision of what ever sensor you choose to try. Run it long enough to lock to the sun moon signal. Use that to guide improvements. If you have to use FFT impulse response or be a bit creative with nonlinear spectral response there are lots of tools out there and groups doing that for other purposes.
If you make a one shot experiment and post it on paper on the Internet, it will be slow to develop. If you run a continuous experiment and share your raw data and archives and models ( in symbolic mathematical and computer programs form so they can be immediately verified by anyone and improved. If someone verifies, they share it too. NOT hidden away but shared.) you can bootstrap quickly and get global communities helping. That kind of sharing is what I do for the Internet Foundation (my full time job). And it can transform an industry or subject in days and weeks, where otherwise it takes years or decades.
I see every day groups that jump to hoard or limit what they share. It enriches a few people and can delay verification and practical applications by years.
If you do want to publish, do not use PDF. It flattens everything. You can set up sharing yourself and let people work alongside you, but learning how to use the Internet for that is troublesome. NOT a single website in the world has a good sharing practices for everyone. They might server a few ten thousand insiders and people already in the field, but prevent anyone else from seeing. For publicly funded projects or those that say they are for the whole world and then only really serve a few ten thousands I only feel sadness and disappointment.
Richard Collins, Director, The Internet Foundation
 
Richard Collins
added an update
A message and comments regarding Maurice's project to localize gravity.
My own visualizations and ways of doing things. They work for me. I have to look at every group on the Internet, and this seems close to the common denominator for the most people in practical situations wanting to move things around with fields. Or to use high spatial frequency imaging with 3D video outputs.
Richard Collins, The Internet Foundation
 
Richard Collins
added an update
I will try to keep this short. I happened onto the phrase "pixel access time" while looking for anyone who is using optical sensors or nano sensor arrays that might be adapted to time of flight use for gravitational signals.
This lead to "time resolved detection", "time resolved fluorescence" and a wide range of time resolved methods across all spatial and time frequencies. And their corresponding energies. So x-ray, microwave, and all the photons above kT. While I continue to develop methods for waves below kT. They will be there regardless, as part of a linearly independent set of sources on a nearly incompressible potential.
I have been using a classification scheme for all knowledge on the Internet that covers the powers of 1000 from 10^-30 to 10^30. We use micro, nano, pico, Giga so casually, but if they were used as a coordinate system, and everyone located their experiments properly there, we could more easily locate similar experiments. We do that sort of.
I have no doubts now that all these "electromagnetic" and "acoustic" tools, models, experiments and measurements can all be simply transformed to gravitational notation.
I am taking it literally that gravity and acceleration fields are identical where the gravitational/electromagnetic potential is the mediary. So if a field transmits force and creates acceleration is it gravitational. This is part of a general rule that says ALL variables and measurements must be labelled properly with units and dimensions.
And, further, they must be tied -- in context and immediately everywhere used -- to their derivations and connected data, groups and networks. I will try to make that clearer with time. For now I will keep collecting all the electromagnetic phenomena and tools that can be adapted to the full range of gravitational sensing, imaging, communication, and control. And, of course, force generation.
Richard Collins, The Internet Foundation
 
Richard Collins
added an update
I made a video a couple of days ago.
Using Optical Sensors to Measure Magnetic Electric Gravitational and Acoustic Fields
I have been going over many fields, trying to fill in the details of the relation between gravitational fields and electromagnetic fields. This is part of a note I just wrote to somene summarizing my current progress. So I modified it a bit to share with anyone who might be interested in solar system gravitational and low frequency magnetic imaging.
I was reading George Feher's 1957 paper and then searching for anyone interested in such things. He writes about the sensitivity of paramagnetic resonance experiments. I am trying to go through every instrument and experiment that is reaching part per trillion in ANY measurement. Or because of high sampling rates and large bit sizes has the potential, even if it requires running for months or years at a time, to get to the levels where one can image using magnetic fields or gravitational fields. Particularly in the solar system, and the whole sky.
I have been working on gravimeter and magnetic imaging arrays. Balancing sensitivity, cost and capabilities is a constant part of trying to piece together global imaging arrays. But I am getting older and tired, so translating into yet another technical language is getting harder. I am up to mid GHz and THz methods that might be useful for detecting and correlating gravitational signals that would help a lot. ANY electron device can be solved for the electron accelerations involved, and then evaluated for its sensitity to gravitational potential, and gravitational potential gradient, measurement. Bose Einstein, "quantum", communications systems, optical sensors, nuclear reactor neutron statistics, updated Mossbauer measurement, single atom techniques of all sorts, magnetic resonance measurements of all sorts. Anywhere a charge particle is involved, and the instrument is capable of gathering close to Gigasample per second (Gsps) is likely a candidate.
I found that if you want to trade measurements for sensitivity, people normally average and essentially throw away (1 - 1/sqrt(N)) of the measurements. But if you keep ALL the data, you can separate it into its individual sources and use pretty much all of it. I am working on that right now. There are many lossless methods of summarizing large data streams. That helps when you do not know what the signal looks like. I have not scanned the interior of the earth with a gravimeter or magnet array, but I expect the first images to be chaotic. Though by the time I get there it probably will make sense. I think of Fermi standing there doing the calculations by the pile, and later dropping small bits of paper. He had to go through a lot to develop the skills needed for those moments. I try hard, but I am getting old.
The gravimeters developed out of accelerometers. I tell people designing new gravimeters that when they can monitor the sun moon vector tidal signal, they can call it a "gravimeter". The next test is when they can do a basic speed of gravity experiment. Only a few people are thinking of transmission and detection tests, even though the Robert Forward and others wrote down the basics 40 years ago. Sorry I have it all in memory.
I am talking to people who use optical sensors, particularly high speed cameras where the sensor is capable of regions of interest. By monitoring the emission rate from the electron wells when the sensor is covered (blocking visible light and some infrared), the data stream is a rich source of data on many different noise sources, some close and some more distant. Among them are gravitational acceleration signals from changing densities on the earth and further away. I know that time of flight and correlations can help locate and characterize the sources where there are regions of stronger signals that many different sensors and arrays can all see. So I am trying to get data to start the long process of identifying and characterizing some source of magnetic variations that can be used for calibration - just as I have used the sun moon tidal signal to calibrate the superconducting gravimeter and broadband seismometer arrays (used to measure acceleration). The MEMS gravimeters (Glasgow and China) won't be available until later this year and they have no market yet.
The large number of samples at Msps from these devices, where there is control over the gain and threshold for the wells (fairly simple to model), allows sorting out the electric, magnetic, gravitational, acoustic and radiation field sources arriving at the sensor cluster. The Msps gives km spatial resolution and I am trying to find common signals for comparisons to get started.
I say "cluster" when the sensors are next to each other, and "array" when they are spread over the earth and space to get uniform or specific coverage.
I am slowly making my way up the frequencies. The gravitational and electromagnetic fields share a common potential. My original reason for calibrating the SG network was to measure the speed of gravity. It gave the "trivial" result that the speed of gravity is the speed of light. The GW170817 event showed that that speeds are identical. That can only happen (or a very likely reason that could happen) is if they share the same underlying potential. I am investigating what it would mean if they were part of one underlying process. As I have gone at it the last three years in particular, I am finding that all the sensor networks use a core of common methods - including ways to estimate sensitivity, sample size for a given sensitivity, sampling rates and noise levels. EVERY sensor's noise comes from somewhere. And most of the global (geophysics, astrophysics) sensors get noise of all types.
I am re-reading James Melcher's book on Field Couple Surface Waves. I read it a few years after it came out, and was struck with the intent to simplify the terminology and the problems of using mixed electromagnetic, acoustic, mechanical, thermodynamics and acceleration fields. Now I can condense that down to "electrons", "gravitons" and "ions" and all their interactions.
This is long. I do not have much time or energy. So I try to be as complete as possible.
The Bose Einstein Condensate gravimeter seems to allow sensitivities of 10^-15 at GHz analog sampling rates. I have the background for neither - at least at the instrument and experimental details to know exactly how to take that insight to make a Gsps/GHz gravimeter sensitive enough to monitor the sun and moon, and to track the seismic waves from mid sized earthquakes. There are plenty of people working on direct gravity signals from earthquakes and plenty using other methods for earthquake early warning and characterization. But it is an available global data community with a large amount of data for correlations.
But, my simple minded idea that the electron can be used as the test mass for gravimetry, when there are enough independent samples to use statistical methods to improve the sensititivity. I have had to go through many many super resolution techniques looking for ways to build low cost high sampling rate gravimeters suitable for imaging arrays that can scan the interior of the earth and sun, the atmosphere and ocean, planets, the sun and the whole sky. I realize it depends on use ALL the information, not throwing away any data, that leads to super resolution. And that is why I want to use ALL the data from these devices because that "noise" is just a linear combination of signals from places where the signals can be characterized by other methods.
This is not pretty yet, nor simplified. I am trying to make some calculators and simulations to show how to trade off to get the most for the least cost. I was just going through to see what I could get from using a dozen cheap audio chips that can run at 24 bits and 96 ksps. That should be equivalent to 96ksp * 2^14 = 1.572864 Gsps at 10 bits for a lossless process. When you have ways to gather and store at Gsps, then you can use lossless methods through out.
I like checking each instrument by running it for minutes, then hours, then days, then months. You learn a lot about the noise in a device when run it that long. At first the "noise" seems overwhelming and you just look at minute by minute statistical summaries and counts. If I could just get a few people running identical sensors at different places, we could begin to compare signals and notes and methods.
Sorry for such a rambling note. If people ask specific questions I will try to answer. Or I will keep going on alone.
Richard Collins, The Internet Foundation
 
Richard Collins
added a project reference
Richard Collins
added a project reference
Richard Collins
added an update
VLBI and precision GPS Stations can constrain their positions and orientation with gravitational observations. Specifically with multiaxis gravimeters of high sensitivity, carefully calibrated to track the sun, moon and planets.
The tidal signal from solar system bodies cannot be damped. The dynamic variations from solar mass changes should not be attenuated by the ionosphere and troposphere. The regular sun, moon and planetary signals will not be attenuated, nor distorted. Instruments in caves and buildings will see the signal with no attentuation. The instruments are relatively low cost, and new generations of atom interferometer gravimeters are pushing out several orders of magnitude in sensitivity.
While the radio telescopes can see specific sources occasionally, every station can track the sun, moon and planets continuously, then correlate with electromagnetic observations.
What are the requirements to give position and orientation to 0.1 mm and 0.1 mas?
There should be astronomical sources, more than GW wave sources, for gravitational acceleration field variations.
 
Richard Collins
added a project goal
Develop an Integrated network Of earth and space based sensors to map, in real time, the gravitational potential and acceleration field, and mass distributions in the solar system. Identify and calibrate existing networks. Adapt existing technologies. Develop new gravitational engineering technologies.
 
Richard Collins
added an update
Superconducting Gravimeters
MEMS Gravimeters
Three axis Broadband Seismometers as three axis Gravimeters
Ephemeris Tidal Gravitational Signal
Earth Moon Barycenter Corrections
Earth Tide Sensitivity
Gravimeter Arrays
Solar System Gravimetry
Gravimeter Imaging Arrays
Time of Flight Imaging
Big G Experiments
Vibration Isolation Systems for Gravitational Wave Experiments