ArticlePDF Available

Exploring Global Consciousness

Authors:
  • Institut Métapsychique International

Abstract and Figures

(NOTE: Read the NEW PAPER: SEARCHING FOR GLOBAL CONSCIOUSNESS, (available here at RGate). The results of this paper are old and outdated. -PAB)... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... The Global Consciousness Project (GCP) is a long-term experiment which investigates the proposition that direct correlations of mind and matter may occur on a global scale. The Project is motivated by numerous experiments which suggest that the behavior of random systems can be altered by directed mental intention. Since 1998, the GCP has maintained a global network of random number generators (RNGs), recording parallel sequences of random data at over 60 sites around the world. In a novel experimental approach to the question of mind-matter interaction, the GCP proposes that data from the RNG network will deviate from expectation during times of "global events," defined as transitory episodes of widespread mental and emotional reaction to major world events. An on-going replication experiment tests this hypothesis by measuring correlations across the network during the designated events. The result of over 300 formal hypothesis tests is highly significant. A composite statistic for the replication rejects the null hypothesis by more than 5 standard deviations. Further analysis reveals evidence of temporal and spatial structure in the data associated with the events. Controls exclude conventional physical explanations or experimental error as the source of the measured deviations. The results suggest that some aspect of human consciousness may be involved as a source of the effects. The paper presents a comprehensive review of experimental methods and results after more than 11 years of continuous operation.
Content may be subject to copyright.
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
Exploring Global Consciousness
Submitted to Explore: The Journal of Science & Healing
Roger Nelson1,# and Peter Bancel2,3
Abstract
The Global Consciousness Project (GCP) is a long-term experiment which investigates the
proposition that direct correlations of mind and matter may occur on a global scale. The Project
is motivated by numerous experiments which suggest that the behavior of random systems can
be altered by directed mental intention. Since 1998, the GCP has maintained a global network of
random number generators (RNGs), recording parallel sequences of random data at over 60 sites
around the world. In a novel experimental approach to the question of mind-matter interaction,
the GCP proposes that data from the RNG network will deviate from expectation during times of
“global events,” defined as transitory episodes of widespread mental and emotional reaction to
major world events. An on-going replication experiment tests this hypothesis by measuring
correlations across the network during the designated events. The result of over 300 formal
hypothesis tests is highly significant. A composite statistic for the replication rejects the null
hypothesis by more than 5 standard deviations. Further analysis reveals evidence of temporal and
spatial structure in the data associated with the events. Controls exclude conventional physical
explanations or experimental error as the source of the measured deviations. The results suggest
that some aspect of human consciousness may be involved as a source of the effects. The paper
presents a comprehensive review of experimental methods and results after more than 11 years
of continuous operation.
1 Global Consciousness Project, Princeton, NJ, USA
2 Global Consciousness Project, Princeton, NJ, USA
3 Institut Métapsychique International, Paris, France
# Corresponding Author. Address: rdnelson@princeton.edu
2/12/10 Pg 1
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
Introduction
The universe is of the nature of a thought or sensation in a universal Mind... To
put the conclusion crudely — the stuff of the world is mind-stuff. As is often the
way with crude statements, I shall have to explain that by "mind" I do not exactly
mean mind and by "stuff" I do not at all mean stuff.
-- Arthur Eddington1
Where is the mind? Is it wholly in the brain? If not, what are its extended qualities? Are there
direct effects of mind in the physical world? Is there such a thing as collective mind? Could there
be a global consciousness?
These are difficult yet deeply interesting questions. They demand not only scientific clarity, but
an inclination for adventure in uncharted intellectual waters. Since early in the 20th century,
researchers working at the edges of physics and psychology have addressed questions like these
by looking at the extraordinary capacities of human consciousness. The Global Consciousness
Project (GCP) was created to broaden these efforts. With contributions from scientists, engineers,
artists, and business people from around the world, its purpose is to study the possibility of a
subtle reach of consciousness in the physical world on a global scale.
The GCP maintains a world-spanning network of instruments designed to produce continuous
random data and asks if these data may be altered during special instances of collective human
activity. The instruments produce data every second at each of 65 locations around the globe,
creating a record of random data that can be compared with the history of major events on the
world stage. The hypothesis we test proposes that streams of data from these random sources will
display non-random behavior during times of “global events.” Specifically, we predict systematic
deviations in the data streams when there is a widespread sharing of mental and emotional
responses. An on-going experimental test of the hypothesis, using operational definitions in a
replication protocol, finds significant evidence of characteristic anomalies in the data for a wide
range of events. The results indicate that something remarkable may be happening when people
are drawn into a community of common attention or emotion. In this review paper we present the
background, methods, and findings of the decade-long experiment, and address certain
implications of the results.
The Edge of Consciousness Science
Over much of the modern scientific era, questions concerning the nature of human consciousness
have largely been ignored by mainstream science. Nevertheless, for nearly a century, a small
number of laboratory researchers have persisted in exploring questions at the margins of our
understanding, developing over the years the experimental methods needed to study potential
interactions between mind and matter.2 This area of research offers a unique window into the
nature of consciousness by proposing direct manifestations of consciousness in the physical
world. Evidence of these “impossible” phenomena gathered under controlled conditions raises
puzzling questions. How could it be possible to obtain information from distant locations with no
physical or sensory connection? What could explain correlations between physical processes and
the purely mental attention of human subjects? Can there be direct effects of intention in the
physical world? Is there a sense in which mind is present in the world beyond the brain?
2/12/10 Pg 2
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
Laboratory experiments which address these questions often exploit a loophole in the causal
framework of physical theory by focusing on the behavior of random systems. Although physical
theory takes causality as a guiding principle, it also admits truly random phenomena (that is,
phenomena which are in principle indeterminate, and not merely statistically uncertain). A truly
random physical process may be influenced by causes, but it is not wholly determined by them.
Quantum transitions are a familiar example of this weaker causality which is accepted in physics,
and is potentially of relevance to mind-matter research. Random phenomena are interesting for
experimentation precisely because, in our current understanding, they are not fully explained by
deterministic causes, and because research on mind-matter interactions also challenges the
completeness of conventional views of causality.
Among the early experiments which investigated the interplay of randomness and conscious
activity were studies in which subjects were asked to influence macroscopic systems.3 Since the
1960’s, experiments have largely used the high speed generation of random numbers employing
quantum electronic or radioactive sources. With the advent of the computer, automatic recording
helped to ensure experimental control. Improved experiments asked whether the random output
of quantum sources could be biased by the mental intentions of subjects.4 In the latter part of the
20th century, replications of random number generator (RNG) experiments were carried out in
laboratories around the world.5,6
One prominent research program, the Princeton Engineering Anomalies Research (PEAR)
laboratory,7 was founded by Robert Jahn in 1979 at Princeton University. In carefully controlled
RNG experiments, the PEAR lab demonstrated a small, persistent effect equivalent to a few parts
in 10,000. Compounded over the full database, the effect is highly significant and cannot be
adequately explained by chance fluctuation or methodological error.8 The research extended the
seminal early work of Schmidt4 and motivated replication experiments in several independent
laboratories. While many experimental questions about the RNG experiments remain (most
notably the role of psychological variables), the research carefully documents anomalous
departures from expectation associated with human consciousness, and specifically, with directed
intention.
Later versions of the RNG experiments used portable random sources and by the early 1990s
field work was feasible. In the field experiments, rather than instructing a participant to focus his
or her intention on a laboratory RNG, the device was brought to locations where groups of
people, blind to the experiment, were engaged in communal events and activities such as a
rituals, ceremonies, meetings and musical concerts. The experiments asked whether continuously
recorded sequences of random data might show structure during periods of group interaction
which involved shared emotions or deep interest.9,10 These experiments were subsequently
replicated by other researchers.11,12 The results indicated that deviations in the random data
correlated with periods of group activity, especially when the people involved reported a sense of
coherence or resonance with the group. Tests in which data were collected in mundane or
unfocused situations conformed to expected random behavior.
The field work raised a number of issues which became the basis of the Global Consciousness
Project. Among these are questions about the consequences of running multiple devices in a
distributed network: would multiple, simultaneous data streams reveal different effects?13,14
Would the RNGs correlate with each other and would this be a function of their proximity to the
2/12/10 Pg 3
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
event or their mutual separation? Other questions concerned the impact of various qualities that
characterize events: their size, emotional tone, importance, human vs. natural origin, etc.
In 1997 an effort was launched to study these questions using a permanent, world-wide network
of RNGs. The result was the Global Consciousness Project, which began data collection in
August, 1998 and continues to this day.15,16 The GCP network is an instrument designed to
capture indications of mind-matter correlations manifesting on a global scale. A fanciful
conception of the network is that of an electroencephalogram or EEG for the world. In practical
terms, the project makes a conceptual leap from the single-device laboratory and field
experiments which examined intention and group consciousness, respectively, to a multi-device
experiment designed to look for related effects on a global scale.
An Experimental Hypothesis
To proceed, the proposition of global mind-matter correlations needs to be translated into an
experimental hypothesis. Since we are breaking new ground, there is little history to guide
hypothesis specification. Nevertheless, we can infer from the laboratory and field research
described above that the effect would most likely span a broad range of physical, social, and
emotive conditions and would be small compared to the intrinsic noise scale of the data. We
therefore make a general hypothesis describing a range of conditions rather than a narrow set of
parameters:
Periods of collective attention or emotion in widely distributed populations
will correlate with deviations from expectation in a global network of
physical random number generators.
The hypothesis avoids premature over-specification, but includes the main elements we wish to
test for: global correlations between collective conscious activity and the material world, as
represented by the physical RNG network. Experimentally, this general hypothesis is instantiated
in a series of specific, rigorously defined hypothesis tests, each of which is compatible with the
general statement. To use technical language, we propose a composite hypothesis which
formulates our broadest guess of how global mind-matter correlations might be defined for the
RNG network. We then proceed experimentally with a series of replications using simple
hypotheses which are fully specified and can be compared quantitatively against the null
hypothesis.
To set up a formal test, we first identify an engaging event. The criteria for event selection are
that the event provides a focus of collective attention or emotion, and that it engages people
across the world. We thus explore events of global character, but allow for variability in their
type, duration, intensity and emotional tone. Once an event is identified, a test hypothesis is
constructed by fixing the start and end times for the event and specifying a statistical analysis to
be performed on the corresponding data. These details are entered into a formal registry before
the data are extracted from the archive. The analysis for an event then proceeds according to the
registry specifications, yielding a test statistic relative to the null hypothesis. These individual
results become the series of replications that address the general hypothesis and ultimately are
combined to estimate its likelihood. To eliminate a frequent misconception, we note that we do
not look for “spikes” in the data and then try to find what caused them. Such a procedure, given
2/12/10 Pg 4
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
the unconstrained degrees of freedom, is obviously inappropriate.
A central experimental problem for the GCP is how best to study a conjectured global
consciousness effect in data dominated by random noise. The experiment must treat an effect
which is not only small in size, but also incompletely specified. The solution we adopt is to
implement a two-stage experimental program. First, the replication series, which we refer to as
the formal experiment, yields an aggregate score which estimates the overall significance of the
composite hypothesis against the null hypothesis. The formal experiment is ongoing and the
aggregate result can be likened to a continuing meta-analysis which updates the significance of a
measured effect size with each new event. Second, the formal experiment identifies a data set for
further analysis since it provides a level of confidence that the hypothesized effect is indeed
represented in the event data. This approach allows us to explore a range of factors in secondary
analyses without imposing constraints prematurely.
How it Works
The GCP is Internet-based and employs a network of RNG devices installed at host sites (nodes)
around the world. A central server receives data from the distant nodes via the Internet and
incorporates them into a continually growing database archive. Each local node comprises a
research quality RNG which is connected to a host computer running custom software. The
software collects one data trial each second, a trial being the sum of 200 consecutive random bits
of RNG output. The bit-sum is equivalent to tossing a fair coin 200 times and counting the heads,
yielding random values with a theoretical average of 100. The bits are generated from physical
random processes (specifically, quantum tunneling) in the RNG circuitry and are not created by a
computer algorithm. The GCP data are thus at once truly random and derived from natural
physical processes.
The trials are time-stamped, written to the local disk and then uploaded from the local hosts to
the network server in Princeton, NJ at 5-minute intervals. Custom software on the server stores
the data in permanent archives with all data synchronized at one-second resolution. The result is
an accumulating database of continuous parallel data sequences. The synchronous data
generation means that we can treat the network as a single instrument, using statistical measures
that address the whole network rather than treating the RNGs individually.
Figure 1 shows the location of nodes in the current network, which has grown to approximately
65 nodes since the start of the Project. We rely on volunteers to host and maintain the RNG
device and software at each node. The geographical distribution of nodes is opportunistic in the
sense that we are constrained by infrastructure limitations of the Internet. While we aim for a
world-spanning network – ideally a deployment representative of world population densities –
network coverage is poor in areas where Internet access is limited. For example, we do not have
coverage in many parts of Africa and Asia.
The GCP website at http://noosphere.princeton.edu describes all aspects of the project, ranging
over its history, context, and technology. One of the important features defining the Project is
transparency, and the website is a public access repository of information, including the entire
archive of raw trial data, which is freely available for download. We maintain a complete record
of the formal hypothesis tests and preliminary results from ongoing analyses, as well as
2/12/10 Pg 5
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
contributions and critiques by independent, third-party investigators.
Figure 1. Location of nodes (RNGs) in the world-spanning GCP network
as of late 2009. Internet infrastructure constrains the distribution. An
interactive map and table available on the GCP website provides details
for each node.
The Formal Replication Experiment
Through January 2010, over 300 rigorously vetted, pre-specified events have been registered in
the formal replication series, including tragedies and celebrations, disasters of natural or human
origin, and planned or spontaneous gatherings involving great numbers of people. The events
generally have durations ranging from a few hours to a full day. The Project registers about 30
formal events per year, and the data taken during these events comprise less than 2% of the 11-
year, 22-billion trial database. The cumulative experimental result attains a level of 5.3σ
(standard deviations) relative to the null hypothesis. The odds of a chance deviation of this
magnitude are about 10 million to 1.
The formal result is obtained by first converting the test statistic for each event to a standard
normal Z-score. The scores are averaged and the confidence level against the null hypothesis is
given by the deviation of this average from zero. We find an average event Z-score of 0.311 ±
0.059, which yields the 5.3σ composite deviation cited above. The calculations assume that the
RNGs have stable output distributions, and this has been extensively verified across the 11-year
database. All analyses are checked for errors by running simulations on pseudo-random data sets.
The primary result has also been confirmed by bootstrap estimates using random re-sampling
from the entire database. A demonstration of the bootstrap analysis is shown in Figure 2.
2/12/10 Pg 6
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
Figure 2. Bold curve shows cumulative total deviation of results for all
formal hypothesis tests. The cloud of gray curves shows the results of
500 re-sampling “control” tests with the same specifications but
randomly offset start times. Expectation is the horizontal line at zero.
The figure compares the cumulative deviation of the actual event Z-scores with the cumulative
traces of Z-scores for identical event periods sampled at random start times. The bold line shows
the event data and the gray traces show 500 re-sampled data series. The graph plots the
chronological deviations from expectation. The endpoint of the bold trace corresponds to the
composite result for the replication. It is clear from Figure 2 that the event data have a positive
bias which is not present in the database as a whole. A full bootstrap analysis finds an empirical
deviation of 5.5σ for the real data against the re-sampling distribution, in close agreement with
the theoretical result. The re-sampling analysis provides a rigorous confirmation that the GCP
database as a whole conforms to expected null behavior, whereas the behavior at the times of
events displays a persistent deviation. It also verifies that our analytical procedures do not
introduce spurious correlations.16
The experimental trace in Figure 2 reveals several other important conclusions about the event
data. First, although the trend is fairly steady, it fluctuates randomly about the average slope, as
is expected for a weak effect dominated by random noise. Second, it is evident by inspection that
the deviation is distributed smoothly over events; the cumulative rise is not dominated by a few
outlier events. Third, the average contribution of events is small. This is a crucial point, as it tells
us that a single event cannot discriminate against the null hypothesis; many events are required
in order to reliably detect and measure the effect. From the measured effect size of 0.311, an
estimated 90 events are needed to attain a significance of 3σ (p-value 0.001), which is at the
lower bound for a comfortable confirmation of the hypothesis. Even with a less demanding
criterion, it is obvious that many replications are needed for an effect to be discriminated.
2/12/10 Pg 7
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
Statistical Noise and Effect Size
Although the effect is, on average, too small to allow for the analysis of individual events, we
can still ask if there are events which might yield to individual analysis and provide us with
further insight. For instance, if the registered event times exclude some nearby periods of
substantial deviation, then including the unexamined data might augment the confidence level of
an individual event. One case that stands out in this respect is the terrorist attack of September
11, 2001.
The original analysis for the September 11 event yields an event Z-score of 1.87 (p = 0.031).17
However, the formally specified duration of four hours and 10 minutes hardly reflects the full
impact of the 9/11 attacks and the worldwide reactions to them. In a series of post hoc analyses,
data extending beyond the formal event time were examined to see if the network deviations
persisted while the world-wide reaction of shock, grief, compassion and anger unfolded. The
analyses show that the deviations did indeed persist for an extended period of over two days at
roughly the same level as measured in the formal event. However, when corrected for multiple
analysis and informed choice, the probability of the deviation measured in the post hoc test is
roughly the same as the registered event period. Thus, the extended 2-day deviation around the
September 11 event does not confirm the GCP hypothesis at a higher level of statistical
confidence. This conclusion is supported indirectly by comparing the 9/11 event with the rush-
hour bombings in Madrid, 2004 and London, 2007, which were similar in character. Neither the
Madrid or London events showed significant deviations in similar analyses.
We conclude from these post hoc assessments that the significance of single event data, even
when data-mined outside the times of the formal specification, remains ambiguous or marginal.
This is an unavoidable consequence of the small effect size and reinforces our conclusion from
the replication experiment that many events are needed to confirm the general hypothesis, even
for events which involve the largest numbers of people or have the greatest emotive impact. Only
in combinations of many separate tests do the effects achieve clear statistical significance.
This situation is, of course, common across scientific disciplines. Psychological and clinical
studies often are designed to assimilate thousands of trials over years of study. An example from
the physical sciences describes similar conditions for data collection at the international Large
Hadron Collider (LHC) experiment. The consortium released the following statement in 2008:
“... the LHC is due to begin testing and collecting data this September. It will be six to seven
years, however, before any results can be analyzed. The reason for this is that the particle events
under investigation will only occur in a minority of the interactions and, even then, their
presence will be masked by 'noise' generated from other interactions. The events are also very
short-lived, lasting only for fractions of a second. As a result, data of sufficient statistical power
will take the best part of a decade to collect.”18
Defining Global Consciousness
It is essential to clarify what we mean by “global consciousness” because the term evokes many
ideas that differ from our intended usage. Because our approach to the GCP hypothesis is strictly
empirical, we adopt an operational definition, stating clearly what we do in the experiment,
thereby defining pragmatically the object of investigation. That is, we treat global consciousness
2/12/10 Pg 8
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
as a set of operations, rather than as an intellectual construct. We want to study X and we do so
by performing operations Y and Z. This yields a precise definition of global consciousness for
the purposes of this experiment. Thus, when we say “operational global consciousness” (OGC)
we refer to the operations constituting the formal replication series that is used to evaluate the
general hypothesis. In other words, we implement hypothesis tests by giving prescriptions for
measuring correlations between data deviations and world events. The degree to which the
hypothesis is valid is measured by performing the replication experiment. Although we denote
the correlation as global consciousness, we do not propose an underlying mechanism for it, nor
do we attribute it to a conscious agency.
The operational definition of global consciousness has a number of advantages. First, it avoids
confusing our experimental proposal with a theoretical conjecture. The GCP hypothesis is not
intended to describe a theoretical position, but is an experimental question motivated by prior
research findings.* Second, it allows us to specify a confidence level for experimentally
established deviations prior to further analysis. Finally, the replication series at the core of our
operational definition is well-suited to an effect with low signal-to-noise ratio. As we have
shown, single events are not amenable to analysis because the small effect size limits statistical
power. From the point of view of our operational definition, single events are not taken as
instances of OGC since they do not confirm the general hypothesis at a high confidence level.
A Research Program
The scientific mind does not so much provide the right answers as ask the right
questions.
-- Claude Levi-Strauss19
The formal experiment can be summarized as follows. The general hypothesis is addressed
experimentally by replicating explicit tests of events. The outcome of each test is expressed as a
Z-score which represents its deviation from the null hypothesis. Confirmation of the general
hypothesis is tested by comparing the average value of the event Z-scores to zero, the null
expectation. The average Z over nearly 300 registered events is 0.311 ± .059, a deviation of more
than 5 standard deviations, equivalent to roughly one chance in 10 million. Extensive analyses
confirm that this value is not skewed by outliers and is a reliable estimate of the effect size. We
conclude that the event experiment successfully measures global consciousness in the
operational sense discussed earlier, and that the general hypothesis is confirmed to a high level of
confidence.
The formal experiment is part of a broader experimental strategy in which models are proposed
and tested to gain insight about the nature of the effect. The first step is to characterize the
structural details of the event data. To achieve this, the formal event Z-scores need to be
expressed directly in terms of the more fundamental RNG trials. Whereas the event Z-scores
concisely summarize the formal result, the RNG trials index a complete description of the
* Our approach differs from research which employs a fixed significance criterion to test hypotheses such as the P-
value < 0.05 often used in the social sciences. Since there is no explanatory theory or precedent for setting
expectations, we simply calculate a level of confidence that the replication shows non-chance variation. In terms of
our operational approach, the confidence level of OGC for the event data is 99.99999% against the null hypothesis.
2/12/10 Pg 9
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
experiment: trial values with their time-stamps, the geographical position of the RNGs, and the
event labels. A trial-level description thus permits analysis of any aspect of the experiment. In
addition, models which advance explanations can be tested against any trial-level structure
shown to be present in the data. The goal of the data characterization is thus to determine an
accurate statistical description of the event data in a form appropriate for model testing. We show
in the following section how this is achieved with a single trial-level statistic.
Models may attempt to expand on the general hypothesis or propose alternate explanations for
the formal result. For example, an obvious approach suggested by the general hypothesis is a
field-type model in which RNG behavior is predicted by the value of a field present at the RNG
locale. The field might depend on the character and distribution of mental and emotional activity
in the world population, or other appropriate variables. A proposal of this type predicts that
temporal and spatial field variations will result in corresponding structure in the data. Empirical
evidence of such data structure would thus be supportive of these models. We will return to the
topic of modeling later in the paper.
Conventional explanations can also be proposed for the anomalous findings. One might suppose
that the result is due to experimental flaws such as the inadequate shielding of RNGs from
background electromagnetic fields or bias due to methodological errors. The GCP design
addresses these eventualities by physically shielding the RNGs from EM fields and by logical
operations in software which cancel output bias arising from environmental influences. The
replication protocol ensures that data remain archived until an event is fully specified so that
methodological “leaks” leading to biased data selection are precluded. In general, a spurious
result will only obtain if the network produces systemic deviations of precisely the kind we
measure, or if the deviations are introduced by a flawed analysis procedure.
An important design feature of the GCP is that data are generated continuously, so data that do
not correspond to events are available for baseline comparisons. This de facto control database
will necessarily contain any systematic non-ideal behavior also present in the event data. Since it
exceeds the size of the event database by nearly two orders of magnitude, the off-event control
data allow us to check for spurious effects with high precision. The re-sampling analysis shown
earlier in Figure 2 is an example of a control which uses off-event data.
Data Characterization
In this section we briefly describe the trial-level statistic which will be the basis for analyses and
model testing. Details are presented in a previous publication.16 There we show that analytical
expressions of the formal result can be reduced to synchronized correlations between the RNG
trials. The correlation elements are expressed as the products of pairs of trial values, C1 = zi zj ,
where zi is the (normalized) trial value of the ith RNG for one second. The elements of C1 include
all possible combinations of RNG pairs, subject to the restriction that the pair-products have
identical time-stamps. It can be shown that the average value of C1 is proportional to the average
linear (Pearson) correlation between RNGs.
Under the null hypothesis, the expected average value of C1 is zero and, in this reformulation, a
For these secondary analyses, the full database of 300+ formal events is reduced to 280 by excluding events
longer than 24 hours and some that are incompatible with the C1 analysis.
2/12/10 Pg 10
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
deviation in the mean value of C1 corresponds to the non-zero average of the event Z-scores. The
pair-product formulation yields a slightly reduced significance (4.9σ versus 5.3σ for the formal
event tests) due to different weighting procedures in the two formulations, but this difference is
not statistically meaningful.
The event and trial-level formulations lend themselves to different interpretations. The event
formulation tells us that the formal predictions are successful in identifying OGC. The pair-
product formulation provides more information. It yields evidence that OGC is associated with a
precise trial-level statistic, namely the synchronized correlations of RNGs in the network. Both
formulations confirm the general hypothesis at a high confidence level, but the formulation in
terms of C1 provides physical insight into how OGC arises during events.
While the bi-linear form of C1 may be familiar for some readers, it is perhaps useful to provide
an intuitive picture of the synchronized correlations it represents. Imagine that the network of
RNGs is replaced by buoys tethered at scattered locations across the ocean, and that the data
acquisition consists of monitoring the height of each buoy, at each second, as it bobs up and
down with the waves. The null hypothesis for C1 describes buoys which bob randomly, without
apparent correlation in their instantaneous heights. A significant positive value of C1 describes a
situation in which the buoys – or at least a substantial number of them – bob up and down in
unison. This corresponds to a measurement of OGC. It represents an unusual occurrence in the
context of this image because we do not expect the detailed motion of buoys (or waves) at distant
ocean locations to be correlated.
There is no reason, a priori, to assume that the formal experiment, or equivalently, the statistic
C1, captures all anomalous deviations present in the event data. While there is in principle an
uncountable number of statistics we could investigate, the simple expression for C1 suggests a
few forms to test.
First, and most obvious, is the value of individual trials, zi, or more generally, the single trial
moments of the form zin which, taken together, represent the full statistical distribution of
individual trials. We find that the single trial statistics conform to null behavior. This is an
important result since it says that, within the accuracy of the experiment, direct perturbations of
the individual trial scores are too small to measure. The formal experiment provides evidence of
significant correlations among RNGs, but we do not see evidence of anomalous deviations in the
trial values themselves.
Second, the C1 statistic suggests a class of correlation products, zin zjm. A straightforward (albeit
tedious) algebraic analysis shows that, for integer (m,n), only the case zi2 zj2 is independent of C1.
We refer to this correlation statistic as C2. This statistic is particularly interesting because it has
exactly the same structural form as C1, but represents a unique, orthogonal correlation
“channel”. The identification of C2 comes solely from analytical considerations, and it is not
measured by the formal replication. As with C1, the average value of C2 is zero under the null
hypothesis, and a positive value indicates the presence of correlations. A calculation of C2 yields
‡ Strictly speaking, the mutual correlation of C1 and C2 is identically zero under the null hypothesis. For
convenience in calculations we employ a modified form which uses the pair-products of zero-mean quantities: C2 =
(zj2-1) (zj2-1). Integer powers of C2 are also uncorrelated with C1, but not with C2 itself, so we need only examine
the lowest order, C2. Details will be presented in a forthcoming publication.
2/12/10 Pg 11
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
an effect size of C2 = 3.8 ±1.8×10-5. Interestingly, this is statistically indistinguishable from
the C1 effect size. Details for the C2 calculation are shown in the last row of Table 1. Re-
sampling analyses on the entire database empirically confirm to high precision that C2 conforms
to null expectation for off-event data, and that C1 and C2 are uncorrelated.
Statistic N Mean value Error Deviation P-value
Event Z-scores 280 events Zevent = 0.31 0.059 5.29σ 0.6×10-7
C1 1.3×1010 C1 = 4.25×10-5 0.88×10-5 4.85σ 6.2×10-7
C2 1.3×1010 C2 = 3.79×10-5 1.74×10-5 2.18σ 0.015
Table 1. Comparison of Z-score and correlation formulations. Row one gives the statistics
for event-based analysis. Row 2 and 3 give trial-based statistics for the C1 and C2
correlations, respectively (see text). Columns show the number of events or trials, the
mean, standard error, total deviation, and probability against chance.
In the image of ocean buoys, C1 corresponds to a correlation of the changing heights of distant
buoys. A non-zero value of C2 corresponds to a discovery that the buoys are also correlated as
they tilt from side to side.
Our characterization analysis thus finds that the RNG network exhibits two orthogonal trial-level
correlation channels. The C1 statistic underlies the formal result, while C2 is revealed by analysis
to be a unique, alternate correlation channel, not measured by the formal experiment. The finding
that the two effect sizes are of the same magnitude is important for interpretations of the
experiment. It suggests that global consciousness, when defined in terms of pair correlations, is a
more general effect than is indicated by the formal experiment alone.
Figure 3 presents the two correlation statistics in the same cumulative deviation format used for
the formal event Z-scores. They both show approximately the same slope, corresponding to their
roughly equal effect sizes. Note also that C2 has a greater intrinsic variance than C1. This is
visible in Figure 3 and shown in the Error column of Table 1. The C2 variance leads to a smaller
significance level for the C2 statistic, relative to mean expectation.
2/12/10 Pg 12
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
Figure 3. Chronological cumulative deviation from expectation of two
measures of correlation in GCP event data. Black (lower) curve is C1.
Gray (upper) curve is C2. Statistics related to C1 and C2 have been
referred to in previous publications to as the netvar and covar,
respectively.
Spatial and Temporal Structure
So far, we have shown that operationally defined global consciousness, OGC, corresponds to
correlations in the RNG network, and that independent correlations also appear in a parallel
channel. We would like to know if the event data contain further structure and if the structure
might relate C1 and C2 more directly. Two important questions to consider are whether the
correlations depend on the location of RNGs, and whether the correlation strength evolves in
time as an event unfolds. The trial-level description provides a basis for spatial and temporal
analyses since the correlation statistics contain the RNG locales and trial times as parameters.
The GCP hypothesis anticipates structure of this kind because it posits an effect that is both
dependent on the timing of events and geographically diffuse. Our general analytical strategy is
to approach issues like these with as little theoretical overlay as possible. In this section we
describe tests of structure in C1 and C2 which are based on minimal, physically intuitive
assumptions about the effect. The tests yield positive evidence for spatial and temporal structure
in the event data and illustrate the utility of our two-stage research strategy.
An immediate challenge is the choice of an appropriate measure for the tests. In the case of
spatial structure, even events with a definite location, such as earthquakes or catastrophic
accidents, lack a ready parametric description of the distribution of global reactions. Consider the
terrorist attacks of September 11, 2001. Although the attacks occurred at three precise locations
in the eastern United States, the response to the news of the event was widespread and complex.
Moreover, it is not clear what aspects of the reactions pertain to OGC and how these might
impact different regions of the network. Similarly, while the GCP hypothesis tacitly implies that
effects will correspond to the event timing, it does not provide a metric for actual durations.
2/12/10 Pg 13
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
Despite these difficulties, both spatial and temporal structure are in principle detectable. Arguing
from minimal assumptions based on the GCP hypothesis we can conclude that a characteristic of
structure in the data correlations will be its smooth variation, both in time and across the
network. These smooth, large-scale heterogeneities in the data are detectable signatures of OGC
because they are not characteristic of excursions which occur purely by chance.
In the case of spatial structure, a test can be devised from a linear regression of the correlation
strength against distance. The test avoids the issue of defining an event’s location by considering
only the distance between RNGs. A general observation from the physics of spatially distributed
complex systems is that correlations among interacting constituents tend to weaken as their
separation grows. Thus a prediction based on physical intuition suggests that the correlation
strength will decrease as a function of RNG pair separation. A test of this conjecture is
constructed as follows.
The geometrical separations of the RNG pairs are calculated for each of the 1010 elements of C1
in the event data. The elements are sorted by distance into bins (in the presentation below, the bin
widths are 250 km). The average values of the correlation strengths are calculated for each bin,
and a regression of correlation against distance is performed. A non-zero regression slope
provides evidence of smoothly varying spatial structure, and the expectation is that the slope will
be negative. The broad deployment of the GCP network allows us to perform the test over
distances which range from a few meters out to the earth’s diameter.
We find that a linear regression of C1 versus pair separation yields a negative slope, consistent
with the prediction. The Z-score of the slope parameter relative to the null hypothesis is ~1.9, for
a P-value of 0.015. Here, the null refers to a situation in which anomalous correlations are
distributed homogeneously throughout the network. This would result in a complete lack of
distance structure and a zero slope parameter in the regression analysis. The negative slope of the
regression fit gives a zero intercept at a distance of roughly the earth’s diameter, indicating that
the correlations decrease gradually with pair separation, and that RNG pairs correlate over
thousands of kilometers. Permutation and re-sampling analyses show that the distance structure
does not occur for off-event data and that it is not due to an accident of the network geography.
The correlation C2 permits a similar, independent check of the distance structure. We find that a
regression on C2 also yields a negative slope, with Z ~ 1.85 and a comparable range of decrease.
The C2 regression lends additional support to the evidence for distance dependent correlations. It
also suggests that C1 and C2 exhibit not only correlations of comparable strength, but that they
share details of spatial structure. A joint regression of C1 and C2, shown in Figure 4, results in a
slope parameter with Z ~ 2.8 (P-value 0.003).
2/12/10 Pg 14
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
Figure 4. Regression of correlation strength on distance for a
composite of the two measures, C1 and C2. The huge numbers of
correlations are averaged in 250-km bins. The heavy black line is the
fitted linear regression. A test of the negative slope parameter against
the null yields a Z-score of 2.8.
The regressions give empirical evidence for spatial structure and indicate that models will need
to incorporate distance-dependent correlations in order to adequately describe the event data. The
form of the dependence (linear, exponential, etc.), and whether the dependence applies to OGC
uniformly or only for certain kinds of events, are issues that remain to be resolved. These are
challenging questions for analysis, as the weak effect size evident in the scatter in the plot of
Figure 4 attests. However, simulations of a numerical model demonstrate that a linear
dependence does provide a good initial representation of the data. Specifically, we model the
distance dependence by a pure linear decrease which declines to zero at the earth’s diameter. The
simulation takes the measured total OGC correlation as input and returns the distribution of
expected model slopes. The left-hand plot in Figure 5 shows the model, which exhibits a
distribution that is well distinguished from a null model with distance-independent correlations
(right). The vertical bar in the figure shows the slope value for the actual event data, which
agrees with the linear slope model, but is incompatible with a null model that excludes distance
dependence.
2/12/10 Pg 15
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
Figure 5. Model of the combined regression of C1 and C2. The left-hand
curve shows the distribution of regression slopes for a decline in
correlation strength which is linear in the RNG pair distance (H1). The
vertical line indicates the actual regression slope for the event data. The
right-hand curve plots the distribution of slopes for the null hypothesis
(H0) of OGC correlations without distance structure.
For the analysis of temporal structure, we propose that the OGC correlations correspond to the
human response to events, which first grows as an event becomes the focus of global attention,
then persists for a time as people attend to the focus, and finally dissipates as attention wanes.
The actual event data are likely to incorporate sections of null data before or after the
correlations because the formally specified periods make generous estimates of the event
durations in order to maximize the likelihood that the full response is included in an event. The
expected temporal characteristic of event data will thus be substantial periods of correlation
during to the actual effect, bracketed by extended null sections (see Figure 6). The
straightforward, yet non-trivial assumption we make is that temporal variations of global
attention will correspond to time variations of OGC deviations.
The time structure can be tested by examining changes in the variance of the data. The test is
constructed by concatenating all events into a single data vector which is then divided into time
blocks of equal length. The variance of each block is calculated, and these are averaged to yield a
“block variance” for the data set. Repeating this calculation for a range of block sizes gives the
variance as a function of block length.
2/12/10 Pg 16
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
Figure 6. An exaggerated schematic of possible time structure. The
intrinsic variance is constant throughout. An “effect” is shown extending
from roughly 3 to 6 on the schematic time scale.
The temporal feature shown in Figure 6 leads to an increase in the block variance, and this can
be employed in a statistical test for time structure. To see this, consider the total variation within
a single data block. In the absence of an effect, the block’s variance is determined by the intrinsic
fluctuations of the random data alone. (For example, see the right half of the plot in Figure 6.) A
similar result will obtain if a deviation is present in the data, but evenly (homogeneously)
distributed throughout. However, for deviations which alternate with null periods, data blocks
can straddle the cross-over region between deviating and null data sections. For these blocks, the
change in deviation within the block will make an extra contribution to the block variance. When
the block length is small, the excess contribution to the variance is small. As the block size
approaches the length of the deviations, the excess contribution is more substantial and the block
variance increases. The expected behavior in the presence of the proposed time structure is thus a
block variance which rises smoothly to a maximum value and then levels off. The block length at
which the variance attains its maximum value gives an indication of the time-scale of the effect.
Figure 7 plots the excess block variance of C1 versus block length. The experimental trace (in
bold) exhibits a gradual rise to a maximum at about four hours, consistent with the expected
behavior in the presence of OGC time structure. This suggests that correlations typically persist
on this time-scale. The robustness of the calculation has been extensively checked with pseudo-
random and re-sampled data. However, the confidence level is modest due to the low power of
the test, which is limited by the intrinsic variance of C1. This can be seen by comparing the
block variance calculation with a specific simulation model. Applying again the approach used
for the analysis of distance, the total correlation of the event data is taken as input to a model.
The predicted time structure is simulated by distributing the correlation into time blocks
interspersed with null data periods. To keep the model simple and definite, the correlations are
distributed into 3-hour periods. The block variance is computed for 1000 simulations, yielding
the average variance for the model (gray curve, Figure 7). The low power of the test is evident
2/12/10 Pg 17
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
from the size of the error bars in the plot. The simulation lies within two experimental standard
deviations of zero, indicating that the power is limited by the amount of event data available for
analysis.
Figure 7. The heavy, black trace shows the excess block variance of C1
as a function of block size. Error bars show the 1σ uncertainties. The
gray trace is a simulation of the same function, with the correlation data
concentrated in randomly placed 3-hr blocks, interspersed with null data.
The horizontal line at zero is the expectation for null time structure.
The block variance of the event data agrees with the model to within one standard deviation and
yields a time-scale of several hours. These results are consistent with our intuitive expectations
for OGC time structure, but this preliminary conclusion will need additional confirmation from
refined or independent analyses. As in the case of distance structure, the temporal test can be
applied independently to the C2 statistic. However, a simulation for C2 shows that the greater
intrinsic variance of this second correlation completely dominates the test, rendering an
assessment of time structure in the C2 statistic untenable. We are working to develop more
powerful tests to overcome these limitations.
Models and Theory
We have demonstrated the existence of unexpected correlations and structure in the event data,
and these results can serve as input for theoretical models of the deviations. To the extent that
models are successful, they will not only describe the empirical findings, but will also refine our
understanding of the structure and lead to testable predictions. Ultimately we seek a theory that
provides a bridge from the empirical findings to a deeper understanding of the role mind or
consciousness plays in the material world.
In keeping with our empirical and operational approach, we consider a variety of explanatory
2/12/10 Pg 18
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
directions. For example, models might attribute the measured effects to:
1. Methodological errors or leaks which bias the formal replications
2. Conventional perturbations of RNG output due to ambient electromagnetic (EM) fields
3. A fortuitous selection of events and parameters through experimenter intuition
4. Retroactive information from future results determining the character of present data
5. A field represented as a linear superposition of individual human minds
6. An emergent field arising from a dynamical interaction among minds
The list is not exhaustive, but it spans a range of ideas from conventional approaches to
speculations which explicitly attempt to include consciousness. These represent classes of
models to investigate as we move forward. Let us consider each of the proposals briefly, asking
how well they might address the anomalous data correlations.
Explanations of the formal experiment based on spurious effects can be rejected for the reasons
detailed in the discussion of the GCP research program. Methodological leaks and systematic
biases are precluded, respectively, by the event specification procedure, which effectively blinds
the analysis, and the re-sampling controls which find no evidence of biases in the off-event data.
Proposals based on electromagnetic perturbations are among the most frequently advanced
conventional explanations of the GCP results. However, such proposals can be challenged on a
number of points. Design features of the RNGs and the network protect the data generation from
biases, as previously described. Even if these protections should fail, it is unlikely that local EM
fields could give rise to distant correlations among the RNGs. Lastly, direct analysis shows no
evidence of diurnal variation in the RNG outputs, whereas ambient electromagnetic fields arising
from the daily cycle of human activity would presumably induce a corresponding variation in the
data. It should be emphasized that, while we do not see current proposals based on EM fields as
viable explanations for the measured global correlations and data structure, it would be
premature to exclude entirely the possibility of subtle EM effects.
The third and fourth proposals, intuitive selection and retroactive information, are variants of a
theoretical position from parapsychology which has been advanced to explain psi functioning.20-
22 The general idea is that expectations and attitudes about the experiment play a role in
determining the outcome. In the data selection case, the key notion is that deviations result from
a fortuitous choice of timing rather than an actual change in the data. The measured anomalies
are attributed to the selection of data excursions in a naturally varying sequence. The fortuitous
selection is assumed to derive from the experimenter's intuition, which informs the choice of
events, their timing and the test procedures.23 The C1 data deviations have been analytically
tested against an explicit version of this model.24 The tests nominally reject the proposal, but at
present are not sufficiently powerful to draw definitive conclusions. However, these preliminary
conclusions are supported by the model’s failure to accommodate the spatial and temporal
structure found in the data.
The retroactive information idea in proposal four is based on time symmetry arguments.22 It
proposes that experimental outcomes are linked to the future in a manner that is analogous to the
apparently causal past. It implicates consciousness directly by claiming that unexpected data
2/12/10 Pg 19
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
correlations can be explained as a desired future actualizing in the present. Retrocausal models
are not developed to the point where they can be tested quantitatively against the GCP data and,
like the selection proposals, they cannot easily explain the varieties of structure seen in the event
data.
The last options propose two distinct field-type models associated with human consciousness.
Proposal five bears a similarity to models based on conventional fields in that it posits a field
generated by a distribution of sources. The connection to consciousness is made by associating
the field sources with conscious humans, while the field dynamics, which explain the RNG
correlations, derive from the coherence of human activity during events. The proposal can
accommodate the inter-node correlations and structure seen in the data, but it remains
phenomenological since it does not explain how the field arises in terms of underlying principles.
The final proposal suggests that individual minds are mutually interactive. In this view,
interactions among the minds of individuals are responsible for an emergent field or property
which depends on individual consciousness but is not wholly reducible to it. The proposal
suggests that the dynamic and interactive qualities of consciousness also involve subtle
interactions with the physical world and that these interactions are responsible for certain
anomalous phenomena, such as are found in the GCP event experiment. The proposal can be
construed as embodying in a formal way the ideas of such thinkers as Teilhard de Chardin25 or
Arthur Eddington.1 While it represents possibilities that are likely beyond the reach of our
current scientific tools, continued analysis of the GCP data will help us to determine whether we
need to look to proposals of this type for an adequate explanatory theory.
Discussion
Our overview of modeling demonstrates that proposals need to be examined for consistency with
the data structure. It is clear that current proposals either fail to explain the experiment or need
further development to produce strong tests. At the same time, the diversity of approaches
highlights the value of our empirical stance. Typically, theory and experiment work together to
guide and advance research. However, the interplay between theory and experiment breaks down
when experimental hypotheses lack a well-developed theoretical basis. This is evidently the case
for the GCP event experiment, despite its robust 5σ result. From this point of view, OGC is an
extreme example of a scientific anomaly in that it calls for both physical and psychological
explanations, without providing a clear theoretical link to either one.26 Of course, anomalies are
not off-limits to scientific study, but they require a period of empirical effort before theoretical
tools can be brought to bear on the problem.
We have followed this strategy by adhering to an operational definition of global consciousness.
The search for data structure has produced key results that will help to determine which classes
of model are more likely to be viable. For example, preliminary assessments indicate that a
phenomenological field model can in principle accommodate all the structure we measure: C1,
C2, and the time and distance parameters, while models based on selection or on EM interactions
face serious challenges.
More fundamentally, the empirical results lay the groundwork for a progressive investigation of
the hypothesis and of OGC, which we summarize in the three questions below. We have partial
2/12/10 Pg 20
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
answers to two of these questions, and future research will test and elaborate our provisional
conclusions. Two distinctions frame the discussion. The first addresses whether the measured
deviations are natural fluctuations or are due to changes in the behavior of the RNG network. We
name an effect physical in the latter case. Second, we ask if an explanation of the effect requires
new theoretical principles. We refer to an effect of this kind as anomalous. With these
distinctions in mind, we can state three questions which will guide our thinking:
1. Is the effect physical?
Our provisional answer is yes. An effect that does not alter the RNG behavior must result from
the fortuitous selection of naturally occurring data segments. But we have argued that models
based on selection bias, whether from intuition or methodological flaws, are unlikely. A caveat is
that our arguments, which rely in part on the evidence for data structure, are limited by the power
of the statistical tests we employ. We are devising more powerful tests to address this limitation.
It should be noted that intuitive selection can account for any structure in the data when models
are open-ended, or not fully specified. However, the models then risk becoming teleological
propositions about the data, without predictive status, and are problematic for their lack of
closure and parsimony.
In addition, a physical basis for the effect is indirectly supported by the character of the data
structure. The tests of temporal and spatial structure, as well as the C2 correlations, derive from
simple, straightforward physical and analytical considerations, and are not the result of an
unconstrained search for statistical excursions.
2. Is the effect anomalous?
Our provisional answer is again, yes. Models based on conventional physical causes such as EM
fields must explain how the RNG shielding can be circumvented and why effects are not seen in
off-event data, where the quantity of data augments the sensitivity of tests by an order of
magnitude. A model might propose that ambient fields increase during events due to exceptional
telecommunications activity, for example, but the OGC correlations are synchronized over
thousands of kilometers (the mean RNG pair separation of the network is ~ 6500 km). Surges in
ambient field amplitude may cover large regions, but such fields will not be coherent. Surge
fields during events would thus generate unsynchronized data correlations, contrary to what is
measured for OGC. The synchronization of correlations is both a strong argument against
conventional proposals and a challenge for any detailed model of an anomalous effect.
Accordingly, we continue to refine the data characterization, and particularly the timing of
correlations, since this factor will play a key role in model building.
3. What characterizes an event?
With the data characterization in hand, an important next step is to undertake a similar analysis
for the events. The goal is to define more precisely the criterion of “collective attention or
emotion,” and thereby provide a basis for distinguishing event characteristics that underlie the
effect. As with the structure analyses, the approach is empirical and begins with general
considerations. For example, the events can be classified into different psychological and
sociological categories, and the categories’ relative importance for OGC can be tested. One early
study has shown distinctions among event Z-scores when the events are sorted by emotional
2/12/10 Pg 21
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
type.27 Analyses like this can now be augmented to include tests for data structure. An important
question of immediate interest is whether different types of events have discernibly different
signatures in the data.
Conclusions
The thing that doesn’t fit is the thing that is most interesting.
-- Richard Feynman28
The GCP is a long-term experiment that asks fundamental questions about human consciousness.
Our review describes evidence for synchronized effects of collective attention on a world-
spanning network of physical devices. Careful analysis reveals multiple indicators of anomalous
data structure which are correlated specifically with moments defined as important to humans.
The findings suggest that some aspect of consciousness may be a source of anomalous effects in
the material world. This is a provocative notion, but it is arguably the best of several alternative
explanatory directions.
Although we are still in the early stages of the full research program, substantial progress has
been made in understanding the GCP replication experiment. The analysis of data structure
allows us to begin discriminating between theoretical approaches, and it provides tools for the
essential job of refining our general hypothesis. To this end, our next efforts will emphasize the
human and participatory aspects of OGC events.
We have argued that the GCP experiment is not easily explained by conventional or spurious
sources and provisionally conclude that OGC is correlated with qualities or states of collective
consciousness activity. While social and psychological variables are challenging to characterize,
an obvious suggestion is to look for changes in the level of “coherence” among the people
engaged by the events. Defining this construct and developing it empirically will be important
for further progress.
In sum, the evidence suggests an interdependence of consciousness and the environment, but the
mechanisms for this remain obscure. Substantial work remains before we can usefully describe
how consciousness relates to the experimental RNG results beyond the empirical correlations.
These findings do not fit into our current scientific understanding of the world, but facts at the
edges of our understanding can be expected to direct us toward fundamental questions.
It is important to consider different theoretical scenarios. Quantum entanglement, retrocausation,
and other ideas have been discussed in this context, but these notions from physics have only
tenuous connections to the GCP experiment, and it is currently hard to see an entry point to any
physical model. Here, the Project’s research provides much needed input by establishing
parameters that may help discriminate models. For example, quantitative modeling can ask
whether a linear composition of sources incorporating the known parameters can produce the
field-like data structure, or whether a more complex model is needed.
More broadly, the GCP results are of relevance for the study of mind and brain because they bear
directly on fundamental questions of consciousness. The starting point for much research in
conventional brain science is: What are the neural correlates which give rise to consciousness?
This question assumes that consciousness reduces to brain activity. The riskier starting point of
2/12/10 Pg 22
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
the GCP is to ask: Are there correlates or a presence of consciousness to be found outside the
brain? The question is highly challenging because it posits phenomena that are anomalous from a
conventional standpoint. The search for neural correlates is unquestionably important for a
comprehensive understanding of human consciousness. But the context and the meaning of that
search changes completely if direct correlates of consciousness are found in the broader world.
Finally, the GCP results inspire deeper questions about our relation to the world and each other.
Might we find that the best explanation, after all, resembles a coherent, extended consciousness
akin to Teilhard de Chardin's aesthetic vision of a noosphere? While this is a possibility beyond
the supply lines of our scientific position, the experimental results are consistent with the idea
that subtle linkages exist between widely separated people, and that consciousness is implicated.
What should we take away from this scientific evidence of interconnection? If we are persuaded
that the subtle structuring of random data does indicate an effect of human attention and emotion
in the physical world, it broadens our view of what consciousness may mean. One implication is
that our attention matters in a way we may not have imagined possible, and that cooperative
intent can have subtle consequences. This is cause for reflection about our responsibilities in an
increasingly connected world. It is clear that our future holds challenges of planetary scope that
will demand the full openness and clarity of science. On that, we will want to be of one mind.
Acknowledgments
The Global Consciousness Project would not exist except for the contributions of Greg Nelson and John
Walker, who created the architecture and the sophisticated software. Paul Bethke ported the software to
Windows, thus broadening the network. Dean Radin, Dick Bierman, and others in the planning group
contributed ideas and experience. Rick Berger helped to create a comprehensive website to make the
project available to the public. The Project also would not exist but for the commitment of time,
resources, and good will from all the hosts of network nodes. Our financial support comes from
individuals including Charles Overby, Tony Cohen, Reinhilde Nelson, Marjorie Bancel, Michael Heany,
Alexander Imich, Richard and Connie Adams, Richard Wallace, Anna Capasso, Michael Breland, Joseph
Giove, J. Z. Knight, Hans Wendt, Jim Warren, John Walker, Alex Tsakiris, and the Lifebridge Foundation.
We also gratefully acknowledge online donations from many individuals. Finally, there are very many
friends of the project whose good will, interest, and empathy open a necessary niche in consciousness
space. The GCP is affiliated with the Institute of Noetic Sciences, which is our non-profit home.
References
1. Eddington AS. The Nature of the Physical World. MacMillan; 1928. (1926–27 Gifford lectures.)
2. Irwin HJ, Watt CA. Introduction to Parapsychology, Fifth Edition. Jefferson, NC: McFarland; 2007.
3. Rhine JB, Pratt JG., Stuart CE, Smith BM, Greenwood JA. Extra-sensory Perception after Sixty Years.
Boston: Bruce Humphries; 1966. (Original work published 1940.)
4. Schmidt H. PK Tests with a High-Speed Random Number Generator. J Parapsychol. 1973;37:105-118.
5. Radin DI, Nelson RD. Evidence for consciousness-related anomalies in random physical systems.
Foundations Phys. 1989;19:1499-1514.
6. Radin DI, Nelson RD. Meta-analysis of mindmatter interaction experiments: 1959 - 2000. In: Jonas W,
Crawford C, eds. Healing, Intention and Energy Medicine. London: Harcourt Health Sciences; 2003.
2/12/10 Pg 23
Nelson & Bancel: EGC For Explore: J Sci Heal Preprint: Do not quote or distribute
7. Jahn RG, Dunne BJ, Nelson RD, Dobyns YH, Bradish GJ. Correlations of random binary sequences with pre-
stated operator intention: A review of a 12-year program. J Sci Explor 1997;11:345-368.
8. Jahn RG, Dunne BJ, Nelson RD, et al. Special Issue on PEAR Lab, Explore: J Sci Heal 2007;3:191-346.
9. Nelson RD, Bradish GJ, Dobyns YH, Dunne BJ, Jahn RG. FieldREG anomalies in group situations. J. Sci
Explor 1996;10:111-141.
10. Nelson RD, Bradish GJ, Dobyns YH, Dunne BJ, Jahn RG. FieldREG II: Consciousness Field Effects:
Replications and Explorations. J Sci Explor 1998;12:425-454.
11. Bierman DJ. Exploring correlations between local emotional and global emotional events and the behavior of
a random number generator. J Sci Explor 1996;10:363-374.
12. Radin DI, Rebman JM, Cross MP. Anomalous organization of random events by group consciousness: Two
exploratory experiments. J Sci Explor 1996;10:143-168.
13. Nelson RD. Multiple Field REG/RNG Recordings during a Global Event. Electronic Journal Anomalous
Phenomena (eJAP) 1997; accessed Jan 26 2010 http://noosphere.princeton.edu/ejap/gaiamind/abstract.html.
14. Nelson RD, Boesch H, Boller E, et al. Global Resonance of Consciousness: Princess Diana and Mother
Teresa. Electronic Journal Anomalous Phenomena (eJAP) 1998; accessed Jan 26 2010
http://noosphere.princeton.edu/ejap/diana/1998_1.html.
15. Nelson RD. Correlation of Global Events with REG Data: An Internet-Based, Nonlocal Anomalies
Experiment. J Parapsych 2001;65:247-271.
16. Bancel PA, Nelson RD. The GCP Event Experiment: Design, Analytical Methods, Results. J Sci Explor
2008;22:309-333.
17. Nelson RD, Radin DI, Shoup R, Bancel PA. Correlations of Continuous Random Data with Major World
Events. Found Phys Letters 2002;15:537-550.
18. Williams M. The end of the world is pretty. Skeptic, Skeptical Digest 2008;21.3.
19. Lévi-Strauss C. The Raw and the Cooked: Introduction to a Science of Mythology: 1. Translated from the
French. Penguin Books; 1966.
20. May EC, Utts JM, Spottiswoode SJP. Decision Augmentation Theory: Toward a Model of Anomalous Mental
Phenomena. J Parapsych 1995;59:195-220.
21. Schmidt H. Comparison of a Teleological Model with a Quantum Collapse Model of Psi. J Parapsych
1984;48:261-276.
22. Shoup R. Anomalies and Constraints - Can clairvoyance, precognition and psychokinesis be accommodated
within known physics? J Sci Explor 2002;16:3-18.
23. Schmidt H. A Puzzling Aspect of the “Global Consciousness Project", Letter to the Editor. J Sci Explor
2009;23:507-509.
24. Nelson RD, Bancel PA. Response to Schmidt's commentary on the Global Consciousness Project, Letter to
the Editor. J Sci Explor 2009;23:510-516.
25. Teilhard de Chardin P. Le Phénomène Humain, written 1938–40, French publication 1955; translation, The
Phenomenon of Man. Harper Perennial; 1976.
26. Atmanspacher H. Scientific Research between Orthodoxy and Anomaly. J Sci Explor 2009;23:273-298.
27. Nelson RD. The Emotional Nature of Global Consciousness. Behind and Beyond the Brain. 7th Symposium
of the Bial Foundation 2008. Emotions, Proceedings. Porto: Fundaçao Bial; 2008.
28. Feynman R. The Pleasure of Finding Things Out. Interview: BBC television program Horizon; 1981;
accessed Jan 28 2010 http://www.scribd.com/doc/23587979/
2/12/10 Pg 24
... The example is given in SGL. The chapter will also show how to enrich the swarm of chasers with a sort of global awareness (possibly, even consciousness [46][47][48][49][50][51][52]) over the whole operational area, which would allow individual chasers and the whole swarm to drastically improve their performance. This global awareness may be naturally and deeply embedded into the communicating chasers as a part of their regular functionality. ...
Article
This paper relates to the accepted presentation presented at the international Air and Missile Defence Technology Conference, held on November 16–17, 2022, in London, the UK, (day two), reflecting the contents of the presentation slides. It describes applications of the patented and internationally tested Spatial Grasp Technology (SGT) and its Spatial Grasp Language (SGL) for integrated air and missile defense (IAMD). Based on holistic space navigation and processing by recursive mobile code self-spreading in distributed words, SGT differs radically from the traditional management of large systems since it consists of parts exchanging messages. The dynamic network of SGL interpreters can be arbitrarily large and cover terrestrial and celestial environments as powerful spatial engines. The paper contains an example of tracking and destruction of multiple cruise missiles by self-evolving spatial intelligence in SGL using networks of radar stations. It also briefs the growing multiple satellite constellation in low Earth orbits (LEO) for potential IAMD applications. Starting from the Strategic Defense Initiative (SDI) of the past and then briefing the latest project of the Space Development Agency, the paper shows SGL solutions for discovery, tracking, and destroying ballistic missiles and hypersonic gliders with the use of collectively behaving constellations of LEO satellites. It also shows how to organize higher levels of supervision of groups of mobile chasers fighting multiple targets (potentially both missiles and drones), by providing global awareness and even consciousness in SGL which can drastically improve their performance. The latest version of SGT can be implemented on any platform and put into operation in a short time, similar to its previous versions in different countries.
... This section first provides a simple example of expressing in SGL a swarm of "chasers" which are constantly moving, discovering, and eliminating the distributed targets seen. Then it supplies the swarm of chasers with a sort of global awareness and even consciousness [27][28][29][30][31][32][33][34][35][36][37][38][39][40][41][42][43] over the whole operational area, which allows individual chasers and the swarm as a whole to improve performance. This global awareness may be deeply and naturally embedded into the communicating chasers as part of their regular functionality. ...
Article
The paper follows practical works on the creation of the first citywide computer networks in Kyiv (Ukraine) which have been integrating different institutes of the National Academy of Sciences and other organizations from the end of the sixties, well before the internet. These works resulted in a new management concept and distributed control methodology and technology, originally called WAVE, which were further developed and demonstrated in different countries in the areas like network management, industry, social systems, collective robotics, military command and control, crisis management, national and international security, defense, distributed simulation, space-based systems, and many others. The current paper analyses the relation of the developed Spatial Grasp Model, resultant Spatial Grasp Language (SGL), and Spatial Grasp Technology (SGT) to some higher-level psychological and philosophical concepts. By providing the basics of SGL and SGT and details of their implementation, it discusses the possible relation of these concepts to some gestalt theory laws like the Law of Proximity, Law of Good Gestalt, and Law of Figure and Ground. The paper also shows how to organize a sort of distributed and global awareness under SGT on an example of a dynamic swarm of chasing units, which can provide the increased operational capability of the swarm and be practically used for the organization of collective behaviour of multiple robot units exploring unknown and harsh environments. The paper mentions how SGT may relate to higher mental concepts like perception, consciousness, and even soul. It also shows relations of SG concept and its implementation to the pattern theory, with pattern considered as a fundamental and universal concept in many areas of human activity, which actually stands in opposition to the terms of logic.
... We will provide here an example in SGL of a swarm of "chasers" that are constantly moving, discovering, and eliminating distributed targets seen (where both chasers and targets can potentially represent missiles, drones, or any other units). Will also show how to enrich the chasers swarm with a sort of global awareness (possibly, even consciousness (Chella & Manzotti, 2011;Galland & Grønning, 2019;Massimini, 2016;Nelson & Bancel, 2016;Moran, 2015;Sapaty, 2020;2021b, Figure 17) over the whole operational area, which would allow individual chasers and the whole swarm to drastically improve performance. This global awareness may be naturally and deeply embedded into the communicating chasers as part of their regular functionality. ...
... Here we can only name some directions of the research in the consciousness area, with references to related sources, which may include the following: what is consciousness [1], artificial consciousness [2], spatial consciousness [3], distributed consciousness [4], global consciousness [5], social consciousness [6], stream of consciousness [7], visual perception and consciousness [8], consciousness outside of the brain [9], consciousness pervades the universe [10], consciousness in the universe [11], network theory and consciousness [12], the qualities of consciousness [13], and very many others. Among the studied qualities of consciousness [13] the following can be named: all possibilities, harmonizing, freedom, unboundedness, infinite dynamism, self-sufficiency, integration, infinite organizing power, invincibility, perfect orderliness, perfect balance, fully awake within itself, evolutionary, simplicity, immortality, and others. ...
Article
The interest to what is often called awareness, consciousness, also self-consciousness, has grown enormously in recent years, with many theories and publications trying to explain what this can actually mean and where can be located. The paper is investigating the possibility of using developed and tested in different countries Spatial Grasp model and technology (SGT) for simulation of different ideas and concepts related to global awareness and consciousness in distributed dynamic systems, with potential applications in intelligent system management, industrial development, space research, security, and defense. The main technology component, Spatial Grasp Language (SGL), allows us to obtain powerful and compact spatial solutions of different problems by directly expressing their top semantics while hiding traditional system organization and management routines inside networked implementation. The paper describes in SGL a traditional organization of two opposing swarms, called “chasers” and “targets”, randomly operating in expected area. It then enriches the chasers swarm with global awareness and a sort of migrating consciousness, further strengthened by external super-consciousness capability, which allows it to drastically improve performance and make important nonlocal decisions, while moving to superior position over opposing targets swarm. Despite simplicity of the shown practical example, it gives us hope for the use of SGT for simulation of much broader and complex areas linked with consciousness like, for example, brain's biomolecular processes and basic structure of the universe. The developed networking technology can be implemented even in traditional university environments, as was done in the past for its previous versions in different countries.
Chapter
The chapter relates to the accepted presentation at the Science of Consciousness symposia by showing how to model in SGT with its recursive unlimited virus-like spatial coverage of any existing, even fantastic, concepts of the consciousness phenomenon, what it actually means and its whereabouts. It provides a simple example of expression in SGL of two opposing swarms, called “chasers” and “targets”, randomly propagating and covering certain operational region and capable of fighting each other. It then supplies the chasers swarm with a sort of global awareness over the whole operational area and all units there, and also constantly active and spatially migrating consciousness. This allows the chasers swarm to drastically improve performance, analyze nonlocal situations in the operational area, and make effective decisions, giving it a big advantage over the opposing targets swarm. The chapter also shows how to organize an additional higher level or super-consciousness for the chasers swarm, by continually analyzing from some point outside (which may be anywhere and migrate too) the presence of migrating consciousness in it, with immediately re-launching the latter if accidentally terminated by failures of some chasers units.
Article
Full-text available
The Global Consciousness Project (GCP) is a cooperative, international venture involving more than 2 dozen researchers interested in anomalies associated with consciousness. A correlation is predicted between characteristics of data from a world-spanning network of random event generators (REGs) and specified "global events" that are expected to create an unusual coherence of interest and attention. To test the prediction, special-purpose software collects data at host sites around the world continuously and sends it over the Internet to a dedicated server running software to archive and process the data. Broadly engaging global events are identified using relatively objective criteria such as intensity and depth of media coverage. The primary analyses address the distribution of deviations of the mean of the REG output during the identified events. Over the first 16 months of continuous running, the network grew to include 28 active "eggs" (as the remote REG devices are called), with host sites in Europe, the United States, India, New Zealand, Fiji, Brazil, and Indonesia. A total of 43 events had been formally specified as of January 2000. The overall cumulative chi-square for all events was 7290.6 on 6920 degrees of freedom, with an associated probability of .00096. These results indicate a small but consistent excess of deviation corresponding to the predictions.
Article
Full-text available
Scientific research takes place in the field of tension between accepted coherent knowledge and not-understood, not-integrated fragments: between orthodoxy and anomaly. Orthodox knowledge is characterized by laws and norms which can be conceived formally (deterministic or statistical laws), methodologically (criteria for scientific work), or conceptually (frameworks of thinking, regulative principles). I propose to classify anomalies according to their feasibility of being systematically connected with accepted knowledge. In this way, one can distinguish anomalies at the frontier of our knowledge, interior anomalies surrounded by accepted knowledge, and anomalies in no man's land. I discuss examples which are intended to exemplify essential characteristics of each of these groups. Anomalies are the salt in the soup of science and dissolve where the domain of accepted knowledge extends or deepens-either by being elucidated or by being abolished.
Article
Full-text available
Portable random event generators with software to record and index continuous sequences of binary data in field situations are found to produce anomalous outputs when deployed in various group environments. These “FieldREG” systems have been operated under formal protocols in ten separate venues, all of which subdivide naturally into temporal segments, such as sessions, presentations, or days. The most extreme data segments from each of the ten applications, after appropriate correction for multiple sampling, compound to a collective probability against chance expectation of 2 × 10−4. Interpretation remains speculative at this point, but logbook notes and anecdotal reports from participants suggest that high degrees of attention, intellectual cohesiveness, shared emotion, or other coherent qualities of the groups tend to correlate with the statistically unusual deviations from theoretical expectation in the FieldREG sequences. If sustained over more extensive experiments, such effects could add credence to the concept of a consciousness “field” as an agency for creating order in random physical processes.
Article
Full-text available
During the period from May 11, 1995 to May 21, 1995 disturbing events, like apparent anomalous movements of objects were reported at the home of a family in Druten (The Netherlands)1. A few days after the phenomena started, a hardware random number generator connected to a computer was installed on the premises. As measures for the non-randomness of the RNG, two coherence variables, one representing first order non randomness and one representing 8-bit patterned forms of non randomness, were used. Comparison of epochs during which disturbances occurred and control epochs showed a decrease in first order RNG coherence (sum of z2 scores was: 78.4 with df=107; p < 0.05 two tailed; in the control periods sum of z 2 was 4724 with df= 4771, n.s.) while the more general RNG coherence measure did not show an effect - On Wednesday May 24, 1995, when the field RNG was still running at the home, the major European sports event, the European soccer cup final, was played between a Dutch and an Italian team. During the 90 minutes of the match, the RNG showed a significant increase in first order non-randomness (sum of z2 scores = 343.6 , df=297, p < 0.05 one tailed) while during the preceding control period the RNG showed its normal behavior (sum of z2 scores = 284, df=297,n.s.). The global coherence measure testing all possible 8 bit patterns decreased non significantly during the match. - Two minutes before the end of the match, the only goal was scored by the Dutch team. Comparison of the 10 minutes before the goal and the 4 minutes after the goal suggests that after the goal the RNG shows a decrease in global coherence (mean chi2 before = 258.4, mean chi2 after = 245.2, t = -1.94, df = 54, p
Article
Preface; Introduction; 1. The downfall of classical physics; 2. Relativity; 3. Time; 4. The running-down of the universe; 5. 'Becoming'; 6. Gravitation: the law; 7. Gravitation: the explanation; 8. Man's place in the universe; 9. The quantum theory; 10. The new quantum theory; 11. World buildings; 12. Pointer readings; 13. Reality; 14. Causation; 15. Science and mysticism; Conclusion; Index.