Q&A

ResearchGate Q&A lets scientists and researchers exchange questions and answers relating to their research expertise, including areas such as techniques and methodologies.

Browse by research topic to find out what others in your field are discussing.

Browse Topics

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
  • Prasad Srikakulapu asked a question in Mice:
    Does anyone have an idea about differences in immune deficiencies between ApoE-/- mice and LDLR-/- mice?

    In atherosclerosis disease, secretory IgM (sIgM) is important for protection. Atherosclerosis levels were significantly increased in sIgM-/-LDLR-/- mice than sIgM+/+LDLR-/- after 12 weeks WD feeding. However, there is no significant difference in atherosclerosis after 12 weeks WD feeding in sIgM-/-ApoE-/- mice when compared with its control group sIgM+/+ApoE-/- mice.

  • Julia B. Smith added an answer in Multivariate Data Analysis:
    In case of multivariate normality if the data not normal. Can I ignore the normality and continue analysis?

    Excuse me, I need literature to support that " In case of multivariate normality if the data not normal the researcher can ignore the normality and continue analysis ?". 

    Thanking you

    Julia B. Smith · Oakland University

    James's question is key here - it is not as important that the data are not normally distributed as it is that you have a handle on why you have those results.  If, for example, the data are continuous but skewed, SEM is fairly robust in its estimation (although, you may want to consider Bayesian estimations).  But if your data are not continuous, drawing inferences from SEM will lead to strange results unless you have carefully articulated your measurement parameters.  If you can share your area of study, I can come up with some articles (for example, this is a topic in both marketing research and biostatistics, but the literature doesn't cross over particularly well).

  • Shelley Burgin added an answer in Mark-Recapture:
    Does anyone have information they could share on recapture rates in anuran amphibians, preferably in the tropics?

    I have found some papers reporting recapture rates in amphibian mark-recapture studies, but they are few, and if anyone has additional data they could share, particularly from tropical areas, I would really appreciate it.  If anyone knows of mark-recaptures studies specifically with tadpoles, I would love to hear about those as well. Thank you in advance!

    Shelley Burgin · Bond University

    Hi Jessica,  Although marginal to your question, there is a capture-mark-recapture study (Chapter 5: Population characteristics and growth of Limnodynastes tasmaniensis) in the PhD thesis by C. B. Schell (2002) titled 'Ecology and life-history variation within  a population of the frog Limnodynastes tasmaniensis (Anura: Myobatrachidae) from a remnant woodland of the Cumberland Plain in North-Western Sydney' ,(University of Western Sydney, Australia (recently renamed Western Sydney University). This area is temperate rather than tropical by temperatures can approach 40 degrees C in summer and winters are relativity short. The thesis will be available on-line, unfortunately this work has not been published. Hope this helps, Shelley 

  • Hamza Imran added an answer in Correlation Analysis:
    Should you implement correlation before or after factor analysis?

    I have a questionnaire to analyze. The questionnaire consists of 72 questions . the dependent variable is one question while the independents variable are remaining 71 question. I would like to know which one of these question has significant correlation with dependent variable. So should iimplement factor analysis before or after the correlation analysis to know the underling factor who has relationship with dependent variable 

    Hamza Imran · Oklahoma State University - Stillwater

    Thank you for all answers 
    My independent variables consist of a different type of variables. For example some of them are Nominal, others are Dichotomous  while others are Ordinal, Also some of them are measured on Likert scale 1-5. My independent variable is measured on Likert scale 1-5. I was thinking to implement spearman correlation . After discovering which one of these independent variables have  a significant relationship with a dependent variable, I will do factor analysis 
    is this practical solution?

  • Sounderya Nagarajan added an answer in Bioimaging:
    Non luminescent particles in the batch and accounting for them in bioimaging?

    In case of fluorescent nanocrystals not all crystals fluorescence. This can be seen when we do a EM with cathodoluminescence that there are some crystals that dont emit while in fluorescence we dont see all the crystals and only ones that emit. If we are quantifying uptake using luminescence how does one account for the particles that dont luminesce?

    Sounderya Nagarajan · Université Paris-Sud 11

    I am looking for some papers on biological implications of the dark nanoparticles. Especially in the case of using multi colored nanoparticles to understand cellular processes, there is no discussion about the possible presence of the dark particles in the organelles or structures in the cells where they are targeted. If we have surface modifications on the particles and use color to track and predict dominant localization one could be wrong if the dark particles are not accounted for. So I havent found any papers that discuss this issue if anyone has some link it will be great. 

  • Konstantin Vlasov added an answer in Polyelectrolyte:
    What is a good computer simulation software for electrostatic interactions within the flow?

    Hello everyone.

    I have only started with theory for simulations, but already have a question. Say I have a system: laminar flow with two components in it, polyelectrolyte solution and appositively charged particles. They interact within the flow. I do understand mathematical model of this situation, but as I don't have any practice I have a question - what software can be used for this type of simulations?

    Thank you.

    Konstantin Vlasov · National Research Nuclear University MEPHI

    Iuliia, thank you! Ask if you need to know more.

  • Daniel S Park added an answer in FigTree:
    Does anyone know how i can view a populations output file in MEGA?

    I have generated a phylogenetic tree using populations and i am trying to view and edit the output tree in MEGA or with Figtree

    Daniel S Park · Harvard University

    If the tree is in Newick format, it can be imported and edited in MEGA as well. 

  • Matthew J. Clement added an answer in Bats:
    What are the best research journals of general behaviors and social interactions of bats?

    Any details would be extremely helpful.

    Matthew J. Clement · United States Geological Survey

    Good answers already, but 1) Journal of Ecology is a plant journal and 2) I usually start with the papers I referenced in my own manuscript. If I have multiple references from journal X, I will put that on the short list. If the single most relevant paper is from journal Y, that goes on the short list too.

  • Ramzan Nazim Khan added an answer in Data:
    How to compare 10000 estimated data sets to one original data set?

    We have one original data set and we used an algorithm to produce 10000 data sets.

    How to compare the 10000 data sets to the original data set and select the best data set that matches with the original data set?

    Ramzan Nazim Khan · University of Western Australia

    RMSE is perhaps the best measure of how "close'" two data sets are. But ALL of them should have the same distribution, and given that nature of randomness, ALL of them should be representative of your original data even though they will not be exactly the same. 

    The more important question is: What is the point of this? Why are you doing this? Is there some other reason you haven't revealed yet?

  • Taylor Orr added an answer in Western Blot Analysis:
    Does anybody have a suggestion regarding an alternative for phosphatase inhibitor cocktails?
    The cocktail is expensive and we sometimes run out of it. I'm interested in phosphoprotein analysis.
    Taylor Orr · Duke University

    This is from Abcam's website: 

    Sodium orthovanadate preparation

    All steps to be performed in a fume hood.

    Prepare a 100 mM solution in double distilled water. 
    Set pH to 9.0 with HCl.
    Boil until colorless. Minimize volume change due to evaporation by covering loosely.
    Cool to room temperature.
    Set pH to 9.0 again.
    Boil again until colorless.
    Repeat this cycle until the solution remains at pH 9.0 after boiling and cooling.
    Bring up to the initial volume with water.
    Store in aliquots at -20°C. Discard if samples turn yellow.

    @Valerie, do you make aliquots of the prepared orthovanadate to avoid repeated freeze-thaw cycles?

  • Ashok Kumar Mallik added an answer in Phylogenetic Analysis:
    How to generate consensus DNA sequence (contig) from forward and reverse sequence? Which software will I use?
    After generating the sequence, how can I confirm that the desired contig sequence is perfect? Currently I am trying to use "bioedit" to generate my desired sequence from both reverse and forward sequences I have. Please suggest to me any other free and easy software available online. Can MEGA 5 or PAUP solve this problem?
    Ashok Kumar Mallik · Indian Institute of Science

    Align your sequences in some alignment software (i.e MEGA 6) with available sequences of closely related species (sister species) from GenBank and chop out ambiguous sites. 

  • Debojit Sarker asked a question in Concrete:
    How can I ensure quality of fresh concrete for construction of Radiation Shielding concrete room with slab thickness of 8.5 feet?

    PPC cement, coarse aggregate ( mixture of 3/4 th inch & 1/2 inch downgraded stone chips), fine aggregate (coarse sand), and admixture (MasterPolyheed) is used for this construction. 
    Fresh density is kept above 2.4 gm per cc. The hardened density requirement is above 2.35 gm per cc. 
    Ice is being used to decrease the mixing water temperature. (To avoid hairline crack in the future)

    Key concern is to avoid radiation leakage.

    The rooms are being constructed in the basement with wall thickness of 5 to 8 feet and slab thickness of 4 to 8.5 feet. Rooms will be used for Oncology treatment and/or Tomotherapy. 

  • Nick Schaum added an answer in Quantitative RT-PCR:
    Why do my amplification curves appear dampened with the Fluidigm gene expression array?

    I am using the 96.96 Fluidigm gene expression array with samples from various mouse tissues (not single cell), as well as a set of purchased fetal cDNA. In the attached image, the group of amplification curves with plateaus of ~0.02 consist of my mouse tissue samples, whereas those that reach a plateau of 0.04-0.06 are the fetal cDNA samples. Can anyone explain the discrepancy? It appears PCR is inhibited in my mouse tissue samples.

    Mouse tissue RNA was isolated with TRIzol, phase separated with phase-lock gel, and run through RNeasy columns. 260/230 and 280/230 are >2 for most samples; RINs are 8.5-10. Made cDNA with 200ng RNA and 20 pre-amp cycles, all with Fluidigm kits. Pre-amp'd samples were serial diluted from 5-fold to 120-fold.

    Fetal cDNA (20ng) was pre-amp'd with 14 cycles and serial diluted from 3-fold to ~5 million fold. This follows Appendix 1 of the single cell analysis document attached.

    Nick Schaum · Stanford University

    I ran a plate-based qPCR and all samples looked good. I've since run two more 96.96 chips and all samples showed good amp curves, and expression matches published results. I still don't know why I had dampened curves for my original run, but it appears everything is working. I settled on 10 pre-amp cycles with 10-fold dilution after Exo I treatment.

  • Dk Matai asked a question in Cognitive Computing:
    Why's IBM Investing $3 Billion In Quantum Computing & Synth Brains? Is It The Trillion Dollar Humanoid Market?

    1. Silicon technology has taken humanity a long way forward from 1947 when the first transistor was invented by the Nobel prize winners Shockley, Bardeen & Brattain.

    2. From smart mobile telephones we rely on to the sophisticated satellite navigation systems guiding our cars, a lot of techno-magic we see around us is a result of our ability to scale silicon-tech that turns hitherto science fiction into everyday reality at affordable prices.

    3. All the Nobel laureates, scientists and engineers we liaise with at Quantum Innovation Labs http://QiLabs.net collectively realise the end of the silicon-scaling era is coming to end as the Moore's Law era for Silicon-based computers finally concludes.

    4. There will come a point in the medium-term future where microchips will no longer be made just out of silicon because other materials such as diamond, carbon nanotubes, graphene and neo-material hybrid chips will allow for faster and more complex computation.

    5. Over the next five years, IBM is likely to invest a significant amount of their total revenue in technologies like non-silicon computer chips, quantum computing and computers that mimic the human brain, thereby staking Big Blue’s long-term survival on big data and cognitive computing.

    6. IBM’s investment is one of the largest for quantum computing to date and the company is one of the biggest researchers in this nascent field.

    7. The point of this latest visionary and bold three billion dollar announcement is to underscore IBM's commitment to the future of computing as much as it historically remains a pioneer in the computing field since its early days in the mid-1940s both during and immediately after the Second World War.

    8. This $3 billion funding round will go towards a variety of projects designed to catapult semiconductor manufacturing past the "end of silicon scaling" in microchips, ie, the end of Moore's Law.

    9. Most major semi-conductor and computing players see an end to silicon scaling within the next three to four tech generations.

    10. Beyond that, Quantum Innovations Labs http://QiLabs.net sees a new way to make money from diamond, carbon nanotubes, graphene and neo-material hybrid chips via long-term returns from holding valuable, potentially lucrative patents and intellectual property in cognitive and quantum computing.

    11. The new R&D initiatives embraced by semi-conductor chip manufacturing computing players such as IBM fall into two categories:

    a. Developing nanotech components for silicon chips for big data and cloud systems; and
    b. Experimentation with "post-silicon" microchips to include:

    i. Research into quantum computers which converse in Q-bits as opposed to binary code;
    ii. Neurosynaptic computers which mimic the behavior of living brains;
    iii. Carbon nanotubes;
    iv. Graphene tools; and
    v. Variety of other neo-material hybrid technologies.

    12. Even after the end of silicon scaling, performance scaling in computer systems to access the trillion dollar new market in artificial intelligence, machine learning & human-like robots is likely to continue in various ways. To this end the R&D efforts of IBM, Google, Microsoft, Amazon, NASA, CIA and NSA amongst other government and private players in Europe, Russia, Asia -- including China, Japan and India -- Australia and South Africa are focused on different ways and means by which this might be done.

    [ENDS]

    What are your thoughts, observations and views? Please visit http://QiLabs.net

  • Harrie A Verhoeven added an answer in NeuN:
    How do I gate nuclei from debris using flow cytometry?

    How do I gate a population of nuclei from debris in a FSC vs. SSC plot, or any other plot for that matter? I am analyzing a live single nuclei suspension from rat brain tissue which are stained with a Höchst DNA dye and the neuronal marker NeuN. 

    Harrie A Verhoeven · Wageningen University

    Hello Johanna,

    isolated nuclei from rat brain, if the tissue is fresh, should be easy to gate from a FSC/SSC diagram. Since brain cells generally do not cycle, nuclei should all be the same size/structure and stand out in the scatter plots. If the Hoechst stain really works (and it works best on unfixed nuclei), then it should enable to combine any scatter signal with the Hoechst fluorescence intensity signal, which should be proportional (approx) to the DNA content of the nuclei. Comparison eg with chicken erythrocytes should give the absolute DNA content of the mouse brain nuclei. The neuronal marker should be labelled with a dye not interfereing with the Hoechst dye, so a number of two parameter plots can be obtained: FSC/SSC; Ho/FSC; Ho/Marker etc. Depends on the brand and type of your FCM.

    Hope this helps, but feel free to ask for more.

    Harrie

  • Burak Omer Saracoglu asked a question in Citations:
    Why do academics give so much importance to journals in Science Citation Index, Science Citation Index-Expanded and Social Sciences Citation Index?

    Dear Researchers;
    I have a few questions on the same topic as follow.
    Why do the academics give so much importance to the journals in the Science Citation Index (SCI), the Science Citation Index-Expanded (SCI-Expanded) and the Social Sciences Citation Index (SSCI)?
    Visit http://ip-science.thomsonreuters.com/mjl/ for Journal Lists for Searchable Databases, and https://en.wikipedia.org/wiki/Science_Citation_Index, https://en.wikipedia.org/wiki/Social_Sciences_Citation_Index
    Do you think that the scientific review processes of the journals in these indexes are better than the journals in the other indexes?
    Do you think that the publication decisions of the journals in these indexes are more appropriate than the publication decisions of the other indexes?
    What are your scientific review experiences as an author in the journals in these indexes?
    What are your scientific review experiences as an author in the journals in the other indexes?
    I would like to thank you for your answers and contributions in advance.
    Best Regards

  • Debra Sharon Ferdinand added an answer in Conscience:
    What is your comment about the following statements?

    If the heart is dead, the mercy is gone.

    If the mind is dead, the wisdom is gone.

    If  conscience is dead, everything is gone

    Debra Sharon Ferdinand · The University of the West Indies, Trinidad and Tobago

    If I may add a spiritual aspect to the topic: The soul is the subject of human consciousness and freedom; soul and body together form one unique human nature. Each human soul is individual and immortal, immediately created by God. The soul does not die with the body, from which it is separated by death, and with which it will be reunited in the final resurrection (Catholic Answers.com). Therefore, even if the heart, mind, and conscience are warped and can potentially lead one to death, the soul does not die. As such, there's always hope that someone's prayers can touch the person's soul either in life or death to be reconciled with the Creator.

    many thanks,

    Debra

  • Sounderya Nagarajan added an answer in DCFH-DA Assay:
    What type of plate should we use clear or dark bottom 96-well plate for DCFH-DA assay in order to measure overall ROS by fluorimetric assay?

    What is the main difference between dark or clear bottom plates to measure fluorescence? 

    Sounderya Nagarajan · Université Paris-Sud 11

    There are different types of plates available and you should choose to match the device configuration. If you are using a system which allows fluorescence excitation and detection from the top then a dark bottom plate would be great. If thats not the case and the sample is excited from top while detection is from the bottom you would need a dark plate that has a transparent bottom. Usually dark plates are prefered for fluorescence as there is less scattering from the well walls and allows better collection of emitted light .If you have lot of samples and dont leave space between wells in a clear plate, it prevents crosstalk. If your samples are intense and you can load them a few wells apart you could still get a decent result from clear plates after running blank to evaluate background fluorescence in an empty well or with diluent but no fluorophore

    .

  • Leonel Linares added an answer in Personality Psychology:
    Is there an instrument to measure the level of humiliation?

    Also of interest: work concerning reactions to humiliations. 

    Leonel Linares · Universidad Juárez del Estado de Durango

    Hello. Does any one knows if there are some scale in Spanish ?

  • Rodrigo Sychocki da Silva added an answer in Mathematical Concepts:
    How does the use of technology influence the subject in the construction of mathematical concepts?

    Through empirical testing, auditing procedures, parameter settings changes, relations between objects is possible that the technology will help in the process of reflective abstraction by the subject. How the mathematical knowledge is constructed makes my central hypothesis is that the coordination of the subject shares (physical and mental actions) evolves towards understanding and construction of concepts involving the objects of knowledge.

    Rodrigo Sychocki da Silva · Federal Institute of Education, Science and Technology from Rio Grande do Sul

    Hi Ana,
    I agree with many notes of your submitted article. It is an evolution of the construction of knowledge by the subject. To understand the objects in their maximum completeness, I believe that technology is a fundamental and essential partner. But the cognitive work is (and always will be) the subject ! Hug.

  • Sergio Kogikoski jr added an answer in Electrochemical Methods:
    Is there an optimum range for CNT concentration in aqueous solutions with the aim of creating composite coatings by electrochemical methods?

    We are going to do an experimental design and need to know about the general  CNT concentration range.

    Sergio Kogikoski jr · Universidade Federal do ABC (UFABC)

    I would work from 1 to 5 mg/mL in solution, maybe after this is starts to agglomerate and loses parts of the electrochemical advantages.

  • Amritlal Mandal added an answer in Epithelial Cell Culture:
    How do you prevent holes from forming in air-liquid airway epithelial cell culture after removal of apical media?

    Shortly after changing from submerged culture to ALI, holes have formed in the previously confluent cell layer.  This is  not due to mechanical damage and looks as though the cells are contracting away from each other which is letting basal media through the membrane and compounding the problem.  This doesn't happen every time and our incubator set up is working properly.  The cells aren't infected or unhealthy.

    Amritlal Mandal · The University of Arizona

    Hi Nadeene,

    We have also seen this issue while culturing primary lens epithelial culture. Frankly speaking we were not able to understand the cause behind this observation but following little modifications in the culture protocol we were able to overcome this unwanted whole formation.

    1. Try keeping seeding density ~15000/sq cm and always try to work with lower passage cells (p2-p3). Try determining the epithelial phenotye of the cells by immunocytochemical staining with E Cadherin, Cytokeratin 5/8 etc. If you see less expression of E Cad and more expression of N cad this is an indication of transformation of the epithelial cells to connective tissue type of messenchymal origin.

    2. Do not culture cells for longer duration, that makes cell longer and retracting from each other. Try trypsinizing cells when 80% confluent. The wholes and retraction are common when the cells are confluent or over confluent. 

    3. Use appropriate cell culture supplements (EGF etc.) to keep cells healthy.

    Though I must admit that the lens epithelial cells and the airway epithelial cells may not be behave similarly in culture and lens epithelium has an intrinsic property to form clusters of small colonies called lentoid (similar to spheroid) due to lens physiology and anatomy.

  • Sergio Kogikoski jr added an answer in Electrochemical Impedance Spectroscopy:
    Can anyone help with a contamination problem with EIS measurements of thin film DLC electrodes on SI/Ti substrates?

    Our set-up is: electrolyte (sulfuric acid), working electrode DLC thin film (from 4 to 50 nm) on top of SI/Ti substrate, counter electrode graphite rod and the reference is Ag/AgCl in KCl. It appears that the system is not stabilizing to OCP very easily and keeps drifting as a function of time. We also get negative solution resistances at some times and otherwise weird results. We are suspecting that maybe there is contamination in the cell since this behavior appears to point towards adsorption of something on the electrode surface. We were also thinking of putting Pt wire between solution and reference with small capacitor connected. Woud it be best to change also the counter electode to Pt wire to reduce contamination? Any other suggestions? Thank you already in advance for your help.

    Sergio Kogikoski jr · Universidade Federal do ABC (UFABC)

    Maybe the film that is deposited is not stable... if you are trying to use the OCP of the film you have to remember that this an equilibrium potential, so your species can oxidase or reduce and in that way degrade the film that you have changing the OCP as result... I would try to add an electrochemical probe, as ferricyanide, in solution, and use the ferro/ferri OCP potential... hope it could help you!

  • Charles Francis added an answer in Theoretical Physics:
    Can Quantum 'Mechanical' Description of Physical Reality be considered Completed?

    Simplicity is the key to the interpretation of physics. Nothing more simple in the analysis than supposing the existence of some parameter "hidden," invisible and not measurable which is an integral part of a pair of photons and that tells at the time of their creation: "you are oriented east" or "you are oriented to the west. "This analysis requires us to introduce "hidden variables", a process which in physics is debatable, but allows in a very elegant way to explain everything in realistic terms. The pair of photons has its own objective reality that can describe them completely. Part of this reality is unknowable but never mind, the problem is only human, nature is safe.

    We have two options: 1) quantum mechanics is inherently probabilistic; 2) quantum mechanics is not inherently probabilistic, but deterministic. The first position is that of the so-called "Copenhagen interpretation", still very accredited by physicists, while the second was that of Einstein-Podolsky-Rosen (EPR) and of the "hidden variables". Subsequently, Bell showed that the hidden variables can not be there. John Bell in 1964 pointed the way for an experimental verification of the existence of hidden variables, but subsequent experiments, especially the French group of Alain Aspect, have shown the full validity of quantum mechanics.

    Then, the second theoretical position is no longer sustainable. Instead it is if we consider the fact that the "ontological materiality" turns out to be greater than the "physical". There are no additional variables that may enter into the physic calculation, but there are physical materials that physics fails to consider which have an impact on theorizing. These factors determine the overall behavior of matter which, therefore, appears inherently probabilistic. It can be said that Einstein was right: the hidden variables exist, only that they lurk outside of physics, in ontology.

    Many physicists (Einstein leading) have always refused that indetermination be an inherent feature of physical reality. Consequently, they preferred to assume that the description provided by quantum mechanics was simply incomplete. Their reasoning, in practice, consists in saying: even at the microscopic level physical reality continues to be deterministic, only that we can not know the exact values of the state variables and so we are forced to an indeterministic description. To explain this failure many proponents of determinism (starting from Einstein himself) introduced the so-called "hidden variables". At the microscopic level, there would be some factor that is not yet known which would prevent us from a deterministic description. The moment we knew, we could provide a description of these factors completely deterministic

    For many years the debate between the advocates of the hidden variables and the promoters of intrinsic indeterminism remained on a purely metaphysical level. In 1964, however, the physicist J.S. Bell derived a famous inequality (Bell's theorem) that allowed to transfer experimentally what until then had been a metaphysical discussion. Such inequality, in practice, led us to expect different experimental results depending on whether had been true the hypothesis of hidden variables (at least limited to the so-called "local theories") or not.

    Now, the Heisenberg principle would not only establish our inability to learn at the same time the values ​​of the position and momentum of a particle. These values are established, before a measurement be made, they are absolutely and inherently indeterminate.

    Einstein's objections to quantum mechanics made sense because he was perfectly aware that quantum mechanics is incompatible with determinism. However, his views obstinately deterministic and his attempts to defend them (hidden variables) have not stood the test of facts.

    The microscopic reality is inherently indeterminate. However, what is surprising is that the macroscopic reality is instead largely deterministic. To explain this apparent contradiction is a fascinating challenge in theoretical physics. An interesting attempt at a solution appears that provided by three Italian physicists G. Ghirardi, A. Rimini and. T. Weber (in Physical Review D 34, 470, 1986).

    So, in this context it became obvious that the description of the states of a physical system offered by quantum mechanics was incomplete and that such an incompleteness was responsible for the indeterministic character of the theory. In other words, it has been assumed that quantum mechanics is indeterministic only because our level of knowledge does not put us in a position to "see" some additional variable, able to "complete" the description of the physical system provided by quantum mechanics. According to this conjecture, if we were able to identify these new variables, currently "hidden", we would recuperate a level of description deeper than the quantum level and at that level determinism could be recovered. "

    In fact, the enigma of the "hidden variables" was not solved by a logical-deductive approach, as Popper might have wished, or was it only partially.

    As already said, “in 1964 the issue was a crucial turning point: J. Bell showed that for a large family of theories and hidden variables, the so-called local theories, it is impossible to reproduce with media operations on hidden variables all the predictions of quantum mechanics. "" the result of Bell had the great merit of showing on the experimental ground the theme of possible deterministic completions of quantum mechanics, and a great interest aroused for the realization of experiments sensitive to discrepancies between the predictions of quantum mechanics and that of the local theories of hidden variables . "(Enrico Beltrametti)

    In 1981, Alain Aspect was able to realize the first of a series of experiments of high quality. In practice, the experiment showed that Einstein had been wrong in suggesting the idea of hidden variables.

    As for Popper, we could say that he lost a game: the one with LQ,

    Criticism of Popper was wrong from a logical point of view, but in many ways it had some basis. Popper did not want to admit a weakness of logic explicit in theory LQ. For Popper's logic was to remain an ‘a priori’ science, having as main feature the absolute independence from any content. Therefore, he refused to consider the possibility of choosing logics different from the logic, most suitable than this to the empirical character of particular situations.

    Already in the Logic of Scientific Discovery, which was finished in 1934, then prior to the writing of Birkhoff and von Neumann, Popper anticipated: "... replacing the word" true with "the word" likely "and the word" false  with "the word" unlikely ", nothing is gained.

    However Popper earned another no less important point. The revolutionary discovery of Bell and Aspect was not from a pure inductivism, but from experiments carried out in the light of a theory already formulated ‘a priori’, then from a hypothesis to be subjected to strict scrutiny, identifying the elements and data that could refute it. At least on this ground, Popper took an important rematch.

    At the time of the article in Einstein's death, the controversy was still strong and "philosophical" issues had a great weight, so much so that an American physicist was the victim of McCarthyism and lost his job for supporting a deterministic model with hidden variables. Today we tend to minimize the importance of our imperfect knowledge on the subject; theories are used as they are reaping the fruits without worrying about a coherent understanding of the underlying laws. Most physicists do not interpret more the principle of indeterminism  in a metaphysical way. It is considered as a simple impossibility of knowing at the same time position and momentum of the particles in a system still felt completely deterministic. After all, beyond the supposed wave-particle duality, also in the macroscopic world there is a kind of uncertainty: for example, I can not measure my speed with accuracy higher than my reaction time to press the button on the timer.

    Charles Francis · Jesus College, Cambridge

    Sebastion, I have been very confused with your position. I think you do not understand the argument in the EPR paper. In a reductio ad absurdum argument one proves the premise false. If you say you disagree with the argument, then you are asserting the false premise.

  • John Christopher Guenther asked a question in Australian Literature:
    Can anyone suggest readings that challenge the belief that geographical remoteness is a disadvantage for people who live in rural or isolated areas?

    In Australia, remoteness indicators have often been used to measure 'disadvantage'. The disadvantage is rarely critically discussed in the Australian literature. Rather it is described with a range of indicators, and from my view, just taken as a given. Remoteness is one of them. I'm interested in hearing from researchers who have considered these issues in other places.

  • Graham Allan Partis added an answer in Black Hole Thermodynamics:
    What is the information? Is it physical? Are there any information particles? Are there any information bosons which carry the information?

    Mathematically, a system’s information content can be quantified by the so-called information entropy H, introduced by Claude Shannon in 1948. The larger the information entropy, the greater the information content.1 Consider the simplest possible information-storage device: a system with two distinct states—for example, up and down, left and right, or magnetized and unmagnetized. If the system is known with certainty to be in a particular state, then no new information can be gained by probing the system, and the information entropy is zero. An interesting question, then, is whether the thermodynamic consequences of the second law extend to information. Is it possible to extract useful mechanical work from a system just by observing its state? If so, how much? And at a more fundamental level, are the thermodynamic and information entropies related?

    Are there any information particles? Are there any information bosons which  carry the information? Is the information fundemental ?

    Graham Allan Partis · University of Southern Queensland 

    Sean Carrol says  the world is made of waves that just seem like particles when they interect with something. You could argue that in a similar way physicality is really just information influencing minds.

  • Zeynab Faraji asked a question in Electrophoresis:
    Why the weight of my gene band is different from my positive control?

    I've faced a problem during my work.
    In electrophoresis of ureC gene PCR product electrophoresis H pylori the the weight of gene band is 100bp more than the positive control weight while I observe a single sharp band in my electrophoresis.
    Why the weight of my gene band is different from my positive control?

  • Harrie A Verhoeven added an answer in Stock Solution:
    Where to store stock solution of plant extract dissolved in DMSO?

    I'm dissolving plant extract (powder form that is stored at room temp) in DMSO for a stock solution. I will later perform serial dilution using cell medium on the stock to get a 1:100 and 1:1000 for cell culture experiment.

    Where are stock solution made with DMSO typically stored and are stock solution typically made in what type of vials or test tubes.

    Harrie A Verhoeven · Wageningen University

    Dear Y.D.

    A bit of disturbing question. The extract you want to test, is provided in a dry, powdered form, stored at room temperature. Why do you want to dissove it in DMSO? Do you know what the original volume was, and whether it was a water extract or other solvent? If you dissolve it in waterfree DMSO, storage at -20 celcius will last for ages. But take care with thawing, and let it come to room temperature before opening to prevent moisture contamination on the cold sample. But better to know about the nature of the extract before diluting it and testing on the cell lines. DMSO in itself is deleterious to most cells at concentrations above 0.5%.

    Regards,

    Harrie

  • Shuichi Shinmura added an answer in Basic Statistical Analysis:
    How do we know which test to apply for testing normality?

    Statistical methods include diagnostic hypothesis tests for normality, and a rule of thumb that says a variable is reasonably close to normal if its skewness and kurtosis have values between –1.0 and +1.0. 

     If the sample size is larger than 50, we use the Kolmogorov-Smirnov test.  If the sample size were 50 or less, we would use the Shapiro-Wilk statistic instead.(Reference: www.utexas.edu/courses/.../AssumptionOfNormality_spring2006)

    For dataset small than 2000 elements, we use the Shapiro-Wilk test, otherwise, the Kolmogorov-Smirnov test is used. (Reference: . For dataset small than 2000 elements, we use the Shapiro-Wilk test, otherwise, the Kolmogorov-Smirnov test is used.)

    Shuichi Shinmura · Seikei University

    Ramon idea is easy check for normality check of one-variable.

    But there is no good test for multi-variables.

    My idea break out this situation.