Questions related to Mixed Methods
Research methods textbook
Creswell's Research Design: Qualitative, Quantitative, and Mixed Methods Approaches (http://www.sagepub.com/books/Book232401) gives a short introduction to the three methods, but I don't know if it's elaborate enough for your purpose. If not, you can choose a book that discusses/compares qualitative and quantitative research methods, and another book that focuses on mixed methods.
I am starting content analysis to determine the dimensions of a construct and would love advice on coding instructions, and how to ensure reliability and validity as I will not have a team of researchers (so no inter-coder reliability or intra-coder reliability).
Any resources I should read?
to share my perspective as a single coder. Not having a team of researchers does not preclude one from being able to assess intra-rater reliability (though you would definitely need someone to help you with establishing inter-rater reliability). As Mackey and Gass (2005) explain, firstly, the researcher codes all the data. Then, after some lapse of time (few weeks or months) he or she would need to re-code the data or some part of it. The scores achieved by the same researcher but at different points of time (hence, "intra-rater") can be compared through standard inter-rater reliability check procedures.
Content analysis is a data driven process, so your data will determine your codes or labels. To ensure validity of you coding process, there are such procedures as "external audit" aka "peer checking" (Cresswell, 2012). Writing an article about your study and submitting it to a reputable scholarly journal counts as such an approach. In the article, you would need to describe your coding process (and the logic behind it) as well as your findings. Based on the feedback you will received from the reviewers (hence, "peer checking") you can make a judgement about the validity and, if needed, improve you coding approach. The bonus is that you can get a publication too!
Hope this helps. Good luck with your studies!
Creswell, J. W. (2012). Educational research: Planning, conducting, and evaluating quantitative and qualitative research (4 ed.). Boston, MA: Pearson Education, Limited.) .
Mackey, A., & Gass, S. M. (2005). Second language research: Methodology and design. Mahwah, NJ: Lawrence Erlbaum Associates.
The research is using mixed methods, framed by a pragmatic theoretical framework, with a convergent design using triangulation. In using this design, it was noted that some researchers code qualitative data in order that there is some 'dialogue' between the two methods used in the design (Hesse-Biber, 2012). I believe that there will be some convergence between the two sets of data but not all (which I expect). In order that I am able to explain these findings, and not risk my paper being thrown out, how can I interpret this?
But terminology follows trends - and the term triangulation at the moment is understood in a specific way. That's what I was trying to say.
Since you are using two methods for two different aspects of the same phenomenon I think it is completely reasonable to have to separate strands of analysis and discussing them in relation to each other afterwards. After all, there are reasons to believe that either method might not really be able to achieve what it is meant to, so the separate analyses can also be useful for validation (which is more difficult if you quantitize your qualitative data). Divergence is not necessarily a bad thing, especially if it means you have a richer picture.
What would be the purpose of quantitizing? If you have a clear idea about this, there is nothing wrong with adding a few variables from coding of your interviews to your questionnaire dataset. You could then check, for example, if certain kinds of narration lead to certain response patterns in the questionnaire. But from what you write I would see that only as an additional option and keep the two strands of analysis apart.
Do you need to use only qualitative methods in collecting data on indigenous / Traditional Knowledge in the area of water governance? and if you use the mixed method is it appropriate?
I need information or some articles on data collection with Traditional Knowledge in water resources governance, policy and practices.
Indigenous knowledge of water governance is often collected through surveys, focal group discussions and interviews. This information often will not have a baseline to compare with to deduce if indigenous knowledge is indeed providing better governance. Also note that water governance in rural societies is not always knowledge based, instead rule based. The society agrees on what is acceptable and what is not. The best way to gain confidence in information or knowledge derived from rural communities is to combine independent analysis of the outcome due to perceived governance structure. For example, the health of the vegetation in a rural area can be estimated with remote sensing techniques. One could correlate indices derived from remotely sensed images with 'perceived' good or bad governance.
Empirical literature studies: studying academic literature as a shortcut to learning about academic practice are common in many fields.
Bryman, Alan. "Why do researchers integrate/combine/mesh/blend/mix/merge/fuse quantitative and qualitative research." Advances in mixed methods research (2008): 87-100.
(we used the technique in)
van Turnhout, K, et al. "Design patterns for mixed-method research in HCI." Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational. ACM, 2014.
However, are there any academic standards for this aproach? Is there a good article describing rules for sampling, analysis and conclusions etc.
You may want to check this book as it offers an extensive advice on writing various types of literature reviews:
Cooper, H. (2010. Research synthesis and meta analysis. A step by step approach. 4th edition. SAGE.
I am going to attempt to use mixed methods as I feel it will add more to the research project but the attitudes towards this seem to be stay clear and avoid or go for it. My question is why...or rather, why not?
there's no general answer to this. as always (no matter if it's quantitative, qualitative or mixed methods) it depends on your research project, your goals, your ressources (time, money, staff, etc.).
mixed methods of course can add some (or a lot of) value to a research project while in other projects only one method (e.g. a simple questionnaire) may answer all the research questions...
so you have to think carefully at the beginning of your project which research questions you want to answer - then choose your methods. that means you should choose the best methods to answer the questions, no matter if qualitative or quantitative or mixed.
There are many articles using econometric to measure the impact of foreign education on economic growth. Is there any article or Ph.D. thesis used mixed method (Questionnaire) to find the relationship between foreign education and economic growth.
Brain circulation and economic growth could be important for countries like Africa and Asia. Number of students going to do higher studies and share of foreign trained skill in total employment could be a reasonable proxy. Or, share of foreign PhD in total PhD could also serve as a measure. The channel of influence on long term growth is through human capital accumulation via knowledge embodiment in foreign training. Former two suggestions are also quite helpful to be pursued. Thanks
I have been observing that doctoral students prefer to opt for quantitative methods compared to qualitative despite, in some cases, understanding well hat the quality of their research could be better using qualitative/mixed methods. When probed, they generally opine that the quantitative methods are easier compared to qualitative and that it takes longer to complete the thesis in case of later.
I am not so sure that conducting quantitative research is really easier compared to qualitative. I am talking about a solid study which attempts at originality / novelty.
For one things, quantitative research can be time consuming and there are many potential pitfalls. Thus, in a quantitative study one would expect that the research instrument is solid and is validated in the researcher's socio-cultural or educational context. This would require an extensive reading of literature before developing and validating the instrument not to mention a potential situation when there might be a need to modify or overhaul the questionnaire. Besides, a good understanding of appropriate and best statistical procedures for each study would be required. Also, the researcher must ensure that the data do not violate necessary assumptions for statistical tests (which cannot be taken for granted) before the tests can even be conducted. In addition, in quantitative analysis there is also the argument that a random sampling (which is not achievable in a small research project, such as a PhD) is preferable to nonrandom. Of course, once a good instrument is developed (after weeks or months of research) and all the preliminary assumptions are fulfilled the analysis of the data takes just a few clicks of the computer mouse.
When we are doing a study involving quantitative method and qualitative, how do we triangulate the result and present it into are dissertation or manuscript?
Please give standard reference and support if possible with pdf
In mixed methods research, the term triangulation has been the source of much confusion. Originally, it had a specific meaning when it was introduced in the 1970s, but that definition was widely ignored during the next decade or ski when anything and everything that used a combination of qualitative and quantitative methods was called triangulation, simply because the broader term "mixed methods" did not exist. The key publication on this is:
Greene JC, Caracelli VJ, Graham WF. 1989. Toward a conceptual framework for mixed method evaluation designs. Educ. Eval. Policy Analysis 11(3):255–74.
The defining feature for classical triangulation is the comparison of the results from different methods, to assess the extent to which they agree. The analogy is to two separate lines converging at the point of a triangle. I thus refer to this kind of design as "convergence" in my own work, to avoid the confusions associated with triangulation as an overly broad term.
Morgan, D. 2013. Integrating Qualitative and Quantitative Methods. SAGE.
I am currently doing an exercise as a preparation for my bachelor graduation project. I want to include methodological triangulation and mix qualitative and quantitative research methods by including a questionnaire and semi-structured in-depth interviews.
Some of the literature I have found so far indicates that using more than two methods would be considered triangulation, but others imply that mixing qualitative with quantitative is already enough and that it depends on the type of research approach you have.
Basically, I am trying to determine the perceived activity in some areas of a company via a likert scale questionnaire (from employees), and I want to cross reference it with the expert interviews (of which some are simple employees as well) which are aimed to answer the questions of what activities are being done, and what could be done in some areas. The final "triangulated" result should be the answer to the question "what to do to increase profit".
It is rather generic at the moment, but this is meant as a first step towards my research design for later. Am I on the right track here? Is there some clear theory in regard to my issue?
Any help is very much appreciated, thank you!
"Triangulation" is a term which comes from geometry and it refers to the different perspectives from which an object can be observed from different points. In fact, social research has adopted that concept in this sense: studying a social process through different methodologies so that you can have an enriched vision or perspective about it. Qualitative methods very often let us discover unknown dimensions from our object, and quantitative ones help us consider how those dimensions are distributed into collectives or social groups. I think that anytime you use several methodological paradigms to observe a social object, you are putting triangulation into practice.
answered a question related to Mixed Methods
Qualitative Research or Quantitative research alone cannot answer some of the research questions. such type of projects It will be better to go for the mixed methods.
Use the method appropriate to your research question. Don't fixate on the method.
Scandinavian countries have shifted in a big way from quantitative to qualitative research in management. In US, too, this trend is visible but Asian countries are still wedded to quantitative research. It appears that the knowledge of qualitative research is lacking in these countries or appropriate softwares are still not available to process the data or researchers are not yet comfortable with these softwares.
There is a growing understanding that human intuition cannot be replaced by quantitative methods. That is why much research in the developed world is qualitative or ethnographic research. You might like to read "What data can't tell you about customers" published in Harvard Business Review (2012). Several writers have written on the role of human judgement - you might like to read wsj blogs and an article in NYT about data and intuition. I have covered these aspects in my new book on Consumer Behaviour which is due to be published by OUP in 2015.
In India, we are enamoured of graphs and charts, even if they are drivel. That is why it will take some time before research shifts from quantitative obsession to something more meaningful. It is not availability of computer software, but the software in the mind of professors that is to blame.
i´d like to know your point of view about things that ever appear in methodology. Yin (2003) mentions that question with *How and Why* should be studied qualitatively. I´d like to know if we raise an issue with HOW and the process is terminated, can it be possible to quantify.
Thank you Dear Paul.
I´d like to comment that we can also lose some info by doing quantitative research
Trying to know the best practice tools for analysis organization, coding, categorization and inference of qualitative interviews.
I'm using qualitative and quantitative research in my study. My philosophy is post-positivism at the first quantitative stage and then I move to social constructionism at the second, more qualitative stage.
What are the dangers of this 'methodological eclecticism'?
I don't think you have to buy into philosophy of knowledge assumptions about the nature in order to use a method. My reading of the current state of affairs in Mixed Methods is a consensus around the emphasis on research design rather than paradigms. So, if you have a good justification for how the results from your methods will fit together, than that would be sufficient.
Alternatively, if you do feel like you need a philosophical paradigm, pragmatism departs in major ways from the older notion that you need to rely on concepts such as ontology and epistemology. I have two articles on that subject, the most recent of which is:
I recently read about Mixed research method and i want to read more details about this method. Is there any references that i can begin with?
Mixed-methods research is currently being recognised as the third major research approach and those in the field are working towards a definition of this paradigm (Giddings & Grant 2007; Johnson et al. 2011). Whether the search for a single definition is necessary, or even desirable, remains unclear. For the sake of clarity therefore, I use the term as defined by Tashakkori and Creswell (2007 p 4) as ‘research in which the investigator collects and analyses data, integrates the findings, and draws inferences using both qualitative and quantitative approaches or methods in a single study or a program of inquiry’. There are many and immediate benefits to be gained from not separating quantitative and qualitative research into distinct categories but, instead, acknowledging and understanding their interrelated nature and processes. The important thing is that researchers do not restrict themselves to a limited range of conventional research approaches or methods.
Mixed-methods research offers a way of making research more meaningful, complete and purposeful than is the case when using either a singular qualitative or quantitative approach, and provides the researcher with other valuable tools to add to their research resources. It is, therefore, based on the principles of 'triangulation'.
Historically, methodological triangulation (or pluralism), from its social science origins in the 1950s, was limited to just parts of a whole study. Denzin (1978) later on sought to expand the scope of mixed-methods research to the whole research design. His intention was to reduce the incidence of research error often associated with studies that used single methods, single researchers or single theories. In today’s context, methodological triangulation/pluralism are terms used to denote a single research study that uses a combination of research approaches, paradigms and/or methods. Essentially both terms refer to the same process although it is more common to see the term triangulation, rather than pluralism, used. As the position of mixed-methods research becomes more established it is becoming more commonplace to see studies that are of a mixed-methods design, but they do not use the term or associated terms to describe this fact. Assumptions are made that the consumer of such studies will understand when a study uses mixed methodology. Where this is the case, the things to look for and the main rationales proposed for conducting a ‘blended’ mixed-methods study are: triangulation; completeness; off-setting weaknesses and providing stronger inferences; answering different research questions; wider explanation of findings; broader illustration of data; potential hypotheses development and testing; and possible instrument development and testing (Doyle et al. 2009).
It is necessary to have a good understanding of different types, categories and combinations of mixed methods before commencing or reviewing this type of research. Depending on what the main aims of any research study are, certain triangulation methods will work better than others. There are a number of different ‘types’ of triangulation.
I am currently in the phase of writing my Phd proposal and in the methodology part i will be using mixed methods, i wanted to know what articles i can read to help me in writing a justification as to why i will be using mixed methods
Health system planners aim to pursue the three goals of Triple Aim: reduce health care costs; improve population health; and improve the care experience. Study has been conducted to explore the novel application of a case–mix adjustment method in order to measure and help improve the Triple Aim of health system performance. The dashboard, developed via case–mix methods, measures 2 of the Triple Aim goals and can help health system planners better manage their health delivery systems. Can developing a dashboard help measure and achieve the triple aim reduce health care costs; improve population health & improve the care experience?
How do we know that a particular research is actually need a mix method design?
I want to review and analyze all paper / studies (approx. 90) in my research topic (Business model) regarding the used methodology. The objective of this review/ analysis is to get overview of research status and identify patterns, problems or priorities.
My idea is to start with general distinction:
- concept paper/ empirical paper/ literature review
Only empirical paper are interesting for further analysis (guided by the research onion from Saunders et al. 2010).
- Approaches: Deductive/ Inductive
- Strategy: Experiment/ Survey/ Case Study/ Action Research/ Grounded Theory …
- Choices: Mono Method/ Mixed Method/ Multi Method
- Time Horizon: Cross Sectional/ Longitudinal
- Data Collection and Data Analysis: Secondary Data, Observation, Questionnaires, Interviews …
- Is the proceeding useful?
- There are any supplements?
- How can I handle studies/ paper which cannot classified or where information about methodology is not available?
- Do you know any paper/ study who did similar review already (in other areas)?
Hi, Could you help me in deciding which mixed methods design to use in my study. I started with questionnaire data of 1800 students to identify the writing strategies of these diff bands students. From that data I identify the pattern of the good students and based that in developing my teaching intervention. The intervention phase data were a mixture of quali (essays) and quanti. I feel that The Exploratory Sequential Design would be most suitable because the results of the quanti are used to build to the subsequent intervention phase but this contradicts Creswell's cs iin my case it did not start with a qualitative study leading to a quanti study? Pls help.
Warda - It is necessary to have a good understanding of different types, categories and combinations before commencing or reviewing this type of research. Depending on what the main aims of any research study are, certain triangulation methods will work better than others. There are a number of different ‘types’ of triangulation. Before commencing mixed-methods based research then, the first step is considering what type of triangulation will best suit the task at hand. Each one is important in its own right and has the potential to produce different perspectives and outcomes from the next — hence the importance of choosing wisely.
As well as different types of triangulation, there are also options for different paradigm combinations to consider. For instance, simultaneous triangulation is the combination of qualitative and quantitative methods in one study at the same time. Sequential (parallel, concurrent) triangulation separates out the two paradigms but combines them in the overall findings. It doesn't matter matter which paradigm comes first; quantitative or qualitative.
With mixed-methods research one is faced with a potential conundrum — which research approach does the research question address and best answer? Where this is the case, the two studies are triangulated if they both relate to the same topic area, they are both planned prior to the research program commencing, one informs the other and, as a final outcome, they both equally expand the related field of inquiry.
Please consider submitting a proposal for this exciting conference, featuring John Creswell as keynote speaker.
Date: June 19th
See attached for Call for Papers
I need to conduct six interviews. 3 from rural and 3 from urban university students as a part of my mixed methods research. The interviews will focus on students past literacy background which might be a major cause for their poor English reading performance at university level. Kindly suggest to me which of the 5 qualitative research strartegies I may use to conduct these interviews. Ethnography, phenomenology, case study, grounded theory or narrative research (Creswell 2007). Kindly share your views if I dont use any of these, can I condut semi-structured interviews and analyse them using themetic analysis in a mixed method study.
I agree with a lot of the above. CCM and other aspects of GT have been used in more generic research; similarly people often keep some field note without doing a full ethnography. Many people have labeled this a generic qualitative approach or interpretive descriptive. There is quite a lot in he literature using those terms. Some of the GT folks have moved over to a more generic method that does not promise a theory as its outcome.
I am conducting a two faceted study, with a quantitative questionnaire package and highly structured open ended questions, as an attempt for triangulation. I am not quite familiar with qualitative analysis software such as NVIVO or MAXQDA. Would excel be an option for organizing and coding highly structured open ended responses? I provide an example data excel file and my data collection tool. Its in Turkish yet you may understand the structure, I guess. Or would you suggest learning some basic features of standard qualitative analysis software packages?
A free alternative might be to use RQDA as part of the R package. The main advantage is that it uses a relational database (SQLITE) so that you can restructure your data much more easily for analysis. I've used it for a variety of purposes and it's not too difficult to learn and use.
I had submitted a manuscript based on a Psychosexual disorder "Dhat syndrome" in a journal. In that study I did in depth interview in some open ended questionnaire and it was cross-sectional in nature. It did have Quantitative outcomes like physical and somatic symptoms like loss of appetite, loss of sleep etc, depression. and corresponding frequency to these measure.
Hence in Material and Method section I used the term " Mix study design "
But the reviewers are putting a question " Does such study design exist?"
In this perspective , I request if anyone can share a free access like pdf, link, attachment on why we use mix method design or mix study design studies in Psychiatry so that I can address to the reviewer of the journal that using the term mix study design is not incorrect and infact it does exist.
In general it is not a good idea to argue with reviewers unless you have a strong justification for doing so, which seems to be missing in this case.
The problem with the phrase "mix methods" is that this label does not exist in the literature, in contrast to "mixed methods" with has an extensive body of work behind it. Since you had both quantitative data in the form of frequencies as well as qualitative data, that would meet the general requirements for calling your study mixed methods.
Here are two basic sources about mixed methods research that are available on ResearchGate:
answered a question related to Mixed Methods
I am conducting a mixed methods study, convergent design, that contains both quantitative survey data and qualitative data from semi-structured interviews. I wonder if MAXQDA software could be useful in data analyses, particularly in helping to corroborate the qualitative and quantitative findings? I would appreciate any suggestions, comments and information. Thank you!
Yes, I tried to use MAXQDA for different mixed methods designs. And no, I personally think it is not useful to corroborate the qualitative and quantitative findings.
If I understood Creswell right, using convergent design means to have the ‚quantitative data collection and analysis‘ and the ‚qualitative data collection and analysis‘ done during the same stage of the research process, and the strands keep independent during analysis.
MAXQDA‘s function to implement quantitative material is more like adding to each (qualitative) case a specific set of quantitative information (e. g. age, gender, etc.). From my point of view, that function doesn’t help you, neither conducting the quantitative analysis nor the qualitative analysis nor the comparison of both.
From my experiences I would recommend (for convergent designs): Use statistical software for your quantitative data and write about your findings. Use (if you like MAXQDA or other or no) software to support your qualitative data and write about your findings. And then bring these two different writings together.
Some believe that triangulation is an integral part of mixed methods, but I want to know more about the internal characteristics of these two methods of data collection to identify the differences.
This is a good question with a number of possible answers.
The first paper (R. Poole et al., A mixed methods and triangulation model....) posted by @Ravichandran Panchanathan is very good.
See the section on triangulation, page 4, for a good overview of the notion of triangulation, which is an import from surveying and navigation, where the basic goal is to find the position of a fixed point so that we can get a bearing on different objects relative to the fixed point.
You may want to broaden the notion of triangulation to include the computational geometry view. For example, if we know the coordinates of points of interest in a data set (e.g., nodes in a social network), then we can Delaunay triangulation. This is done by connecting each pair neighboring (nearest) points with a straight edge. See, for example, the triangle of social network nodes shown in the attached image.
I am proposing a mixed methods approach, and my sub-questions require different approaches. Hence, in the first phase of my study, I have a relativist ontological approach, and in the next phase, my ontology is realist. Is that alright? Does this change need to be justified?
Approaches are just that: approaches, not religions. And remember that an approach is an initial orientation, and needs to be ready to adapt to what you find.
Using survey and randomize controlled trial in one study will be called triangulation? Here it is important to note that both survey and randomized controlled trials are taken as quantitative methods.
the term "triangulation" is based on the idea that two or more methods are used to verify the results. This provides more support for findings that are derived via methods that may be more bias-prone, or of they are not all encompassing. So in your case, you have data that are derived via RCT (which most researchers would argue is bias-free and the results should stand on their own) and supplemented by (self-reported?) survey data. If the outcomes from both approaches (that is, both approaches provide similar results in direction, if not in magnitude of effect) then you can have greater confidence that the results are "real".
As per Wikpedia:
can be used in both quantitative (validation) and qualitative (inquiry) studies.
is a method-appropriate strategy of founding the credibility of qualitative analyses.
It becomes an alternative to traditional criteria like reliability and validity.
It is the preferred line in the social sciences.
By combining multiple observers, theories, methods, and empirical materials, researchers can hope to overcome the weakness or intrinsic biases and the problems that come from single method, single-observer and single-theory studies.
I hope this helps
I am planning a research on how the knowledge of nurses in basic life support will improves general emergency care using mixed method research
Dear Alex, there are many examples of mixed methods questionnaires, such as standard student response questionnaires and the UK National Student Survey. They combine demographics with closed and open questions. Open question responses can lead to qualitative analysis provided that they are not too short. Closed question responses are always analyzed quantitatively.
I am doing a research of two phases: investigating the practice and coordination of integrated marketing communications among advertising agencies, then proposing a viable system model to diagnose the current structure of the agencies and to provide better structure for better coordination in the future. Thank you
I am formulating and defining my research question for a systematic review (subject: dementia and singing) and expect to obtain both qualitative/quantitative/mixed methods studies. I don't think there will be enough to limit the review to one type or the other.
What type of tool would you use to formulate your question? The more I look at it PICO seems too rigid, but would SPIDER be appropriate even when included mixed methods?
PICO(S) or SPIDER are goods tool to build a good research question (qualitative or quantitative method). Nevertheless, I think we don't have to strict on that in case of conducting a systematic review and meta-analysis. Key words are the most important. With good key words, we will have a good chance to collect all relevant evidence for our research question, then we can classified it into sub-groups to perform further analysis specifically.
This may be a good example for you:
When you approach an organization to assess its processes, which methods are best suited for that purpose? In literature can be found studies that reported the use of quantitative and qualitative methods, as well as mixed methods studies. What is your opinion about this issue?
In my opinion it good if both approaches are used. The reason is that numbers are necessary to show facts. By that I mean the organisation results, to quantify various aspects of the organisation, comparing past and current situation in quick and dynamic manner. However, numbers can be cold and do not always express the reality from an human perspective. For that, qualitative approach is better to collect human opinion, difficulties and why not their own experience and ideas for the organisation development. In this way participants can feel more involved and valued in process, making easier to accept future changes and implementations as result of the assessment.
answered a question related to Mixed Methods
I prepared polyniline nantubes by rapid mixing method, and this give me non-homogeneous tubular structure, What is the problem?
possibly due to the rapid reaction ..means...you have to give a sufficient nucleation (reaction) time to form a homogeneity or by slow addition of the reagents or vigorous stirring / sonication/microwave irradiation to regulate the ordered approach between the reagents
So we are trying to do kinetic characterization of an alcohol dehydrogenase. We've decided to go with a reaction buffer consisting of 2 mM TCEP, 10 mM ZnCl2, 2 mM EDTA, 0.15 M NaCl, and 50 mM Tris at pH 7. We are also adding in 1 mM NADP, 2 uM protein, and a varying quantity of substrate (for now, n-propanol) at the time of the experiment (Currently attempting a stopped-flow 1:1 mixing method, so all the concentrations wind up getting halved at the time of the reaction except for the NaCl and Tris.) Every time I have made up the buffer, a white precipitate has formed. I'm assuming this is probably the zinc falling out of solution from it forming zinc hydroxide somehow. I tried to get around the potential for it to form Zn(OH)2 by pH balancing the mix of TCEP, EDTA, NaCl, and Tris before adding the ZnCl2 and not messing with the pH after. However, over a period of about five minutes a white precipitate started falling out of solution. I tested the pH later, and it read around 5.6, which really stumped me. I've attached a picture of that batch to show what the precipitate looks like. Does anyone have any advice on how I should proceed? This data is badly needed, so I'd be incredibly grateful if someone could help us out.
EDIT: So I'm finding that it is definitely pH dependent. If I shoot in some concentrated HCl it clears right up. However, when I re-balance to even pH 6.5 it starts precipitating again. I have no idea how our reference paper managed to carry this kinetics assay out at pH 7.7. The only difference is they used beta-mercaptoethanol instead of TCEP. I think what I'll wind up doing is a fusion of the answers. I'm going to lower the NaCl concentration to probably 50 mM, cut the EDTA and Zinc concentrations to a tenth, and see how that pans out. Will also try using both DTT and BME to see if its the TCEP that's causing the issues.
ZnCl2 is an acidic salt, which is why the pH came out too low. Also, the buffering capacity of Tris for acids is weak at pH 7 (pKa = 8.1).
You are probably using an unnecessarily high a Zn concentration. It might be soluble enough at 10 µM, which is probably all you need, if you need any at all.
You may want to leave out EDTA, which chelates Zn.
I found this paper useful:
Would it be effective to conduct a likert scale survey to find how people respond to something, and then use interviews to go into more depth on the common responses received from the Likert scale? So, to not compare the results between both methods, but to use each method to supplement one another?
Robert, regardless of all statements above, which I do agree about, one more consideration; after you come up with your results / Findings via Likert scale, you may desire to conduct interviews with qualified experts to support, corroborate or validate your findings.
I am currently conducting mixed method research. My research will comprise of multiple phases as the following:
The quantitative which will be focus groups (students) to obtain more information about the topic and to use the gained information in the next phase (survey).
quantitative which will be an online survey to examine factors for developing a framework.
The last phase will be quantitative which will be focus groups with (Admin and lecturers) to confirm and get their perception of the outcome of the quantitative phase.
I was thinking its embedded but after doing more research I thought it might be (The exploratory sequential design)
Its confusing to choose the research design method, either triangulation or embedded, exploratory, explanatory. Can anyone advise me ?
I think is a Multiphase Design (Creswell & Plano Clark, 2011), pg. 72. Your phase1 should be qualitative research as you will be using focus group interviews to solicit information from participants to build up your propositions (note: in this phase1, beside getting input from focus group you must have solid underpinning theories through literature review to support your study unless it is a totally new research). Phase2 is quantitative because you will develop a conceptual framework / research model & hypotheses based on phase1 propositions & conduct an online survey then data analysis etc (again should have strong underpinning theoretical framework to support your study). Phase3 is qualitative research as you want to dig deeper by interacting with some participants via focus group study (not sure these are new participants or involved before as phase1's participants or phase2's respondents). So think your research is multiphase design with Qual -> Quan -> Qual. Exploratory is only Qual -> Quan then stop. Explanatory is only Quan -> Qual then stop. Wishing you all the best.
I would like to explore methods for re-ranking result sets retrieved using a term-based query against a database of bibliographic records. I believe that this additional layer of processing could improve a user's information-seeking experience by helping them to more find easily find articles relevant to their need.
An alternative implementation is to exclude records from the result set which, although contain the search term, fail to meet other criteria.
In either case, am looking for existing literature which could help me identify a suitable method of analysis for comparing one set of ranked results to another. I have found studies in which a subject matter expert codes each individual record returned in a result set as relevant or not, in order to compute precision and recall. This may be one strategy, but I am not sure if this alone will really be able to describe and express the differences between two result sets, or the differences in how they are ranked (at least for some arbitrary number of results returned-- it could become unfeasible for a human to evaluate thousands of results, for example.) I am also considering the value of a mixed method approach, in which I integrate more qualitative assessments of user satisfaction with what they feel to be the quality of results retrieved.
I would appreciate any suggestions for literature or methods to consider for this type of research. Thank you!
If you are interesting in comparing ranks then look for measures like MRR(Mean Reciprocal Rank) or rank co-relations like Kendall tau or Spearman' rho, these measure will only help you in finding the rank co-relation, if you consider one of the rank created by some state-of-the art method as baseline and then re-rank using your approach. An alternative could be that use generative method like language models to see how likely it is to generate the given query from a retrieved set of documents which will hint you towards the notion of relevance of certain documents, when you don't have relevance assessment available for your test collection. If your experiment collection already comes with the relevance assessments then the standard TREC based evaluation measures will be the preferred choice. Moreover, if relevance judgement are graded then I suggest go for Normalized Discounted Cummulative Gain (NDCG) etc.
I am conducting a mixed methods observational study with interviews performed at baseline before starting a treatment (n=30) and one year after baseline (n=30, from which 18/30 were the same participants that were interviewed at baseline). I have finished analyzing the baseline qualitative data using framework approach quite a while ago; and I am nearly half way working on my follow-up interview data.
Unsurprisingly, there are a lot of similar themes generated from those 'two studies', though the research aims of the two are different. But there is a new category with many new themes generated in the follow-up interview, closely linked with the aims of follow-up interview.
I am aware that there are papers published in either way; and I have seen more research articles analyzed them separately. Since I have been analyzing data for a while, I prefer to keep them separately.
Any suggestions and ideas on how to better defend my choice and how to present results are much appreciated. Thank you very much!
I think to be consistent with qualitative research principles such as emergence and not overly imposing 'a priori' theorising/prediction that you analyse pre and post data separately. As you say in your post 'there is a new category with many new themes generated in the follow-up interview' - the 'new category' is therefore emergent from your follow up interviews and can be defended as needing to be separated from your initial analysis.
Mixed methods models carry different titles by different authors but mean the same thing conceptually. One model that appears to be very useful when using both closed and open-ended questions on a survey for research is the Within-Stage mixed methods model. However, do researchers use this model frequently in survey research?
Since you are looking for concrete studies that use a specific mixed methods design that you identify as "within-stage" (though I agree with David that "within-stage" is not a widely accepted terminology) I would like to share my own research papers that could fit the kind of method you are referring to.
In these studies, which are in the field of applied linguistics, I collected qualitative data (images, representations, beliefs, etc.) through a free-response approach and then asked the respondents to assign their own favourability ratings to each response, thus obtaining quantitative data:
As to your question whether researchers often use this particular design in surveys, this would depend on academic discipline. The method I used in the papers above is well-accepted in psychology research; it is rare in applied linguistics.
Good luck in your academic endeavours!
there are some mixed methods to do research in educational leadership.please provide some idea
depends on your research aim but I used focus group and an instrument I developed
Today, I have read that using two different sampling methods in a mixed method study is possible. For example, a random sampling method for quan and purposive sampling for qual.
I am conducting a two-faceted study, with a quantitative questionnaire package and highly structured open ended questions, as an attempt for triangulation. I plan to structure the open ended responses (basic content analysis) and conduct quantitative analyses.I collect 58 units of data per subject and the average responses fill out 2 A4 papers. I have 27 subjects. The group is a very narrow group: successfull (GPA over 3.00) senior counseling students. I am convinced that the study is just fine; But, according to some "objective" standarts, does this design seem to be Okay? If you were a referee for this paper would you offer gathering more data?
It depends of what kind of paper you want to produce. If you are interested in giving more attention to qualitative analysis, 27 subjects is okay. However, if you want quantitative results with statistical power you should get more subjects (the size of the sample - if you want generalize your results - will depend of the parameters of your population).
In my master's thesis I used a quantitative questionnarie with open-ended questions and my sample was 598 subjects. I conducted the content analysis for all participants (tough task!) and reported the results qualitatively (I described and discussed each category, giving examples of patients' answers, etc) and then I tried to publish in a qualitative journal. Result: the referees said I misunderstood the concept of my work, that it was a quantitative survey and I "just" analyzed it qualitatively, so I should perform quantitative analysis with my qualitative results, and that is what I did.
But, in your case, if you try to publish the qualitative result, I think you would have no problem, once you have a small sample. On the other hand, it's gonna be hard to publish a quantitative analysis with only 27 subjects. I've seen papers with both qualitative and quantitative results, most of them with a small sample and just descriptive analysis, without statistical approach. I attached a paper with quali and quantitative results with statistical analysis (sample of 50 participants).
The other file attached is about quali-quanti research designs and how to transform qualitative data in quantitative results. It helped me to find out what kind of analysis I should report with my data. Hope it's helpful for you too.
please I need your comments with regards to the selection of the respondents who will participate in a qualitative method (Interview). Do you think that the purposive sampling is a good method for choosing the sample size for the qualitative method for a purpose by inquiring the teacher to recommend students who will answer the interview questions or we should avoid using this method as it has a bias
For purposeful (or purposive) sampling, what you need is a clear purpose. If you need to define which students which students fit with your goals, then you are using purposeful sampling. (The alternative would be random sampling, which would ask each teacher for a list of all their students, and then randomly selecting students from that list.)
If the interviews with the students are a qualitative study, that means you will have a relatively small N. If so, statistical generalization is not a relevant concern, and you should indeed use purposeful rather than random sampling. The concept of "bias" only makes sense in terms of making accurate statistical conclusions, so if that is not your goal, then you should reframe your thinking to ask which sets of research participants will be most useful for answering your research question, which is the fundamental principle in purposive sampling.
I am using a mixed method approach in my study. I have chosen a sample size of 296 out of the whole population 770. How can I extract the sample size for the interview and for the pilot study.
I agree that we could use some more information, especially what you mean by the size of the "interview" versus the full sample size of 296. By interview, do you mean the qualitative portion of your mixed methods study, and if so, can you say more about the design for the overall study.
There are Sequential Transformative Design & Concurrent Transformative Design (Creswell, 2009) and Transformative Design (Creswell & Plano Clark, 2011). Can anyone share your thoughts what is the meaning of “Transformative” in these various mixed method designs? Can you also share some examples?
You seem to have posted two versions of this question.
As I noted in the other version, "transformation" is a motivation for doing research (e.g., transforming society) and as such it is not really a design. The citations that Mary provided are a nice introduction to transformative (or emancipatory) justifications for doing research.
As far as I am concerned, one can apply just about qualitative, quantitative or mixed methods design to a transformative goal, so long as one can make a cear argument for using that particular method to answer that particular question. In other words, mixed methods are just one of several different ways to do transformative research.
Do you think that we learn most from Quantitative, Qualitative, or mixed method research? Why?
It may be unhelpful, but I don't think that the question is useful as posed. At best people will tend to respond according to their philosophical and historical bagage. I strongly believe that the most important ingredient is the question. What is important about a particular method is that you can trust the information that it brings with respect to the question you care about. The tendency of people to trust certain methods without regard to what they actually wish to know has lead to, and continues to lead to, a lot of poor research and ignorance.
We are conducting mixed methods using sequential exploratory design in two stages, the first stage consisted of three exploratory focus groups to inform the development of a cross-sectional survey. The second stage is a cross sectional survey with a large randomly selected sample. Could you please suggest any helpful reference or article?
The authors Creswell, Teddlie, and Tashakori have written quite a bit on this topic (not necessarily always together). Many articles and chapters of books are available online.
I am pasting one reference below and it is available online. This chapter describes the different types of mixed methods.
Creswell, J. W., Plano Clark, V. L., Gutmann, M. L., & Hanson, W. E.
(2003). Advanced mixed methods research designs. In A. Tashakkori & C. Teddlie
(Eds.), Handbook of mixed methods in social and behavioral research (pp. 209–240).
Thousand Oaks, CA: Sage.
I was considering using a mixed methods approach for a future research topic. I would appreciate the views of others in relation to their experiences and views about mixed methodology.
One thing your should think about from the very beginning is the integration of the qualitative and quantitative results. Too, often it seems that people are attracted by the value of having more data from different sources, without enough attention to how to bring that data into meaningful contact.
I tell my graduate students that mixed methods can sometimes be three times as hard as using a single method, because you not only have to do solid research with two different methods, but it also can take just as much effort to integrate what you learn from those different methods.
Of course, there is always the option of "minimal integration" where you simply have separate Results sections for each method, and possibly some Discussion of their mutual implications. This is still a common way of doing things, but I would treat it as a fall back strategy rather than a goal.
I am conducting my doctoral dissertation mixed methods study (convergent design) and wonder if the QDA Miner is a good tool to analyse qualitative data from semi-structured interviews? I also wonder if this software can be instrumental in comparing quantitative survey data (compatibility with SPSS?) and qualitative data? I would appreciate if you share your observations, comments and suggestions. Thank you!
QDA looks great, but if your inquiry is the use of one analytic application that could easily be used to analyze both qua and quan data inputs, I would rather suggest that you look into NVivo. The newer version of NVivo has new added features previous platforms that allows you to analyze both quantitative and qualitative and even geospatial datasets.
Why mixed method is preferred in social as well as health science research nowadays is because, it can give more clear and real answers of the research question (s). It might have other more benefits as well.
In much of the research on the human condition, whether it focuses on health or other aspects, such as economic processes or religious beliefs, it is advisable to use both qualitative and quantitative methods, because each provides diffferent kinds of information about the topic at hand. In qualitative research, the perspective tends to be up-close, holistic, and capable of characterizing process. It also teaches us how to ask the most relevant questions about the topic, ar as Agar says, "the hip questions." Qualitative investigation cannot, however, tell us how prevalent a behavior is, or how generalizable a pattern of behavior is. That is the province of the quantitative inquiry, which usually relies on short answers administered to large samples of respondents. The most productive way of joining the two kinds of inquiry involves "front-loading" the qualitative study so that it can inform the formulation of the quantitative interview instrument. Both kinds of research have to be as efficient as possible, and both have strategies for assuring efficiency - quantitative by means of power analysis and qualiative by means of saturation of categories (grounded theory) or redundancy (ethnography). This is just a thumbnail sketch of a series of lectures on the question of qual and quant.
We are intending to acquire a licence for an online survey tool at our institute. It will be applied in different research projects for both qualitative and quantitative surveys. It should have no limitation regarding the number of questions, have a decent variety of question types (incl. filter questions), and should allow extracting the data as a CSV or Exel file.
I do really appreciate your suggestions together - best with a short reason why you suggest a particular tool (OR why not). Btw.: Also open source tool suggestions are welcomed.
Our school uses REDCap (http://project-redcap.org/) and Qualtrics (http://qualtrics.com/). Both are browser based software solutions for electronic data collection. Data can be exported from REDCap into SPSS, R, SAS, XLS and from Qulatrics to SPSS, XLS and CSV, HTML and TXT. I have not used REDCap but am finishing up a project using Qulatrics. Survey development was intuitive, filter options were available and question types were plentiful. As to open source - both require "partnerships."
Do I start with my research design? What else to say?
Start with the research question, aim and objectives. Then go through the various components of the method: study design, target population, inclusion/ exclusion criteria, intervention, outcome measures etc
Thanks in advance for your help!
If you are working with a large "corpus" of text and your primary approach is automated searching of that text, then your best bet among the standard QDA programs is probably MAX because it includes an additional text searching package called MAX Dictio.
If your needs go beyond that, you should look at software that is primarily designed for word mining (also knows as word crunching). QDA Miner is a high quality program for that kind of work, with a number of interesting features that do no exist in the standard QDA packages.
I am conducting a mixed methods study and one of my data sources is documents. I am wondering if it is possible to mix the quantitative (content analysis) and qualitative (semiotic) at the same phase of document analysis? Any references that may guide me in this area will be appreciated.
I can't really offer you any references - with the exception of Mayring and his content analysis where he tries to combine the classical "word counting" with really qualitative methods. For me is of more interest how to assess the impact of what I analyzed in a qualitative way - that is, how are the new "contents" or "meanings" or "concepts" accepted and how do they spread in discourse? But seemingly, there are hardly any methods to do so - with the exception of bibliometrics (that usually have nothing to do -like, at all- with qualitative research or discourse analysis).
I am conducting a multiple case study within a mixed methods design. I aim to recruit 20 participants and will have treatment and control groups with random assignment to one of the two. My sample size is within what is suggested for a multiple case study. My question is around data analysis since this is not just a case study. I am doing baseline, pre and post tests, as well as qualitative interviews at the end. I am having a hard time finding support around the analysis.
Toula - Other resources that might be useful for you in your study include:
Niglas, K. (2009). How the novice researcher can make sense of mixed methods designs. International Journal of Multiple Research Approaches, 3(1), 34-46.
Kwok, L. (2012). Exploratory-triangulation design in mixed methods studies: A case of examining graduating seniors who meet hospitality recruiters' selection criteria. Tourism and Hospitality Research, 12(3), 125-138.
Molina-Azorín, J.F. (2011). The use and added value of mixed methods in management research. Journal of Mixed Methods Research, 5(1), 7-24.
Jogulu, U.D., & Pansiri, J. (2011). Mixed methods: a research design for management doctoral dissertations. Management Research Review, 34(6), 687-701.
Cameron, R.A. (2011). Mixed Methods Research: the Five Ps Framework. Electronic Journal of Business Research Methods, 9(2), 96-108.
Tashakkori, A., & Teddlie, C. (2010). Sage Handbook of Mixed Methods in Social and Behavioral Research (2nd Edition ed.). Thousand Oaks, CA: Sage.
Mason, P., Augustyn, M., & Seakhoa-King, A. (2010). Exploratory study in tourism: Designing an initial, qualitative phase of sequenced, mixed methods research. International Journal of Tourism Research, 12(5), 432-448.
Castro, F.G., Kellison, J.G., Boyd, S.J., & Kopak, A. (2010). A Methodology for Conducting Integrative Mixed Methods Research and Data Analyses. Journal of Mixed Methods Research, 4(4), 342.
answered a question related to Mixed Methods
Some researchers use mixed research and mixed method synonymously in their scientific papers. I think that they would not be the same.
Both terms are vague when decontextualised. I think it would depend on how researchers clarify such terms and approaches in their scientific papers. On face value, I don't see much of a difference between the two constructs (although "research" goes beyond mere "method" and would thus require further elaboration).
In need of valid appraisal tool for cross sectional studies for my systematic review. There seems to be adapted version of tools online but its reliability can be questioned. The attachment below is one I found online. But I am not sure whether it can be used to appraise quantitative research (cross sectional method). With regards to Mixed Methods, I plan to use Mixed Methods Appraisal Tool (MMAT).
Any suggestions, then please let me know!
A good starting point might be the Methods for the development of NICE public health guidance (third edition): http://publications.nice.org.uk/methods-for-the-development-of-nice-public-health-guidance-third-edition-pmg4/
Look at appendix G for Quality appraisal checklist – quantitative studies reporting correlations and associations.
Put simply - no. All methodological approaches have their strengths and weaknesses - and the same is true for mixed methods. Sure - mixed methods studies have their obvious advantage over single method studies (most notably comprehensiveness) - but they don't always work for all situations, they are not always correctly applied - and not everyone favours them. The intention of mixed methods is to offer diveristy - not to restrict by 'replacing' other paradigmatical options.
For mixing two methods in order to provide an in-depth understanding, I planned to use two concurrent methods: First method is quantitative (weight 50 %) for one sample, and the second is qualitative method (weight 50%) for a different sample. It may be easy to mix the data, but the question is: how to use both deductive logic and inductive logic in mixed methods?
Thank you so much in advance.
Interesting question Abdulellah, the first thing that comes to mind is using grounded theory - discovering theory through the analysis of data. This is an inductive approach in a sense - going from what the data 'tell' you. This does not mean you won't have (deduced) hypotheses before you start collecting data , it does mean you keep the freedom of retrospectively formulating new hypotheses to fit data.
What is mixed methods research or study?
Dear Dean -Thank you so much. I am happy to know that it can be used in a single study or a program of inquiry. I got a bit confused after participating in a workshop where it was argued that it only applies to a single study. Once again, thank you for your answer.
Quite often, I see action research studies that say they are 'participatory' when they don't appear to be and, moreso, action research studies that clearly are participatory - but do not claim to be. Is it as simple as 'participation' in action research makes it 'participatory' - or is it more complex than that? For instance, based on social critical research theory - does organisational and individual emancipation and empowerment need to be in place for it to be 'truely' participatory?
Action research requires the researcher(s) to use the results of the research to act upon and change the situation being studied. The action intervention may be part of the research design.
Participatory action research requires the people who are the subject of the study both to take part in making the analysis and then to use the findings to (a) understand and interpret the situation (b) make decisions and (c) act upon and change the situation they have analyzed.
Some colleagues (not in RG) were asking, why many of us choose mix methods for our researches on social sciences, instead of singly qualitative or quantitative. Anyone would like to respond?
The mixed design approach (Quantitative & Qualitative) could be applied in three sequences:
1- Conducting (Quantitative & Qualitative) concurrently for the sake of triangulation.
2- Conducting the Quantitative R first and then conducting the qualitative R. This sequence enables researchers to better understand and interpret the quantitative results.
3- Conducting the Qualitative R first and then conducting the quantitative R. The qualitative R could help researchers explore some interesting variables and then they could investigate the relationships among these variables using the quantitative methods.
My supervisor somehow suggests that I use Mixed methods for the study investigating users' perceptions in Facebook use. He suggests that I begin with a qualitative study, interview these users, get some perceptions and incorporate those elements in the survey. What is the effectiveness of this?
Definitely - yes. Perhaps the greatest value of mixed-methods research is the potential to offer wider scope for constructive, contained and appropriate research - with the potential to present as a more complete and comprehensive research opportunity. Miixed methods not only expand your toolbox, they also provide the opportunity for synthesis of research traditions and give the investigator additional perspectives and insights that are beyond the scope of any single technique. Mixed-methods also assist in resolving the issue of methodological dominance and order and enable a rich and comprehensive picture of the issue under investigation. Another argument for triangulation of methods assumes that weaknesses in one method can be counter-balanced by strengths in another. This situation has challenged researchers to develop ‘conceptual triangulation’ as part of their planning. Here, each research approach that is incorporated into the overall research design is evaluated separately according to its own methodological criteria. Each component can stand alone while also being linked conceptually to other parts. This is of great value when researchers want to understand how parts of issues they are investigating relate to the whole picture — again adding to the comprehensiveness of the studies.
Starting qualitatively - leading to quantitative is the most common form - but it can work the other way around as well i.e. if something evolves from an initial survey that you find interesting or unexpected - so you then decide to interview participants to expand on the phenomenon
I am looking to complete a mixed methods synthesis which will combine the results from a systematic review which evaluates the predictive ability of certain determinants, and a meta-ethnography of qualitative studies which explores stakeholders perceptions of those factors. I would be interested in exploring various models available for such mixed synthesis. Thanks in advance for any guidance.
Qualitative and quantitative data will address different research questions, however those questions can be related. This relationship should guide your synthesis. For example, if you look at the number of people in a particular situation, and their perception of that situation, the first one can be addressed by a meta-analysis of quantitative works and the later by meta-ethnography of qualitative studies. These two cannot be mixed as a single piece of synthesis, but they can be presented as two related and complementing summaries of information.
I have been invited to go to Maharashtra and meet with farmers who have been practicing Sustainable Yogic Agriculture (SYA). I'm leaving in ten days so I have limited time to get a solid research proposal together. Therefore, I'd like some guidance from others on how to make the most of this research opportunity (my training is in anthropology and ethnography).
SYA is a process where farmers combine meditation with traditional organic farming. Quantitative data regarding crop yield, nutrition and economic benefits are available (see attachment).
The UN (where I work representing an NGO) is interested in local innovative projects that are improving the lives of - and making tangible contributions to - local farming communities. They are also seeking disaggregated data. So there is a need for both qualitative and quantitative data.
I have a week with the farmers, so Participant Observation will be the foundation of my time there. And focussed interviews. Areas of interest I'm considering at the moment are the effects of SYA on farmer health and wellbeing, within individuals, families, communities. There are also broader issues and challenges such as market access, attitudes of local and regional government, problems with seed sharing, political pressure/ support, affect on farmer suicide.
I wonder if some of you who have experience and expertise in the area of agricultural research could make some suggestions as to how to make the best use of the farmers time, and my time too. So I can return with something that can make a real contribution to this important area of food security, climate change, agricultural innovation and farmer wellbeing.
With my thanks and appreciation.
Article Systems Approach to Agriculture
we are doing a questionnaire based survey in Uganda to get an idea of community health and well being such that we can use this data to make informed decisions towards developing agricultural programs. you can get more information about the group at www.deeprootsconsultants.com. We are neccesarily focusing on nutrition and agriculture. I do not mind sharing some of this information if you ask me specific questions or even reviewing some of your questions if you make a basic worksheet. From the information you gave above I think a few things to consider may be to collect information before implementaiton of SYA and after and how this has affected their well being. You will have to necessarily sub-section the survey into health status, nutrition status, access to food/ changes in food systems, agricultural progress and socio economic changes. In the health status section it may be important to subsection it to collect information about physical and mental well being,both, since the practice incorporates yoga-meditation. In terms of agriculture, seed source, fertilization, irrigation methods, agricultural input, cost of input, outcome (yields), new markets (if any?) profitability and its impact on local economy may be something to consider. Some of this data will be qualitative but it is important to develop a quantititive scale to measure most if not all impacts. FYI we are working in Uganda as UN volunteers and will love to work with you in this adventure. I am originally from India, Maharashtra and if you happen to have any cultural questions / roadblocks please feel free to ping.
What type of the research that I will conduct in a single institution but method of data collection is in mixed method?
Can I say my research type is a case study and mixed method? Any one can give a direction how to label my research?
I will be analyzing qualitative data (interviews and field notes) with survey derived quantitative data.
All the major CAQDAS packages have facilities to handle that kind of mixed design, e.g. using the "Casebook" in NVivo. MaxQDA has some very nice facilities for what Creswell calls joint displays, e.g. a "Quotation Matrix". Atlas.ti is IMHO the most intuitive to use, and is generally agreed to have the best modelling functions. Both MaxQDA and Atlas have very good licensing for student users, if that works for you. And I think Atlas have some discount offer this month? Dedoose, which is less well-known (developed by medical researchers at UCLA), claims to have specific facilities for Mixed Methods analysis. It's also available as a pay-as-you-go Cloud service, so worth a look.
Upon reading focus group and video-taped transcripts, I have been sensitized to how technology plays an important role in Mexican American teens' experiences with romantic jealousy. There is little out there on this topic, and I'd like to analyze the data qualitatively using both observational and focus group methods since together the data tells more than either source alone (e.g., how a topic raised tangentially in focus groups played out in their observed dyadic discussions of chosen conflict issues). I have not found many examples or information on how to do this type of data analysis well. Moreover, it is a bit complicated since some of the teens that were videotaped were in focus groups, while others were not. Does anyone have any thoughts/recommendations? Your input is much appreciated!!
Some would say that you are not mixing methods. You are using a variety of data making techniques. You could get drowed in literature and approaches but what you are basically going to do is read, watch, think, write, read again, watch again, think again, etc. until it makes sense to you. If it doesn't you go back to the data or back to the kids. Perhaps a thread that could run through your analysis to give you light is the one you offer as justification for the varied technique: what would this theme look like if I did not have two different sources of data? How is one enriching the other? What are their coincidences and discrepancies and where do the latter seem to come from.
Jealousy is often related to the Limmerant phases of romantic love described by Dorothy Tennov. I think she's worth more than a look.
Best of luck.
I am trying to complete my doctoral dissertation. I need 10 physicians and 10 nurses to discuss treatment of Myelodysplastic Syndromes (MDS) in my mixed methods study. How can I inspire them to participate without a financial incentive?
Be aware that busy health professionals might not have much time to spare. In my experience, you might not be able to get more than 45 minutes of their time, and this is likely to be first thing in the morning, (before they start clinic), at lunch time or in the afternoon/evening when their work is done. Thus, your questions may need to be more focused than you might like. It can help to send them a copy of your interview schedule in advance.
When I was having problems recruiting professionals to a study, I went and made a presentation about my proposed research in the department where many of the doctors I wanted to interview were based. I also presented some of the findings from patient interviews. So far as nurses and other professions allied to medicine are concerned, we organised a local 'expert' conference for them (at no cost to themselves) which was also an adjunct to recruiting interviewees.
A good strategy is to befriend one senior insider and use them to 'snowball' other respondents. Some clinicians may doubt your motives to start with, thus once a colleague can vouch for you it makes things a lot easier.
In order to provide an analytical and structured form to experiences that are perceived or experienced by participants in a holistic dimension and in terms of flow, what are the most effective research methods?
Hi Mauro, I would say qual research methods are the most appropriate; IPA, Grounded Theory, content and thematic analysis or meta-ethnography could be used to analyse such data.. cheers, Paul
Can somebody recommend good literature on pre-post comparisions in studies?
I am serching fo a scientific explanation, how pre-post comparisions are done at best.
I used the quantitative cross-sectional, correlational design whilst conducting my PhD study. However, whilst interviewing the participants to complete the questionnaires some of them during responding to the question (closed question in the questionnaire) extend their response with more details about their experience with the disease. The participants were Arabic speakers so as a result I translated what they said into English.
I found these data are types of qualitative data that actually increased my insight about these people's experience. So, my question is can I use some of these qualitative data to make only reflection in the discussion chapter during writing, although I could not present it in the result chapter, just as a quote. And if I used this data, it will not affect the validity of my study?
I agree that one doesn't bring forth new findings in discussion and as a reviewer I attack that all the time as well as big descriptive results about samples in methods. Having said that, sometimes illustrative quotes from qualitative quotes are seen as a way of illuminating the quantitative results and thus are treated as part of disussion. The point there would be that they are not really being treated as data in such a case. So you would have to reflect on that. Bob
Are there any software for 1) Interview, 2) Observation, 3) Questionnaire and 4) Journal writing?
QDA, can be done as a computer assisted process, but finally needs you personal attention and judgement.
Some software are:
I am working on a meta-synthesis of qualitative findings about interventions for improving wellbeing in cancer patients. Thanks in advance for any input!
Hi Ronán, brilliant, thank you so much - is there some special trick that allowed you to search my question? Seems too good to be true, but then Google are pretty amazing at that sort of thing.
if i work with quantitative method..the output should model? if mix method.. the framework is suitable for the output, is that right?
although it should ideally be "research driven", there are alternatives...for example, if you have large data sets.that can be "mined" for the content they have hidden within them (e.g., by using cluster analysis or other data mining techniques)...obviously, it's not that simple, but they are both places to begin your search...good luck...
I carried out a multiphase mixed methods study for my doctorate to develop a specific EAP (English for Academic Purposes) course. It was action participatory research and the different phases were (i) a pilot (ii) a needs analysis (iii) course development (iv) course evaluation (v) reflection.
So the different phases led to each other. I have 2 questions:
(a) Do I show the notation for each phase separately? If no, how do I indicate the progression from the one phase to the other? Some phases were depending on the previous one, while others took place concurrently.
(b) The questionnaires collected qualitative as well as quantitative data. The qualitative data was collected by open ended questions and also by providing a space after some questions for elaboration. Are those examples of embedded QUAN(qual) or explanatory QUAN-qual? Any input would help,
You can write into your methods that you are following an action research approach (thus justifying why you are presenting it sequentially) but still indicate that your choices were influenced or inspired by mixed methodological principles. One of the great things about qualitative research is that the methodological traditions can inform each other or even be combined to create new approaches.
Which theory is best for Mixed method ( qualitative and quantitative ) research on socio- cultural and Institutional barriers in accessing Adolescent sexual reproductive health ?
Dear Giri, with all due respet, in spearhead research it is not true any longer that there are only two methods or approaches to any given phenomenon: qualitative and quantitative. As a consequence, a centered perspective of a sort of mixed or hybrid qualitative-and-quantitative methodology is not tenable any more.
To tell you the truth there are nowadays three kinds of science, and hence, thrre general kind of methods, thus: empirical science, deductive science and science via modelling and simulation. Along the same way, we have besides the two classical methodologies, one third, namey modelling and simulation.
The third type does certainly share some feature with the two classical ones, but it is quite distinct and different.
I would most cordially sugest to take into consideration the importance of modeling and simulation for such a marvellous complex system such as the reproductve health services. Scientists and researchers should step into 21st Century science and methods, discusssions and approaches.
I am currently conducting mixed method survey on caregiver's burden and coping responses on caring HIV people. The quantitative part will measure the burden and coping response of caregivers, meanwhile qualitative in-depth interview session will explore the burden and coping issues. The aim to conduct mixed method is to get complete clear picture of burden and coping issue, also to construct and support quantitative findings. Its confusing to choose the research design method, either triangulation or embedded, exploratory, explanatory. Can anyone advise me ?
In choosing a mixed-method approach I would be guided by the nature of the research problem as well as resource availability and constraints (funding, maximum length of funding, access to participants, etc). The research design depends largely on:
- the nature of the research problem and if
- Previous studies - if there is very little literature on the problem (in case of limited previous studies on the problem consider exploratory design);
- If you are testing theory (consider explanatory), building theory (consider exploratory).
You may want to refer to the following for some good guide on mixed methods research:
- Creswell, JW 2009, Research design: qualitative, quantitative, and mixed methods approaches, 3rd edn, Sage Publications, Thousand Oaks.
- Creswell, JW, Clark, P & Vicki, L 2011, Designing and conducting mixed methods research, 2nd edn, Sage Publications, Los Angeles.
- Groger, L, Mayberry, PS & Straker, J 1999, 'What we didn’t learn because of who would not talk to us', Qualitative Health Research, vol. 9, no. 6, pp. 829-35.
- Sproull, NL 1995, Handbook of research methods: a guide for practitioners and students in the social sciences, 2nd edn, Scarecrow Press, Metuchen, N.J.
- Teddlie, C & Yu, F 2007, 'Mixed methods sampling', Journal of Mixed Methods Research, vol. 1, no. 1, pp. 77-100.
I hope this helps. All the best in your quest.
I want to get to know more about Mixed method analysys tools. Can anyone suggest me any tools for my research?
Many times these analyses are used with longitudinal data. If that's your plan, consider Applied Longitudinal Data Analyses by Singer & Willett (ISBN: 0195152964). SAS for Mixed Models (ISBN: 1590475003) can be helpful for learning to conduct these analyses.
Is embedded mixed method differ from nested design?
I wish things were as clear as Senthilvel states, but if you look at most of the examples that Creswell & Plan-clark give for embedded designs, they turn out to have a sequential component where the a dominant QUANT design is the source of the qual component. this amounts to: QUANT --> qual.
Personally, I think the term embedded design leads to more trouble than it is worth, since you can get more clarity from determining whether there is a primary method and whether the combination of methods is sequential or not.
FYI, what Senthilvel describes would amount to QUANT + qual or QUANT --> qual, then why not just address that issue directly?
In research using triangulation or mixed methods/approaches such as qualitative and quantitative HAVE strengths and weaknesses. What do you think?
Some interesting questions were asked in the area of mixed research and could be helpful for the followers of this thread!
I am planning to conduct semi-structured interviews with up to 40 parents of severely disabled people across two countries (20 in India and 20 in New Zeland). In the end, if I end up using numbers to compare their responses (e.g. 'X' was reported as an issue by 15 NZ parents but only 8 Indian parents); does such 'quantification' make my study a Mixed Methods Research or is it still a qualitative study?
I'd say it depends on how you use that data. If it is simply descriptive, then I wouldn't count it as mixed methods. For example, lots of qualitative studies include a "Table One" that provides descriptive information about the participants.
Alternatively, if you do analyses on that data and produce substantive results, then you are getting closer to mixed methods. In that case, the issue would be how you "integrated" those quantitative results with your primary qualitative study
What can be good strategy when the quantitative and qualitative results differ in mix-method settings? Keep it in the same article or separate the results and prepare two different manuscript for different outlets?
Interesting topic! The intention of the Mixed methods research is: one method should complement the other method. The quantitative and qualitative studies do not measure the same, Therefore, their findings cannot be reported in the same manner. Generally, the findings of the quantitative can be made sense through your qualitative findings, but, not always. There could be something that is not clear enough, then it is worth exploring it further.
I did come across two paper (by the same author) that had contradictory findings and were reported as two different papers. They had measured the impact of an intervention both quantitatively and qualitatively. To me, that was the error. You cannot measure the impact of an intervention qualitatively, but explore the perceptions of the participants. Luckily, I got to read both of them and could draw my own my conclusions which were not in agreement with the authors' conclusions, but, were important to me.
I am very curious as to learn a bit more about the methodology and the research questions, so I can give a better opinion. My email address is firstname.lastname@example.org
What do you consider to be the top topics needed in health service delivery that could be answered by systematic reviews ( either quantitative with or without meta analysis, or qualitative, or mixed methods)?
Dear Craig and Dean,
If I have not missed your question, the way we grade healthcare evidence lays in the methodology we use to generate it. However, there is no single top method in generating evidence, unlike some ideologies declare that qualitative evidences are superior to quantitative evidences. We cannot get an evidence regarding meaningfulness from a meta-analyses.Similarly, we cannot get an evidence of effectiveness from a meta-syntheses. Therefore we need both qualitative and quantitative evidence. What makes JBI a champion is that it has an approach for grading evidences generated from both qualitative and quantitative analyses and syntheses, not putting one on top of the other , but giving them a parallel grade. More importantly, there is a mixed method that JBI is using to analyse both qualitaitve and quantitative evidence.
I collected my data from three medical settings in one phase time where I conducted non-participant observation (i.e. medical consultation was recorded), then a questionnaire that patients had to fill. In the questionnaire, there are qualitative or embedded open-ended questions. I collected documents by the end of my fieldwork. My question is that:
Can I say that I used two mixed methods, namely convergent parallel and embedded design? I want to make sure that I am using the correct designs?
I would be so grateful if any expert mixed method research can help in this regard? Am I using the correct design? or other designs are better than the ones I have used? From my readings, I found these might be the best designs to be used? What do you think?
It seems that you are talking about two different types of design within MMR (mixed methods research). Usually, having few qualitative questions within quantitative questionnaire does not constitute a mixed methods study of the embedded design; rather it is a quantitative survey with qualitative elements.
To be able calling the study MMR, you need to have both qualitative and quantitative parts of the study being of equal weight. That is one of the basic criteria for quality mixed methods studies. In the convergent parallel design, you collect in parallel both quantitative and qualitative data (say, you do qualitative interviews and an online quantitative survey) at the same period of time. Perhaps, qualitative data from your non-participant observations and the quantitative questionnaire that you mentioned, together, could better fit the definition of the convergent MM design, if both parts have equal weight. This type of design has its certain methodological limitations, though.
I would recommend the following reading for quick reference of MMR designs and their pros and cons:
Creswell, J. W., & Plano Clark, V. L. (2011). Designing and conducting mixed methods research, (2nd ed.) Thousand Oaks, CA: Sage.
Hope it helps.
The questionnaire covers work related health from the perspective of the worker, the organization and the management investigated from the physical, psychological and environmental view. The mentioned pragmatic validity is tested through applying the questionnaire to several workplaces, performing different interventions and following the results of the specific interventions. I am also looking for references concerning pragmatic validity and references where this concept has been used in research.
Thank you, I look forward to eeading the article.
I am having an issue regarding research methodology especially mixed-method approach. Briefly, I used survey questionnaires on 200 respondents, and semi-structured interviews on 8 private companies and 12 government officials respectively. All of them must be included as they are vital stakeholders in my research (same research objective) and categorised in 3 different groups. About the analysis, I employed a simple descriptive-frequency and correlation analysis on 200 respondents-SPSS software and deductive content analysis-coding (themes) based analysis on the both government and private interviews. In short, 3 groups of stakeholders' results based on the aforesaid explanation. My intention is after the analysis, each of them I will come out a conclusion, so overall, I will have 3 different conclusions. My intention is to establish some kind of linkages between the three conclusions or in other words, based on these three conclusions, some similarities found, I will draw an overall conclusion to explain them. My question is, I am not sure whether my methodology especially the part of 'integrating'/ 'converging' the 3 different and somewhat interrelated conclusions and generalising them become 1 conclusion is valid or not. Therefore, I humbly ask for assistance if anyone knows whether the way I do is valid and relevant.
Very interesting discussion on mixed-methods research with "3 different conclusions". In the book by Vicki L. Plano Clark and John W. Creswell (2008) titled "The Mixed Methods Readers" pages 288 - 294, may help answer your concerns Gabriel. Especially with Meta-inference. Page 893 provides some good information on "Multiple validities legitimation".
I am considering coding video data for metacognition during facilitated learning experiences and group learning activities/problem solving
Find attached a paper in which the authors used video coding of meta cognitive processes in a similar context. I have found it helpful in trying to design my own observation protocol in a not too dissimilar context.
I'm a pragmatist at heart and after exploring mixed methods, I'm more interested in qualitative research. So far, I can't find much literature or arguments that link pragmatism and qualitative research. Any suggestions, advice or comments appreciated.
Yes indeed, Grounded theory is a pragmatic research methodology used extensively in qualitative research. So pragmatism is the heart of Grounded theory and can be traced to Charles Sanders Peirce who was a research philosopher working in the US in the second half of the 19th century.
Pragmatists believed that knowledge was a coping strategy for achieving human ends and therefore if humans contruct something that works for them, then it was right. They used the term "true" to refer to somehting that works. This idea underpins grounded theory and points to the practical value of academic research in the working experience.
I am doing my doctorate research in finding training strategies that have worked in the corporate world in shortening the time-to-expertise of employees. There is zero research studies in this area and this is possibly the first time research.
There are some experts worldwide in different organizations who have done practical work. Therefore I am trying to tap their opinions and cases to compile a knowledge base.
I am using case study methodology and wanted to use semi-structured in-depth interviews as the primary and only method to collect data from this handful of participants. However, I came across an issue that several potential participants are only comfortable with survey questionnaires where they can think through the responses before submitting.
Does anyone know if I could use semi-structured in-depth interview (with some pre-defined questions) for a set of participants based on their preference/willingness and capture opinions of another set of participants using qualitative questionnaires containing the same set of questions I planned for the interviews? Both methods for data collection are to be concurrent. Will it be called methodological triangulation?
Has anyone used this kind of method? Is there any support from literature for using this kind of method that still stays rigorous when assessed by examiners? I do not have luxury of going back to the participants a second time, so I need to capture the data from an interested participant at one timepoint.
Mdumiseni: Here is what I learned from my research experience. If you have bounded case, then you could use combination of several different data collection methods to gather data within that case. If you have only one case, then you cross-check the outcomes from one method vs. other. This will help you establish the truthfulness of the findings. If you have multiple cases, then you could combine the data form different sources within a case to make a full picture fo the case before you compare it with next case. In such a case, you would not need to do data analysis by source, but you could just do data analysis by case.
If however, you are not using bounded cases, then the data quality and depth is different in interviews and in questionnaires even if you have same set of questions. In such a case, you may want to keep data sets from these two methods separately and analyze those seperately. If findings from one match with other set, you could use it as 'data triangulation' to establish validity of your research findings.
I will have to go back to my database for references and literature. I will try to find and provide you.
Any ideas about exploratory mixed methods? Creswell (2014) suggests moving from the qualitative data analysis to scale development. Is it better to develop a scale, or questionnaire, or survey?
A lot depends on how much survey based content you already have. For example, I worked on a project where the innovative aspect was the creation of a new dependent variable. We had everything else that we needed in terms of background (demographic) variables and pre-existing measures for the independent variables. So, we concentrated on doing a series of focus groups that produced a 15-item scale to measure the dependent variable.
Alternatively, you may be in a much more "discovery" oriented mode, where you are unsure about what topics your survey should cover. In that case, you might need to start by generating hypotheses, followed by further qualitative work to create the measures to operationalize the key concepts you discovered in the earlier phase of your work.
Overall, the key point is that a qual --> QUANT exploratory sequential design is a general template that you have to apply to a specific set of circumstances. Once you are clear about your research goals, then you can work out the specific implementation of the design.
In mix method strategy, we find articles that either,
1) First do the survey type study and then carry out a few case studies to further deepen understanding of the issue
2) Begin with the exploratory case study and then do the survey for generalization.
What could be the better way to do?
It all depends on how you want to use the different strengths of the methods. I would interpret the two designs you are describing as:
qual --> QUANT
QUANT --> qual
In the first case, you are usually using the preliminary qualitative method to develop the content for the survey. If you already have a set of hypotheses and measures to operationalize them, then this design does not have much to offer.
In the second case, you would get the survey results and then as you say "deepen" your understanding of the results. For example, if your results match your hypotheses, you could get good illustrations of how and why things work the way they do. Or if some of your hypotheses are not supported, you could explore why this was so.
What research approach/design would you use to provide a comprehensive understanding of the efficacy of problem based learning in relation to student cognitive and emotional development?
Problem based learning or application based learning?
I'm trying to compose a guide on methods for (1) generating and (2) verifying theories about rare phenomena. I am interested in any sort of ideas and experiences.
Dear Chris, I am more daring about it: if you truly do research on rare phenomena, you have to develop new methods. By definition, innovation entails a real challenge, so that the standard methods, f.i. are useless vis-à-vis new problems, etc. I would be glad and proud if you work on developing those new methods in accordance with the fields, problems and heuristics you work with.
As part of my thesis, I have gathered data from a survey from 300 individuals from various automotive divisions using sequential mixed methods (quan to qual) and then interviewed 11 individuals to address some questions from the quan findings. I had problems in interviewing any of the 300 individuals for the following reasons:
1) They do not want to have their views documented. They refused to be interviewed
2) They fear they could lose their jobs over the comments.
3) Their english is poor.
As a results, I had to interview 11 middle management who have more than 20 years experience in various divisions but they are not part of the 300 individuals who participated in the survey.
My question is as follows:
1) Are these comments considered acceptable reasons? How do I put these into acceptable comments suitable for academic purposes?
2) Is the interview with the 11 middle management employees acceptable?
3) What are your views on the above approach?
The quantitative versus qualitative debate has taken place within our RG research team and significant steps towards a reconciliation of initial differences is taking place. As a means of introduction of the following considerations, I recall to have noticed quite a synthetic but effective message on our task of research designers:
As a starting observation I’d like to point out that the point at which the data analysis begins and ends depends on the type of data collected, which in turn depends on the sample size, which in turn depends on the research design, which in turn depends on the purpose.
Nevertheless, far more insidious discrimination remains. Systematic review within the topic of methodology exhibits all the characteristics of "institutionalised quantitativism" in that criteria for a "good" review are almost entirely determined by the quantitative methods promoted. Nobody who understands qualitative research would insist that its primary studies demonstrate alien concepts such as "sample size" or "statistical power". Yet comparably fundamental incongruities persist with regard to qualitative syntheses. Why should systematic reviewers of qualitative research pursue a "gold standard" comprehensive literature search when concepts such as "data saturation" have an established pedigree? Why shouldn't they apply systematic, explicit and reproducible principles of thematic or concept analysis to create syntheses that advance our understanding of qualitative issues and highlight research gaps?
Due significance should be given to the methodological options at our disposal reckoned to be more appropriate in the different circumstances dependent on the nature of the study. Accordingly, we should make choices of methods that are both philosophically defensible and, at the same time, practicable and responsive. For instance, taking the case of researchers becoming aware that the purposes of their study often involve both quantitative and qualitative aspects, it follows that it will be more appropriate to explore the opportunity of developing mixed method research designs that may be better associated with their investigation purposes.
Then, following the ideal thread of earlier observations, I choose to concentrate on the ‘mixed methods’ methodological approach employed as a research configuration in social sciences at large especially in cases of integration or connections of quantitative and qualitative data.
On the first approach I observe that an increasing development of mixed methods research is commonly accepted as a distinctive feature of contemporary research designs to profit from the inclusion of both quantitative and qualitative sources of information, mostly when generalization of results and feedback evaluations are the purposes pursued. In fact, as it has been remarked, the insertion of qualitative data can help researchers to enlighten relationships emerging from quantitative data. Similarly, the inclusion of quantitative data can help in compensating for the fact that qualitative data normally cannot be generalized.
According to the prevailing literature on mixed research design , combining quantitative and qualitative analyses has been advocated when the process presents evident complementary strengths as in what Denzin (1978) dubbed triangulation of different data source, i.e. the process of testing the consistency of findings obtained through different instruments for the study of the same occurrence. Specifically, the combination of the two approaches seems useful when:
1) results from qualitative interviews can help to identify unobserved heterogeneity in quantitative data as well previously unknown explaining variables and misspecified models;
2) results from the qualitative part of mixed methods design can help to understand previously incomprehensible statistical findings;
3) qualitative research can help to discover quality problems of quantitative measurement instruments;
and 4) quantitative research can be used to examine the scope of results from a qualitative study and support the transfer of such findings to other domains” (Kelle,2005).
For its characteristics, mixed method research has been labeled: the third major research paradigm and gained the legitimacy of being a stand-alone research design. However, besides its strengths, controversial issues still hinder its potential and some of the details remain to be worked out by research methodologists (e.g., problems of paradigm mixing, how to qualitatively analyze quantitative data, how to interpret conflicting results). Then, a word of caution seems appropriate to be addressed to researchers (and better to research teams) who plan to design inclusive, complementary methods that are capable to embrace diverse perspectives, data and values within and across studies.
Denzin, N. K., “Triangulation”. In N. K. Denzin (ed.), The research act: An introduction to sociological methods, 1978, McGraw-Hill, New York
U. Kelle, Mixed Methods as a Means to Overcome Methodological Limitations of Qualitative and Quantitative Research, Workshop on mixed-methods held on October 26-27th 2005 at the University of Manchester.
I have more than a dozen videos recorded from class observations. Are there any simpler and faster ways to analyse and transcribe these recordings?
Dear Fatimah, here are some ideas below + refernces.
I would also recommend following authors and issues explored by Sarah Pink (visual ethnography) Marcus Banks (visual data in qualitative research), Robert Kozinets (netnography)
In our lab we are working on a small group study where we are observing people's behaviors. Participants played a game in small groups of 3-6 people and their interactions were video recorded. We are now in the process of developing a code book and are looking to find an efficient way of doing it so that we retain rich information as well as making it possible to conduct mixed-method analyses. Any methodological/statistical suggestions for resources would be welcome.
I have two suggestions: 1) Kathy Charmaz (2006) Constructing grounded Theory published by Sage is a great resource to help develop codes. I used it for my dissertation looking at a distributed small group. 2) I have a paper looking at different levels of group interaction (I've attached it here) and I have it in my profile
How many participant-observations are enough to achieve saturation?
There are couple of studies which discussed possible number of interviews to achieve saturation in data; however, I am interested in knowing the number of participant-observations needed for this purpose.
I have been reading about mixed methods synthesis techniques (Sandelowski et al 2006, 2012 and Joanna Briggs Institute 2014) for systematic review.
I think a segregated synthesis approach will suit my subject well but can find references only to qualitative/quantitative studies - how do I deal with mixed methods studies? Would I separate the data into the above categories and place that data into the appropriate synthesis? I'm not sure if that would work.
Mixed method approaches to research are tricky all along the research pathway. I've always done the analysis of qualitative data separate from the quantitative data. However, In the report or publication try to show how the results complement, support or contradict each other. Like David said a lot depends on the design.