Questions related to Designs
As you know, when understanding the architectural space, women and men perceive the space differently, and this difference can also be seen in their designs.
Can the gender difference in the perception of architecture students be reduced and brought to balance? Is this the right thing to do?
For example, should the design of an architectural product designed by a female student follow the same points as a male student or vice versa?
Because I realized in my recent research that men pay attention to form and women pay attention to space. But how can I balance this perceptual difference?
I need to understand whether the focus should be on one person at a time in such experimental designs. Kindly help me with the references if possible. Thank you.
I am working with a lot of timelapse data and I've been switching between Timelapse2 & just using VLC (strictly because it isn't as laggy with going back a few seconds like Timelapse2) to analyze these files, but its a very time consuming process. I was wondering if anyone here has a lot of experience going through timelapse files and tips on how to go through them in a more efficient matter?
The goal of my project is to compare detection rates for a slow moving species with a few different camera trapping designs
technical debts described as choices that done for fast initial gains but are counterproductive in long term. while code smell defined as signs of weak designs and coding.
i am just confused with the difference. is that the reason, time it took to solve or anything else? please help!!
Is there any existing technology (Robot, AI-Climate techs, non-human agential, etc.) which can be used to capture and/or convert greenhouse gases before they are released into the atmosphere due to underground activities of mining and/or oil exploration?
I am planning to conduct experimental studies with various HMI designs. I want to have a thorough understanding of the conventional HMI design process. For example, guidelines for text size, color, and placement of various process parameters and controls.
I am a post graduate student presently writing my thesis in the department of curriculum and instructional designs.
I am seeking your help to choose the suitable MM design which fits my study. My study is consisting three phases, the first phase is Delphi technique to design the research intervention and questionnaire, the second phase was the qualitative where I explore the participants view about their work in the first phase, and the last one was implementing the intervention and run the questionnarie. the second and third phase were conducted simatenously
Now, I am confused which one best fits my study? the convergent one or sqential as i run the delphi technique first?
I am planning to conduct an experimental research on a single group. I read the within group design from Educational Research by J.W. Creswell.
Repeated measure design involve same group and different treatment..I would like to discuss following points.
Does content after each intervention different?
Does measure after each intervention different ?
If someone has used this design in their research, please share your experiences.
- Does content after each intervention different in repeated measures designs?
- Does measure after each intervention different in repeated measures designs?
I saw on some conventional BOTDA designs on the net that, they are using two different laser sources in BOTDA. Is that not more expensive than using one laser? Could be there another reason for this except for the difficulty of using loop design?
Thanks in advance,
First of all the different roles of ESP practitioner should be known to the ESP practitioner. As 'course designer- The ESP practitioner gets know some basic information and knowledge of the discipline in which the learners are in. For example, if you are teaching engineering students, you need to know the engineering terms, texts related to engineering and contexts in which engineers will be interacting with the stakeholders within their domain. Based on the needs of the learners the ESP practitioner then designs the course, frames the curriculum and prepares the syllabus.
Secondly, the role of the ESP practitioner as collaborator involves cooperating with subject specialists (Dudley-Evans and St John, 1998) in which ESP teacher finds out about the subject syllabus in an academic context or the tasks that students have to carry out in a specialized situation.
Thirdly, it is found that the ESP practitioner has different and complicated roles. As teacher, he/she has to help learners to learn. As course designer and material provider, his/her role is to choose handmade or authentic materials that best suit the learners' needs.
Fourthly, as a materials provider, he choses the tasks and activities in connection with the discipline the learners are in. Thus the tasks and activities motivates the learners to practice and learn.
Fifthly, as an evaluator, the ESP practitioner understands each learners' output and evaluates accordingly.
Finally, the ESP practitioner is indeed a researcher. As a researcher, he goes on continuously finding different methods that are approachable and brings success in learning the needed language skills.
I have a question regarding the appropriate way to analyse my data using ANOSIM or PERMANOVA. In particular, I have gathered data on stomach contents of fish in the North Sea from different sources, rather than collecting them myself. That leaves me with quite a patchy dataset, with some stations at which 5 fish or more have been sampled, but many with only 1 sample per station. Therefore, my data is partly with and partly without replication per station. Since the stations themselves are located in different habitats which I have defined a priori, I am now wondering whether I will need to analyse my data in a nested design or not? In the answers to similar questions regarding ANOSIM and PERMANOVA, I have repeatedly read that these tests are fairly robust towards unbalanced designs, but these cases concerned designs which all included replicates (and varying numbers thereof).
Any suggestions on how to proceed with my data are highly appreciated!
In many designs I saw that the ground of various MIMO elements are joined together by a thin line.
Is it mandatory to join the grounds of various MIMO elements, or it is not necessary.
I collected the data from 14 participants by using an item-Likert scale (1-5) related to workload measurement. The same participants rated the scale for six different-sized keyboard designs by considering workload. To understand if there is a significant difference among the six keyboard designs, I applied the non-parametric Friedman test. I found a statistical difference, so I applied Wilcoxon signed-rank test with Bonferroni correction for pairwise comparisons.
My question: While I found no significant difference between LL-SS, there is a significant difference between ML-SS, though LL and ML have the same mean. Is this true? adjusted alpha value = 0,05 / 15 = 0,00333, the Wilcoxon result is 0,00306 for ML-SS. So I considered it a significant difference. Did I make it correct?
Looking forward to your support!
I attached the workload results and spss outcome.
I am a qualitative research newbie. This study is for my doctoral dissertation. I am interviewing music professors about their experiences working with student musicians who have an occupational injury. I wish to understand and discuss the themes that emerge most frequently from the interviews of my 15 subjects. Does analysis of my interview data using the code frequency method mean that my research is both qualitative and quantitative?
I am doing a systematic review of a psychological intervention in child populations, and the currently published studies are all feasibility studies of varying sizes and designs, made for children with different psychological problems and diagnoses.
It seems that the majority of these studies use reliable change index in their statistical analysis (although they do use them a bit differently). I am thinking that it could be good to compare how many patients achieved reliable/clinically significant change across these studies...
I have tried to find studies that have done this without much luck, does anyone know if such comparisons would be a good idea, or if anyone has done this before? (or how this could be done better in other ways?).
The heterogeneity of designs etc of the studies I have found make proper meta-analysis impossible, but I am trying to find a good way to present their findings, and I am not sure how this could be done properly (I have not found many/any reviews of feasibility studies).
Any advice or references to studies would be greatly appreciated.
I'm trying to apply the Taguchi method to improve the injection parameters of a plastic article.
I was planning to start with a Plackett and Burman design to deduce the most influential parameters and then use the Box-Behnken design to find the optimum values of the parameters.
Is it possible to make a Taguchi design based on the most important parameters deduced from the PB design instead?
In a field experiment if there is possibility for both designs factorial RBD and Split plot design, and both qualifies as per degrees of freedom, then what design should be choose and why?
I asked 14 participants "what is your preference among six keyboard designs?" The data resulted in the following picture. For that case, which statistical method should I use? I thought that fisher's exact test is suitable for that case, but I could not be sure.
Dear All I have simulated two 4x14 patch antenna arrays in HFSS 3D layout for 24.2 GHz frequency. Coaxial feeding by using via is used for both designs. I have counter checked simulated design by using CST both designs performed very well. The main difference between both designs is dimension of patch antenna element. For design fabrication a professional firm is hired. When fabricated designs are tested the measured result of one design significantly matched with simulated one while other one has quite different result.
The results are attached
This explicitly refers to designs or concepts that aim to convey or foster knowledge about barriers and the concept of accessibility.
I am looking for literature discussing the designs of paper-based microfluidic devices, especially the one that highlights the flow rate, channel widths, etc. If there is any, list down their titles.
Lange (2001) asks African designers to “embark on a quest to reflect... diversity,
to challenge modernist conventions and produce a graphic design that is essentially local in its components while also being internationally competitive” to change the esthetic standards defined based on Western culture and previously considered “aesthetic standard.
Ricoeur (2007: p. 52) invites to go back to our own origins to face the expansion of the universal culture. He declares to this effect that to confront a foreign culture, one must first have one’s own culture and identity.
Modern graphic designers must develop more designs that portray to their nation’s style and create stylish designs with new meaning. We have the responsibility and obligation to complete our country’s culture in the contemporary graphic design scene, and to promote it. This will bring a breakthrough in modern graphic design in the future. Today, national culture (and the elements it provides) is undoubtedly an important aspect of a nation’s development. In the context of design or art, its use allows development of national esthetic specificities; a cultural identity.
How to make a fair utilized area comparison between three designs; one of them uses 4-input LUT, the second one uses 6-input LUT, and the last one which is an Altera based uses ALUTs ( Adaptive LUT). Can I calculate the equivalent gate size/number for each case?
I have data of multiple environment trials (MET) , the design used was an augmented design in each of the locations. And I would like to analyze the data with Rstudio or a free software.
Does anyone know a library in R, guideline or free software to analyze (ANOVA) the augmented designs data?
Thanks in advance
I have been tasked with designing a 4 (maybe 5) degree of freedom of freedom robotic arm that will be installed in the back of a van. Being in a van, its of importance for it to be as lightweight as possible while drawing as little energy as possible. Does anyone have any experience with this and can pass some ideas? Any sources for designs maybe?
Can any one of you suggest some papers related to the material selection in fabricating industrial robotic grippers?
I know that researchers widely cover the soft type robotic grippers. But, I want to find more papers to extend my understanding of the material selection for the rigid type grippers.
So please, if you have a paper relevant to this field, please share it with me.
What type of valves are BRU-A valves?
Are there different designs depending on the VVER design?
Are they motorized valves powered by direct or alternating current?
How long are BRU-A valves available during an SBO sequence?
This Project continues as an examination of ALL-PASS Band-Pass circuits.
Project Paper : updated Feb 01, 2022
in Active-Band-Pass Circuits"
emphasizing the use of All-Pass filters.
- - - Here, we continue our earlier "AFX" Project, which was presented in RGN at :
Introduction for the "AFC" project :
We examine the "ALL-PASS-FILTER" and develop an Analog Narrow-Band-Pass Audio Filter, which has immediate application in receiving Morse Code signals in a Amateur Radio Station.
Our resulting model is an experiment to gather this data.
A Proper Analysis of this design may aid in understanding the nature of All-Pass Filtering. Once an adequate system equation is achieved, then resulting models may be useful in designing Band-Pass Filters for Audio applications which can be based on Non-Resonant Phase-Filtered circuits, similar to our "AFX" design.
All-Pass (phase-shifting) filters have frequency responses which must be " zero at w=0 and at w=pi ". From the research, This means that AllPass Filters cannot be used for (1) Low-Pass nor (2) High-Pass nor (3) Band-Pass designs. This is because the resulting combination of waveforms are homogeneous ; ie, the combinations are always simple phase shifts,
producing no frequency & amplitude changes. ... *** The authors have developed working Dual-Notch Band-Pass circuits which (1) perform a BAND-PASS function which is f(0) peaked at 700Hz. (2) generates DUAL-NOTCHES around f(0) at plus/minus aprox. 200 Hz . The current All-Pass project is titled : "AFC"
*** First Experimental Target : (1) Utilize All-Pass stages to replace resonance tuned Active-BandPass stages.
(2) Reduce Number of MFB active filter stages required to Align Signal Phases (a) in order to support Dual-Notch Generation around f(0) ; (b) in support of our previous project "AFX" "AFV-3RL-v4F-D-vQ-Man".
... Continued Project now uses the Schematic in the groups:
AFC_1R-1A-12A-2F-Sum-S-451 and AFC-3R-2F-8A-Dif-S-451 .
The Bode plot and Magnitude plot are in the pre-paper.
The Problem to be resolved is why this design (1) using one All-Pass Lo-Pass paralleled with twelve All-Pass Hi-Pass Filters (2) will produce an Wave-Form Output in the Bode plot. ...The Problem to be resolved is " Why Do One APF Lo-Pass paralleled with Twelve APF Hi-Pass interact in an unfamiliar manner.
This "AFC" project is derived from our previous "AFX" project
Our long series of projects in Analog Narrow Band-Pass Filters has been presented on our website at : http://www.geocities.ws/glene77is/
2021 Oct 12 ...This Project continues as an examination of ALL-PASS Band-Pass circuits. ...This "AFC" project is derived from our previous "AFX" project https://www.researchgate.net/post/Are-there-any-Analog-Active-Audio-Filters-that-match-any-Digital-Signal-Processing-filters. ...
Latest upload: 2021 Oct26
We have a paper attached : "AFC_All-Pass_Phase-Filter_Paper.pdf"
Latest upload: 2021 Nov 29
If I want to know how many Americans have certain mental health concerns and if that proportion is increasing or decreasing, what would be (a) A focused research question using PICO (b) What study design should I choose and why? Why not one of the other designs?
I have a few designs that were constructed computationally using Rosetta. Almost all papers using Rosetta design use high-throughout yeast display, encoding the designs in oligo pools, amplifying, and transforming into yeast.
How expensive is the process of getting apparent KDs from this approach to identify if any of the proteins bind to the intended target as a first line screening protocol before employing other experiments like BLI.
It is only a few designs, and I am not testing thousands of designs like many of the Rosetta papers.
I would always be willing to collaborate with experimental groups for authorship.
some of the economical benefits which arises from the usage of cultural patterns and motifs by some local businesses to improve their products in terms of their designs and appearances.
What kind of scientific research dominate in the field of New ideas, new concepts in science, in art, in business?
The new idea often contains something innovative in relation to what was previously invented, created, designed and manufactured.
New ideas take different forms of new solutions, new concepts, new models, new designs, new axioms, new directions of scientific thought, and many other forms that incorporate any aspects of novelty.
While conducting scientific research, new ideas, ideas, inventions, techniques, technologies, innovations, etc. are created. Thanks to this, the economy is developing, and civilization and cultural progress are being realized. Therefore, it is necessary to create good standards and conditions, including financing the conduct of research projects.
Please reply. I invite you to the discussion
Need to clarify something,
If I have two factors, F1 (4 levels a, b, c, d) and F2 (i, ii, iii, iv) what statistical design will be the best suitable!. Let's suppose F1 is 4 levels of potash fertilizers and F2 includes 4 spiked Cd soil concentrations.
Can we randomize all 16 sets of treatments together and both designs are the same in sense of the physical arrangement of pots and statistical explanation?
Means ANOVA under CRD (2way)= ANOVA under Factorial (2-way interaction), as CRD is the physical arrangement of experimental sets while 2 way ANOVA under CRD (or Factorial ANOVA with 2 way interaction) is used to explain the statistical significance of data (of course data arrangement can differ).
Please give your valuable suggestions
Now the teachers are facing problems in providing proper feedback on the designs of the students of architecture.
So, what new/alternate methods of "Engagement/Discussion/Crits" have you discovered with your students during online teaching-learning of studio-based subjects e.g. Architectural Design, etc.?
Please share your experiences here.
Thanks in anticipation.
I have conducted a survey which included several Likert-type questions. For example, I ask participants to evaluate the strength of two different study designs, design A and design B. For each design, respondents can choose between eight different options: very strong, rather strong, moderate, rather weak, very weak, and no opinion.
I would like to compare respondents' evaluations for design A and design B: are there any differences in how they view the strength of these two designs? Considering that my data are ordinal (having converted each answer option to a number, e.g. very weak = 1, very strong = 7), and since every respondent evaluated both designs, I wanted to use the Wilcoxon signed rank test. Would this be the correct test to use in this case?
Additionally, I was wondering what to do with the 'no opinion' answers. I have not assigned them a number yet. They are few in number (around 20 for a total sample size of approximately 550). Should I assign them the number 0, or should I treat them as missing data? For my Wilcoxon test, if I treat 'no opinion' answers as missing data, the associated pairs would be excluded from analysis (e.g. if someone answered 'very strong' and 'no opinion' for design A and B respectively, there would be no pairing possible, and both answers would be excluded from the analysis). What would be the best approach to take here?
I am a PhD student. I want to integrate results of systematic review, qualitative and quantitative studies. The designs i see are mainly QUAN and QUAL. Please how do i address SYST+QUAL+QUAN? They are separate studies with different aims and i conducted the studies concurrently due to time and funding constrains.
I reached out to John W Creswell. He referred me to his book "Please go to my book, "A Concise Introduction to Mixed Methods Research" (SAGE, 2015) to see the mixed methods designs possible. Thanks. John W Creswell".
I'm conducting a statistical study of the user performance for different Base station topology and array designs. Can anyone suggest some materials/papers regarding the same? For eg, when I increase the number of the array elements at the base station, my EVM values is still higher than the case when there is less number elements at BS. A scattering-based channel model is used.
What quality assessment tools are available for a meta-analysis when the papers contain case-control, cohort, x-sectional, prospective designs?
Many SIW designs (if not all) employ metallic posts with some gaps between them. This might lead to leakage radiation as I understood.
My questions: Why cannot we design sidewalls instead of periodic metallic posts/vias? Would this reduce the leakage losses? Or is it just for facilitating the fabrication process?
Thanks in advance.
I have been struggling for many years to educate software researchers about two simple facts (i) there are two kinds of engineering paradigms, which are Component-based and non-Component-based as illustrated at: http://real-software-components.com/raju/TwoKindsOfParadigms.pdfand (ii) today engineering discipline for Civil, Chemical or Software not employing Component-based paradigm. I shall not relent until the unknown fact (i.e. today software engineering is not employing Component-based paradigm) is understood and accepted.
Until the above unknown fact is known, no one in software industry will try to invent tools, methods and technologies essential for transforming software engineering paradigm from (a) complex, inefficient, and error-prone non-Component-based paradigm (that is infested by notorious spaghetti code) to (b) Component-based paradigm that is ten-times more efficient (e.g. by eliminating the notorious spaghetti code from design and development of each large or complex software product).
History of industrial engineering proves that Component-based paradigm can increase manual productivity, quality and agility (where the agility can reduce total cost of ownership such as maintenance and redesign) by ten times: http://real-software-components.com/raju/Briefs/BenifitsOfRealCBE.pdf.
Today each large software product (i.e. an excitable code) is built as a big monolith as illustrated by FIG-1, which is certainly not a Component-based-paradigm. Today software experts insist that FIG-1 represents Component-based paradigm, since it uses reusable parts or modules that are composed but are not assembled. Please refer to figures in the attached PDF.
Essential condition for Component-based paradigm is building each product by assembling multiple modules or components as illustrated in FIG-2, which requires invention of real software components that can be assembled, and essential tools and mechanisms for plugging-in all the components to build the software product.
For example, even in case of Civil engineering, it is possible to increase manual productivity by ten times (by employing even a rudimental and primitive Component-Based paradigm) such as: https://www.youtube.com/watch?v=AhLk7L1B_fE. But such component-based paradigm requires very expensive material or components.
That is, Civil engineering has a huge drawback – Cost or material for building all the components is 5 to 7 times more, which increases the total cost of each building 3 to 4 times (compared to existing method that uses cheaper material cement, bricks, concrete, sand and steel). The cost of material is about 80% of the cost of the building.
In case of software, over 90% of the cost is spent on manual effort for designing and building large code base of each software product, which includes design and development of code for each of the modules/parts used in the software product and more code for integrating the modules/parts to build the product. In fact, cost of material (i.e. code) to build each module is much cheaper in case of software products that are designed and built by employing real-Component-based paradigm.
The following three inventions are essential for achieving effective CBE-paradigm for software and we already secured patents for the 3 inventions: http://real-software-components.com/raju/pdfs/PatentedInventions.pdf
1. Simple and effective methodologies for partitioning each large software product in FIG-1 into multiple self-contained modules or components in FIG-2,
2. Inventions of missing technologies or tools necessary for creating and using self-contained modules or components in FIG-2 for building the product, and
3. Inventions of tools and mechanisms that can automate various tasks and activities to create, redesign, and manage communication code that is essential for allowing communication between the modules and components.
In summary, it is not hard to prove this simple rule: If an engineering paradigm that designs and builds large or complex products (without using Component-based paradigm), it is not hard to increase manual productivity by ten-fold for designing and building each large or complex product, by transforming the engineering paradigm to Component-based paradigm: http://real-software-components.com/raju/Briefs/BenifitsOfRealCBE.pdf.
Software engineering is most certainly not an exception to above rule. But no one in the software community is willing to explore the possibility of transforming software engineering to Component-based paradigm (as illustrated in FIG-2), since everyone falsely concluded that software engineering already employing components and Component-based paradigm.
Therefore, I am forced to expose this unknown heretical fact that, software engineering is not employing Component-based paradigm: http://real-software-components.com/raju/Briefs/InventionBriefly.pdf
I have a set of de novo designed miRNA sequences and I would like to present the different elements related to the secondary structures. Would you please let me know the suitable tool to reach my target?
I am currently writing a manuscript of a cohort study. In order to strengthen the discussion and better provide the evidence, I would like to incorporate current cohort study data into a meta-analysis. This would be possible because the demographic of my present study subjects is different from any published studies. However, I wonder whether I should publish the cohort study first, then make a meta-analysis or I can directly combine both designs into one article?
I have been working on Mathematical modeling, design and Simulations of clarifier systems for wastewater treatment. The clarifiers is envisaged to handle 40000litres capacity of wastewater at a time. Which of Circular or Rectangular clarifier modeling and designs will you recommend to handle Industrial wastewater?
I am working on a sensor design for the detection of a specific range of concentrations. There are two designs, a primary one and an optimized one. Both devices are tested and the optimized sensor shows higher sensitivities as the linear calibration slope is 20 times higher. After statistical analysis of both devices through ANOVA and posthoc pairwise analysis, it was established that both devices can distinguish every two pairs of tested concentrations with statistically significant differences. Now, I want to discuss how higher sensitivity in the optimized model would be beneficial but I am not sure what method would be suitable for this purpose as two designs are capable of statistically differentiate between all experimented concentrations. I appreciate your help and suggestions.
I am looking for literature on preferably ASD research focusing on behavioural interventions with single case designs. Specifically where children or adolescents receiving the intervention are self-reporting on their weekly or daily progress (whatever the frequency of the measurements).
For a research project we are investigating whether we would like our clinical group to assess their own progress, or to let parents/teachers/caregivers rate. All methods have their own downsides so we are simply seeing what the best options are.
Thank you in advance!
One of the critical tasks of a researcher is to choose an an appropriate blueprint to interrogate the issues of a research . The multi-methods design appears to be gaining currency in social science research in recent times. Since the choice of designs are driven by a researcher's paradigmatic orientation, which of the research orientations are associated with the multi-methods design?
I am currently conducting a systematic review on interventions, but for my inclusion criteria for study designs I included single group pre-post studies as well under quasi-experimental designs.
I conducted a meta-analysis on the studies that run a RCT and CCT design. However is it possible to conduct a separate meta-analysis on the single group pre-post studies, and pool the effect size together? Or should I report the individual studies and their respective effect sizes?
Any help would be much appreciated!
I want to know what are architects priority when he designs an apartment building. Is it appearance of building or proper citing of common areas or better returns for developers' money and so on?
The two designs seem to have some common characteristics as both of them correlate variables. Therefore, how could one differentiate between them?
I am a research scholar, working on wearable antenna design for wireless and biomedical application. I have 2 questions.
1. simulations are being done for some designs and obtaing gain less than 7dB for all my designs(slotted patch with partial ground plane). In on-body scenario, this simulated 7dB may decrease. so i want to know the value of gain in practical wearable environment (especially in WBANs).
2. In some cases, i am getting -ve gain and directivity or -ve value for radiation efficiency with good S11 value (-30 to -40dB). I unable to find the solution for this. Here, substrate material is polyimide.
I have been reading this paper on how to analyze linear relationships using a repeated measure ANOVA:
I was wondering though once you establish a linear relationship across your categorical variables (A, B, C, D) how can you check if the difference across conditions A vs. B vs. C vs. D is also significant?
I have been using pairwise t-tests (A vs. B; B vs. C; C vs. D), but is there a better test to look at this?
Just for completeness, I have been using "ols" from "statsmodels" to check for the linear relationship, and "pairwise_ttests" from "pingouin" to run post-hoc tests in Python.
As you know the nature of split mouth design is paired, however when we have both split mouth RCTs and Parallel RCTs:
1) Can we summarize and merge both study designs in one analysis?
2) should we find External Correlation amount for split mouth design?
3) generally, How to do meta-analysis with split mouth designs?
I am working with a small group of subjects (n=6), and I intend to employ multiple ABA single-subject designs for each of my subjects. I would appreciate it if someone could guide me about the statistical test/s that can be used to study the impact of the intervention.
Conditional logistic regression is used to manage the presence of sparse data due to confounders using matching to improve the precision of estimation. Most statisticians and researchers recommend me to use statal.
1. Can I conduct conditional logistic regression using SPSS?. Why?
2. What is the advantage of STATA over SPSS in analyzing matched case-control study designs (1case:1control approach)?
Looking for materials involving information (Basics to advanced) on Randomised Response Surface Methodology Designs for optimisation problems in engineering.
Please share your views ,is it compulsory to design basic patch, substrate, ground of fractal antenna as per mathematical calculations of microstrip antenna, because thereay be alot of designs which can't be made on that particular dimensions so can we consider dimensions as per requirement of design so as to achieve required results of antenna at required frequency of operation?
My hypothetical experiment is a pretest posttest nonequivalent control group quasi experimental design. I am testing the effectiveness of a math anxiety intervention between two middle school pre-algebra classes from the same school. I understand that internal validity is an issue with non-equivalent control group designs. What tests are suitable to determine whether the two groups are suitable to use? Also, what is the best approach to analyze data? If I am looking at mean scores for pretest and posttest, do I use ANOVA testing? I have read that this might not be the best choice.
does anybody know how many similarity judgments I need for each pair of stimuli in multidimensional scaling. For clarification: I am not talking about the number of comparisons. I know that there is the option for reduced designs, but this is not what I mean. I mean how many subjects have to indicate their perceived similarity among two stimuli. Does anybody know a source that provides an answer to this question?
Hello to all,
reading some scientific articles I came across experiments conducted with orthogonal designs. I think I have sensed its great potential, especially regarding the reduction of samples.
It seems that many authors use these designs to rank the different factors (decided a priori) that influence a certain response variable. For example, I might want to evaluate how temperature (20.25.30 ° C), a different type of soil (clayey, sandy and silty) and a fertilizer (A, B, C) influence the microbial respiration of the soil.
I have seen that many authors generate the drawing (and there are programs like SPSS that do it automatically), then they draw up a classification 8ranking) of these factors (for example the temperature and the factor that least influences breathing while the fertilizer is the one that has the greatest effect), then some report that the differences are calculated with ANOVA, but how is it possible to conduct a test with such an experimental design? I don't have the "classic repetitions" and this thing confuses me...
Does anyone have experience or practical knowledge of using Fresnel lenses on photovoltaic panels? Are there techniques, designs or applications where application is advantageous? Discussion is welcome ...
Switching frequency is usually a free parameter in power electronic converter designs. Some example criterion can be:
1 - Resonance frequency or soft-switching criterion
2 - Efficiency criterion
3 - Volume and component sizing criterion
4 - Power quality criterion
5 - Control and stability criterion
If you have studied a research paper in this regard please let me know.
I have seen unit cell designs that only perform at normal incidence while there are some designs that perform for a wide range of incidence angles. what is the reason or explanation behind this. what makes them angularly stable.