Question

I have around 180 responses to 56 questions. Each respondent was asked to rate each question on the sale of -1 to 7. This is based on Schwartz (1992) Theory and I decided to keep it the same.
I used Principal Components as the method, and Oblique (Promax) Rotation. I had to modify iterations for Convergence from 25 to 29 to get rotations.
Looking at the Pattern Matrix Table (on SPSS). I noted that there are some cross loading taking place between different factors/ components. These are greater than 0.3 in some instances and sometimes even two factors or more have similar values of around 0.5 or so.
What do I do in this case? Do I remove such variables all together to see how this affects the results? My initial attempt showed there was not much change and the number of factors remained the same.
To clarify, as I have 56 variables, I am trying to reduce this to underlying constructs to help me better understand my results. Using Factor Analysis I got 15 Factors with with 66.2% cumulative variance.

2nd Jul, 2020
Simon Dang
Griffith University
All of the responses above and others out there on the internet seem not backed by any scientific references. For that reason, this response aims to equip readers with proper knowledge from a book of a guru in Statistics, Joseph F. Hair, Jr.
First, it must be noted that the term cross-loading stemmed from the idea that one variable has moderate-size loadings on several factors, all of which are significant, which makes the interpretation job more arduous.
A loading is considered significant (over a certain threshold) depending on the sample size needed for significance [1], which can be seen as follow:
-----------------------------
.30 - 350
.35 - 250
.40 - 200
.45 - 150
.50 - 120
.55 - 100
.60 - 85
.65 - 70
.70 - 60
.75 - 50
-----------------------------
When a variable is found to have more than one significant loading (depending on the sample size) it is termed a cross-loading, which makes it troublesome to label all the factors which are sharing the same variable and thus hard to make those factors be distinct and represent separate concepts. The ultimate goal is to reduce the number of significant loadings on each row of the factor matrix (i.e. make each variable associate with only one factor). The solution is to try different rotation methods to eliminate any cross-loadings and thus define a simpler structure. If the cross-loadings persist, it becomes a candidate for deletion. Another approach is to examine each variable's communality to assess whether the variables meet acceptable levels of explanation. All variables with communalities less than .50 are viewed insufficient.
RESPECIFY THE MODEL IF NEEDED
What if we should not eliminate the variable base on rigid statistics because of the true meaning that a variable is carrying? Problems include (1) a variable has no significant loadings, (2) even with a significant loading, a variable's communality is deemed too low, (3) a variable has a cross-loading. In these cases, researchers can take any combination of the following remedies:
+ Ignore those problematic variables and interpret the solution as is but the researcher must note that the variables in question are poorly presented in the factor solution
+ Consider possible deletion: depending on the variable's overall contribution to the research as well as its communality index. If the variable is of minor importance to the study's objective and also has unacceptable communality value, then delete it and derive new factors solutions with those variables omitted.
+ Employ alternative rotation method: could be oblique method if only orthogonal had been used.
+ Decrease/increase the number of factors retained: to see whether a smaller/larger factor structure will solve the problem.
+ Modify the type of factor model used (component versus common factor): to assess whether varying the type of variance considered affects the factor structure.
Note:
No matter which options are chosen, the ultimate objective is to obtain a factor structure with both empirical and conceptual support. As we can see, many tricks can be used to improve upon the structure, but the ultimate responsibility rests with the researcher and the conceptual foundation underlying the analysis. Indeed, some empirical researches chose to preserve the cross-loadings to support their story-telling that a certain variable has indeed double effects on various factors [2]. So, ultimately, it's your call whether or not to remove a variable base on your empirical and conceptual knowledge/experience.
Reference:
[1] Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2009). Multivariate Data Analysis 7th Edition Pearson Prentice Hall
[2] Le, T. C., & Cheong, F. (2010). Perceptions of risk and risk management in Vietnamese Catfish farming: An empirical study. Aquaculture Economics & Management, 14(4), 282-314. https://doi.org/10.1080/13657305.2010.526019
16 Recommendations

2nd Jul, 2020
Simon Dang
Griffith University
All of the responses above and others out there on the internet seem not backed by any scientific references. For that reason, this response aims to equip readers with proper knowledge from a book of a guru in Statistics, Joseph F. Hair, Jr.
First, it must be noted that the term cross-loading stemmed from the idea that one variable has moderate-size loadings on several factors, all of which are significant, which makes the interpretation job more arduous.
A loading is considered significant (over a certain threshold) depending on the sample size needed for significance [1], which can be seen as follow:
-----------------------------
.30 - 350
.35 - 250
.40 - 200
.45 - 150
.50 - 120
.55 - 100
.60 - 85
.65 - 70
.70 - 60
.75 - 50
-----------------------------
When a variable is found to have more than one significant loading (depending on the sample size) it is termed a cross-loading, which makes it troublesome to label all the factors which are sharing the same variable and thus hard to make those factors be distinct and represent separate concepts. The ultimate goal is to reduce the number of significant loadings on each row of the factor matrix (i.e. make each variable associate with only one factor). The solution is to try different rotation methods to eliminate any cross-loadings and thus define a simpler structure. If the cross-loadings persist, it becomes a candidate for deletion. Another approach is to examine each variable's communality to assess whether the variables meet acceptable levels of explanation. All variables with communalities less than .50 are viewed insufficient.
RESPECIFY THE MODEL IF NEEDED
What if we should not eliminate the variable base on rigid statistics because of the true meaning that a variable is carrying? Problems include (1) a variable has no significant loadings, (2) even with a significant loading, a variable's communality is deemed too low, (3) a variable has a cross-loading. In these cases, researchers can take any combination of the following remedies:
+ Ignore those problematic variables and interpret the solution as is but the researcher must note that the variables in question are poorly presented in the factor solution
+ Consider possible deletion: depending on the variable's overall contribution to the research as well as its communality index. If the variable is of minor importance to the study's objective and also has unacceptable communality value, then delete it and derive new factors solutions with those variables omitted.
+ Employ alternative rotation method: could be oblique method if only orthogonal had been used.
+ Decrease/increase the number of factors retained: to see whether a smaller/larger factor structure will solve the problem.
+ Modify the type of factor model used (component versus common factor): to assess whether varying the type of variance considered affects the factor structure.
Note:
No matter which options are chosen, the ultimate objective is to obtain a factor structure with both empirical and conceptual support. As we can see, many tricks can be used to improve upon the structure, but the ultimate responsibility rests with the researcher and the conceptual foundation underlying the analysis. Indeed, some empirical researches chose to preserve the cross-loadings to support their story-telling that a certain variable has indeed double effects on various factors [2]. So, ultimately, it's your call whether or not to remove a variable base on your empirical and conceptual knowledge/experience.
Reference:
[1] Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2009). Multivariate Data Analysis 7th Edition Pearson Prentice Hall
[2] Le, T. C., & Cheong, F. (2010). Perceptions of risk and risk management in Vietnamese Catfish farming: An empirical study. Aquaculture Economics & Management, 14(4), 282-314. https://doi.org/10.1080/13657305.2010.526019
16 Recommendations

## Top contributors to discussions in this field

19th Mar, 2017
Christian Vollmer
Pädagogische Hochschule Tirol
If you used the same scale as Schwartz (1992) then maybe you can confirm the latent model that was found by Schwartz (1992) with your data with confirmatory factor analysis.
If don't get a good model fit for the detailed model, then try to fit the higher order factors I am assuming that you are talking about the Human Values developed by S. H. Schwartz?
20th Mar, 2017
Wali Ur Rehman
University of Essex
You need to see the communality table after looking at the Pattern Matrix. If you see any item cross loading, see the items, if the Communality is less than 0.5, try removing those items from further analysis. Remember that the deletion of the items should not affect the Factor theoretically. It should be theoretically justified. If the item is important and its deletion can affect the content validity of the construct, you may need to retain it.
2 Recommendations
20th Mar, 2017
Loughborough University
Many thanks Christian. I considered doing the CFA but I do not appear to meet the minimum criteria for CFA, due to the sample size. I however wondered what you meant by trying to fit a higher order factor?  I am talking about the Human Values indeed.
20th Mar, 2017
Loughborough University
Thank you Wail. I checked the Communalities table and all the figures are considerably larger than 0.5. What would you try to do next?
20th Mar, 2017
Wali Ur Rehman
University of Essex
Now try deleting the cross loaded items. Repeat the process unless you get maximum items loading on only one factor at a time.
1 Recommendation
21st Jun, 2019
Aurelius arlitha Chandra
Universitas Sebelas Maret
hello... my friends
I have one question ...
I am currently researching with factor analysis methods using the SPSS application
what i want to ask is:
when viewing the results of the "Rotated Component Matrix" there is one variable that has a value below 0.5
then, how to overcome the situation?
thank you
2nd Jul, 2020
Simon Dang
Griffith University
All of the responses above and others out there on the internet seem not backed by any scientific references. For that reason, this response aims to equip readers with proper knowledge from a book of a guru in Statistics, Joseph F. Hair, Jr.
First, it must be noted that the term cross-loading stemmed from the idea that one variable has moderate-size loadings on several factors, all of which are significant, which makes the interpretation job more arduous.
A loading is considered significant (over a certain threshold) depending on the sample size needed for significance [1], which can be seen as follow:
-----------------------------
.30 - 350
.35 - 250
.40 - 200
.45 - 150
.50 - 120
.55 - 100
.60 - 85
.65 - 70
.70 - 60
.75 - 50
-----------------------------
When a variable is found to have more than one significant loading (depending on the sample size) it is termed a cross-loading, which makes it troublesome to label all the factors which are sharing the same variable and thus hard to make those factors be distinct and represent separate concepts. The ultimate goal is to reduce the number of significant loadings on each row of the factor matrix (i.e. make each variable associate with only one factor). The solution is to try different rotation methods to eliminate any cross-loadings and thus define a simpler structure. If the cross-loadings persist, it becomes a candidate for deletion. Another approach is to examine each variable's communality to assess whether the variables meet acceptable levels of explanation. All variables with communalities less than .50 are viewed insufficient.
RESPECIFY THE MODEL IF NEEDED
What if we should not eliminate the variable base on rigid statistics because of the true meaning that a variable is carrying? Problems include (1) a variable has no significant loadings, (2) even with a significant loading, a variable's communality is deemed too low, (3) a variable has a cross-loading. In these cases, researchers can take any combination of the following remedies:
+ Ignore those problematic variables and interpret the solution as is but the researcher must note that the variables in question are poorly presented in the factor solution
+ Consider possible deletion: depending on the variable's overall contribution to the research as well as its communality index. If the variable is of minor importance to the study's objective and also has unacceptable communality value, then delete it and derive new factors solutions with those variables omitted.
+ Employ alternative rotation method: could be oblique method if only orthogonal had been used.
+ Decrease/increase the number of factors retained: to see whether a smaller/larger factor structure will solve the problem.
+ Modify the type of factor model used (component versus common factor): to assess whether varying the type of variance considered affects the factor structure.
Note:
No matter which options are chosen, the ultimate objective is to obtain a factor structure with both empirical and conceptual support. As we can see, many tricks can be used to improve upon the structure, but the ultimate responsibility rests with the researcher and the conceptual foundation underlying the analysis. Indeed, some empirical researches chose to preserve the cross-loadings to support their story-telling that a certain variable has indeed double effects on various factors [2]. So, ultimately, it's your call whether or not to remove a variable base on your empirical and conceptual knowledge/experience.
Reference:
[1] Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2009). Multivariate Data Analysis 7th Edition Pearson Prentice Hall
[2] Le, T. C., & Cheong, F. (2010). Perceptions of risk and risk management in Vietnamese Catfish farming: An empirical study. Aquaculture Economics & Management, 14(4), 282-314. https://doi.org/10.1080/13657305.2010.526019
16 Recommendations

## Related Publications

Article
The essential idea in all analytical rotation schemes for approximating Thurstone's simple structure is to split the factor loadings into two groups, the elements of the one tending toward zero, and of the other, toward unity. In the PROMAX method, the loadings of a preliminary factor pattern are raised to a fixed power to provide a target for Proc...
Article
Items from the 40-item Brief Index of Self-actualization were submitted to principal factor analysis with promax and oblique rotation (N = 620). With eigenvalues greater than 1 and factor loadings of .40 or higher, 32 items were retained as four factors, without overlap. A revised Brief Index of Self-actualization is presented as an improved measur...
Article
Full-text available
Sra. Directora: En respuesta a la Carta al Director “Sobre el análisis factorial en la validación de una escala”, que hace referencia al análisis estadístico del artículo “Estudio piloto para la validación de una escala sobre el entorno de práctica enfermera en el Hospital San Cecilio” del cual soy autor, me complace hacer algunos comentarios. En p...
Got a technical question?