Literature DB >> 33870039

The Use of Research Methods in Psychological Research: A Systematised Review.

Salomé Elizabeth Scholtz1, Werner de Klerk1, Leon T de Beer2.   

Abstract

Research methods play an imperative role in research quality as well as educating young researchers, however, the application thereof is unclear which can be detrimental to the field of psychology. Therefore, this systematised review aimed to determine what research methods are being used, how these methods are being used and for what topics in the field. Our review of 999 articles from five journals over a period of 5 years indicated that psychology research is conducted in 10 topics via predominantly quantitative research methods. Of these 10 topics, social psychology was the most popular. The remainder of the conducted methodology is described. It was also found that articles lacked rigour and transparency in the used methodology which has implications for replicability. In conclusion this article, provides an overview of all reported methodologies used in a sample of psychology journals. It highlights the popularity and application of methods and designs throughout the article sample as well as an unexpected lack of rigour with regard to most aspects of methodology. Possible sample bias should be considered when interpreting the results of this study. It is recommended that future research should utilise the results of this study to determine the possible impact on the field of psychology as a science and to further investigation into the use of research methods. Results should prompt the following future research into: a lack or rigour and its implication on replication, the use of certain methods above others, publication bias and choice of sampling method.
Copyright © 2020 Scholtz, de Klerk and de Beer.

Entities:  

Keywords:  psychological research; research approach; research designs; research methods; research topic; research trends; systematised review

Year:  2020        PMID: 33870039      PMCID: PMC8028395          DOI: 10.3389/frma.2020.00001

Source DB:  PubMed          Journal:  Front Res Metr Anal        ISSN: 2504-0537


Introduction

Psychology is an ever-growing and popular field (Gough and Lyons, 2016; Clay, 2017). Due to this growth and the need for science-based research to base health decisions on (Perestelo-Pérez, 2013), the use of research methods in the broad field of psychology is an essential point of investigation (Stangor, 2011; Aanstoos, 2014). Research methods are therefore viewed as important tools used by researchers to collect data (Nieuwenhuis, 2016) and include the following: quantitative, qualitative, mixed method and multi method (Maree, 2016). Additionally, researchers also employ various types of literature reviews to address research questions (Grant and Booth, 2009). According to literature, what research method is used and why a certain research method is used is complex as it depends on various factors that may include paradigm (O'Neil and Koekemoer, 2016), research question (Grix, 2002), or the skill and exposure of the researcher (Nind et al., 2015). How these research methods are employed is also difficult to discern as research methods are often depicted as having fixed boundaries that are continuously crossed in research (Johnson et al., 2001; Sandelowski, 2011). Examples of this crossing include adding quantitative aspects to qualitative studies (Sandelowski et al., 2009), or stating that a study used a mixed-method design without the study having any characteristics of this design (Truscott et al., 2010). The inappropriate use of research methods affects how students and researchers improve and utilise their research skills (Scott Jones and Goldring, 2015), how theories are developed (Ngulube, 2013), and the credibility of research results (Levitt et al., 2017). This, in turn, can be detrimental to the field (Nind et al., 2015), journal publication (Ketchen et al., 2008; Ezeh et al., 2010), and attempts to address public social issues through psychological research (Dweck, 2017). This is especially important given the now well-known replication crisis the field is facing (Earp and Trafimow, 2015; Hengartner, 2018). Due to this lack of clarity on method use and the potential impact of inept use of research methods, the aim of this study was to explore the use of research methods in the field of psychology through a review of journal publications. Chaichanasakul et al. (2011) identify reviewing articles as the opportunity to examine the development, growth and progress of a research area and overall quality of a journal. Studies such as Lee et al. (1999) as well as Bluhm et al. (2011) review of qualitative methods has attempted to synthesis the use of research methods and indicated the growth of qualitative research in American and European journals. Research has also focused on the use of research methods in specific sub-disciplines of psychology, for example, in the field of Industrial and Organisational psychology Coetzee and Van Zyl (2014) found that South African publications tend to consist of cross-sectional quantitative research methods with underrepresented longitudinal studies. Qualitative studies were found to make up 21% of the articles published from 1995 to 2015 in a similar study by O'Neil and Koekemoer (2016). Other methods in health psychology, such as Mixed methods research have also been reportedly growing in popularity (O'Cathain, 2009). A broad overview of the use of research methods in the field of psychology as a whole is however, not available in the literature. Therefore, our research focused on answering what research methods are being used, how these methods are being used and for what topics in practice (i.e., journal publications) in order to provide a general perspective of method used in psychology publication. We synthesised the collected data into the following format: research topic [areas of scientific discourse in a field or the current needs of a population (Bittermann and Fischer, 2018)], method [data-gathering tools (Nieuwenhuis, 2016)], sampling [elements chosen from a population to partake in research (Ritchie et al., 2009)], data collection [techniques and research strategy (Maree, 2016)], and data analysis [discovering information by examining bodies of data (Ktepi, 2016)]. A systematised review of recent articles (2013 to 2017) collected from five different journals in the field of psychological research was conducted.

Methods

Design

Grant and Booth (2009) describe systematised reviews as the review of choice for post-graduate studies, which is employed using some elements of a systematic review and seldom more than one or two databases to catalogue studies after a comprehensive literature search. The aspects used in this systematised review that are similar to that of a systematic review were a full search within the chosen database and data produced in tabular form (Grant and Booth, 2009).

Sampling

Sample sizes and timelines vary in systematised reviews (see Lowe and Moore, 2014; Pericall and Taylor, 2014; Barr-Walker, 2017). With no clear parameters identified in the literature (see Grant and Booth, 2009), the sample size of this study was determined by the purpose of the sample (Strydom, 2011), and time and cost constraints (Maree and Pietersen, 2016). Thus, a non-probability purposive sample (Ritchie et al., 2009) of the top five psychology journals from 2013 to 2017 was included in this research study. Per Lee (2015) American Psychological Association (APA) recommends the use of the most up-to-date sources for data collection with consideration of the context of the research study. As this research study focused on the most recent trends in research methods used in the broad field of psychology, the identified time frame was deemed appropriate. Psychology journals were only included if they formed part of the top five English journals in the miscellaneous psychology domain of the Scimago Journal and Country Rank (Scimago Journal & Country Rank, 2017). The Scimago Journal and Country Rank provides a yearly updated list of publicly accessible journal and country-specific indicators derived from the Scopus® database (Scopus, 2017b) by means of the Scimago Journal Rank (SJR) indicator developed by Scimago from the algorithm Google PageRank™ (Scimago Journal & Country Rank, 2017). Scopus is the largest global database of abstracts and citations from peer-reviewed journals (Scopus, 2017a). Reasons for the development of the Scimago Journal and Country Rank list was to allow researchers to assess scientific domains, compare country rankings, and compare and analyse journals (Scimago Journal & Country Rank, 2017), which supported the aim of this research study. Additionally, the goals of the journals had to focus on topics in psychology in general with no preference to specific research methods and have full-text access to articles. The following list of top five journals in 2018 fell within the abovementioned inclusion criteria (1) Australian Journal of Psychology, (2) British Journal of Psychology, (3) Europe's Journal of Psychology, (4) International Journal of Psychology and lastly the (5) Journal of Psychology Applied and Interdisciplinary. Journals were excluded from this systematised review if no full-text versions of their articles were available, if journals explicitly stated a publication preference for certain research methods, or if the journal only published articles in a specific discipline of psychological research (for example, industrial psychology, clinical psychology etc.).

Procedure

The researchers followed a procedure (see Figure 1) adapted from that of Ferreira et al. (2016) for systematised reviews. Data collection and categorisation commenced on 4 December 2017 and continued until 30 June 2019. All the data was systematically collected and coded manually (Grant and Booth, 2009) with an independent person acting as co-coder. Codes of interest included the research topic, method used, the design used, sampling method, and methodology (the method used for data collection and data analysis). These codes were derived from the wording in each article. Themes were created based on the derived codes and checked by the co-coder. Lastly, these themes were catalogued into a table as per the systematised review design.
Figure 1

Systematised review procedure.

Systematised review procedure.

Rigour

According to Johnston et al. (2019), “literature screening, selection, and data extraction/analyses” (p. 7) are specifically tailored to the aim of a review. Therefore, the steps followed in a systematic review must be reported in a comprehensive and transparent manner. The chosen systematised design adhered to the rigour expected from systematic reviews with regard to full search and data produced in tabular form (Grant and Booth, 2009). The rigorous application of the systematic review is, therefore discussed in relation to these two elements. Firstly, to ensure a comprehensive search, this research study promoted review transparency by following a clear protocol outlined according to each review stage before collecting data (Johnston et al., 2019). This protocol was similar to that of Ferreira et al. (2016) and approved by three research committees/stakeholders and the researchers (Johnston et al., 2019). The eligibility criteria for article inclusion was based on the research question and clearly stated, and the process of inclusion was recorded on an electronic spreadsheet to create an evidence trail (Bandara et al., 2015; Johnston et al., 2019). Microsoft Excel spreadsheets are a popular tool for review studies and can increase the rigour of the review process (Bandara et al., 2015). Screening for appropriate articles for inclusion forms an integral part of a systematic review process (Johnston et al., 2019). This step was applied to two aspects of this research study: the choice of eligible journals and articles to be included. Suitable journals were selected by the first author and reviewed by the second and third authors. Initially, all articles from the chosen journals were included. Then, by process of elimination, those irrelevant to the research aim, i.e., interview articles or discussions etc., were excluded. To ensure rigourous data extraction, data was first extracted by one reviewer, and an independent person verified the results for completeness and accuracy (Johnston et al., 2019). The research question served as a guide for efficient, organised data extraction (Johnston et al., 2019). Data was categorised according to the codes of interest, along with article identifiers for audit trails such as authors, title and aims of articles. The categorised data was based on the aim of the review (Johnston et al., 2019) and synthesised in tabular form under methods used, how these methods were used, and for what topics in the field of psychology.

Results

The initial search produced a total of 1,145 articles from the 5 journals identified. Inclusion and exclusion criteria resulted in a final sample of 999 articles (Figure 2). Articles were co-coded into 84 codes, from which 10 themes were derived (Table 1).
Figure 2

Journal article frequency.

Table 1

Codes used to form themes (research topics).

ThemeCode amountCodes
Social Psychology31Aggression SP, Attitude SP, Belief SP, Child abuse SP, Conflict SP, Culture SP, Discrimination SP, Economic, Family illness, Family, Group, Help, Immigration, Intergeneration, Judgement, Law, Leadership, Marriage SP, Media, Optimism, Organisational and Social justice, Parenting SP, Politics, Prejudice, Relationships, Religion, Romantic Relationships SP, Sex and attraction, Stereotype, Violence, Work
Experimental Psychology17Anxiety, stress and PTSD, Coping, Depression, Emotion, Empathy, Facial research, Fear and threat, Happiness, Humor, Mindfulness, Mortality, Motivation and Achievement, Perception, Rumination, Self, Self-efficacy
Cognitive Psychology12Attention, Cognition, Decision making, Impulse, Intelligence, Language, Math, Memory, Mental, Number, Problem solving, Reading
Health Psychology7Addiction, Body, Burnout, Health, Illness (Health Psychology), Sleep (Health Psychology), Suicide and Self-harm
Physiological Psychology6Gender, Health (Physiological psychology), Illness (Physiological psychology), Mood disorders, Sleep (Physiological psychology), Visual research
Developmental Psychology3Attachment, Development, Old age
Personality3Machiavellian, Narcissism, Personality
Psychological Psychology3Programme, Psychology practice, Theory
Education and Learning1Education and Learning
Psychometrics1Measure
Code Total84
Journal article frequency. Codes used to form themes (research topics). These 10 themes represent the topic section of our research question (Figure 3). All these topics except, for the final one, psychological practice, were found to concur with the research areas in psychology as identified by Weiten (2010). These research areas were chosen to represent the derived codes as they provided broad definitions that allowed for clear, concise categorisation of the vast amount of data. Article codes were categorised under particular themes/topics if they adhered to the research area definitions created by Weiten (2010). It is important to note that these areas of research do not refer to specific disciplines in psychology, such as industrial psychology; but to broader fields that may encompass sub-interests of these disciplines.
Figure 3

Topic frequency (international sample).

Topic frequency (international sample). In the case of developmental psychology, researchers conduct research into human development from childhood to old age. Social psychology includes research on behaviour governed by social drivers. Researchers in the field of educational psychology study how people learn and the best way to teach them. Health psychology aims to determine the effect of psychological factors on physiological health. Physiological psychology, on the other hand, looks at the influence of physiological aspects on behaviour. Experimental psychology is not the only theme that uses experimental research and focuses on the traditional core topics of psychology (for example, sensation). Cognitive psychology studies the higher mental processes. Psychometrics is concerned with measuring capacity or behaviour. Personality research aims to assess and describe consistency in human behaviour (Weiten, 2010). The final theme of psychological practice refers to the experiences, techniques, and interventions employed by practitioners, researchers, and academia in the field of psychology. Articles under these themes were further subdivided into methodologies: method, sampling, design, data collection, and data analysis. The categorisation was based on information stated in the articles and not inferred by the researchers. Data were compiled into two sets of results presented in this article. The first set addresses the aim of this study from the perspective of the topics identified. The second set of results represents a broad overview of the results from the perspective of the methodology employed. The second set of results are discussed in this article, while the first set is presented in table format. The discussion thus provides a broad overview of methods use in psychology (across all themes), while the table format provides readers with in-depth insight into methods used in the individual themes identified. We believe that presenting the data from both perspectives allow readers a broad understanding of the results. Due a large amount of information that made up our results, we followed Cichocka and Jost (2014) in simplifying our results. Please note that the numbers indicated in the table in terms of methodology differ from the total number of articles. Some articles employed more than one method/sampling technique/design/data collection method/data analysis in their studies. What follows is the results for what methods are used, how these methods are used, and which topics in psychology they are applied to. Percentages are reported to the second decimal in order to highlight small differences in the occurrence of methodology. Firstly, with regard to the research methods used, our results show that researchers are more likely to use quantitative research methods (90.22%) compared to all other research methods. Qualitative research was the second most common research method but only made up about 4.79% of the general method usage. Reviews occurred almost as much as qualitative studies (3.91%), as the third most popular method. Mixed-methods research studies (0.98%) occurred across most themes, whereas multi-method research was indicated in only one study and amounted to 0.10% of the methods identified. The specific use of each method in the topics identified is shown in Table 2 and Figure 4.
Table 2

Research methods in psychology.

Research MethodSocial PsychologyCognitive PsychologyPsychometricsExperimental PsychologyPhysiological PsychologyHealth PsychologyDevelopmental PsychologyPsychological PracticePersonalityEducation and Learning
Quantitative4011626960525248283813
Qualitative28410523501
Review115203411301
Mixed Methods7000101100
Multi-method0000000010
Total4471717260615853473915
Figure 4

Research method frequency in topics.

Research methods in psychology. Research method frequency in topics. Secondly, in the case of how these research methods are employed, our study indicated the following. Sampling−78.34% of the studies in the collected articles did not specify a sampling method. From the remainder of the studies, 13 types of sampling methods were identified. These sampling methods included broad categorisation of a sample as, for example, a probability or non-probability sample. General samples of convenience were the methods most likely to be applied (10.34%), followed by random sampling (3.51%), snowball sampling (2.73%), and purposive (1.37%) and cluster sampling (1.27%). The remainder of the sampling methods occurred to a more limited extent (0–1.0%). See Table 3 and Figure 5 for sampling methods employed in each topic.
Table 3

Sampling use in the field of psychology.

Sampling MethodSocial PsychologyCognitive PsychologyPsychometricsExperimental PsychologyPhysiological PsychologyHealth PsychologyDevelopmental PsychologyPsychological PracticePersonalityEducation and Learning
Not stated3311534557494343383114
Convenience sampling558101689261
Random sampling15391220211
Snowball sampling14441200300
Purposive sampling6020020310
Cluster sampling8120020000
Stratified sampling4120110000
Non-probability sampling4010000010
Probability sampling3100000000
Quota sampling1010000000
Criterion sampling1000000000
Self-selection sampling1000000000
Unsystematic sampling0100000000
Total4431727660605852484016
Figure 5

Sampling method frequency in topics.

Sampling use in the field of psychology. Sampling method frequency in topics. Designs were categorised based on the articles' statement thereof. Therefore, it is important to note that, in the case of quantitative studies, non-experimental designs (25.55%) were often indicated due to a lack of experiments and any other indication of design, which, according to Laher (2016), is a reasonable categorisation. Non-experimental designs should thus be compared with experimental designs only in the description of data, as it could include the use of correlational/cross-sectional designs, which were not overtly stated by the authors. For the remainder of the research methods, “not stated” (7.12%) was assigned to articles without design types indicated. From the 36 identified designs the most popular designs were cross-sectional (23.17%) and experimental (25.64%), which concurred with the high number of quantitative studies. Longitudinal studies (3.80%), the third most popular design, was used in both quantitative and qualitative studies. Qualitative designs consisted of ethnography (0.38%), interpretative phenomenological designs/phenomenology (0.28%), as well as narrative designs (0.28%). Studies that employed the review method were mostly categorised as “not stated,” with the most often stated review designs being systematic reviews (0.57%). The few mixed method studies employed exploratory, explanatory (0.09%), and concurrent designs (0.19%), with some studies referring to separate designs for the qualitative and quantitative methods. The one study that identified itself as a multi-method study used a longitudinal design. Please see how these designs were employed in each specific topic in Table 4, Figure 6.
Table 4

Design use in the field of psychology.

Research DesignSocial PsychologyCognitive PsychologyPsychometricsExperimental PsychologyPhysiological PsychologyHealth PsychologyDevelopmental PsychologyPsychological PracticePersonalityEducation and Learning
Experimental design828236010128643
Non-experimental design1153051013171313143
Cross-sectional design123311211917215132
Correlational design5612301022042
Not stated377304241413
Longitudinal design21621122023
Quasi-experimental design4100002100
Systematic review3000110100
Cross-cultural design3001000100
Descriptive design2000003000
Ethnography4000000000
Literature review1100110000
Interpretative Phenomenological Analysis (IPA)2000100000
Narrative design1000001100
Case-control research design0000020000
Concurrent data collection design1000100000
Grounded Theory1000100000
Narrative review0100010000
Auto-ethnography1000000000
Case series evaluation0000000100
Case study1000000000
Comprehensive review0100000000
Descriptive-inferential0000000010
Explanatory sequential design1000000000
Exploratory mixed-method0000100100
Grounded ethnographic design0100000000
Historical cohort design0100000000
Historical research0000000100
interpretivist approach0000000100
Meta-review1000000100
Prospective design1000000000
Qualitative review0000000100
Qualitative systematic review0000010000
Short-term prospective design0100000000
Total4611757463635856483916
Figure 6

Design frequency in topics.

Design use in the field of psychology. Design frequency in topics. Data collection and analysis—data collection included 30 methods, with the data collection method most often employed being questionnaires (57.84%). The experimental task (16.56%) was the second most preferred collection method, which included established or unique tasks designed by the researchers. Cognitive ability tests (6.84%) were also regularly used along with various forms of interviewing (7.66%). Table 5 and Figure 7 represent data collection use in the various topics. Data analysis consisted of 3,857 occurrences of data analysis categorised into ±188 various data analysis techniques shown in Table 6 and Figures 1–7. Descriptive statistics were the most commonly used (23.49%) along with correlational analysis (17.19%). When using a qualitative method, researchers generally employed thematic analysis (0.52%) or different forms of analysis that led to coding and the creation of themes. Review studies presented few data analysis methods, with most studies categorising their results. Mixed method and multi-method studies followed the analysis methods identified for the qualitative and quantitative studies included.
Table 5

Data collection in the field of psychology.

Data CollectionSocial PsychologyCognitive PsychologyPsychometricsExperimental PsychologyPhysiological PsychologyHealth PsychologyDevelopmental PsychologyPsychological PracticePersonalityEducation and Learning
Questionnaire3641136542405139243711
Experimental task68663529511551
Cognitive ability test957112615110
Physiological measure31216253010
Interview19301302201
Online scholarly literature104003401000
Open-ended questions15301312300
Semi-structured interviews10300321201
Observation10100000020
Documents5110000120
Focus group6120100000
Not stated2110001401
Public data6100000201
Drawing task0201110200
In-depth interview6000100000
Structured interview0200120010
Writing task1000400100
Questionnaire interviews1010201000
Non-experimental task4000000000
Tests2200000000
Group accounts2000000100
Open-ended prompts1100000100
Field notes2000000000
Open-ended interview2000000000
Qualitative questions0000010001
Social media1000000010
Assessment procedure0001000000
Closed-ended questions0000000100
Open discussions1000000000
Qualitative descriptions1000000000
Total55127375116797365605017
Figure 7

Data collection frequency in topics.

Table 6

Data analysis in the field of psychology.

Data AnalysisSocial PsychologyCognitive PsychologyPsychometricsExperimental PsychologyPhysiological PsychologyHealth PsychologyDevelopmental PsychologyPsychological PracticePersonalityEducation and Learning
Not stated5120011501
Actor-Partner Interdependence Model (APIM)4000000000
Analysis of Covariance (ANCOVA)17813421001
Analysis of Variance (ANOVA)112601629151715653
Auto-regressive path coefficients0010000000
Average variance extracted (AVE)1000010000
Bartholomew's classification system1000000000
Bayesian analysis3000100000
Bibliometric analysis1100000100
Binary logistic regression1100141000
Binary multilevel regression0001000000
Binomial and Bernoulli regression models2000000000
Binomial mixed effects model1000000000
Bivariate Correlations321030435111
Bivariate logistic correlations1000010000
Bootstrapping391623516121
Canonical correlations0000000020
Cartesian diagram1000000000
Case-wise diagnostics0100001000
Casual network analysis0001000000
Categorisation5200110400
Categorisation of responses2000000000
Category codes3100010000
Cattell's scree-test0010000000
Chi-square tests52201756118743
Classic Parallel Analysis (PA)0010010010
Cluster analysis7000111101
Coded15312111210
Cohen d effect size14521323101
Common method variance (CMV)5010000000
Comprehensive Meta-Analysis (CMA)0000000010
Confidence Interval (CI)2000010000
Confirmatory Factor Analysis (CFA)5713400247131
Content analysis9100210100
Convergent validity1000000000
Cook's distance0100100000
Correlated-trait-correlated-method minus one model1000000000
Correlational analysis2598544182731348338
Covariance matrix3010000000
Covariance modelling0110000000
Covariance structure analyses2000000000
Cronbach's alpha61141865108375
Cross-validation0020000001
Cross-lagged analyses1210001000
Dependent t-test1200110100
Descriptive statistics3241324349414336282910
Differentiated analysis0000001000
Discriminate analysis1020000001
Discursive psychology1000000000
Dominance analysis1000000000
Expectation maximisation2100000100
Exploratory data Analysis1100110000
Exploratory Factor Analysis (EFA)145240114040
Exploratory structural equation modelling (ESEM)0010000010
Factor analysis124160215020
Measurement invariance testing0000000000
Four-way mixed ANOVA0101000000
Frequency rate20142122200
Friedman test1000000000
Games-Howell post hoc2200010000
General linear model analysis1200001100
Greenhouse-Geisser correction2500001111
Grounded theory method0000000001
Grounded theory methodology using open and axial coding1000000000
Guttman split-half0010000000
Harman's one-factor test13200012000
Herman's criteria of experience categorisation0000000100
Hierarchical CFA (HCFA)0010000000
Hierarchical cluster analysis1000000000
Hierarchical Linear Modelling (HLM)762223767441
Huynh-Felt correction1000000000
Identified themes3000100000
Independent samples t-test38944483311
Inductive open coding1000000000
Inferential statistics2000001000
Interclass correlation3010000000
Internal consistency3120000000
Interpreted and defined0000100000
Interpretive Phenomenological Analysis (IPA)2100100000
Item fit analysis1050000000
K-means clustering0000000100
Kaiser-meyer-Olkin measure of sampling adequacy2080002020
Kendall's coefficients3100000000
Kolmogorov-Smirnov test1211220010
Lagged-effects multilevel modelling1100000000
Latent class differentiation (LCD)1000000000
Latent cluster analysis0000010000
Latent growth curve modelling (LGCM)1000000110
Latent means1000000000
Latent Profile Analysis (LPA)1100000000
Linear regressions691941031253130
Linguistic Inquiry and Word Count0000100000
Listwise deletion method0000010000
Log-likelihood ratios0000010000
Logistic mixed-effects model1000000000
Logistic regression analyses17010421001
Loglinear Model2000000000
Mahalanobis distances0200010000
Mann-Whitney U tests6421202400
Mauchly's test0102000101
Maximum likelihood method11390132310
Maximum-likelihood factor analysis with promax rotation0100000000
Measurement invariance testing4110100000
Mediation analysis29712435030
Meta-analysis3010000100
Microanalysis1000000000
Minimum significant difference (MSD) post hoc comparison0100000000
Mixed ANOVAs196010121410
Mixed linear model0001001000
Mixed-design ANCOVA1100000000
Mixed-effects multiple regression models1000000000
Moderated hierarchical regression model1000000000
Moderated regression analysis8400101010
Monte Carlo Markov Chains2010000000
Multi-group analysis3000000000
Multidimensional Random Coefficient Multinomial Logit (MRCML)0010000000
Multidimensional Scaling2000000000
Multiple-Group Confirmatory Factor Analysis (MGCFA)3000020000
Multilevel latent class analysis1000010000
Multilevel modelling7211100110
Multilevel Structural Equation Modelling (MSEM)2000000000
Multinominal logistic regression (MLR)1000000000
Multinominal regression analysis1000020000
Multiple Indicators Multiple Causes (MIMIC)0000110000
Multiple mediation analysis2600221000
Multiple regression341530345072
Multivariate analysis of co-variance (MANCOVA)12211011010
Multivariate Analysis of Variance (MANOVA)38845569112
Multivariate hierarchical linear regression1100000000
Multivariate linear regression0100001000
Multivariate logistic regression analyses1000000000
Multivariate regressions2100001000
Nagelkerke's R square0000010000
Narrative analysis1000001000
Negative binominal regression with log link0000010000
Newman-Keuls0100010000
Nomological Validity Analysis0010000000
One sample t-test81017464010
Ordinary Least-Square regression (OLS)2201000000
Pairwise deletion method0000010000
Pairwise parameter comparison4000002000
Parametric Analysis0001000000
Partial Least Squares regression method (PLS)1100000000
Path analysis21901245120
Path-analytic model test1000000000
Phenomenological analysis0010000100
Polynomial regression analyses1000000000
post-hoc Fisher LSD0100000000
Principal axis factoring2140001000
Principal component analysis (PCA)81121103251
Pseudo-panel regression1000000000
Quantitative content analysis0000100000
Receiver operating characteristic (ROC) curve analysis2001000000
Relative weight analysis1000000000
Repeated measures analyses of variances (rANOVA)182217521111
Ryan-Einot-Gabriel-Welsch multiple F test1000000000
Satorra-Bentler scaled chi-square statistic0030000000
Scheffe's test3000010000
Sequential multiple mediation analysis1000000000
Shapiro-Wilk test2302100000
Sobel Test13501024000
Squared multiple correlations1000000000
Squared semi-partial correlations (sr2)2000000000
Stepwise regression analysis3200100020
Structural Equation Modelling (SEM)562233355053
Structure analysis0000001000
Subsequent t-test0000100000
Systematic coding- Gemeinschaft-oriented1000100000
Task analysis2000000000
Thematic analysis11200302200
Three (condition)-way ANOVA0400101000
Three-way hierarchical loglinear analysis0200000000
Tukey-Kramer corrections0001010000
Two-paired sample t-test7611031101
Two-tailed related t-test0110100000
Unadjusted Logistic regression analysis0100000000
Univariate generalized linear models (GLM)2000000000
Variance inflation factor (VIF)3100000010
Variance-covariance matrix1000000100
Wald test1100000000
Ward's hierarchical cluster method0000000001
Weighted least squares with corrections to means and variances (WLSMV)2000000000
Welch and Brown-Forsythe F-ratios0100010000
Wilcoxon signed-rank test3302000201
Wilks' Lamba6000001000
Word analysis0000000100
Word Association Analysis1000000000
z scores5610110100
Total173863532919219823722511715255
Data collection in the field of psychology. Data collection frequency in topics. Data analysis in the field of psychology. Results of the topics researched in psychology can be seen in the tables, as previously stated in this article. It is noteworthy that, of the 10 topics, social psychology accounted for 43.54% of the studies, with cognitive psychology the second most popular research topic at 16.92%. The remainder of the topics only occurred in 4.0–7.0% of the articles considered. A list of the included 999 articles is available under the section “View Articles” on the following website: https://methodgarden.xtrapolate.io/. This website was created by Scholtz et al. (2019) to visually present a research framework based on this Article's results.

Discussion

This systematised review categorised full-length articles from five international journals across the span of 5 years to provide insight into the use of research methods in the field of psychology. Results indicated what methods are used how these methods are being used and for what topics (why) in the included sample of articles. The results should be seen as providing insight into method use and by no means a comprehensive representation of the aforementioned aim due to the limited sample. To our knowledge, this is the first research study to address this topic in this manner. Our discussion attempts to promote a productive way forward in terms of the key results for method use in psychology, especially in the field of academia (Holloway, 2008). With regard to the methods used, our data stayed true to literature, finding only common research methods (Grant and Booth, 2009; Maree, 2016) that varied in the degree to which they were employed. Quantitative research was found to be the most popular method, as indicated by literature (Breen and Darlaston-Jones, 2010; Counsell and Harlow, 2017) and previous studies in specific areas of psychology (see Coetzee and Van Zyl, 2014). Its long history as the first research method (Leech et al., 2007) in the field of psychology as well as researchers' current application of mathematical approaches in their studies (Toomela, 2010) might contribute to its popularity today. Whatever the case may be, our results show that, despite the growth in qualitative research (Demuth, 2015; Smith and McGannon, 2018), quantitative research remains the first choice for article publication in these journals. Despite the included journals indicating openness to articles that apply any research methods. This finding may be due to qualitative research still being seen as a new method (Burman and Whelan, 2011) or reviewers' standards being higher for qualitative studies (Bluhm et al., 2011). Future research is encouraged into the possible biasness in publication of research methods, additionally further investigation with a different sample into the proclaimed growth of qualitative research may also provide different results. Review studies were found to surpass that of multi-method and mixed method studies. To this effect Grant and Booth (2009), state that the increased awareness, journal contribution calls as well as its efficiency in procuring research funds all promote the popularity of reviews. The low frequency of mixed method studies contradicts the view in literature that it's the third most utilised research method (Tashakkori and Teddlie's, 2003). Its' low occurrence in this sample could be due to opposing views on mixing methods (Gunasekare, 2015) or that authors prefer publishing in mixed method journals, when using this method, or its relative novelty (Ivankova et al., 2016). Despite its low occurrence, the application of the mixed methods design in articles was methodologically clear in all cases which were not the case for the remainder of research methods. Additionally, a substantial number of studies used a combination of methodologies that are not mixed or multi-method studies. Perceived fixed boundaries are according to literature often set aside, as confirmed by this result, in order to investigate the aim of a study, which could create a new and helpful way of understanding the world (Gunasekare, 2015). According to Toomela (2010), this is not unheard of and could be considered a form of “structural systemic science,” as in the case of qualitative methodology (observation) applied in quantitative studies (experimental design) for example. Based on this result, further research into this phenomenon as well as its implications for research methods such as multi and mixed methods is recommended. Discerning how these research methods were applied, presented some difficulty. In the case of sampling, most studies—regardless of method—did mention some form of inclusion and exclusion criteria, but no definite sampling method. This result, along with the fact that samples often consisted of students from the researchers' own academic institutions, can contribute to literature and debates among academics (Peterson and Merunka, 2014; Laher, 2016). Samples of convenience and students as participants especially raise questions about the generalisability and applicability of results (Peterson and Merunka, 2014). This is because attention to sampling is important as inappropriate sampling can debilitate the legitimacy of interpretations (Onwuegbuzie and Collins, 2017). Future investigation into the possible implications of this reported popular use of convenience samples for the field of psychology as well as the reason for this use could provide interesting insight, and is encouraged by this study. Additionally, and this is indicated in Table 6, articles seldom report the research designs used, which highlights the pressing aspect of the lack of rigour in the included sample. Rigour with regards to the applied empirical method is imperative in promoting psychology as a science (American Psychological Association, 2020). Omitting parts of the research process in publication when it could have been used to inform others' research skills should be questioned, and the influence on the process of replicating results should be considered. Publications are often rejected due to a lack of rigour in the applied method and designs (Fonseca, 2013; Laher, 2016), calling for increased clarity and knowledge of method application. Replication is a critical part of any field of scientific research and requires the “complete articulation” of the study methods used (Drotar, 2010, p. 804). The lack of thorough description could be explained by the requirements of certain journals to only report on certain aspects of a research process, especially with regard to the applied design (Laher, 20). However, naming aspects such as sampling and designs, is a requirement according to the APA's Journal Article Reporting Standards (JARS-Quant) (Appelbaum et al., 2018). With very little information on how a study was conducted, authors lose a valuable opportunity to enhance research validity, enrich the knowledge of others, and contribute to the growth of psychology and methodology as a whole. In the case of this research study, it also restricted our results to only reported samples and designs, which indicated a preference for certain designs, such as cross-sectional designs for quantitative studies. Data collection and analysis were for the most part clearly stated. A key result was the versatile use of questionnaires. Researchers would apply a questionnaire in various ways, for example in questionnaire interviews, online surveys, and written questionnaires across most research methods. This may highlight a trend for future research. With regard to the topics these methods were employed for, our research study found a new field named “psychological practice.” This result may show the growing consciousness of researchers as part of the research process (Denzin and Lincoln, 2003), psychological practice, and knowledge generation. The most popular of these topics was social psychology, which is generously covered in journals and by learning societies, as testaments of the institutional support and richness social psychology has in the field of psychology (Chryssochoou, 2015). The APA's perspective on 2018 trends in psychology also identifies an increased amount of psychology focus on how social determinants are influencing people's health (Deangelis, 2017). This study was not without limitations and the following should be taken into account. Firstly, this study used a sample of five specific journals to address the aim of the research study, despite general journal aims (as stated on journal websites), this inclusion signified a bias towards the research methods published in these specific journals only and limited generalisability. A broader sample of journals over a different period of time, or a single journal over a longer period of time might provide different results. A second limitation is the use of Excel spreadsheets and an electronic system to log articles, which was a manual process and therefore left room for error (Bandara et al., 2015). To address this potential issue, co-coding was performed to reduce error. Lastly, this article categorised data based on the information presented in the article sample; there was no interpretation of what methodology could have been applied or whether the methods stated adhered to the criteria for the methods used. Thus, a large number of articles that did not clearly indicate a research method or design could influence the results of this review. However, this in itself was also a noteworthy result. Future research could review research methods of a broader sample of journals with an interpretive review tool that increases rigour. Additionally, the authors also encourage the future use of systematised review designs as a way to promote a concise procedure in applying this design.

Conclusion

Our research study presented the use of research methods for published articles in the field of psychology as well as recommendations for future research based on these results. Insight into the complex questions identified in literature, regarding what methods are used how these methods are being used and for what topics (why) was gained. This sample preferred quantitative methods, used convenience sampling and presented a lack of rigorous accounts for the remaining methodologies. All methodologies that were clearly indicated in the sample were tabulated to allow researchers insight into the general use of methods and not only the most frequently used methods. The lack of rigorous account of research methods in articles was represented in-depth for each step in the research process and can be of vital importance to address the current replication crisis within the field of psychology. Recommendations for future research aimed to motivate research into the practical implications of the results for psychology, for example, publication bias and the use of convenience samples.

Ethics Statement

This study was cleared by the North-West University Health Research Ethics Committee: NWU-00115-17-S1.

Author Contributions

All authors listed have made a substantial, direct and intellectual contribution to the work, and approved it for publication.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
  19 in total

1.  Editorial: a call for replications of research in pediatric psychology and guidance for authors.

Authors:  Dennis Drotar
Journal:  J Pediatr Psychol       Date:  2010-05-23

2.  When a cigar is not just a cigar: alternative takes on data and data analysis.

Authors:  Margarete Sandelowski
Journal:  Res Nurs Health       Date:  2011-04-07       Impact factor: 2.228

Review 3.  The Future of Qualitative Research in Psychology: Accentuating the Positive.

Authors:  Brendan Gough; Antonia Lyons
Journal:  Integr Psychol Behav Sci       Date:  2016-06

4.  Reporting Practices and Use of Quantitative Methods in Canadian Journal Articles in Psychology.

Authors:  Alyssa Counsell; Lisa L Harlow
Journal:  Can Psychol       Date:  2016-10-06

5.  Is Psychology Headed in the Right Direction? Yes, No, and Maybe.

Authors:  Carol S Dweck
Journal:  Perspect Psychol Sci       Date:  2017-07

6.  Journal article reporting standards for quantitative research in psychology: The APA Publications and Communications Board task force report.

Authors:  Mark Appelbaum; Harris Cooper; Rex B Kline; Evan Mayo-Wilson; Arthur M Nezu; Stephen M Rao
Journal:  Am Psychol       Date:  2018-01

7.  Building capacity for public and population health research in Africa: the consortium for advanced research training in Africa (CARTA) model.

Authors:  Alex C Ezeh; Chimaraoke O Izugbara; Caroline W Kabiru; Sharon Fonn; Kathleen Kahn; Lenore Manderson; Ashiwel S Undieh; Akinyinka Omigbodun; Margaret Thorogood
Journal:  Glob Health Action       Date:  2010-11-12       Impact factor: 2.640

8.  On Quantitizing.

Authors:  Margarete Sandelowski; Corrine I Voils; George Knafl
Journal:  J Mix Methods Res       Date:  2009-07-01

9.  Quantitative methods in psychology: inevitable and useless.

Authors:  Aaro Toomela
Journal:  Front Psychol       Date:  2010-07-30

10.  Replication, falsification, and the crisis of confidence in social psychology.

Authors:  Brian D Earp; David Trafimow
Journal:  Front Psychol       Date:  2015-05-19
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.