DE / EN

Research

The research of our group aims at response processes in self ratings and performance measures in psychological assessment, and on cognitive processes in human memory and judgment and decision making in the realm of experimental psychology. The common focus of our research projects is the statistical modeling of psychological processes with psychometric, discrete stochastic, and multivariate methods.

Research projects

  • Analysis of multiple response processes in rating data

    Self ratings in questionnaires with ordinal response formats are widely used to assess individual differences, like in research on personality, social cognition, attitudes or in the clinical domain. It is well known, however, that observed rating responses are affected by multiple response processes, including general response styles, social desirability, and interactions between item wording and person characteristics, over and above the to-be-measured traits. To separate the intended traits from the potentially biasing effects of further response processes, latent variable models can be chosen from item response theory or structural equation modeling. In our research, we develop such psychometric models further, examine the statistical properties and psychological validity of the model components and parameters, and we apply the models to data from different areas of psychological research.

  • Modeling heterogeneity in psychometric models

    In the measurement of person attributes by means of questionnaires or performance tests, it is commonly assumed that the structural features of items and the parameters of psychometric models are constant across persons. This assumption includes, for example, that the items are of equal difficulty across all testees and that the relative weighting of response processes is identical for all persons. If the assumption is violated, the validity of the measurement is at risk and the fairness of person comparisons is jeopardized. Our research extends psychometric models, such that heterogeneity of item characteristics and response processes across persons can be identified and accounted for in the data analysis. For this purpose, we use mixture-distribution models and model-based partitioning methods for multidimensional psychometric decision trees, in order to examine sources of heterogeneity and to optimize the assessment of psychological constructs in a tailored way.

  • Cognitive processes in episodic memory

    Human episodic memory comprises the encoding, storage and retrieval of events and related context information, such as information on the temporal and physical context of an event or information on the subjective state during the event experience. The joint encoding and storage of event and context information allows a bound memory representation and thereby enables episodic recollection. For the retrieval and report of episodic recollections, metacognitive processes and reconstructive inferences are relevant beyond genuine memory processes. In our research, we analyze episodic memory representations as well as retrieval and judgment processes by means of experimental designs and statistical models. The models include multinomial processing models for the separation of memory processes and reconstructive inferences, and generalized multilevel models for the analysis of dynamic cognitive processes on the level of individual events.

  • Cognitive processes and heuristics in contingency learning, judgment and decision making

    Learned contingencies are essential for judgment formation and decision making, such as contingencies between social groups and behaviors in stereotyping or contingencies between options and resulting outcomes in decision making. Contingencies that are inductively inferred from information in the social environment, however, can be biased or wrong. For example, information can originate from selective samples or be of limited validity for the judgment to be made, and individuals can overlook relevant moderating variables in their judgments and decisions. Heuristics can exert additional influences on judgments and decisions, like the use of perceived processing fluency in truth judgments of statements or in familiarity statements of events. Our research utilizes different experimental paradigms to investigate potential biases in judgments and decisions due to inductive contingencies and heuristics. The experimental paradigms together with statistical models of data analysis allow us to study the underlying processes as well as individual differences in the judgment and decision effects.

Publications

    2024

    • Seitz, T., Wetzel, E., Hilbig, B. E., & Meiser, T. (2024). Using the multidimensional nominal response model to model faking in questionnaire data: The importance of item desirability characteristics. Behavior Research Methods. Advance online publication. https://doi.org/10.3758/s13428-024-02509-x
    • Hasselhorn, K., Ottenstein, C., Meiser, T., & Lischetzke, T. (2024). The effects of questionnaire length on the relative impact of response styles in ambulatory assessment. Multivariate Behavioral Research. Advance online publication. https://doi.org/10.1080/00273171.2024.2354233
    • Merhof, V., Meiser, T. (2024). Co-occurring dominance and ideal point processes: A general IRTree framework for multidimensional item responding. Behavior Research Methods. Advance online publication. doi.org/10.3758/s13428-024-02405-4
    • Petras, N., Dantlgraber, M., & Reips, U.-D. (2024). Illustrating psychometric tests, scales, and constructs: An R package for Item Pool Visualization. Behavior Research Methods, 56, 639–650. https://doi.org/10.3758/s13428-022-02052-7
    • Petras, N., & Meiser, T. (2024). Problems of domain factors with small factor loadings in bi-factor models. Multivariate Behavioral Research, 59(1), 123–147. https://doi.org/10.1080/00273171.2023.2228757
    • Reiber, F., & Ulrich, R. (2024). Exploring effects of age on conflict processing in the light of practice in a large-scale dataset. Experimental Aging Research, 50(4), 422–442. https://doi.org/10.1080/0361073X.2023.2214051
    • Schreiner, M. R., Bröder, A., & Meiser, T. (2024). Agency effects on the binding of event elements in episodic memory. Quarterly Journal of Experimental Psychology, 77(6), 1201-1220. https://doi.org/10.1177/17470218231203951

    2023

    • Alagöz, Ö. E. C., & Meiser, T. (2023). Investigating heterogeneity in response strategies: A mixture multidimensional IRTree approach. Educational and Psychological Measurement. Advance online publication. https://doi.org/10.1177/00131644231206765
    • Bott, F. M., & Meiser, T. (2023). Information sampling in contingency learning: Sampling strategies and their consequences for (pseudo-)contingency inferences. In K. Fiedler, P. Juslin & J. Denrell (Eds.), Sampling in judgment and decision making (pp. 245–265). Cambridge University Press.
    • Frick, S. (2023). Estimating and using block information in the Thurstonian IRT model. Psychometrika, 88, 1556–1589. https://doi.org/10.1007/s11336-023-09931-8
    • Henninger, M., & Meiser, T. (2023). Quality control: Response style modeling. In R. Tierney, F. Rizvi & K. Ercikan (Eds.), International encyclopedia of education (4th edition, pp. 331–340). Elsevier. https://doi.org/10.1016/b978-0-12-818630-5.10041-7
    • Henninger, M., Plieninger, H., & Meiser, T. (2023). The effect of response formats on response style strength: An experimental comparison. European Journal of Psychological Assessment. Advance online publication. https://doi.org/10.1027/1015-5759/a000779
    • Meiser, T., & Reiber, F. (2023). Item-specific factors in IRTree models: When they matter and when they don’t [Commentary on Lyu, Bolt & Westby (2023)]. Psychometrika, 88, 739–744. https://doi.org/10.1007/s11336-023-09916-7
    • Merhof, V., Böhm, C. M., & Meiser, T. (2023). Separation of traits and extreme response style in IRTree models: The role of mimicry effects for the meaningful interpretation of estimates. Educational and Psychological Measurement. Advance online publication. https://doi.org/10.1177/00131644231213319
    • Merhof, V., & Meiser, T. (2023). Dynamic response strategies: Accounting for response process heterogeneity in IRTree decision nodes. Psychometrika, 88, 1354-1380 https://doi.org/10.1007/s11336-023-09901-0
    • Schreiner, M., & Hütter, M. (2023). The Influence of Social Status on Memory: No Evidence for Effects of Social Status on Event Element Binding. Social Cognition, 41(5), 447–466. https://doi.org/10.1521/soco.2023.41.5.447
    • Schreiner, M., & Meiser, T. (2023). Measuring binding effects in event-based episodic representations. Behavior Research Methods, 55, 981–996. https://doi.org/10.3758/s13428-021-01769-1
    • Schreiner, M., Meiser, T., & Bröder, A. (2023). The binding structure of event elements in episodic memory and the role of animacy. Quarterly Journal of Experimental Psychology, 76(4), 705–730. https://doi.org/10.1177/17470218221096148
    • Schreiner, M. R., Mercier, B., Frick, S., Wiwad, D., Schmitt, M. C., Kelly, J. M., & Pütter, J. Q. (2023). Measurement issues in the many analysts religion project. Religion, Brain & Behavior, 13(3), 339–341. https://doi.org/10.1080/2153599X.2022.2070260

    2022

    2021

    • Bott, F. M., Kellen, D., & Klauer, K. C. (2021). Normative accounts of illusory correlations. Psychological Review, 128(5), 856–878. https://doi.org/10.1037/rev0000273
    • Frick, S., Brown, A., & Wetzel, E. (2023). Investigating the normativity of trait estimates from multidimensional forced-choice data. Multivariate Behavioral Research, 58(1), 1–29. https://doi.org/10.1080/00273171.2021.1938960
    • Henninger, M., & Plieninger, H. (2021). Different styles, different times: How response times can inform our knowledge about the response process in rating scale measurement. Assessment, 28(5), 1301-1319. https://doi.org/10.1177/1073191119900003
    • Plieninger, H. (2021). Developing and applying IR-Tree models: Guidelines, caveats, and an extension to multiple groups. Organizational Research Methods, 24(3), 654–670. https://doi.org/10.1177/1094428120911096

    2020

    • Bott, F. M., Heck, D. W., Meiser, T. (2020). Parameter validation in hierarchical MPT models by functional dissociation with continuous covariates: An application to contingency inference. Journal of Mathematical Psychology, 98, ArtID: 102388. https://doi.org/10.1016/j.jmp.2020.102388
    • Bott, F. M., & Meiser, T. (2020). Pseudocontingency inference and choice: The role of information sampling. Journal of Experimental Psychology: Learning, Memory, and Cognition, 46(9), 1624-1644. https://doi.org/10.1037/xlm0000840
    • Henninger, M., & Meiser, T. (2020a). Different approaches to modeling response styles in divide-by-total item response theory models (Part I): A model integration. Psychological Methods, 25(5), 560–576. https://doi.org/10.1037/met0000249
    • Henninger, M., & Meiser, T. (2020b). Different approaches to modeling response styles in divide-by-total item response theory models (Part II): Applications and novel extensions. Psychological Methods, 25(5), 577–595. https://doi.org/10.1037/met0000268
    • Meiser, T. (2020). Illusorische Korrelationen. In L.-E. Petersen & B. Six (Hrsg.), Stereotype, Vorurteile und soziale Diskriminierung (2. überarbeitete und erweiterte Auflage, S. 54–62). Weinheim: Beltz.

    2019

    • Arnold, N. R., Heck, D. W., Bröder, A., Meiser, T., & Boywitt, C. D. (2019). Testing hypotheses about binding in context memory with a hierarchical multinomial modeling approach: A preregistered study. Experimental Psychology, 66(3), 239–251. https://doi.org/10.1027/1618-3169/a000442
    • Meiser, T., Plieninger, H., & Henninger, M. (2019). IRTree models with ordinal and multidimensional decision nodes for response styles and trait-based rating responses. British Journal of Mathematical and Statistical Psychology, 72(3), 501–516. https://doi.org/10.1111/bmsp.12158