When to Think Like a Scientist Balancing Scientific Rigor and Satisfying Business Needs in HR Analytics

Greta Ontrup, Jana Moeschke, Ralf Buechsenschuss,Torsten Biemann

ZEITSCHRIFT FUR ARBEITS-UND ORGANISATIONSPSYCHOLOGIE(2023)

引用 0|浏览0
暂无评分
摘要
Free AccessWhen to Think Like a ScientistBalancing Scientific Rigor and Satisfying Business Needs in HR AnalyticsGreta Ontrup, Jana Moeschke, Ralf Buechsenschuss, and Torsten BiemannGreta OntrupGreta Ontrup, Department of Work, Organizational and Business Psychology, Ruhr University Bochum, Universitätsstraße 150, 44801 Bochum, Germany, [email protected]https://orcid.org/0000-0003-4720-1494Department of Work, Organizational and Business Psychology, Ruhr University Bochum, GermanySearch for more papers by this author, Jana MoeschkeCommunardo Software GmbH, Bonn, GermanySearch for more papers by this author, Ralf BuechsenschussZurich Insurance Company Ltd., Zürich, SwitzerlandDepartment of Business Administration, Human Resource Management and Leadership, University of Mannheim, GermanySearch for more papers by this author, and Torsten Biemannhttps://orcid.org/0000-0003-1728-6765Department of Business Administration, Human Resource Management and Leadership, University of Mannheim, GermanySearch for more papers by this authorPublished Online:July 20, 2023https://doi.org/10.1026/0932-4089/a000418PDF ToolsAdd to favoritesDownload CitationsTrack Citations ShareShare onFacebookTwitterLinkedInReddit SectionsMorePresentation of the ProblemHuman resource analytics (HR-A) refers to a data-driven HR practice that is based on analyzing data related to employees, HR processes, and organizational performance (Marler & Boudreau, 2017). The results serve as a basis for decision-making on strategically relevant business issues (Falletta & Combs, 2020). It further concerns the aspiration to show how HR measures can create (monetary) value by linking “soft” people factors (e. g., employee engagement) to strategically relevant outcomes such as customer satisfaction (Falletta & Combs, 2020).A prerequisite for using HR-A to improve decision-making is increased research and experimentation within the organization (Falletta & Combs, 2020; Peeters, et al., 2020). In this, HR-A constitutes an intersection between a science-oriented operational practice and an application-oriented HR science. We argue that it is a key challenge for successful HR-A to meet high scientific standards but to also satisfy business needs in a direct way.Concern: Balancing Scientific Rigor and Satisfying Business NeedsOn the one hand, the use of scientific methods for generating value-adding insights fosters the expectation to use rigorous methods. Successful science is characterized by (a) rigorous research questions and hypotheses that derive logically from theory, (b) rigorous methods and research designs that allow valid conclusions to be drawn and eliminate alternative hypothesis, and (c) rigorous measures (Dipboye, 2007). Psychometric quality and fitness for use of the data are discussed as critical prerequisites for HR-A (Cai & Zhu, 2015; Levenson & Fink, 2017). Expressed poignantly by the phrase “garbage in, garbage out,” good quality data are a necessary condition for valuable insights (Levenson & Fink, 2017; Peeters et al., 2020).On the other hand, stakeholders expect HR-A to solve business problems in a direct and quick way. A recent case study highlights how organizations prioritize “impression management and speed of output” of HR-A (Jörden et al., 2021, p. 11). Due to work overload, short deadlines, and tight budgets, HR-A needs to satisfy business needs quickly rather than rigorously (Jörden et al., 2021; Levenson & Fink, 2017). In line with this, a preoccupation with data quality distracts attention away from the essentials of HR-A (Boudreau & Ramstad, 2004). Following this line of argumentation, an academic mindset (i. e., an emphasis on scientific rigor) even hinders HR-A, which should serve to deliver practical “good enough” solutions to complex problems (Rasmussen & Ulrich, 2015). This adds to the belief that rigor comes at the cost of relevance. The two are often discussed as mutually exclusive, implying that rigorous research comes at the expense of practical relevance, rendering scientifically derived results useless for the “real world” (Palmer et al., 2009). With this apparent tension between scientific rigor and the satisfaction of business needs, the question arises: Does HR-A need more or less scientific rigor to deliver value?In this dialogue article, we argue that HR-A can profit from the use of rigorous methods, but that an academic mindset does not facilitate satisfaction of business needs throughout all project phases. By comparing process steps in research and HR-A projects, we delineate similarities and differences and suggest when HR-A should adhere to high scientific rigor and what negative consequences might arise if scientific standards are not adhered to in these phases. We do not believe the scientific process to be the optimal blueprint for HR-A in general but propose that the satisfaction of business needs should be prioritized in other project phases. Thus, we argue that different capabilities are needed during different HR-A project phases.The goal of this article is to stimulate a dialogue about similarities and differences between HR-A and research processes so that science and research can achieve a shared mental model regarding potentially differing goals. We believe this to be a prerequisite for successful science–practice collaboration. Further, we aim to provide (a) researchers with starting points for capabilities that might explain the impact and success of HR-A and (b) practitioners with an overview of necessary capabilities that can translate into trainable competencies for successful HR-A.Presentation of the Differences Between HR-A and Research ProjectsTable 1 shows a simplified process of HR-A and research projects. HR-A projects can range from descriptive to prescriptive analyses (Marler & Boudreau, 2017); we refer to sophisticated HR-A that goes beyond HR controlling.HR-A differs significantly from research projects regarding the identification of the problem. As the novelty of the research question is not important for HR-A (e. g., companies might copy a project another company did before), HR-A is phenomenon-driven rather than theory-driven. Stakeholders might point at potential problems (e. g., management forms a hypothesis and HR-A team must investigate) or HR might detect anomalies in a reporting and thus forms a hypothesis concerning a potential problem. An academic mindset would hinder effective projects as HR-A teams are expected to contribute to strategy execution rather than to offer incremental insights on theory (Levenson, 2018; Rasmussen & Ulrich, 2015). In this project phase, the capability to prioritize strategically important questions is of importance. Table 1 A comparison between HR-A and research projects in terms of a simplified project process Simplified processHR-A projectResearch project(1) Identification ofproblemShort-‍, mid-‍, long-term impact for strategic business challenges very important; phenomenon-drivenNovelty of research question very important; theory-driven(2) Derivation of research questionTheory-based derivation of research question and hypotheses(3) Data collectionand analysisData quality assessment and use of appropriate analytical procedures(4) ResultsGeneralizability of findings beyond organizational context does not matterGeneralizability of findings very important(5) Implementationand evaluationHighly relevant to the organizationUsually not part of research projectTable 1 A comparison between HR-A and research projects in terms of a simplified project process View as image HTML Similarly, the interpretation of the results and implementation and evaluation of outcomes differ. For HR-A, the generalizability of the findings does not matter; it might even be counterproductive for organizations to publish results that other companies can benefit from (cf. resource-based view; Barney, 1996). Further, the implementation of derived implications and their evaluation is usually not part of research projects.By contrast, the derivation of research questions and data collection and analysis should not differ between HR-A and research projects. Although the defined problem for an HR-A project is phenomenon-driven, the subsequent research questions/hypotheses can either be developed based on theory or investigated exploratorily. A process with high theoretical and methodological rigor would place an emphasis on a thorough literature review for deriving research questions and testable hypotheses and ensure rigorous data quality assessment. However, due to short deadlines, exploratory data analysis is oftentimes favored and “interesting patterns” in the data are used for gaining insights (Levenson & Fink, 2017). The need for quick insights promotes the use of easily available data, since psychometric data assessment takes time. Yet, the exploratory investigation of “good enough” data without a theoretical rationale and data quality assessment entails several drawbacks. Thus, we suggest that during these project phases, high scientific rigor should be a key priority, as otherwise severe negative consequences might arise, which are detailed in the following.Justification of a Scientific Approach for Deriving Research Questions and Collecting and Analyzing DataThe importance of high scientific rigor for the derivation of research questions, data collection, and analysis can be illustrated with a hypothetical example. An international insurance company experiences considerable turnover among customer consultants and sets out to answer the question of what explains these high turnover rates. The HR-A team starts by reviewing scientific literature on individual (e. g., job involvement), job-related (e. g., autonomy), and structural (e. g., training opportunities) predictors of turnover. Shortly afterwards, the team is pressed to present their analyses and recommended actions. Due to the time pressure, the HR-A team starts analysis immediately, which causes a focus on already available data. Individual- and job-related factors from the HR information system (personnel, structural, contractual data) and annual feedback reports (commitment, job satisfaction) are available. Exploratory analyses show that personnel and contractual factors explain most variance: Employees that work from home (WFH) and men are less likely to leave the company, compared to employees working from the office and women. Based on the results, the management decides to implement a new WFH policy and a female mentoring program. The approach can be deemed reasonable regarding the time restrictions. Nevertheless, skipping the derivation of a theory-based hypothesis and a thorough data quality assessment entails severe drawbacks.A lack of theoretical underpinnings for the expected effects entails ethical and legal challenges. If data are mined with the primary purpose of maximizing predicted outcomes (Braun & Kuljan, 2015), the interpretation and recommendations based on the results can be challenging. A theoretical rationale for including variables prevents a focus on irrelevant (not job-related) and potentially biased (e. g., gender) factors (Tippins et al., 2021). In the example, there is no theoretical rationale (a theory that explains1 why gender might predict turnover). Yet, the statistical relation (“interesting pattern”) can become the reason for personnel decisions (recruiting, mentoring, etc.). If effects cannot be explained by theory, deriving actions is still guesswork and accompanied by potential ethical and legal conflicts. Regarding the example, the effect of gender might be explained in multiple ways (less support, discriminant culture, etc.). As there is no theoretical framework to help explain the effect, recommended actions might not be effective (a mentoring program will likely not help a discriminant culture) and bear potential for ethical and legal concerns (e. g., if it leads to fewer promotions of women). Therefore, HR-A should aim not only at finding significant effects, but to understand the reasons why effects occur (Tippins et al., 2021).Additionally, a theoretical framework also ensures that all relevant factors are considered. Without a theoretical foundation, it is unclear why WFH and gender emerge as predictors. It is conceivable that WFH leads to more autonomy and women receive less support than men. In these cases, the actions taken by the management would be purposeful. It is also conceivable that other factors explain the effect. Perhaps employees WFH and men have better access to (virtual) trainings offered at off-peak times, that is, times at which office workers commute and women are more likely to have care responsibilities. Although the actions taken by the organization will not be detrimental in this case, it would be more effective to invest in more flexible training opportunities.Another drawback concerns statistical artifacts produced by exploratory analyses. Especially if big data are processed, small amounts of shared confounding variance can produce artifactual associations of otherwise unrelated variables (Smith & Nichols, 2018). Without a theoretical rationale for an expected effect, the risk of statistically significant but practically irrelevant findings increases. In this case, actions derived from the results will not lead to an improvement of the outcomes at stake.Lastly, the omission of a psychometric assessment can have a major impact on the results. Information on commitment and job satisfaction is derived from annual feedback reports. Although it is desirable to integrate data from various sources for HR-A, it necessitates a verification of their content validity (do they measure what they are supposed to measure?), reliability (do they assess the phenomenon of interest with precision?), and fitness of use (are they up to date and exhaustive?). When extracting information from existing data (i. e., data were primarily collected for a different purpose), particular emphasis should be placed on ensuring content validity, that is, that concept and measure align (McAbee et al., 2017). If remarks from feedback reports are interpreted as commitment but instead display team satisfaction, the conclusion that commitment is not a significant driver of turnover is false. Actions taken based on the results might waste resources (e. g., valid data might show an effect of commitment, indicating a need for interventions) or might even be harmful (e. g., when commitment-supporting practices are discontinued).Consequences for Research and Theory BuildingWe argued that HR-A constitutes an intersection between a science-oriented operational practice and an application-oriented HR science, which necessitates balancing scientific rigor with the satisfaction of business needs. We suggest that for the derivation of research questions, data collection, and analysis, high scientific rigor can enhance HR-A success significantly. However, for identifying the problem, interpreting the results, and implementing and evaluating outcomes, the satisfaction of business needs should be prioritized. Thus, we posit that different capabilities predict HR-A success – depending on the project phase.Table 2 offers suggestions for operationalizing the capabilities of satisfying business needs and scientific rigor for HR-A as predictors to understand their impact on HR-A success. Theorizing necessary capabilities and how these might be operationalized is of importance as there is a need to explain how HR-A works and what success factors contribute to successful HR-A (McCartney & Fu, 2022a, 2022b). The proposed capabilities also offer starting points for discussing what competencies HR analysts need. Based on the empirical investigation of the relation between competencies and HR-A success, implications for the training of HR analysts or for teaching in degree programs, such as work and organizational psychology or HR management, can be derived. Work on necessary competencies for the role of HR analysts is only emerging. We suggest that it is indispensable to offer rigorous statistical training as well as knowledge/skills training that ensures that business needs can be identified and stakeholder communication can be initiated. One important competency in that regard could be storytelling. The importance of storytelling is increasingly emphasized (Fu et al., 2022), as it enables HR analysts to present analytical findings in a way that enables stakeholders to understand their importance. Storytelling competencies could be the link between methodological rigor and the simultaneous communication of the relevance of the findings for the satisfaction of business needs. We encourage theoretical work on the relation between these competencies and a dialogue on best-practice teaching methods for preparing students for the role of HR analysts.Consequences for PracticeDespite the value propositions of HR-A, adoption rates in practice are low (McCartney & Fu, 2022b). Most organizations are far from a strategic use of HR-A (Weibel et al., 2019) and organizations in Europe feel less ready to apply HR-A compared to non-European countries (Guenole et al., 2017). We argue that the success of HR-A can be enhanced by the capability to balance the satisfaction of business needs and the adherence to scientific standards depending on the project phase. HR-A teams might use the operationalizations specified in Table 2 to check whether their capabilities need to be developed in this regard. Table 2 Operationalization of HR-A capabilities CapabilityOperationalizationSatisfaction of business needs•A strategically important question is addressed•The focus is on strategic capabilities rather than on short-term incremental improvements•Results enable …decision-support for stakeholders …justifying a business case and people-related investments Rigorous derivation of research questions•There are theoretical assumptions about investigated effects•Testable hypotheses are formulated that are derived logically from theoryRigorous data quality assessment•Psychometric rigor is tested (reliability, validity, objectivity)•Fitness for use is ensured (accuracy, timeliness, usability, completeness)Table 2 Operationalization of HR-A capabilities View as image HTML Science-practice collaborations have been proposed as a means to ensure scientific rigor of HR-A, since organizational researchers can contribute methodological and analytical expertise (Angrave et al., 2016). However, for science–practice collaborations to be successful, there needs to be a shared mental model regarding the goals of the project. As presented in Table 1, we suggest that reaching a shared goal for HR-A and a research project can be challenging, due to significant differences in problem definition, implementation, and evaluation of the project. Thus, a focus should be placed on the collaborative project execution (derivation of testable hypotheses, data quality assessment), for which the academic expertise can be a valuable addition to HR-A execution. Such collaborations can also add to leveraging scientific results in science in the sense that results from scientific studies can be tested by HR-A teams in different settings and contexts. In this way, we believe it is possible for practice to profit from science and from science to profit from practice.LiteraturAngrave, D., Charlwood, A., Kirkpatrick, I., Lawrence, M., & Stuart, M. (2016). HR and analytics: Why HR is set to fail the big data challenge. Human Resource Management Journal, 26 (1), 1 – 11. https://doi.org/10.1111/1748-8583.12090 First citation in articleCrossref, Google ScholarBarney, J. B. (1996). The resource-based theory of the firm. Organization Science, 7 (5), 469 https://doi.org/10.1287/orsc.7.5.469 First citation in articleCrossref, Google ScholarBoudreau, J. W., & Ramstad, P. M. (2004). Talentship and human resource measurement and analysis: From ROI to strategic organizational change. Marshall School of Business. First citation in articleGoogle ScholarBraun, M. T., & Kuljanin, G. (2015). Big data and the challenge of construct validity. Industrial and Organizational Psychology, 8 (4), 521 – 527. https://doi.org/10.1017/iop.2015.77 First citation in articleCrossref, Google ScholarCai, L., & Zhu, Y. (2015). The challenges of data quality and data quality assessment in the big data era. Data Science Journal, 14 (2), 1 – 10. https://doi.org/10.5334/dsj-2015-002 First citation in articleGoogle ScholarDipboye, R. L. (2007). Eight outrageous statements about HR science. Human Resource Management Review, 17 (2), 96 – 106. https://doi.org/10.1016/j.hrmr.2007.04.001 First citation in articleCrossref, Google ScholarFalletta, S. V., & Combs, W. L. (2020). The HR analytics cycle: A seven-step process for building evidence-based and ethical HR analytics capabilities. Journal of Work-Applied Management, 13 (1), 51 – 68. https://doi.org/10.1108/JWAM-03-2020-0020 First citation in articleCrossref, Google ScholarFu, N., Keegan, A., & McCartney, S. (2022). The duality of HR analysts’ storytelling: Showcasing and curbing. Human Resource Management Journal. Advance online publication https://doi.org/10.1111/1748-8583.12466 First citation in articleGoogle ScholarGuenole, N., Feinzig, S., Green, D., & Zhang, H. (2017). HR analytics readiness: How does Europe compare to the rest of the world? IBM Smart Workforce Institute. First citation in articleGoogle ScholarJörden, N. M., Sage, D., & Trusson, C. (2021). ‘It’s so fake’: Identity performances and cynicism within a people analytics team. Human Resource Management Journal. Advance online publication. Article 1748 – 8583.12412 https://doi.org/10.1111/1748-8583.12412 First citation in articleGoogle ScholarLevenson, A. (2018). Using workforce analytics to improve strategy execution. Human Resource Management, 57 (3), 685 – 700. https://doi.org/10.1002/hrm.21850 First citation in articleCrossref, Google ScholarLevenson, A., & Fink, A. (2017). Human capital analytics: Too much data and analysis, not enough models and business insights. Journal of Organizational Effectiveness: People and Performance, 4 (2), 145 – 156. https://doi.org/10.1108/JOEPP-03-2017-0029 First citation in articleCrossref, Google ScholarMarler, J. H., & Boudreau, J. W. (2017). An evidence-based review of HR analytics. The International Journal of Human Resource Management, 28 (1), 3 – 26. https://doi.org/10.1080/09585192.2016.1244699 First citation in articleCrossref, Google ScholarMcAbee, S. T., Landis, R. S., & Burke, M. I. (2017). Inductive reasoning: The promise of big data. Human Resource Management Review, 27 (2), 277 – 290. https://doi.org/10.1016/j.hrmr.2016.08.005 First citation in articleCrossref, Google ScholarMcCartney, S., & Fu, N. (2022a). Bridging the gap: Why, how and when HR analytics can impact organizational performance. Management Decision, 60 (13), 25 – 47. https://doi.org/10.1108/MD-12-2020-1581 First citation in articleCrossref, Google ScholarMcCartney, S., & Fu, N. (2022b). Promise versus reality: A systematic review of the ongoing debates in people analytics. Journal of Organizational Effectiveness: People and Performance, 9 (2), 281 – 311. https://doi.org/10.1108/JOEPP-01-2021-0013 First citation in articleCrossref, Google ScholarPalmer, D., Dick, B., & Freiburger, N. (2009). Rigor and Relevance in Organization Studies. Journal of Management Inquiry, 18 (4), 265 – 272. https://doi.org/10.1177/1056492609343491 First citation in articleCrossref, Google ScholarPeeters, T., Paauwe, J., & van de Voorde, K. (2020). People analytics effectiveness: developing a framework. Journal of Organizational Effectiveness: People and Performance, 7 (2), 203 – 219. https://doi.org/10.1108/JOEPP-04-2020-0071 First citation in articleCrossref, Google ScholarRasmussen, T., & Ulrich, D. (2015). Learning from practice: How HR analytics avoids being a management fad. Organizational Dynamics, 44 (3), 236 – 242. https://doi.org/10.1016/j.orgdyn.2015.05.008 First citation in articleCrossref, Google ScholarSmith, S. M., & Nichols, T. E. (2018). Statistical challenges in “big data” human neuroimaging. Neuron, 97 (2), 263 – 268. https://doi.org/10.1016/j.neuron.2017.12.018 First citation in articleCrossref, Google ScholarTippins, N. T., Oswald, F. L., & McPhail, S. M. (2021). Scientific, legal, and ethical concerns about AI-based personnel selection tools: A call to action. Personnel Assessment and Decisions, 7 (2), 1 – 22. https://doi.org/10.31234/osf.io/6gczw First citation in articleCrossref, Google ScholarWeibel, A., Schafheitle, S. D., & Ebert, I. L. (2019). Goldgräberstimmung im Personalmanagement? Wie Datafizierungs-Technologien die Personalsteuerung verändern [Gold-rush atmosphere in personnel management? How datafication technologies are changing human resource management]. Zeitschrift für Organisationsentwicklung, 3, 23 – 29. https://www.alexandria.unisg.ch/server/api/core/bitstreams/1063c47b-b26a-46cd-9e87-812db9a7dbf7/content First citation in articleGoogle Scholar1The explainability of an effect (logical framework) is not the same as explainable analytics (is the output of complex analyses understandable/ are decisions by models transparent?). Although explainable analytics are of great importance for ethical and legal HR-A, this part refers to the logical reasoning behind an expected effect.FiguresReferencesRelatedDetails Volume 0Issue 0ISSN: 0932-4089eISSN: 2190-6270 InformationZeitschrift für Arbeits- und Organisationspsychologie A&O (2023), 0,https://doi.org/10.1026/0932-4089/a000418.© 2023Hogrefe VerlagPDF download
更多
查看译文
关键词
scientist
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要