In this research we propose a web-based adaptive self-explicated approach for multi-attribute preference measurement (conjoint analysis) with a large number (ten or more) of attributes. In the empirical application reported here the proposed approach provides a substantial and significant improvement in predictive ability over current preference measurement methods designed for handling a large number of attributes. Our approach also overcomes some of the limitations of previous self-explicated approaches. Two methods are commonly used to estimate attribute importances in self-explicated studies: ratings and constant-sum allocation. A common problem with the ratings approach is that it does not explicitly capture the tradeoff between attributes; it is easy for respondents to say that every attribute is important. The constant-sum approach overcomes this limitation, but with a large number of product attributes it becomes difficult for the respondent to divide a constant sum among all the attributes. We developed a computer-based self-explicated approach that breaks down the attribute importance question into a sequence of constant-sum paired comparison questions. We first used a fixed design in which the set of questions is chosen from a balanced orthogonal design and then extend it to an adaptive design in which the questions are chosen adaptively for each respondent to maximize the information elicited from each paired comparison question. Unlike the traditional self-explicated approach, the proposed approach provides (approximate) standard errors for attribute importance. In a study involving digital cameras described on twelve attributes, we find that the predictive validity (correctly predicted top choices) of the proposed adaptive approach is 35%-52% higher than that of Adaptive Conjoint Analysis, the Fast Polyhedral approach, and the traditional self-explicated approach, irrespective of whether the part-worths were estimated using classical or hierarchical Bayes estimation. Additionally, the proposed adaptive approach reduces the respondents burden by keeping the number of paired comparison questions small without significant loss of predictive validity.
-
Faculty
- Academic Areas
- Awards & Honors
- Seminars
-
Conferences
- Accounting Summer Camp
- California Econometrics Conference
- California Quantitative Marketing PhD Conference
- California School Conference
- China India Insights Conference
- Homo economicus, Evolving
-
Initiative on Business and Environmental Sustainability
- Political Economics (2023–24)
- Scaling Geologic Storage of CO2 (2023–24)
- A Resilient Pacific: Building Connections, Envisioning Solutions
- Adaptation and Innovation
- Changing Climate
- Civil Society
- Climate Impact Summit
- Climate Science
- Corporate Carbon Disclosures
- Earth’s Seafloor
- Environmental Justice
- Finance
- Marketing
- Operations and Information Technology
- Organizations
- Sustainability Reporting and Control
- Taking the Pulse of the Planet
- Urban Infrastructure
- Watershed Restoration
- Junior Faculty Workshop on Financial Regulation and Banking
- Ken Singleton Celebration
- Marketing Camp
- Quantitative Marketing PhD Alumni Conference
- Rising Scholars Conference
- Theory and Inference in Accounting Research
- Voices
- Publications
- Books
- Working Papers
- Case Studies
-
Research Labs & Initiatives
- Cities, Housing & Society Lab
- Corporate Governance Research Initiative
- Corporations and Society Initiative
- Golub Capital Social Impact Lab
- Policy and Innovation Initiative
- Rapid Decarbonization Initiative
- Stanford Latino Entrepreneurship Initiative
- Value Chain Innovation Initiative
- Venture Capital Initiative
- Behavioral Lab
- Data, Analytics & Research Computing