Blog Viewer

Research Highlights: How do we replicate entrepreneurship research?

  
(posted on behalf of @Aviel Cogan and @Tobias Pret)

It is widely acknowledged that we are experiencing a “replication crisis.” As research results are often difficult or impossible to replicate, the credibility of scientific work has been questioned. Entrepreneurship scholars have long acknowledged the importance of replication studies, but these rarely appear in top journals. Consequently, scholars largely avoid replicating entrepreneurship research to the detriment of our field’s ability to test, confirm, and challenge extant theories. In an effort to encourage and facilitate replication studies, @Christopher Crawford and colleagues published an article in Entrepreneurship Theory and Practice entitled “Advancing Entrepreneurship Theory Through Replication: A Case Study on Contemporary Methodological Challenges, Future Best Practices, and an Entreaty for Communality.”
 
Their study was conceived in response to a call for papers on knowledge accumulation. As Crawford points out, “the field has come a long way toward understanding and refining concepts,” but he and his co-authors wondered, “What if we could tweak the variables and models in about 30 older, highly influential studies with representative sample data … and see what happens?” As it turned out, this was easier said than done.
 
One significant issue they encountered was that “it took an inordinate amount of time just to try to run two or three studies … and we already had eight highly competent co-authors [Vitaliy Skorodziyevskiy, Casey Frid, Thomas Nelson, Zahra Booyavi, Diana Hechavarria, Xuanye Li, Paul Reynolds, and Ehsan Teymourian].” Luckily, the author team was able to get support to run more studies: “Dr. Lynn Agre stepped in and … helped me find students from our [Rutgers University’s] Masters in Statistics program … who were already really good at the methods in question. We held a realistic job preview interview … where we gave 22 applicants the Davidsson and Honig 2003 JBV paper and said, ‘Here, replicate it over a weekend. ’ Only eight applicants were successful, so the research team hired these eight as data analysts. As Crawford explains, “With eight co-authors and eight analysts, it was a lot like herding cats. Though we were systematic in our process … we were still only able to attempt 19 replications.”
 
The primary problem was that, “Of those [19], we only managed to successfully replicate six … about 32%, which is exactly what other domains come up with when they try replication. That’s where the replication crisis comes in. They say, ‘We can't replicate two-thirds of the published papers … even though all the data are available, even though all the authors are available, even though all the scripts are available, we still can't do it.’”
 
Consequently,  the authors had some difficulty when writing the paper as they “didn't want to throw anybody under the bus.” As Crawford shares, “We really didn't want to say, ‘Hey, we were able to successfully replicate these, but not these,’ and so we acknowledge a bunch of authors, because we pretty much reached out to everybody to say, ‘Look … can you help us with a script or with a data set or anything?’ … but there were so many authors who replied with,  ‘Sorry … I've been to six different schools since this 2004 paper was published and I just don't have it. I'm really sorry.’ … But then there were others who said, ‘I understand that this is an important project, so I'm going to dig through my old external hard drives, figure it out for you, and then I’ll talk you through some of it.’ And some of those worked and some of those still didn't.”
 
Facing these challenges showed the authors that “It’s all an interdependent system where, if one component falls off, the rest suffer. In order to do these replications successfully, you have to involve everybody … the whole village of stakeholders … and everyone has to understand how important the work is for the domain.” Based on their experience, the authors see the need for significant institutional changes to support replication work: “Administrators, department chairs, and resource providers need policies to value these studies and count [them] toward promotion and tenure. Journal editors … need special issues or ongoing, open invitations for replication studies … as well as extra online space for methodological appendices. Authors need to systematically document every part of the data preparation and analysis process … and PhD students need to be trained how to … produce replicable work.”
 
The bottom line, Crawford adds, is that “Every entrepreneurship scholar has a part to play … Being able to replicate increases the legitimacy of the domain. It increases the ability for others to build upon that knowledge [and] to identify boundary conditions of different theories. Most importantly, when replications fail, … you're able to develop new theory from those failures so it's important to at least try … So, in our paper, we espouse the idea that everyone needs to be on board with all of this.” It seems that it does, indeed, take a village to raise a paper. Hopefully, the best practices that they set forth in their article will help entrepreneurship scholars make it easier for others to replicate their research.​​
0 comments
14 views

Permalink