Abstract

Web-based crowdsourced citizen science is an efficient method for scientists to collect and process data. Although lay persons obtain the opportunities to participate in research and engage with scientists, these crowdsourced projects generally maintain the traditional hierarchy of academic science. Lay persons have little say in project or platform governance, and institutional tools to hold project investigators accountable are almost nonexistent. This article examines how existing institutional policies address the question of distribution in crowdsourced citizen science, as it may further affect lay participants’ role in the institution of scientific knowledge production and their access to research resources. This article begins by comparing the norms developed by citizen-science institutions. It then discusses examples from Galaxy Zoo to see how the results of research projects are distributed, both in the form of access to research outcome and in authorship. The article also discusses the potential conflicts that arise when crowdsourced projects are organized by for-profit companies and why citizen-science platforms should develop institutional norms to avoid such conflicts.

Abstract

摘要:透過網站平台以眾包方式進行公民科學,可以讓科學家有效率的蒐集、處理研究資料,雖然素人得到參與研究、與科學家互動的機會,但這一類型的公民科學往往維持了科學研究的階層制度,參與者對研究計劃或是平台治理往往無法置喙,而能要求研究主持者對參與者負責的制度工具也付之闕如。本文探討公民科學如何處理研究成果之資源分配的問題,這些分配的方式將進一步影響素人參與者在科學知識生產制度中的地位以及取得研究資源的機會。本文首先比較公民科學相關機構所發展出的制度規範,再以星系動物園 (Galaxy Zoo) 的幾項研究為例討論研究成果如何分配,包含誰能取得使用研究成果(如論文及研究資料)、以及誰能成為論文的共同作者。本文也檢討營利事業以眾包方式進行公民科學所可能產生的衝突,以及為何公民科學平台應發展相關的制度性規範來避免此等衝突的發生。

1 Access, Credits, and Governance: Distribution Questions in Crowdsourced Citizen Science

In the past ten years, web-based crowdsourced citizen-science projects have mushroomed, especially after the iconic project Galaxy Zoo (GZ). Yet, lay participation in scientific research is not new and has existed in a variety of disciplines. In the nineteenth century, naturalists relied on tradesmen, missionaries, colonial officials, or professional collectors to gather specimen and local knowledge on the frontiers (Fan 2004; Endersby 2008). Internet-based citizen engagement in science projects is not new either. Since 1999, through SETI@home, computer users around the world have contributed their computing power to analyze radio signals for the search for extraterrestrial intelligence (Anderson et al. 2002). Nevertheless, this wave of web-based crowdsourced science differs from SETI@home as the lay public are contributing not merely their surplus computing power but their surplus cognitive power (Shirky 2010).

The story of GZ excites both fellow researchers and the public at large—facing an enormous amount of telescope images, two Oxford researchers enlisted dispersed individuals on the Internet to help with categorizing galaxies. Many responded enthusiastically to their call, and the lay participants recruited via this process later proved to be competent for the assigned tasks (Galaxy Zoo, a).1 With their help, GZ scientists were able to process data with an unprecedented speed. Since the initial project, GZ has continued to engage lay participants in subsequent studies. The credentials accumulated from earlier successes help the scientists to receive new grants, curate new research materials, or gain access to existing datasets. The project’s reputation and the network of lay participants also attract more individuals to help GZ scientists. GZ has been extremely productive in terms of publications (fifty-three papers between 2008 and 2016). As academic science has increasingly followed a market logic and scientists are evaluated by their performativity, including their publications (Pasii 2005), it is no surprise that GZ’s success has drawn attention to crowdsourced citizen science as an efficient production model for scientific research. GZ has led to the birth of Zooniverse, a platform hosting web-based citizen-science projects across several disciplines (Zooniverse a).

GZ’s production model finds affinity with what Yochai Benkler (2006: 62) has called “commons-based peer production (CBPP),” which refers to “production systems that depend on individual action that is self-selected and decentralized, rather than hierarchically assigned.” Notable CBPP examples include free software projects and Wikipedia. By noting these peer-production projects are “commons-based,” Benkler underscores the institutional structure of the access to and the control over resources. Commons is the opposite of proprietary systems that allocate the power to control resources within the owner. Our institutional norms that govern important areas of knowledge production and distribution, both in copyright and patent law, conventionally rely on a utilitarian model that provides economic incentives by allocating proprietary rights in the hands of individual owners, in order to encourage quality work that would benefit the public at large. Free software projects have proved that some programmers can be motivated by incentives other than economic rewards, and that proprietary control may hinder meaningful ways of exchange and collaborative production. Contrary to the notion of the romantic author who works in solitude, which is the default model in the copyright system (Chen 2012), free software developers regularly share code and modify each other’s code. As sharing and modifying code are both authors’ exclusive rights conveyed by copyright law, these exclusive rights advertently interfere with their daily practice. To be able to continue their collaboration and reproduce their own community, these programmers, as a recursive public, developed free licenses as tools to work around the legal obstacles caused by the copyright system (Kelty 2008). Moreover, since the resources are not under anyone’s proprietary control, when unresolvable governance issues within the community arise, community members have the option to “fork,” that is, build an independent project with the resources in the commons. Even though forking is rarely exercised, the potential threat to fork can keep project administrators in check and ensure good governance (Weber 2004; Tkacz 2015).

While not every CBPP project is able to “scale,” or prosper by attracting enough participants to lead to mature project development, high-profile examples such as Linux, Firefox, and Wikipedia have shown viable alternatives to the proprietary model. These high-profile projects disrupted their respective market and even displaced powerful players in their fields. Since the first decade of this millennium, scholars have revisited the theoretical assumptions of intellectual property law, considered institutional reforms, and advocated for openness in culture and knowledge production (Biagioli, Jaszi, and Woodmansee 2011; Chen 2011; Suber 2012). Scholars have also paid attention to lay contributions to invention, such as users’ role in designing and improving surgical equipment and surfboards (von Hippel 2005). Together, these literatures help to challenge the dominance of the proprietary model of cultural production and to push back against proprietors’ continuous demand for greater control over cultural resources.

Announced in 2007, GZ fits well with this zeitgeist: the pursuit for a more collaborative and inclusive environment for a knowledge commons through peer production (Madison 2014). With their porous structure and voluntary nature, web-based crowdsourced citizen-science projects follow CBPP design principles, such as modularity and granularity. Their flexible task designs help to attract people with different levels of capacity or motivation to participate, and help to continuously cultivate their interests in the projects (Benkler 2006; Schrier 2016). Nevertheless, web-based crowdsourced citizen-science projects are not exactly CBPP. Rather, they are generally designed and operated by scientists within research institutions, and their organization structure is more centralized and hierarchical. As will be discussed below, both the peer-production model and the Mertonian norm of communitarianism in scientific research have helped to move the research results of crowdsourced science toward the “commons.” Nevertheless, lay participants’ contribution in crowdsourced projects does not necessarily challenge the current hierarchy of knowledge production, nor guarantee a more democratized “republic of science,” in which the scientists assert less authority over the lay public (Polanyi 1962).2

Crowdsourced citizen science may be conceived as an efficient production model for scientific research, if not merely an answer to the lack of resources (funds, manpower, time). Scientists can process more data, at a faster pace; publish more papers; and gain access to more resources (Sauermann and Franzoni 2015). Yet, what are the gains and rewards, if at all, for lay participants? This is not merely a question about motivation, as many have asked after seeing the surge of crowdsourced science (Raddick et al. 2013; Schrier 2016). Rather, this is also a question about how empowered lay persons can become through their participation, both in terms of obtaining access to scientific research results, and in terms of gaining symbolic resources such as reputational prestige, which may grant them access to more power and resources in the institution of knowledge production (Fuchs and Turner 1986). Or, rather simply, as some lay participants have questioned, is this model of science production exploitative, unfair, or unethical? (Nowell 2016).

Below I begin with a brief review of the existing institutional policies on citizen science. Based on my own observation and existing literature, I discuss two aspects of how research results are distributed within crowdsourced citizen science projects: the access to research results in the form of data and publications and the forms of credit received by citizen scientists. Asteroid Zoo, a Zooniverse project, brought in a third aspect of distribution that deserves attention: can for-profit companies administer citizen-science projects? The tension between lay participants and the platform administrators in this case also hints at important questions about platform and project governance, although a full examination of both issues is beyond the scope of this article.

2 Existing Institutional Policies on Citizen Science

Zooniverse, the platform that grew out of GZ, defines a simple set of policies to which projects should comply if promoted to the Zooniverse community via its platform: 1) projects should be “producing useful research” and aim to turn efforts into a formal publication; 2) projects should make classification data open after a proprietary period; 3) results should be communicated to the community, for example, via open-access publications or blogs (Zooniverse b).3 These policies address the intended “publicness” of citizen-science projects—scientific research is not just to benefit professional scientists but is for the common good, and the research results should eventually be accessible to all. Such publicness implicitly justifies the plea for the lay participants to contribute to scientific endeavors. Similarly, the European Citizen Science Association (ECSA) lists ten principles of citizen science, among them requiring projects to ask research questions and have genuine science outcomes (principles 1 and 2); giving citizen scientists feedback (principle 5); making data and metadata publicly available (principle 7); and to publish in open-access format whenever possible (principle 7) (ECSA 2015).

Compared to the Zooniverse policies, the ECSA principles declare a stronger commitment to the democratization of science (principle 6). ECSA makes it clear that citizen scientists can play more important roles in science—they can be collaborators or project leaders (principle 1), and can take part in not only data collection and analysis but also the problem-setting and research design process (principle 4). ECSA believes that citizen scientists should benefit from their participation (principle 3), and should be acknowledged in project results and publications (principle 8). While the Zooniverse is focused on research science and asks projects to have formal publications as their primary outcome, ECSA notes the impacts citizen science can also have on policies (principles 3, 9).

Overall, ECSA bestows more power on citizen scientists, in their participation in both research and regulatory science. One reason could be that the Zooniverse is focused on web-based crowdsourced academic science, whereas the ECSA principles acknowledge more forms of citizen engagement in science. Zooniverse actually does not use the term citizen science in their website description but calls itself “people powered research”—anyone can volunteer to assist professional scientists and contribute to “real academic research” (Zooniverse, a). Nevertheless, one cannot really disassociate Zooniverse from citizen science, as it is the web portal of the Citizen Science Alliance (CSA). Chris Lintott, cofounder of GZ and Zooniverse, acknowledges that they use the term citizen science to describe their projects in academic papers. The term was intentionally chosen over crowdsourcing because they want participants to ask and answer their own questions (Lintott 2013a).4

3 Access to Data and Research Results in Crowdsourced Citizen Science

Scientific research is presumably intended to enrich human knowledge. This assumption forms the basis of the Polanyian republic of science, in which scientists are collaboratively putting together one big puzzle in order to benefit mankind (Polanyi 1962). Research activities do not produce results that are of direct monetary value for the scientists. The scientific claims—facts or natural laws—are part of nature and stay in the public domain. Rather, scientists receive credits, which are symbolic capital that can be converted into something of value in the economy of academic science, such as career promotions, prizes, or grants (Biagioli 2000). Scientists who conduct successful crowdsourced science may receive resources to carry out future research, which also presumably benefits the public, including those who have contributed to the previous projects. Hence, theoretically, crowdsourced citizen science is a win-win situation for both lay participants and the professional scientists, and the question of exploitation (Resnik, Elliott, and Miller 2015) is not so acute.

Yet, if lay participants are seeking to go beyond the assigned task in a crowdsourced project and develop their own research agenda, they have to overcome various access control points: to the raw research materials, to the intermediate research outcome, and to the final publications. This is because the economic system of academic knowledge production is complicated by intersecting economic systems and ownership regimes. For example, the raw research materials can be costly to acquire and maintain. As they form the basis for scientific research, scientists and institutions tend to treat both physical objects (e.g., specimens) and the raw data as private property and lock them away from the public or other researchers, even though these materials are often funded by public money. In web-based crowdsourced projects, the data to be analyzed is digitized and can be transmitted or accessed with fewer physical constraints. Nevertheless, with the design principle of modularization and granularity, lay participants are typically presented with fragments of the whole data set. Neither the Zooniverse policies nor the ECSA principles address the issue of giving lay participants access to the raw data.

Both ECSA and Zooniverse require projects to provide certain access to research results, including intermediate results (e.g., classification data) and the publications. Even though scientific claims cannot be owned, academic papers that embody them are copyrightable. As authors of academic papers usually have to assign their copyright to the publishers and are not financially rewarded for their publications, they make a strong case against the conventional rationale that supports copyright and proprietary control. Researchers are motivated by both intrinsic incentives (e.g., producing good scholarship to benefit society), as well as extrinsic incentives (reputation and impact), which can be converted into currency in the academic economic system. In fact, academic authors may find the copyright system counterproductive for their main purposes, as rent-seeking publishers may overcharge individuals and research institutions, making their scholarship less accessible and less influential.

Nowadays, not only are academic papers frequently prohibitively expensive for lay individuals, research institutions also often find subscriptions unaffordable. As part of a larger critique of copyright law, the open access (OA) movement embodies the zeitgeist that champions a less restrictive access to knowledge and scholarship in the age of the Internet (Budapest Open Access Initiative 2002).5 The OA movement has addressed this issue by asking researchers to make their publications available on institution-based archives or asking journals to lift the pay wall. The movement went beyond journal articles to include research data and supplemental materials (Open Access Max-Planck-Gesellschaft 2003). One should note that “data” is not copyrightable per se, but the arrangement of data can be protected by either copyright law or sui generis database law. Advocates for open access to data argue that a mandate to make data freely available is both more efficient for the science community and enables better scientific practices, as data can be reused and verified by other researchers (Pampel and Dallmeier-Tiessen 2014). Another argument for the OA movement is rather political and distributional: for scholarship that is funded by tax money, the public has already paid for the research and should not be charged again for accessing the results (Suber 2012). This argument works for both open access to journal articles and research data (Pampel and Dallmeier-Tiessen 2014).

With the larger critique of copyright and open access advocacy looming in the background, it is no surprise that some citizen-science participants are aware of these issues. They have more reasons to demand open access in these projects, since they are not only taxpayers but they also contribute to the data analysis or data collection in these projects. Zooniverse policies and the ECSA principles both address the issue of open access to research data and publications. Yet, these principles are not always self-evident to researchers who have solicited contributions from the public. GZ’s lay participants brought the issue of open access to project investigators’ attention and advocated for this principle. In 2013, six years after GZ began, a contributor asked on the project’s forum, “Why must zooites who created the GZ data ‘pay’ to access the key paper on it?” GZ did not seem to have a deliberated position on this issue then, and a forum moderator initially responded by noting that scientists (institutions) also have to pay for their access. Unsatisfied with the answer, the same contributor pointed out that GZ’s papers are behind a paywall even though GZ had published with publishers that offer an open access option. The science team then argued that since they have preprints on arXiv.org, a popular open-access archive, it seems unnecessary to pay the substantial fee that publishers ask to remove the paywall. This contributor then verified GZ’s statement by surveying all the published GZ papers and found that two of them did not offer open-access preprints (Tate 2013). Soon, Lintott announced that Oxford University Press would remove the paywall for the Zooniverse papers published in Monthly Notices of the Royal Astronomical Society, where the GZ team frequently submits for publication (Lintott 2013b).6 In terms of data, as mentioned the Zooniverse policy does ask projects to provide open access to classification data, although it permits a proprietary time and does not set a limit for such a period.7

Scholars have argued that lay participants, although they lack formal training, can bring insights from their knowledge and interest, and offer opportunities for discoveries (Lukyanenko, Parsons, and Wiersma 2016). But when lay participants notice scientific questions that are not addressed by the project design and attempt to answer or to confirm the feasibility of their own research question, they need access to the raw research materials. GZ ran into this situation soon after the project started—a group of participants noticed a category of objects that was unknown to the science team (which was later confirmed as a discovery of a new kind of galaxy—Green Peas, see below). As Lintott aspires for citizen scientists to be able to ask and answer their own questions, the GZ team did offer these participants access to the database to run an automated search for similar objects, which was an important step for that discovery. Nevertheless, the access to the raw research data is not addressed by the Zooniverse policies and seems to be dependent on the goodwill of the individual science team.

4 Authorship in Citizen-Science Projects

What motivates scientists more than copyright ownership are the prestige and reputation resources, the symbolic resources that are controlled by the publication system. While scientists may not acquire copyright ownership over the publications made possible by citizen scientists’ efforts, they do garner reputation as productive and prolific researchers. As GZ noted itself, one important sign of their success is the time granted to them on some of the largest telescopes in the world (Galaxy Zoo, a). The competition for access to these large telescopes is intense (McCray 2004). For GZ, citizen scientists’ collaborative efforts are instrumental for its reputation as a project that is capable of putting data to good use, and hence increased its bid for access to this extremely scarce resource.

The distribution of reputational prestige in citizen-science projects is a key question for those who aspire for the inclusion of lay participants as a catalyst for democratizing science. Only a small number of lay participants received more valuable forms of credits, such as having a celestial body named after them for being the discoverer, or acquiring the status of coauthors of an academic paper. While not every lay participant cares about credits, the symbolic power underlying such formal recognition can matter if they intend to go beyond the roles designated by scientists. In the Polanyian republic of science, scientists exercise authority over the lay public. Citizen science suggests a more inclusive republic, but are lay participants entitled to challenge their assigned role, dispute judgments made by the scientists, or question the boundary between themselves and those with full membership in the Polanyian republic? The lack of entitlement in the scientific community can affect lay participants’ claims to access existing literature or research materials that are currently behind publishers’ paywalls or institutions’ concrete walls. Even when citizen-science projects are making their publications open access, lay participants need access to many other pay-walled papers in order to pursue their own inquiries. Lawmakers, when considering stronger copyright laws, are more willing to grant scientists exceptions as a way to balance the private and public interests. Without the credentials and reputations as persons capable of producing valid knowledge, past legislations have granted the lay public less access to existing scholarship and research materials (Chen 2015; Malcolm 2018).

The problem of determining whose contribution deserves coauthorship is a long-lasting one. Being an author typically means that the person has made a significant contribution to the result, participated in the writing and reviewing of the paper, and hence is responsible for the scientific claim. Nevertheless, in some cases labs developed their own norms for assigning authorship—for example, having a standard author list instead of tracing the contribution of individual researchers in order to determine authorship (Biagioli 2000). Coauthorship can also serve as a token of recognition for certain interactions rather than actual collaboration (Katz and Martin 1997), for example, as a form of compensation for material exchange (Rodriguez et al. 2008). Coauthorship is thus a vital form of symbolic capital. Scholars have found that some collaborators are listed in acknowledgments rather than as coauthors and that a better way to study collaboration patterns should look beyond authorship and explore the acknowledgements (Paul-Hus et al. 2017). Nevertheless, being acknowledged is less valuable than coauthorship as a form of academic credit and currency.8

In general, most coauthors of papers resulting from citizen-science projects are still professional scientists from relevant departments in academic institutions, with personnel from information-technology consulting firms or computer scientists as a recurring exception. Almost every GZ paper published between 2008 and 2012 listed information technologists as coauthors.9 In fact, these experts have become more than occasional collaborators, forming an integral part of web-based citizen-science research. The affiliation of information technologists in recent GZ papers shows that their positions have become formalized (Zooniverse, c).

There is no unified practice in terms of how citizen scientists are credited or recognized for their contributions. In Galaxy Zoo, for example, lay participants are generally acknowledged as a collective in a footnote, which contains a URL that directs readers to the project webpage with registered users. This page is titled “project volunteers” but uses “authors” as a subdomain name (Galaxy Zoo, b). Yet, since GZ does occasionally recognize individual lay participants as coauthors of a paper, it would seem that the other participants are more “volunteers” than “authors.” For astronomy-related projects, the discovery of a new celestial object seems to be a common standard for recognizing coauthorship in the paper announcing the finding. Hanny van Arkel, the Dutch schoolteacher who identified a new object (Hanny’s Voorwerp) when participating in GZ in 2007, is a coauthor of the paper (hereafter HV paper) reporting the discovery (Lintott et al. 2009). Stardust@home makes it a principle on its project website that the “discoverer of an interstellar dust particle appears as a coauthor of any scientific paper” by the team announcing the discovery.10 In a 2016 paper, Radio Galaxy Zoo lists two contributors as coauthors for discovering a new galaxy cluster (University of Minnesota 2016), with “Zooniverse Citizen Scientists, c/o Oxford Astrophysics” as their affiliation (Banfield 2016).

Other kinds of contribution to “discoveries” may not guarantee coauthorship. For GZ, another early major achievement was the discovery of Green Peas, a new class of galaxy. Via the online forum, a group of enthusiastic contributors (referring to themselves as the “Peas Corps”) discussed the new class of objects they noticed when categorizing galaxies (Zooniverse 2009a). They raised the research question—what is this repeatedly appearing type of object?—and soon began to tackle it. They collected more potential examples, explored existing literature, proposed galaxy spectra criteria, and used automated methods to search the database for more evidence (Zooniverse 2009b). The involved citizen scientists did not have formal training and lacked an authoritative way to discount borderline examples to resolve ambiguities, therefore they did not push the research all the way through to claim the discovery. In the last stage, GZ scientists stepped in and completed this research by limiting the Peas candidates, performing analyses, and publishing the paper announcing the discovery (hereafter GP paper) (Straub 2016). The GP paper individually acknowledged several members of the Peas Corp, instead of listing them as coauthors (Cardamone et al. 2009). The Peas Corp did not dispute the decision of not making them coauthors. This could be partly because the GZ scientists ensured they communicated the research process to the community and maintained a good collaborative relationship throughout the process (Straub 2016), and partly because these citizen scientists did not find attribution to be their main concern (Nowell 2016).

From various accounts of the discovery of Hanny’s Voorwerp, it seems that scientists stepped in once Arkel asked on the discussion board what it was that she spotted. Her role was rather inactive while the scientists identified the object. The discovery of Green Peas involved many more efforts of citizen scientists (Arkel is among them), yet the GP paper did not name them as coauthors as the HV paper named Arkel. One possible reason for the different treatment is that the former situation—identifying a new celestial object—has a clear rule of authorship to follow. Kevin Schawinski, cofounder of GZ and a coauthor of the GP paper, acknowledges that the Green Peas example was “a genuine citizen science project, where the users were directly involved in the analysis” (Space Daily 2009). If, as in Straub’s (2016) account, the fact that members of the Peas Corp lacked the capability and access to necessary tools to rule out borderline examples was their main obstacle, then the story speaks to the importance of the distribution of symbolic resources for citizen scientists in the long run. For engaged citizen scientists who wish to take a larger role in future research through their participation, this is a catch-22 situation. If only scientists with the right credentials can push the project far enough to claim discoveries, then can lay participants ever accumulate enough credits through citizen-science projects for them to conduct science? If different labs and disciplines have been able to develop their own norms of assigning authorship, perhaps citizen-science projects can also consider new ways to credit citizen scientists, which will gradually empower them to perform more active roles, gain access to the necessary research tools and materials, and eventually blur the line between professional scientists and the lay public.

The science team in a citizen-science project may not have the final word on how to credit citizen scientists. Journals not only serve as the gatekeepers of scholarship but they also have their own institutional reputation. In 2016, editors of the Journal of Molecular Biology postponed a paper submitted by the EteRNA project, citing “ethical concerns” for having three of the paper’s coauthors using screen names. The EteRNA science team had engaged participants in communal discussions about whether these coauthors’ real names were required before the submission. The science team was willing to carry out the communal decision and considered submitting the paper elsewhere, but the three citizen scientists relented and gave their real identity. The journal editor who put the paper on hold argued that it is inappropriate to use a screen name because in academic publishing, authors must take responsibility for the integrity of a paper and must be reachable if there were accountability issues (Bohannon 2016). The question of whether each author can be held accountable for the integrity of the whole paper is not new and certainly not specific to citizen science. Here, the science team did not have difficulties reaching out to these three coauthors. It is also possible for projects to demand verifiable contact information when citizen scientists register with screen names. The editor’s real concerns are likely to be something other than pseudonymous participants’ reachability.

As noted by some journal editors, the real-world identity is important in order to examine potential conflict of interest in scientific research (Bohannon 2016). Even if a pseudonymous author is reachable by the contact information he or she provided, it is extra work and cost for the editors and readers to confirm the author’s identity, and it would be more difficult to rely on the public to spot potential violations of research ethics (not suggesting journals particularly welcome public review projects such as Pubpeer). When scientists guarantee the integrity of the research, their own membership in the community and career opportunities are at stake.11 Even when lay coauthors are credited using their real name, the editors may still be concerned about the accountability that accompanies authorship. Without an institutional affiliation or a prospective career in academia, what cost would lay persons pay if they make false claims?

Although journals have legitimate concerns about accountability, citizen-science projects also have other concerns about asking participants for their real identity. Whether online collaborative projects should allow anonymous or pseudonymous participants has been debated. In fact, anonymous contributors often make high-quality contributions. Asking participants to register and log in every time they contribute can create disincentives such as user privacy concerns and extra entry cost. The collection of real names may deter participation from the very beginning. Requiring a screen name or a constant identity already mitigates the accountability issue by adding a layer of reputational cost within the community of participants (Anthony, Smith, and Williamson 2009). If citizen science projects eventually need to provide journals with real names of all coauthors, will project coordinators have to ask for real identities when participants begin to contribute? The collection and the maintenance of such personal information are likely to cause extra burden for the scientists. Would it be possible for journals to settle on making the coauthoring professional scientists accountable for the citizen scientists who choose to remain pseudonymous? As in the EteRNA paper, the coauthoring professional scientists do not have less at stake than journal editors and publishers when worrying about the accountability of their pseudonymous colleagues. Yet, the scientists in the EteRNA paper clearly had no problem letting them remain pseudonymous. The EteRNA case could be seen as an example of scientists experimenting on new forms of credit attribution in a citizen-science project, but it was not approved by the journal, an institutional gatekeeper in knowledge production.

Overall, determining coauthorship in citizen science can be complicated—it involves the usual difficulties in determining an individual’s contribution in a large team, as well as the common problem of weighing contributions made by collaborators of different disciplines. Moreover, as an organization that exercises normative control in science, a journal can decide whether to approve the evaluation of the science team or the collective decisions by the science team and lay contributors. Besides the evaluation of an individual’s actual contribution, other thorny issues are the general perception of lay persons’ capability to perform science, whether they have a (prospective) career, status, or honor to lose if unable to defend their claims, or how else can they be held accountable. Arguably, the perception of what lay persons are capable of has also evolved with the development of citizen science and within individual citizen-science projects. GZ gave lay participants relatively simple tasks in its first phase. Boosted by the “newfound confidence in the ability of . . . volunteer classifiers,” the investigators gradually made the tasks more challenging (Galaxy Zoo, b). In 2013, Galaxy Zoo Quench began as an experiment to invite lay participants to “experience the full scientific process”: to “classify, analyze, discuss and collaboratively write a paper” (Galaxy Zoo, c). This project sounds inclusive but also disciplining—with professional scientists showing and tutoring lay persons how science is (traditionally) done through the process. Participants completed the first stage (classification) within two months, nevertheless, the project became dormant in 2014 and has not led to any formal results. It is still unclear how the authorship practice in such an experiment will differ from the previous GZ publications.

5 Citizen Science versus For-Profit Companies? Lessons from Asteroid Zoo

Zooniverse policies and ECSA principles use slightly different language, but both ask projects to be “real”—to ask genuine science questions. Zooniverse, with a stronger emphasis on research science, particularly asks the projects to aim to publish their results as academic papers. As mentioned earlier, academic research results are presumably benefiting the public. Even though there are still distribution problems such as open access and authorship, at least scientists generally do not receive proprietary gains from the research results. The following example deals with a rare example of a citizen science project run by a for-profit company, which presumably has a goal beyond simply doing science, if there was indeed a science project.

In May 2016, Zooniverse paused the project, Asteroid Zoo (AZ), stating that the community had exhausted the data currently available. The project would go through more examination, and in particular, there seemed to be a gap between the data quality offered by the project and what was required by the institution (Minor Planet Center) that has the authority to confirm the results (Zooniverse 2016).12 After the project was paused, a participant questioned the real reason behind the decision, noting that the data did not seem to have been exhausted. But as this participant noted, it is hard to argue when the other side—the science team—is generally silent (MvGulik 2016). Indeed, his question remained unanswered after several months. There had not been updates from the AZ’s science team or Zooniverse. This could simply mean that the examination and analysis required more time. However, when digging deeper into the project, one will notice that the lack of communication between AZ’s science team and the community had been a long-standing problem.

AZ was launched in June 2014 by Planet Resources (PR), a for-profit company focusing on asteroid mining. After only five months into the project, the lack of communication from the science team had led to complaints: contributors’ queries and reports were left unanswered and the science team did not update its own findings. Zooniverse helped the community to nag the science team, but the responses were sparse (Hightower73 et al. 2014). In February 2016, several community members lost patience and questioned PR’s actual agenda. One participant claimed to have communicated this situation to Zooniverse and pleaded with fellow contributors to be patient. In the following weeks, participants’ doubts increased and finally they called out the project for violating Zooniverse’s ethical rule (“don’t waste people’s time”) and policies (they did not turn the work into a formal paper, make the data open, or communicate with the community), and resented AZ’s lack of recognition and the potential exploitation (Hightower73 et al. 2016). In May 2016, the project was paused abruptly. Aside from the blog post announcing the news, there was not much public information about this decision.

To be fair, PR disclosed itself as being an asteroid mining company in the press release when launching AZ in 2014 (Planetary Resources 2014). In the previous year, PR had gained a lot of media attention for running a very successful crowdfunding campaign.13 PR has been designing a space telescope, named Arkyd, to detect exploitable asteroids. Through this campaign, PR would raise funds to set one Arkyd aside as “the first space telescope for public use.” After raising US$1.2 million in four weeks, PR added a citizen-science project (later known as Asteroid Zoo) as a stretch goal in the last stage to further boost the crowdfunding campaign—gaining an extra US$300,000 in the last forty hours.14 PR later made a grant to Adler Planetarium/Zooniverse to run AZ, with the goal of detecting potentially hazardous asteroids, and to advance “automated asteroid-searching technology for telescopes on Earth and in Space, including [PR’s own Arkyd]” (Planetary Resources 2013). AZ received much publicity as a project calling citizen scientists to help defend the earth from hazardous asteroids and “to provide real benefit to Earth.” Nevertheless, the part about how human classifications can improve automated detection for telescopes, including Arkyd, was not picked up as often (Wall 2014). Coincidentally, around the same time AZ was paused in 2016, PR announced a full refund for the crowdfunded public space telescope plan (Planetary Resources 2016a). In 2016, the company received a $US 21.1-million investment for its new business operation, which aims to send ten Arkyd telescopes to orbit the Earth for observation (Boyle 2016).

Neither the Zooniverse policies nor the ECSA principles require citizen science to be led by researchers at academic institutions. The existing Zooniverse policies only guarantee the publicness of citizen science by asking the projects to be serious about science and to publish their findings. The fact that PR is a for-profit company naturally begs the question of PR’s real agenda for the claimed citizen-science project, and whether there is a way to separate the science, especially if it is made possible by the public, from PR’s for-profit activities. Unlike scientific research, whose main outcome—knowledge, scientific claims—is supposed to go into the public domain, an asteroid mining company is seeking to enclose the tangible resources they exploit in outer space and turn them into their own property. Although asteroid mining still sounds like sci-fi to most people, the company plans to launch its first commercial prospecting mission in 2020 (Planetary Resources 2016b). PR also lobbied for the legal infrastructure permitting such enclosure—when PR launched the crowdfunding in 2013, it also hired a lobby firm to push for a bill to legitimize the asteroid mining industry (Shaer 2016). The US Congress passed the Commercial Space Launch Competitiveness Act in 2015, which allows US citizens to own the minerals they extract from the Moon or asteroids. PR also made deals with Luxembourg, which passed a law that permits foreign companies established in the country to own minerals from asteroid mining, and its government invested $US 28 million to become one of PR’s largest business partners (Nowakowski 2016).15

AZ, the citizen-science project, claims to defend Earth from hazard for the benefit of mankind and to detect new asteroids as a scientific discovery. Yet, as identifying resource-rich asteroids is important for PR’s business, the citizen-science project also potentially improves the company’s technology by providing useful data and increasing the probability of successful exploitations. PR’s prospective space-mining business and the relationship it may have with AZ were never clearly explained. If AZ might contribute to the enrichment of the US space-mining company and its investors, many might not have seen it as a citizen-science project and might not have participated.

6 Platform Governance and Projects Proposed by For-Profit Companies

AZ participants blamed PR alone for noncommunication, nonperformance, insincerity, and potential deceit, but expressed trust and appreciation for Zooniverse. Nevertheless, Zooniverse as a citizen-science platform also plays an instrumental role in soliciting contributions to the project, which is also why Zooniverse bothers to establish policies for projects to comply with if promoted on its platform. AZ not only bore a similar-sounding name to GZ, it also had a legitimate appearance—when the project was live on Zooniverse, it did list an academic publication (although only a conference abstract, not a published paper) and hence complied with Zooniverse policies (Beasley et al. 2013). Moreover, one coauthor of the paper was Zooniverse’s own Lintott. In other words, AZ appeared to be a project that was blessed with Zooniverse’s endorsement if not direct involvement. AZ may be an outlier among all Zooniverse projects, but the trust and appreciation for Zooniverse could be harmed or lost in the long run if Zooniverse allowed similar situations to repeat. Even if the AZ science team later came back with more analyses and data, Zooniverse as a platform should help contributors to demand more transparency of the project agenda and an adequate disclosure of conflict of interest.

Should Zooniverse make it a policy to host only projects initiated by research institutes? For-profit companies may be able to produce formal publications and contribute to science. They may also be capable of designing a suitable project for harnessing the wisdom of the crowd—many have done so through Amazon Mechanical Turk (Mturk), a crowdsourcing marketplace where one can recruit and coordinate workers to perform tasks that require human intelligence. Turkers generally walk away with their payments and do not demand the company to disclose their agenda or to communicate the results.16 But Zooniverse and citizen scientists are different from the Mturk and turkers. When a for-profit company conducts a citizen-science project whose research results are not mere scientific discovery and symbolic capital, participants are more likely to question the distribution of the company’s tangible and proprietary gains and will need to be convinced to volunteer their time and efforts. Without a disclosure of the potential conflict of interest, participants would feel deceived and exploited. Also, even if for-profit companies do follow the Zooniverse policy to make their data and research open access after a proprietary period of time, they could have already gained an advantage as the first mover in a competitive market. Moreover, neither the company nor their scientists face pressure to publish and communicate their findings to the public. Even if a for-profit company that conducts citizen science has the manpower to communicate with the participants for mutual benefits, they may not prioritize such efforts. The scientific community cannot effectively sanction for-profit companies and their scientists for violating its norms, as they are beyond its normative control.

It may be true that, with the past record, PR’s reputation will prevent it from starting another Zooniverse or crowdfunding campaign. But with the legal infrastructure for outer space mining already in place, PR can attract investors more effectively and may not need to do either again. In other words, aside from throwing out their projects from the platform, Zooniverse and the scientific community really did not have effective ways to sanction for-profit companies if they violated ethics and principles of citizen science. Zooniverse and the citizen-science community should take AZ as a hard lesson and reconsider whether to allow projects proposed by for-profit companies and develop adequate policies. After all, a platform’s reputation and its ability to attract future participants are at stake. When a citizen-science platform opens its door to lay participants, it should be responsible to them by ensuring its norms and policies are capable of addressing issues important to citizen scientists, as well as general ethical questions in science, such as conflict of interest and research ethics (Zooniverse 2018), or it may lose enthusiastic participants in the long run.

7 Project Governance in Citizen Science

Although AZ participants had been expressing their concerns about the project on its “Talk” forum as early as November 2014 (Hightower73 et al. 2014), I find AZ participants generally expressed an exceptional level of patience, understanding, and trust over the course of the project. After the early complaints, the science team remained largely incommunicative, yet the participants continued to contribute for another year and a half—both as citizen scientists carrying out the task and as community members who demanded the AZ comply with the “norms” in citizen science. Both PR and Zooniverse should be accountable for the trust vested in the project, the emotional distress, and the many hours of work spent on the project. AZ contributors were repeatedly told by Zooniverse that the AZ scientists might be busy. While this statement could be true and was repeatedly accepted by the community, the delayed responses and the sudden decision to pause the project raised questions about project and platform governance—are there appropriate norms and policies to deal with such a situation, should there be more transparency about such decisions, could there be more involvement from contributors in finding out what actually happened, can contributors question the decisions made by the science team of an individual project or by the platform administrator, and can citizen scientists disagree with the decision and decide to move it forward on their own?

As mentioned earlier, in CBPP, forking is an important way for community members to keep the project administration in check (Weber 2004; Tkacz 2015). When internal disagreements become critical to the project goals or values, one can (threaten to) copy the freely licensed source code or content and start a parallel project. The Spanish Wikipedia community forked in 2002 for disagreeing with Jimmy Wales, who was considering allowing advertising on Wikipedia to offset its growing operational costs (Enyedy and Tkacz 2011). This incident helped the push by the larger Wikipedia community to set the no-advertising principle in stone and eventually to transfer Wikipedia from a for-profit company to a non-profit organization (Chen 2011). To start a fork, one of course has to figure out some capacity and technical issues to host and process the amount of data involved, but the tougher challenge is whether contributors would follow the fork, and whether the forked project could achieve a working model (Tkacz 2015). Yet, a fundamental difficulty for forking citizen-science projects is the data access. As mentioned earlier, citizen-science projects are not exactly CBPP. Existing norms only address open access to publications and to the analyzed data. Raw research data is largely controlled by research institutions and its access depends on the goodwill of each science team. Without access to the same or equivalent research materials, even if a science team’s ability to carry out the research is questioned, lay participants or other professional scientists interested in the topic will not be able to fork, and the possibility of contributors using the threat of forking as a way to compel good project governance is also slim.

8 Conclusion

Crowdsourced citizen science is an efficient method for research scientists to collect and process data. While contemporary web-based citizen science may not have been foreseen by Polanyi, these projects match his ideal of spontaneous coordination in science—scientists are able to engage a large number of helpers to put together a big puzzle, to reduce the amount of time needed. Yet the Polanyian republic is reserved for the scientists, who, as a collective, uphold scientific authority over the lay public and determine whether a novice can be admitted to the community. As citizen science projects now provide new ways to participate in producing scientific knowledge, will lay participants ever be recognized as independent scientists through this process? The ultimate concern of this research is whether citizens can challenge the existing boundary of the institution of science or technocracy with knowledge and persuasive skills they acquire through participating in citizen science and engage more effectively in the technical decision making in society (Jasanoff 2003). While what Ottinger (2016) calls “social movement–based citizen science” may more directly address these questions, these activist groups also often rely on credentialed scientists to increase the legitimacy of their research and translate their results in regulatory proceedings. By focusing on citizen science employed in academic science, this article asks whether citizens can advance their positions within the institution of science itself and gain legitimacy as qualified knowledge producers.

This article examines existing norms in web-based crowdsourced citizen-science projects and discusses the distribution of project results, as in data and publications, as well as the symbolic resources, as in credits, authorship, and prestige. Admittedly, citizen-science projects have some implications for democratizing the republic of science, as lay persons are given more access to the data and publications they help to produce, and the science teams constantly answer inquiries from the lay participants and communicate intermediary investigation results before the final publication. With such communication done either with the commitment to engage lay participants in the knowledge-making process or as a way to retain lay participation by cultivating relationships, citizen-science projects have allowed more access to the research process and have provided participants with some disciplinary training. Nevertheless, the normative structure of scientific organizations remains largely unchallenged, as the coauthorship assignment and the access to raw data remain largely the decision of scientists and scientific institutions. Authorship assignment affects citizen scientists’ long-term ability to build reputational prestige, and the access to data affects their ability to challenge project governance, which is important as we have seen in the Asteroid Zoo case. The phenomenal success of prominent projects has led to the institutionalization of citizen science, and existing institutional policies are already addressing important questions. Yet, existing policies are still insufficient to make citizen-science platforms and projects accountable to their contributors.

Acknowledgments

The initial version of this article was presented at the conference Empowering or Disciplining the Citizenry through Citizen Science: Historical and Normative Perspectives on Knowledge and Power in 2016. I express my gratitude for Institutum Iurisprudentiae, Academia Sinica for supporting the research and the conference; Wen-Tsong Chiou, Chuan-Feng Wu, Kuang-Chi Hung, Anne S. Y. Cheung for co-organizing; and Ellie Yu-Hui Huang for providing research assistance. I also thank Tilman Bayer, Mario Biagioli, Tyng-Ruey Chuang, Hsiang-Fu Huang, Ming-Li Wang, peer reviewers, and the journal editors for their thoughtful suggestions and comments.

Notes

1

Within Galaxy Zoo’s first year, 150,000 people made 50 million classifications. Considering the estimate that a graduate student working full-time makes 50,000 classifications per month, it would take the student about 1,000 months to complete the same amount of work. The scale and the quality of volunteer contributions are both crucial to the project’s success. As their images receive multiple categorizations, GZ scientists are able to discount the few questionable categorizations through statistical methods (Galaxy Zoo, a).

2

The backgrounds of citizen science participants vary. There could be well-trained individuals among the participants, though they may be contributing to a different field from their profession or not contributing in their institutional capacity.

3

One additional policy asks projects to acknowledge Zooniverse in any publication and to report publications back to Zooniverse.

4

Margaret Kosmala (2016), a scientist who runs web-based citizen science projects, regards “crowdsourced science” as projects that pay people for their contribution, as in Amazon Mechanical Turk. Lintott’s use of crowdsourcing is broader and should include unpaid volunteers.

5

The Budapest Open Access Initiative aims to make all research articles freely available on the Internet.

6

Oxford University Press might have been friendlier to GZ and easier to negotiate as GZ was born and is still housed at Oxford University. GZ still relies on the preprint open-access model when it publishes with other journals.

7

Zooniverse suggests that the normal proprietary period is two years after project launch but does not impose an upper limit (Zooniverse, b).

8

However, when a paper has an extended list of coauthors, such symbolic recognition may weigh too little to be transferred to more tangible assets (Woolston 2015).

9

These coauthors’ affiliations include consulting firms and web design companies.

10

“In recognition of the critical importance of the Stardust@home volunteers, the discoverer of an interstellar dust particle appears as a co-author on any scientific paper by the Stardust@home team announcing the discovery of the particle” (Stardust@home).

11

Along this line of thinking, if journal editors and publishers are concerned about how to effectively hold coauthors accountable, they are likely to welcome the earlier-mentioned integration of information technology in institutions hosting citizen-science projects that offer technologists new career paths within academic science instead of remaining as outsiders.

12

Although the authority vested in an institution to confirm discovery is itself an important question of power allocation and symbolic capital in science, this topic is beyond the scope of this article.

13

The crowdfunding campaign began on 30 May 2013, and is also known as the “space-selfie” crowdfunding project. With twenty-five dollars, a backer could send media to the spacecraft and have a selfie made with Earth in the background (Lakdawalla 2013).

14

This last stretch goal ($1.7 million) was added right before the closing of the campaign, asking for $450,000 in forty hours (Kemsley 2013). The campaign closed at $1.5 million.

15

In October 2018, Planetary Resources was acquired by ConsenSys, a blockchain firm (Planetary Resources 2018).

16

However, there could still be other ethical problems, for example, if the results are used to manufacture weapons.

References

References
Anderson
David P.
,
Cobb
Jeff
,
Korpela
Eric
,
Lebofsky
Matt
, and
Werthimer
Dan
.
2002
. “
SETI@home: An Experiment in Public-Resource Computing
.”
Communications of the ACM
45
, no.
11
:
56
61
. cacm.acm.org/magazines/2002/11/6966-setihome/fulltext.
Anthony
Denise
,
Smith
Sean W.
, and
Williamson
Timothy
.
2009
. “
Reputation and Reliability in Collective Goods: The Case of the Online Encyclopedia Wikipedia
.”
Rationality and Society
21
, no.
3
:
283
306
.
Australian National University
.
2016
. “
Citizen Scientists Discover Huge Galaxy Cluster
.”
June
15
. www.anu.edu.au/news/all-news/citizen-scientists-discover-huge-galaxy-cluster.
Banfield
Julie K.
, et al. 
2016
. “
Radio Galaxy Zoo: Discovery of a Poor Cluster through a Giant Wide-Angle Tail Radio Galaxy
.”
Monthly Notices of the Royal Astronomical Society
460
, no.
3
:
2376
84
. doi.org/10.1093/mnras/stw1067.
Beasley
M.
,
Lewicki
C. A.
,
Smith
A.
,
Lintott
C.
, and
Christensen
E.
2013
. “
AsteroidZoo: A New Zooniverse Project to Detect Asteroids and Improve Asteroid Detection Algorithms
” (abstract for American Geophysical Union Fall Meeting). http://adsabs.harvard.edu/abs/2013AGUFMED51A0592B (accessed
February
16
,
2019
).
Benkler
Yochai
.
2006
.
The Wealth of Networks: How Social Production Transforms Markets and Freedoms
.
New Haven
:
Yale University Press
.
Biagioli
Mario
.
2000
. “
Rights or Rewards? Changing Contexts and Definitions of Scientific Authorship
.”
Journal of College and University Law
27
, no.
1
:
83
108
.
Biagioli
Mario
,
Jaszi
Peter
, and
Woodmansee
Martha
, eds.
2011
.
Making and Unmaking Intellectual Property: Creative Production in Legal and Cultural Perspective
.
Chicago
:
University of Chicago Press
.
Bohannon
John
.
2016
. “
Fight Over Author Pseudonyms Could Flare Again
.”
Science
351
, no.
6276
:
902
.
Boyle
Alan
.
2016
. “
Planetary Resources’ Asteroid Miners Focus on Earth Observation with $21 Million in New Funding
.”
Geekwire
,
May
26
. www.geekwire.com/2016/planetary-resources-asteroid-mining-earth-observation-21-million-funding/.
Budapest Open Access Initiative
.
2002
. “
Read the Budapest Open Access Initiative
.”
February
14
. www.budapestopenaccessinitiative.org/read.
Cardamone
Carolin N.
, et al. 
2009
. “
Galaxy Zoo Green Peas: Discovery of A Class of Compact Extremely Star-Forming Galaxies
.”
Monthly Notices of the Royal Astronomical Society
399
, no.
3
:
1191
205
. mnras.oxfordjournals.org/content/399/3/1191.
Chen
Shun-Ling
.
2011
. “
The Wikimedia Foundation and the Self-Governing Wikipedia Community: A Dynamic Relationship under Constant Negotiation
.” In
Critical Point of View: A Wikipedia Reader
, edited by
Lovink
Geert
and
Tkacz
Nathaniel
,
351
69
.
Amsterdam
:
Institute of Network Cultures
.
Chen
Shun-Ling
.
2012
. “
Collaborative Authorship: From Folklore to the Wikiborg
.”
Journal of Law, Technology, and Policy
2011
, no.
1
:
131
67
.
Chen
Shun-Ling
.
2015
. “
Exposing Professionalism in United States Copyright Law: The Disenfranchised Lay Public in a Semiotic Democracy
.”
University of San Francisco Law Review
49
, no.
1
:
57
121
.
Citizen Science Alliance
. n.d. “
What Is the Citizen Science Alliance?
www.citizensciencealliance.org/index.html (accessed
Feburary
16
,
2019
).
Endersby
Jim
.
2008
.
Imperial Nature: Joseph Hooker and the Practices of Victorian Science
.
Chicago
:
University of Chicago Press
.
Enyedy
Edgar
, and
Tkacz
Nathaniel
.
2011
. “
‘Good Luck with Your WikiPAIDia’: Reflections on the 2002 Fork of the Spanish Wikipedia
.” In
Critical Point of View: A Wikipedia Reader
, edited by
Lovink
Geert
and
Tkacz
Nathaniel
,
110
18
.
Amsterdam
:
Institute of Network Cultures
.
European Citizen Science Association
.
2015
. “
Ten Principles of Citizen Science
.”
September
. ecsa.citizen-science.net/engage-us/10-principles-citizen-science.
Fan
Fa-Ti
.
2004
.
British Naturalists in Qing China: Science, Empire, and Cultural Encounter
.
Cambridge
:
Harvard University Press
.
Fuchs
Stephan
, and
Turner
Jonathan H.
1986
. “
What Makes a Science ‘Mature’? Patters of Organizational Control in Scientific Production
.”
Sociological Theory
4
, no.
2
:
143
50
.
Galaxy Zoo
. n.d.a. “
The Story So Far
.” web.archive.org/web/20170330154312/https://www.galaxyzoo.org/#/story (accessed
Feburary
16
,
2019
).
Galaxy Zoo
. n.d.b. “
Project Volunteers
.” authors.galaxyzoo.org/authors.html (accessed
Feburary
16
,
2019
).
Galaxy Zoo
. n.d.c. “
Galaxy Zoo Quench
.” quench.galaxyzoo.org/ (accessed
Feburary
16
,
2019
).
Hightower73 et al
.
2014
. “
Are We Wasting Our Time Here?
Asteroid Zoo Discussion Board
(Help).
November
24
. talk.asteroidzoo.org/?&ga=1.132986702.1910454792.1459904649#/boards/BAZ0000001/discussions/DAZ000070l.
Hightower73 et al
.
2016
. “
REALLY REALLY REALLY GOOD FANTASTIC NEWS !!!!!!!!!! PLEASE READ
.”
Asteroid Zoo Discussion Board
(Chat).
February
1
. talk.asteroidzoo.org/?&ga=1.132986702.1910454792.1459904649#/boards/BAZ0000002/discussions/DAZ0000arj.
Jasanoff
Sheila
.
2003
. “
Breaking the Waves in Science Studies: Comment on H. M. Collins and Robert Evans, ‘The Third Wave of Science Studies.’
Social Studies of Science
33
, no.
3
:
389
400
.
Katz
J. Sylvan
, and
Martin
Ben R.
1997
. “
What Is Research Collaboration?
Research Policy
26
, no.
1
:
1
18
.
Kelty
Christopher M.
2008
.
Two Bits
.
Durham, NC
:
Duke University Press
.
Kemsley
Tamarra
.
2013
. “
Kickstarter Campaign Adds New Goal of Allowing Individuals to Hunt for Asteroids
.”
Nature World News
,
June
28
. www.natureworldnews.com/articles/2707/20130628/kickstarter-campaign-adds-new-goal-allowing-individuals-hunt-asteroids.htm.
Kosmala
Margaret
.
2016
. “
Is Citizen Science Ethical?
Ecology Bits
,
January
13
. ecologybits.com/index.php/2016/01/13/is-citizen-science-ethical/.
Lakdawalla
Emily
.
2013
. “
Planetary Resources’ Crowdfunded Space Telescope
.”
Planetary Society
(blog),
May
31
. www.planetary.org/blogs/emily-lakdawalla/2013/05310914-planetary-resources-arkyd-100.html.
Lintott
Chris J.
, et al. 
2009
. “
Galaxy Zoo: ‘Hanny’s Voorwerp,’ Quasar Light Echo?
Monthly Notices of the Royal Astronomical Society
399
, no.
1
:
129
40
.
Lintott
Chris
.
2013a
. “
Putting the ‘Citizen’ in ‘Citizen Science.’
Zooniverse
(blog),
June
24
. blog.zooniverse.org/2013/06/24/putting-the-citizen-in-citizen-science/.
Lintott
Chris
.
2013b
. “
(Many) Zooniverse Papers Now Open Access
.”
Zooniverse
(blog),
August
2
. blog.zooniverse.org/2013/08/02/many-zooniverse-papers-now-open-access/.
Lukyanenko
Roman
,
Parsons
Jeffrey
, and
Wiersma
Yolanda F.
2016
. “
Emerging Problems of Data Quality in Citizen Science
.”
Conservation Biology
30
, no.
3
:
447
49
.
Madison
Michael J.
2014
. “
Commons at the Intersection of Peer Production, Citizen Science, and Big Data: Galaxy Zoo
.” In
Governing Knowledge Commons
, edited by
Frischmann
Brett M.
,
Madison
Michael J.
, and
Strandburg
Katherine J.
,
209
54
.
Oxford
:
Oxford University Press
.
Malcolm
Jeremy
.
2018
. “
The Fate of Text and Data Mining in the European Copyright Overhaul
.”
Electronic Frontier Foundation
,
April
27
. www.eff.org/deeplinks/2018/04/text-and-data-mining-european-copyright-overhaul.
McCray
W. Patrick
.
2004
.
Giant Telescopes: Astronomical Ambition and the Promise of Technology
.
Cambridge, MA
:
Harvard University Press
.
MvGulik
.
2016
. “
Asteroid Zoo Paused (May 19, 2016)
”.
Asteroid Zoo Discussion Board
(Chat),
May
19
. talk.asteroidzoo.org/#/boards/BAZ0000002/discussions/DAZ0000b7p.
Nowakowski
Tomasz
.
2016
. “
Luxembourg Invests $28 Million in Asteroid Mining
.”
Spaceflight Insider
,
November
8
. www.spaceflightinsider.com/missions/commercial/luxembourg-invests-28-million-asteroid-mining/.
Nowell
Rick
.
2016
. “
Is Citizen Science Ethical?
Zooniverse Talk
,
March
25
. www.zooniverse.org/talk/15/46052.
Open Access Max-Planck-Gesellschaft
.
2003
. “
Berlin Declaration on Open Access to Knowledge in Science and Humanities
,”
October
22
. openaccess.mpg.de/Berlin-Declaration.
Ottinger
Gwen
.
2016
. “
Social Movement-Based Citizen Science
.” In
The Rightful Place of Science: Citizen Science
, edited by
Cavalier
Darlene
and
Kennedy
Eric B.
,
89
104
.
Tempe, AZ
:
Consortium for Science, Policies, and Outcomes
.
Pampel
Heinz
, and
Dallmeier-Tiessen
Sünje
.
2014
. “
Open Research Data: From Vision to Practice
.” In
Opening Science
, edited by
Bartling
Sönke
and
Friesike
Sascha
,
213
24
.
Heidelberg
:
Springer
.
Pasii
Anssi
.
2005
. “
Globalization, Academic Capitalism, and the Uneven Geographies of International Journal Publishing Spaces
.”
Environment and Planning A: Economy and Space
37
, no.
5
:
769
89
.
Paul-Hus
Adèle
,
Mongeon
Philippe
,
Sainte-Marie
Maxime
, and
Larivière
Vincent
.
2017
. “
The Sum of It All: Revealing Collaboration Patterns by Combining Authorship and Acknowledgements
.”
Journal of Infometrics
11
, no.
1
:
80
87
.
Planetary Resources
.
2013
. Update #17: “
Mystery Goal Revealed, Planetary Defense . . . Planetery Annihilation!
Kickstarter
(posting),
June
27
. www.kickstarter.com/projects/arkydforeveryone/arkyd-a-space-telescope-for-everyone-0/posts/522673.
Planetary Resources
.
2014
. “
Asteroid Zoo Is Live! Now You Can Hunt for Asteroids
.”
June
24
. www.planetaryresources.com/2014/06/asteroid-zoo-live/.
Planetary Resources
.
2016a
. Update #39 “
Final Update and FULL Refund
.”
Kickstarter
(posting),
May
26
. www.kickstarter.com/projects/arkydforeveryone/arkyd-a-space-telescope-for-everyone-0/posts/1584844).
Planetary Resources
.
2016b
. “
Planetary Resources and the Government of Luxembourg Announce €25 Million Investment and Cooperation Agreement
.”
November
3
. www.planetaryresources.com/2016/11/planetary-resources-and-the-government-of-luxembourg-announce-e25-million-investment-and-cooperation-agreement/.
Planetary Resources
.
2018
. “
ConsenSys Acquires Planetary Resources
.”
October
31
. www.planetaryresources.com/2018/10/consensys-acquires-planetary-resources/.
Polanyi
Michael
.
1962
. “
The Republic of Science: Its Political and Economic Theory
.”
Minerva
1
, no.
1
:
54
73
.
Raddick
M. Jordan
,
Bracey
Georgia
,
Gay
Pamela L.
,
Lintott
Chris J.
,
Cardamone
Carie
,
Murray
Phil
,
Schawinski
Kevin
,
Szalay
Alexander S.
, and
Vandenberg
Jan
.
2013
. “
Galaxy Zoo: Motivations of Citizen Scientists
.”
Astronomy Education Review
12
, no.
1
:
1
27
.
Resnik
David B.
,
Elliott
Kevin C.
, and
Miller
Aubrey K.
2015
. “
A Framework for Addressing Ethical Issues in Citizen Science
.”
Environmental Science and Policy
54
,
December
:
475
81
.
Rodriguez
Victor
,
Janssens
Frizo
,
Debackere
Koenraad
, and
De Moor
Bart
.
2008
. “
On Material Transfer Agreements and Visibility of Researchers in Biotechnology
.”
Journal of Infometrics
2
, no.
1
:
89
100
.
Sauermann
Henry
, and
Franzoni
Chiara
.
2015
. “
Crowd Science User Contribution Patterns and Their Implications
.”
PNAS
12
, no.
3
:
679
84
.
Schrier
Karen
.
2016
.
Knowledge Games: How Playing Games Can Solve Problems, Create Insight, and Make Change
.
Baltimore
:
Johns Hopkins University Press
.
Shaer
Matthew
.
2016
. “
The Asteroid Miner’s Guide to the Galaxy
.”
Foreign Policy
,
April
28
. foreignpolicy.com/2016/04/28/the-asteroid-miners-guide-to-the-galaxy-space-race-mining-asteroids-planetary-research-deep-space-industries/.
Shirky
Clay
.
2010
.
Cognitive Surplus: Creativity and Generosity in a Connected Age
.
New York
:
Penguin Press
.
Space Daily
.
2008
. “
Astronomers Discover Rare ‘Green Pea’ Galaxies
.”
July
28
. www.spacedaily.com/reports/Astronomers_Discover_Rare_Green_Pea_Galaxies_999.html.
Stardust@home
. n.d. Stardust@home (About page) stardustathome.ssl.berkeley.edu/about/stardusthome/ (accessed
Feburary
16
,
2019
).
Straub
Miranda C. P.
2016
. “
Giving Citizen Scientists a Chance: A Study of Volunteer-Led Scientific Discovery
.”
Citizen Science: Theory and Practice
1
, no.
1
:
1
10
.
Suber
Peter
.
2012
.
Open Access
.
Cambridge, MA
:
MIT Press
.
Tate
Jean
.
2013
. “
Why Must Zooites Who Created the GZ Data *Pay* to Access the Key Paper on It?
Galaxy Zoo Discussion Board (Science)
,
June
27
. talk.galaxyzoo.org/#/boards/BGZ0000001/discussions/DGZ00002ga.
Tkacz
Nathaniel
.
2015
.
Wikipedia and the Politics of Openness
.
Chicago
:
University of Chicago Press
.
University of Minnesota
.
2016
. “
U of M Astronomers Help Citizen Scientists Discover Rare Galaxy Cluster
.”
June
28
. twin-cities.umn.edu/news-events/u-m-astronomers-help-citizen-scientists-discover-rare-galaxy-cluster.
Von Hippel
Eric
.
2005
.
Democratizing Innovation
.
Cambridge, MA
:
MIT Press
.
Wall
Mike
.
2014
. “
Asteroid Zoo Asks Public to Find Dangerous Space Rocks
.” ,
June
24
. www.space.com/26349-asteroid-zoo-zooniverse-planetary-resources.html.
Weber
Steven
.
2004
.
The Success of Open Source
.
Cambridge, MA
:
Harvard University Press
.
Woolston
Chris
.
2015
. “
Fruit-Fly Paper Has 1,000 Authors
.”
Nature
521
, no.
7552
:
263
. www.nature.com/news/fruit-fly-paper-has-1-000-authors-1.17555.
Zooniverse
, n.d.a. “
What Is the Zooniverse?
www.zooniverse.org/about (accessed
Feburary
16
,
2019
).
Zooniverse
. n.d.b. “
How to Launch Your Project and Zooniverse Policies
www.zooniverse.org/help/lab-policies (accessed
Feburary
16
,
2019
).
Zooniverse
. n.d.c. “
About: Publications
.” www.zooniverse.org/about/publications (accessed
February
16
,
2019
).
Zooniverse
.
2009a
. “
The First Volunteer-Inspired Galaxy Zoo Paper is Submitted!
Galaxy Zoo
(blog),
April
14
. blog.galaxyzoo.org/?s=The+First+Volunteer-Inspired+Galaxy+Zoo+Paper+is+Submitted.
Zooniverse
.
2009b
. “
Peas in the Universe, Goodwill, and a History of Zooite Collaboration on the Peas Project
,”
Galaxy Zoo
(blog),
July
7
. blog.galaxyzoo.org/2009/07/07/peas-in-the-universe-goodwill-and-a-history-of-zooite-collaboration-on-the-peas-project/.
Zooniverse
,
2016
. “
Asteroid Zoo Paused
.”
Zooniverse
(blog),
May
19
. blog.zooniverse.org/2016/05/19/asteroid-zoo-paused/.
Zooniverse
.
2018
. “
Experiments on the Zooniverse
.”
Zooniverse
(blog),
July
25
. www.zooniverse.org/talk/14/695488.