December 2015 Download this article as a PDFAbstract

Around the globe, crowdsourcing initiatives are emerging and contributing in a diversity of areas, such as in crisis management and product development and to carry out micro-tasks such as translations and transcriptions. The essence of crowdsourcing is to acknowledge that not all the talented people work for you; hence, crowdsourcing brings more perspectives, insights, and visions to, for instance, an innovation process. In this article, we analyze how crowdsourcing can contribute to the different stages of innovation processes carried out in living labs and thus contribute to living labs by strengthening their core role as innovation process facilitators. We have also identified benefits and challenges that need to be grappled with for managers of living labs to make it possible for the crowd to fully support their cause.

Introduction

Today, there is a growing trend of organizations tapping into the wisdom of the crowd to contribute to their innovation processes to create value (Ye & Kankanhalli, 2013). This trend has been fuelled by IT that enables companies to reach and engage a crowd on a global scale (Ye & Kankanhalli, 2013). Examples can be seen in LEGO's use of crowdsourcing to develop new models (Schlagwein & Andersen, 2014), Dell's use of crowdsourcing for their IdeaStorm initiative (Di Gangi & Wasko, 2009), and Procter and Gamble's Connect+Develop program, which has been relying on external sources for more than half of its innovation tasks (Huston & Sakkab, 2006). These crowds can contribute to activities such as collecting data, identifying problems, carrying out tedious work, rendering ideas, engaging in co-creative activities, voting for an idea, and developing solutions to a problem (Prpić et al., 2015). In addition, crowdsourcing has also shown to be very efficient for activities such as developing marketing videos, translating, mapping information, interpreting photos, and developing software. However, the impact and full potential of crowdsourcing initiatives does, to a large extent, remain to be seen given that the understanding of crowdsourcing is in its infancy (Ye & Kankanhalli, 2013). Currently, many organizations do not have sufficient insights regarding how the crowd can be engaged in innovation processes, and how the results from the crowd can be used to support their cause (Boudreau & Lakhani, 2013).

In Europe, there is another evolving concept that also strives to support the development of innovation and create value by involving users. That concept is called the "living lab", and it aims to support user-centered innovation processes in real-world contexts, and hence it often acts as an open innovation network (Leminen et al., 2012) and innovation intermediary organization (Cleland et al., 2012).In this article, we align our approach with Bergvall-Kåreborn and colleagues (2009), who defined a living lab as a user-centric innovation milieu built on everyday practice and research, with an approach that facilitates user influence in open and distributed innovation processes engaging all relevant partners in real-life contexts, aiming to create sustainable values. Due to the focus on carrying out innovation activities in real-life contexts, living lab processes involves a plethora of stakeholders, both locally and globally, and thus require supportive innovation processes. Therefore, living labs need to be well equipped with processes to support the development of various types of innovations in a diversity of contexts, with a variety of users, and in different countries.

In living labs, the innovation processes generally consists of four main phases: i) exploration, ii) design, iii) implementation, and iv) test and evaluation (e.g. Almirall et al., 2012; Ståhlbröst & Bergvall-Kåreborn, 2008). In this article, we argue that these four phases could be supported by different crowdsourcing initiatives, thus making it possible for the living lab to remain specialized in a core area. Our view is that a living lab is one instantiation of innovation processes, meaning that the usage of crowdsourcing initiatives could apply to innovation processes being carried out in other premises as well. Hence, the purpose of this article is to relate contemporary crowdsourcing initiatives to living lab innovation process and subsequently analyze the potential benefits and challenges this approach could raise for living labs.

Research Methodology

The methodology for this research started with a literature review in which crowdsourcing and innovation processes were in focus. In this study, we searched journals within the area of information systems and within innovation management, looking for papers published between 2006 and 2014 following the recommendations from Hart (2003) and von Brocke and colleagues (2009). Using the search terms "crowdsourcing" and "innovation process", we found relevant articles that we then examined for evidence of crowdsourcing contributing to innovation processes. Based on that, we applied a snowballing approach, searching both backwards and forwards to find relevant articles. The literature was then combined with desktop research of different crowdsourcing initiatives with the objective to analyze the activities and the mode of the initiative as well as how the initiative could be labelled. We started by analyzing the most common crowdsourcing sites such as InnoCentive, Amazon Mechanical Turk, and Quirky, and then continued to dig further into a variety of initiatives with a focus on initiatives that could support innovation process and initiatives that were driven by third parties, hence excluding company-centered initiatives such as Dell´s IdeaStorm. We used a qualitative and reflective approach, meaning that we reflected on the results from one paper or platform and then looked further. To guide our analysis, we started by categorizing the described activities and then analyzing what the crowd is actually doing by means of the platform for each initiative. We categorized the initiatives according to what the crowd contribute with and the essence of the initiative. To support our categorization, we examined the role of the crowd, asking for instance, whether the crowd members were primarily problems owners, solvers, creators, or data providers or testers. Thereafter, we labelled each initiative according to existing categories of crowdsourcing as suggested by Howe (2009). He defines four basic categories of crowdsourcing applications: i) crowd wisdom; ii) crowd creation or user-generated content; iii) crowd voting; and iv) crowd funding. In this process, we discovered that these categories did not cover all the different aspects of crowdsourcing that we had identified, hence labels such as "crowd innovation", "crowd engagement", and "crowd testing" emerged. The term "crowd testing" stems from the literature (Zogaj & Bretschneider, 2013). The other terms result from our analysis of the essential elements of the crowdsourcing initiatives and our interpretation of the existing four categorizes as inadequate to catch the kernel of the initiative and the motivators related to it. For instance, with crowd engagement, even though the crowd jointly creates the content, and thus could be related to crowd creation, the essence of what the crowd contributes with and create is more strongly related to wanting to change the society and contribute to a common good. This can, for instance, be seen in initiatives such as HarassMap, where the crowd marks the geographical location of where they have been sexually harassed in a city.

Mapping Crowdsourcing Initiatives to Living Lab Innovation Processes

The concept of crowdsourcing was first coined in 2006 by Jeffrey Howe (2006a), who defined crowdsourcing as follows:

"Simply defined, crowdsourcing represents the act of a company or institution taking a function once performed by employees and outsourcing it to an undefined (and generally large) network of people in the form of an open call. This can take the form of peer–production (when the job is performed collaboratively), but is also often undertaken by sole individuals. The crucial prerequisite is the use of the open call format and the large network of potential labourers."

Even though the term was coined recently, actions to engage crowds have been ongoing for a long period of time. For instance, engaging citizens in research activities such as gathering weather data has been done for at least 50 years. And, involving people outside an organization in idea generation has a long history. The main differences between these initiatives and today's crowdsourcing trend are that today, the process can be facilitated by an ICT-based platform and it can have a global reach (Boudreau & Lakhani, 2013).

In the beginning of the development of crowdsourcing as a concept, many organizations largely engaged the crowd in micro-tasks as suggested by Howe (2006b). However, today, the concept of crowdsourcing has been broadened; it does not only refer to situations where an open call is used, but also includes situations where people join forces and create value. Examples include the Fukushima Daiichi nuclear disaster, where the crowd built Geiger meters, installed them on cars, bicycles, etc. to get more useful and accurate radiation measures than the Japanese government provided. This crowd activity ended up with more than 150 million data points to be compared with the 30,000 provided by the government (Burns, 2014; Massung et al., 2013) . Other situations where the crowd creates the content and core value of a service can be seen in initiatives such as Airbnb or Uber (Hamari et al., 2015). The main aim of crowdsourcing is to mobilize the distributed and diverse competences and expertise held by the crowd (Zhao & Zhu, 2014). It is driven by meta-trends such as the rise of the entrepreneurial startup culture, the growth of freelancers or independent employees, an expanded global marketplace, and the friction between transparency and monetization.

A crowd can be engaged in many different ways and with different purposes, each answering to certain motivators for the crowd. Thus, to facilitate engagement of crowds in innovation activities, the task may be divided into smaller sub-tasks depending on the complexity of the task and the variety of the outcomes (Ye & Kankanhalli, 2013). Members of the crowd are also motivated differently to participate in the crowd initiatives: some of them are driven by the desire to collaborate and contribute their small part to a larger cause, as seen for instance in community activism initiatives (Massung et al., 2013) or other societal challenges as in OpenIdeo where they are motivated by a collectiveness (Hajiamiri & Korkut, 2015). Other crowds are more motivated by a challenge and solving a problem and thus having the opportunity to win a prize as in InnoCentive or NineSigma, two initatives that focus on connecting companies with experts to solve a complex problem (Ye & Kankanhalli, 2013), or by a desire to spend free time on meaningful activities (Kaufmann et al., 2011). Participation may be perceived as being fun (Lakhani & Wolf, 2005; Rotman et al., 2014), entertaining, or enjoyable. or as a learning opportunity (Maher et al., 2011; Nov, 2007). Other relevant motivators are reputation building (Rotman et al., 2014), career building (Casalo, 2009), rewards (Ye & Kankanhalli, 2013), and recognition (Hajiamiri & Korkut, 2015). In sum, crowds are motivated differently depending on the essence of the crowd's efforts. It is therefore important to understand what triggers the specific crowd that is expected to contribute to a specific process to encourage the development of a vigorous and lively crowd that is willing to do the work expected of it in the innovation process.

In the following sections, we have aligned the different crowdsourcing initiatives to the four different phases of living lab innovation processes (Almirall et al., 2012; Ståhlbröst & Bergvall-Kåreborn, 2008):

  1. Exploration (or contextualization): refers to gaining understanding of the situation and the potential it offers for innovation
  2. Design (or concretization): refers to the design of the innovation in all its different maturity levels
  3. Implementation/Realization: focuses on exposing the innovation to the real-world context
  4. Evaluation and test (or feedback): refers to the process of using and reflecting on the use of the innovation in the real world context

Exploration

In living labs, one of the core activities is to develop innovations centred on human needs and values (e.g., Ståhlbröst, 2012). Thus, in the living lab, the starting point for innovation is a real-world situation, where there is an opportunity to improve people's lives. A deep understanding of human needs and values is needed as well as deep insights into contemporary problems and challenges from a societal perspective. To support this process, a variety of stakeholders need to be involved to gain as comprehensive and rich a picture of the situation as possible, and we see that different crowdsourcing initiatives can contribute to this process.

The focus for many living lab projects has been to engage end users, or potential users, of an innovation in the process to gain their insights (e.g., Bergvall-Kåreborn et al., 2010; Dell'Era & Landoni, 2014; Svensson et al., 2010). We argue that broadening the scope and including crowds that want to accomplish changes by, for instance, contributing to, shedding light on, or investigating societal issues could contribute valuable insights and real-world experiences to the innovation process in living labs. Given that a crowd-based approach differs from the current focus on end users in living labs and that crowds include a broad range of people beyond only potential users, input from crowds would make it possible for a living lab to obtain a good view of trends and issues that are important to solve in society. Crowds could also be involved in smaller investigations that could contribute to a deeper understanding of a situation or they could be involved in simpler tasks such as idea generation and brainstorming, which can render many ideas quickly. Thus, we see that involving a diversity of crowds in the exploration phase (see Table 1 for examples) could create value for living labs by offering both support in carrying out tasks as well as facilitating better insights.

Table 1. Crowdsourcing initiatives supporting exploration

Label

Mode/Type of Activity

Action in Exploration

Examples of Initiatives

Crowd creation

Collaboration

Generating ideas

OpenIdeo

Crowd engagement

Collaboration

Influencing society and communities (e.g., as activists)

HarassMap

Urban water mappers

Crowd science

Collaboration

Investigating and researching on background information; providing data

Zooniverse

Kaggle

Crowd work

Compensation

Generating ideas; brainstorming

Amazon

Freelancer

Design

In living labs, the design process is always viewed as a co-creative process in which many stakeholders should be involved to influence the innovation in focus (e.g., Krogstie, 2012). Innovations are co-created in interaction between users (or user representatives), developers, and designers. This interaction often takes place in a physically co-located arena where the team can jointly design ideas, concepts, and prototypes by means of different methods and tools (Bergvall-Kåreborn et al., 2010). On some occasions, parts of the process are carried out online and the team can share a collaborative workspace where suggestions for different design solutions are posted and comments/suggestions for improvements are given (e.g., Følstad & Karahasanovic, 2012).

From our perspective, we see that different crowd initiatives that are innovative, creative, and diverse could contribute to the design phase in living labs. Potential assignments for the crowd could be, for instance, to carry out programming tasks, to develop design suggestions, or to contribute to solving complex problems. In this process, design competitions could be used, which would make it possible to generate a great variety of creative ideas, while at the same time externalizing the risk of failure (Ye & Kankanhalli, 2013). Hence, opening up the design process and involving different types of crowds can be beneficial for the living lab as well as for the innovation as such: heterogeneous skills and insights come together and thus leverage the innovation potential. In this process, strong motivational factors for the crowd are to have fun and to receive recognition for their efforts, hence it is important to make the winning solution visible and recognizing the winners. In sum, involving crowds in the design phase (see Table 2 for examples) could create value for living labs in terms of increased insights and perspectives as well as increased efficiency due to the co-creative activities.

Table 2. Crowdsourcing initiatives supporting design

Label

Mode/Type of Activity

Action in Design

Examples of Initiatives

Crowd labour

Collaboration

Programming software

SourceForge

Topcoder

Crowd creation

Collaboration

Innovating in collective design

OpenIdeo

Crowd wisdom

Competition

Solving complex problem as experts

NineSigma

xPrize

Crowd creation

Competition

Creative creation

99 designs

eYeka

Implementation/Realization

The implementation phase is of vital importance for living labs, where the innovation activities are to be carried out in real-world contexts; thus, this phase needs to handled efficiently and effectively. In this phase, the focus is to expose the innovation to the complexity of the context with different users, competing systems, contextual factors, and the users' experiences of using the innovation "for real". To support this process, it is important for the living lab to have an extensive network that can offer implementation contexts that are suitable for different innovations. The implementation context can be very diverse including, for instance, private households, public buildings, city contexts, or even smartphones, because the implementation must be carried out in the context in which the innovation is expected to operate. In this process, dynamic and large crowds such as those used by Amazon Mechanical Turk or Freelancer (focusing on matching workers with micro-tasks from requesters) could contribute and offer private contexts where the crowd can install the innovation in their context, if it is for instance an ICT-based innovation that is implemented. However, based on our analysis of current crowdsourcing initiatives, implementation in public contexts is not usually supported. Hence, it is still important to maintain the network of partners surrounding the living lab, but developing and maintaining their own user panels or communities becomes of less importance.

In the implementation stage, some funding might also be needed to make it possible for the innovation to reach the market and become fully implemented. Including the crowd in this process, through crowdfunding, could then give a hint of how interesting the crowd members interpret the innovation to be: if they are not willing to contribute to its financing, the market potential of the innovation might be weak. However, if the opposite is true and the crowd wants to fund the development and implementation of the innovation, this finding could contribute significantly to the living lab process and it might also give indications of its market potential. Hence, involving the crowd in the implementation phase (see Table 3 for examples) can create value for the living lab in terms of access to different contexts and use situations. Involving the crowd can also give a first insight of how the innovation is valued by the crowd, which gives the living lab the opportunity to take action on how to proceed with the innovation and thus be a living lab in a dynamic sense.

Table 3. Third-party crowdsourcing initiatives supporting implementation/realization

Label

Mode/Type of Activity

Action in Implementation/ Realization

Examples of Initiatives

Crowdfunding

Collaboration

Funding

KickStarter

Gofundme

Crowd work

Compensation

Working for hire to perform micro-tasks

Amazon Mechanical Turk

Freelancer

Test and evaluation

The living lab test and evaluation phase is often applied in innovation projects as a means to obtain user insights on the experiences of using the innovation (e.g., Wendin et al., 2015). In this process, users are usually involved in their own real-world context to test or evaluate the innovation to ensure that the solution answers their needs and creates value for them (Almirall & Wareham, 2011). If the innovation is in its early stages, this process might be performed in a physical meeting during which the innovation is demonstrated and the users give their feedback, or in later stages, the tests can be performed in real-world contexts where the users use the innovation into their own context and then follow instructions and answer questions related to their experiences of using it. In this process, we see that different crowd initiatives can contribute, depending on the innovation to be tested. For instance, initiatives such as Testbirds, which focuses on supporting tests of applications and websites with crowd members, could be used for testing usability of an IT system, or Freelancer or Amazon Mechanical Turk could be used to test the usefulness of the innovation and evaluate experiences of using the system. Here, a structured test process with clearly defined tasks and goals is important, as is having a large number of potential testers. To motivate the crowd to carry out this type of task, some form of compensation may help, depending on the time and efforts expected from the crowd members. Thus, the value being created for the living lab by involving crowds in the test and evaluation is increased knowledge and understanding of how the innovation is being used, which gives direction to future changes and adjustments of the innovation. In addition, involving third-party crowds in the test and evaluation phase makes the process more efficient because the living lab does not need to recruit, maintain, or communicate with their own crowds.

Table 4. Crowdsourcing initiatives supporting test and evaluation

Label

Mode/Type of Activity

Action in Test and Evaluation

Examples of Initiatives

Crowd testing

Compensation

Testing

Testbirds

Testbats

Crowd work

Compensation

Working for hire to perform micro-tasks

Amazon

Clickworker

Freelancer

Discussion and Conclusions

In this section, we summarize the main issues identified that we argue will be of importance for the future understanding of how living lab innovation process can utilize the power of the crowds and leverage their innovation process. When reflecting on the potential benefits for living lab initiators to engage in contemporary crowdsourcing initiatives, three core areas emerged. First, it is likely that the administration within the living lab organization can be decreased because time and money can be saved on the management of participants in living lab activities. Also, it can provide access to a broader network during the innovation process. Second, utilizing contemporary crowdsourcing initiatives presents good opportunities for keeping track of current trends and emerging issues and also to connect with the people engaging in these ideas. Potentially, this benefit is of specific value for smaller living lab initiatives that might not otherwise have the resources to access this kind of knowledge and networks. Last, we see that the overall innovation capacity could be leveraged. By being able to utilize innovative ideas from an international perspective (such as Openideo, which focuses on developing and involving a global community) while at the same time engaging a context-aware crowd from the region (e.g. Botnia Living Lab, which mainly has a local crowd) and then infusing ideas from the global community to the local community , can boost the innovation capacity and thus widen the range of possible successful innovations. It is also possible to more carefully target important characteristics such as usability knowledge (e.g., Testbirds), design skills (e.g., 99designs), or true user experience (e.g., Harassmap).

Closely linked to the benefits are the identified corresponding challenges in making them happen, that is, the challenges in leveraging the innovation possibilities. We argue that, when different crowds are involved in the innovation process, living lab managers might need to embrace a dynamic approach that focuses on following the crowd from a close distance and being prepared to take actions to, for instance, motivate or to stimulate the crowd. Working with people and crowds also makes it difficult to predict exactly what will come out of the crowd even if the requirements of the expected outcome are clearly defined and communicated through, for instance, micro-tasks to be carried out. The solution is to a large extent determined by the participant’s interpretation and previous experience related to the task. In addition, interacting with different crowds in the process makes it possible to work with many ideas in parallel, which can boost the innovation process given that ideas from one crowd can be implemented into another crowd, thus stimulating new perspectives and discussions.

Next, we argue that the innovation interaction with crowds might have an effect on how we see the end solution or, more specifically, how we view the crowd participant’s role in the solution. This is a matter of value capture and value creation. When crowds are involved, they might be the ones creating the value of an innovation, such as gathering data or designing the innovation. But, the value of the innovation might still be captured by the initiator of the innovation. Here, it is important that the living lab manager consider how they wish to assign ownership of the final innovation: should it be co-owned among all contributors, or should the initiator of the process own it? Traditionally, the innovations brought forward by living labs are owned by the actor initiating the living lab process or by the initiator of the problem, usually a company. Crowds are also involved in different ways: some crowds are mainly opinion leaders whereas others offer important resources to make the innovation come to life, such as in Kickstarter, a company that crowdfunds innovation. This change of role will likely challenge the way participants and their contributions are viewed, and it will likely affect how the ownership of the innovation is discussed and realized.

Finally, we argue that the way a living lab engages with a crowdsourcing initiative will be of outmost importance. As seen by the motivators for the exemplified crowds, we conclude that a living lab must make efforts to identify the proper incentives and ways to communicate with – and engage – crowds. Some crowds are driven by the sheer enjoyment of contributing to an issue they view as important, and for them it is important to feel that they are doing exactly that. A likely consequence of this situation is that it can become a challenge to obtain sufficient insights if the problem in focus is being presented as a predetermined solution that the living lab wants some response to, because it will limit their creativity and innovativeness. Other crowds are motivated by more individualistic drivers, such as receiving attention for their contribution or to receive some micropayment for their efforts, and as mentioned before, others see themselves as part of the solution. In all cases, making sure each crowd is engaged in the proper manner will be key for leveraging the innovation potential. In the end, success will depend on the mindset of the living lab managers, who must be brave enough to follow the power of the crowd and live with the process, and thus truly become part of a "living lab".

In Figure 1, we have depicted the essence of our discussion in a matrix where the level of engagement of the crowd and the crowd perspective render four different crowd roles. Here, the crowd perspective represents the different views of the crowd where they can be seen as factors that mainly provide data (e.g., sensor data, energy consumption data) or sponsor initiatives (e.g., money). The level of crowd engagement also influences which role the crowd take. When the level of engagement is high, the crowd put in their own resources to co-create or sponsor the innovation, whereas if the level of engagement is low, the crowd mainly contribute with their ideas or their data – they might test and give input, but their efforts do not require a high level of engagement.

By applying this matrix in living lab processes, living lab managers can gain support in determining which crowdsourcing initiate they should focus on using in their innovation process. For instance, if the living lab manager wants to have people who are active and has a high level of engagement in the process – which also requires a high level of engagement from the living lab manager – they might want the crowd members to play the role of co-creators, as in OpenIdeo. In a similar vein, if the living lab manager wants to have a crowd that mainly contributes with data and has a low level of engagement, they might want to engage a crowd focusing on being providers as in, for instance, Testbirds where the crowd mainly contributes to a small well-defined and structured task.

Figure 1

Figure 1. Four roles of the crowd in crowdsourcing initiatives, as derived from a matrix of crowd engagement and perspectives in crowdsourcing

This study represents a first step towards understanding, from a theoretical perspective, how crowds can be engaged in, and contribute to, living lab innovation processes. In future research, it would be interesting to study the actual impact and contribution crowds can have in living lab innovation processes. In this study, we report on a limited number of crowdsourcing initiates, but due to the rapid growth within the area, and also the emerging challenges the area face, a more comprehensive study of the different initiatives and their potential would be valuable for living labs to reap the benefits of crowdsourcing.

Acknowledgements

An earlier version of this paper was presented at the XXVI International Society for Professional Innovation Management (ISPIM) Conference – Shaping the Frontiers of Innovation Management, Budapest, Hungary, 14–17 June, 2015.

 


References

Almirall, E., & Wareham, J. 2011. Living Labs: arbiters of mid- and ground-level innovation. Technology Analysis & Strategic Management, 23(1): 87–102.
http://dx.doi.org/10.1080/09537325.2011.537110

Almirall, E., Lee, M., & Wareham, J. 2012. Mapping Living Labs in the Landscape of Innovation Methodologies. Technology Innovation Management Review, 2(9): 12–18.
http://timreview.ca/article/603

Bergvall-Kåreborn, B., Howcroft, D., Ståhlbröst, A., & Melander-Wikman, A. 2010. Participating in Living Lab: Designing Systems with Users. Paper presented at the Human Benefit through the Diffusion of Information Systems Design Science Research: IFIP WG 8.2/8.6 International Working Conference, Perth, Australia, March 30 – April 1.

Bergvall-Kåreborn, B., & Ståhlbröst, A. 2009. Living Lab: An Open and Citizen-Centric Approach for Innovation. International Journal of Innovation and Regional Development, 1(4): 356–370.
http://dx.doi.org/10.1504/IJIRD.2009.022727

Boudreau, K. J., & Lakhani, K. R. 2013. Using the Crowd as an Innovation Partner. Harvard Business Review, 91(4): 60–69.

Burns, R. 2014. Moments of Closure in the Knowledge Politics of Digital Humanitarianism. Geoforum, 53: 51–62.
http://dx.doi.org/10.1016/j.geoforum.2014.02.002

Casalo, L. V. 2009. Determinants of Success in Open Source Software Networks. In J. Cisneros, C. Flavian, & M. Guinaliu (Eds.), Industrial Management & Data Systems, 109: 532–549.

Cleland, B., Mulvenna, M., Galbraith, B., Wallace, J., & Martin, S. 2012. Innovation of eParticipation Strategies Using Living Labs as Intermediaries. Electronic Journal of e-Government, 10(2): 120–132.

Dell'Era, C., & Landoni, P. 2014. Living Lab: A Methodology between User-Centred Design and Participatory Design. Creativity & Innovation Management, 23(2): 137–154.
http://dx.doi.org/10.1111/caim.12061

Di Gangi, P. M., & Wasko, M. 2009. Steal My Idea! Organizational Adoption of User Innovations from a User Innovation Community: A Case Study of Dell IdeaStorm. Decision Support Systems, 48(1): 303-312.
http://dx.doi.org/10.1016/j.dss.2009.04.004

Følstad, A., & Karahasanovic, A. 2012. Online Applications for User Involvement in Living Lab Innovation Processes. In Proceedings of e-Society 2012: 257–264. IADIS Press.

Hajiamiri, M., & Korkut, F. 2015. Perceived Values of Web-Based Collective Design Platforms from the Perspective of Industrial Designers in Reference to Quirky and OpenIDEO. ITU AZ, 12(1): 147–159.

Hamari, J., Sjöklint, M., & Ukkonen, A. 2015. The Sharing Economy: Why People Participate in Collaborative Consumption. Journal of the Association for Information Science and Technology. Forthcoming.
http://dx.doi.org/10.1002/asi.23552

Hart, C. 2003. Doing a Literature Review – Releasing the Social Science Research Imagination. London: Sage Publication Inc.

Howe, J. 2006a. Crowdsourcing: A Definition. Crowdsourcing: Why the Power of the Crowd Is Driving the Future of Business, June 2, 2006. Accessed December 1, 2015:
http://crowdsourcing.typepad.com/cs/2006/06/crowdsourcing_a.html

Howe, J. 2006b. The Rise of Crowdsourcing. Wired, 14(6), June 6, 2006. Accessed December 1, 2015:
http://www.wired.com/2006/06/crowds/

Howe, J. 2009. Crowdsourcing: Why the Power of the Crowd Is Driving the Future of Business. New York: Crown Business.

Huston, L., & Sakkab, N. 2006. Connect and Develop: Inside Procter & Gamble's New Model for Innovation. Harvard Business Review, 84(3): 58–66.

Kaufmann, N., Schulze, T., & Veit, D. 2011. More than Fun and Money. Worker Motivation in Crowdsourcing – A Study on Mechanical Turk. In Proceedings of AMCIS2011, Paper 340.

Krogstie, J. 2012. Bridging Research and Innovation by Applying Living Labs for Design Science Research. In C. Keller, M. Wiberg, P. Ågerfalk, & J. Z. Eriksson Lundström (Eds.), Nordic Contributions in IS Research, 124: 161–176. Berlin: Springer Berlin Heidelberg.
http://dx.doi.org/10.1007/978-3-642-32270-9_10

Lakhani, K. R., & Wolf, G. 2005. Why Hackers Do What They Do: Understanding Motivaiton and Efforts in Free/Open Source Software Projects. In J. Feller, B. Fitzgerald, S. Hissam, & K. R. Lakhani (Eds.), Perspectives on Free and Open Soruce Software. Cambridge, MA: MIT Press.

Leminen, S., Westerlund, M., & Nyström, A.-G. 2012. Living Labs as Open-Innovation Networks. Technology Innovation Management Review, 2(9): 6–11.
http://timreview.ca/article/602

Maher, M., L, Paulini, M., & Murty, P. 2011. Scaling Up: From Individual Design to Collaborative Design to Collective Design. Design Computing and Cognition, 10: 581–599.
http://dx.doi.org/10.1007/978-94-007-0510-4_31

Massung, E., Coyle, D., Cater, K. F., Jay, M., & Preist, C. 2013. Using Crowdsourcing to Support Pro-Environmental Community Activism. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: 371–380. Paris: ACM.
http://dx.doi.org/10.1145/2470654.2470708

Nov, O. 2007. What Motivates Wikipedians. Communication of the ACM, 50(11): 60–64.
http://dx.doi.org/10.1145/1297797.1297798

Prpić, J., Shukla, P. P., Kietzmann, J. H., & McCarthy, I. P. 2015. How to Work a Crowd: Developing Crowd Capital through Crowdsourcing. Business Horizons, 58(1): 77–85.
http://dx.doi.org/10.1016/j.bushor.2014.09.005

Rotman, D., Hammock, J., Preece, J., Hansen, D., Boston, C., Bowser, A., & He, Y. 2014. Motivations Affecting Initial and Long-Term Participation in Citizen Science Projects in Three Countries. In iConference 2014 Proceedings: 110–124.
http://dx.doi.org/10.9776/14054

Schlagwein, D., & Andersen, N.-B. 2014. Organizational Learning with Crowdsourcing: The Revelatory Case of LEGO. Journal of the Association for Information Systems, 15(11): 754–778.

Ståhlbröst, A. 2012. A Set of Key Principles to Assess the Impact of Living Labs. International Journal of Product Development, 17(1-2): 60–75.
http://dx.doi.org/10.1504/IJPD.2012.051154

Ståhlbröst, A. 2013. A Living Lab as a Service: Creating Value for Micro-enterprises through Collaboration and Innovation. Technology Innovation Management Review, 3(11): 37–42.
http://timreview.ca/article/744

Ståhlbröst, A., & Bergvall-Kåreborn, B. 2008. FormIT – An Approach to User Involvement. In J. Schumacher, & V.-P. Niitamo (Eds.), European Living Labs - A New Approach for Human Centric Regional Innovation: 63–76. Berlin: Wissenschaftlicher Verlag.

Svensson, J., Esbjörnsson, E., & Eriksson, C. I. 2010. User Contribution in Innovation Processes – Reflections from a Living Lab Perspective. In HICCS 2010, the 43rd Hawaii International Conference on System Sciences: 1–10. Maui, Hawaii. New York: IEEE.

vom Brocke, J., Simons, A., Niehaves, B., Reimer, K., Plattfaut, R., & Cleven, A. 2009. Reconstructing the Giant: On the Importance of Rigour in Documenting the Literature Search Process. ECIS 2009 Proceedings: Paper 161. Verona, Italy.

Wendin, K., Åström, A., & Ståhlbröst, A. 2015. Exploring Differences between Central Located Test and Home Use Test in a Living Lab Context. International Journal of Consumer Studies, 39(3): 230–238.
http://dx.doi.org/10.1111/ijcs.12171

Ye, H., & Kankanhalli, A. 2013. Leveraging Crowdsourcing for Organizational Value Co-Creation. Communication of the AIS, 33(1): Article 13.

Zhao, Y., & Zhu, Q. 2014. Evaluation on Crowdsourcing Research: Current Status and Future Directions. Information Systems Frontier, 16: 417–434.
http://dx.doi.org/10.1007/s10796-012-9350-4

Zogaj, S., & Bretschneider, U. 2013. Crowdtesting with Testcloud – Managing the Challenges of an Intermediary in a Crowdsourcing Business Model. Paper presented at ECIS2013, Utrecht, Germany. 

Share this article:

Cite this article:

Rate This Content: 
No votes have been cast yet. Have your say!

Keywords: citizen, crowdsourcing, ICT, innovation process, Living lab, user

Add new comment

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Badbot Fields
If you see these fields, something is wrong.
If you see this field, something is wrong.
If you see this field, something is wrong.
If you see this field, something is wrong.