Scientometrics CSCW
Scientometrics CSCW
net/publication/320861763
CITATIONS READS
15 1,092
3 authors:
Benjamim Fonseca
Institute for Systems and Computer Engineering, Technology and Science (INESC …
111 PUBLICATIONS 625 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by António Correia on 18 August 2019.
Abstract Over the last decades, CSCW research has undergone significant structural
changes and has grown steadily with manifested differences from other fields in terms of
theory building, methodology, and socio-technicality. This paper provides a quantitative
assessment of the scientific literature for mapping the intellectual structure of CSCW
research and its scientific development over a 15-year period (2001–2015). A total of 1713
publications were subjected to examination in order to draw statistics and depict dynamic
changes to shed new light upon the growth, spread, and collaboration of CSCW devoted
outlets. Overall, our study characterizes top (cited and downloaded) papers, citation pat-
terns, prominent authors and institutions, demographics, collaboration patterns, most fre-
quent topic clusters and keywords, and social mentions by country, discipline, and
professional status. The results highlight some areas of improvement for the field and a lot
of well-established topics which are changing gradually with impact on citations and
downloads. Statistical models reveal that the field is predominantly influenced by funda-
mental and highly recognized scientists and papers. A small number of papers without
citations, the growth of the number of papers by year, and an average number of more than
39 citations per paper in all venues ensure the field a healthy and evolving nature. We
discuss the implications of these findings in terms of the influence of CSCW on the larger
field of HCI.
123
Scientometrics
Introduction
CSCW research has grown steadily for much of the past quarter century, disseminated by
devoted and non-specific publication venues from North America and Europe (Grudin and
Poltrock 2012). As an established, practice-based field of research committed to understand
the socially organized nature of cooperation and its characteristics in order to inform the
design of computer artifacts in cooperative settings, Computer Supported Cooperative
Work (CSCW) has adopted a variety of theoretical constructs, descriptive methods, con-
ceptual frameworks, and hybrid forms to characterize technology-driven waves that shape
digitally mediated communication and collaboration in increasingly complex, networked
settings (Schmidt and Bannon 2013). As a result of these advances, socially embedded
systems and technologies have pervaded our everyday life settings as driving forces for
contemporary research (Crabtree et al. 2005). With this in mind we tried to contribute to
Schmidt and Bannon’s (2013) intent of assessing the ‘‘complex physiognomy’’ of this field
of research, acknowledging that measuring individual and collaboration outcomes within
an interdisciplinary community like CSCW can be a complex task due to its polymorphic
nature and inherent difficulty to ‘‘judge the quality of its members’ contributions’’ (Bart-
neck and Hu 2010). A contrast between CSCW and other communities and disciplines that
promote up-to-date introspections and history-aware research agendas is clearly noted. For
instance, ‘‘CSCW researchers remain unreflective about the structure and impact of their
own collaborations’’ (Horn et al. 2004). More recent evidence (Keegan et al. 2013) sug-
gests that they ‘‘have been relatively quiet, with only a few attempts to understand the
structure of the CSCW community’’, which strongly affects the course of innovation and
theory building in this field.
Some researchers have emphasized the ability of quantitative analysis to understand the
nature of a field and its evolution over time. Kaye (2009) goes even further by claiming
that ‘‘it seems reasonable that a certain amount of introspection is healthy for a field’’.
Scientometrics methods and tools ‘‘can be used to study sociological phenomena associ-
ated with scientific communities, to evaluate the impact of research, to map scientific
networks and to monitor the evolution of scientific fields’’ (Jacovi et al. 2006). This means
that such kind of analysis not only help researchers to understand how research commu-
nities are evolving throughout the world by reflecting on the publication data of specific
conferences and journals (Gupta 2015; Barbosa et al. 2016; Mubin et al. 2017) but also
faculty members, students, general public, administrators, conference organizers, journal
editors, and publishers interested on their advances and trends as a basis for knowledge
acquisition, consensus building, and decision making in a vast set of situations such as
hiring staff or targeting venues and publications to examine (Holsapple and Luo 2003;
Henry et al. 2007). Notwithstanding some efforts using statistical methods to measure data
from the literature, none of the earlier known studies provided a cross-sectional, biblio-
metric analysis to portray the field of CSCW and its evolution in the entire twenty first
century through the widely accessible data sources. There is a need to reflect upon the
contributions annually published by CSCW authors both continuously and systematically
for shaping the future by discovering new gaps while quantifying the underlying intel-
lectual structure of CSCW research. Prior quantitative examinations are considered as the
starting point for this study with a focus on CSCW devoted venues in a formal data-driven
approach.
The present paper aims to reflect on the authorship and impact of the published papers
in the field of CSCW, studying its main trends with the ultimate goal of providing future
insights and recommendations to a research community fully committed to understanding
123
Scientometrics
cooperative work practices and designing digital and other material artifacts in cooperative
settings. We tried to examine more deeply the authorship profile and how it has changed
over time, the prominent institutions, and the research topics over time (Pritchard 1969;
Glänzel 2009). Conceived as integral units of the ‘‘science of measuring’’ (Price 1963) for
analyzing scientific and technological literature, bibliometric indicators are used as for-
mative instruments to characterize structural gaps and trends of CSCW research. In
addition, we expanded our research to identify the geographic origin of research by
detecting the institutional’s affiliation country of each author. In the course of these pur-
suits, the present study also sheds new light to the altmetrics literature by providing an
overview of the coverage of alternative metrics in CSCW research. We are particularly
interested in gaining insight into the following Research Questions (RQ):
1. Was there an increase in the amount of literature in CSCW research? Is there a pattern
to how citations fluctuate over years?
2. What are the differences in the country, affiliation, and individual research output?
What does that tell us about the field profile?
3. How has the scientific collaboration among CSCW researchers evolved? Do
collaboration types influence the overall impact of publications?
4. What research topics were of concern to the authors in the field of CSCW? How are
they related with respect to their history and future progress?
5. What is the impact of CSCW publications on social media? How much and what kind
of altmetrics data exist for the papers?
6. How can our observations on the structural changes of CSCW research be related to
the larger field of Human–Computer Interaction (HCI)?
All these questions are suited to taking a first step in portraying the CSCW literature.
The remainder of the paper is organized as follows. ‘‘Background’’ section discusses some
background on the field of Scientometrics, providing an overview of the research dedicated
to the bibliometric study of CSCW and contiguous disciplines. A conceptual framework
comprising core and peripheral concepts in the field of CSCW is also presented. ‘‘Method’’
section summarizes our data selection, collection, and preparation processes. ‘‘Findings’’
section describes the main findings obtained with the scientometric analysis. Thereafter,
we speculate and discuss on what they could mean looking ahead for the CSCW com-
munity by performing a scientometric comparison with HCI related outlets. The paper
concludes with some reflections on the issues raised and a list of possible shortcomings of
our analysis.
Background
123
Scientometrics
scientific and technological domains, the improvement processes related to social factors,
the study of cognitive and socio-organizational structures in scientific disciplines, and the
design of information systems for knowledge acquisition, analysis and dissemination (van
Raan 1997). A recent review of the literature (Mingers and Leydesdorff 2015) covered the
disciplinary foundations and historical milestones of this problem-oriented field both
conceptually and empirically.
A somewhat similar body of work appeared under the banner of bibliometrics as a
descriptive and evaluative science focused on the mathematical and statistical examination
of patterns in documentation and information (Pritchard 1969). Hertzel (1987) was among
the first to trace the development of bibliometrics from its origins in statistics and bibli-
ography. The term is often overlapped with concepts such as cybermetrics and webo-
metrics, defined as the quantitative analysis of scholarly and scientific communication in
the Internet ecosystem (Diodato 1994). At first, bibliometrics techniques and tools are
based on mathematical patterns intended to measure trends and citation impact from the
literature (Tague-Sutcliffe 1992). Another set of recent contributions regards the study of
the altmetrics literature for measuring a different kind of research impact, ‘‘acting as a
complement rather than a substitute to traditional metrics’’ (Erdt et al. 2016). The paradigm
of ‘scientometrics 2.0’ (Priem and Hemminger 2010) emerged as a way of using non-
traditional metrics for evaluating scholars, recommending publications, and studying sci-
ence to monitor the growth in the size of scientific literature using Web 2.0 tools. Altmetric
data sources include but are not limited to social bookmarking, social news and recom-
mendations, blogs, digital libraries, social networks, and microblogs. The number of papers
with available alternative metrics has been growing for most disciplines in the last 5 years
(Erdt et al. 2016). Such Web-based metrics enable the assessment of public engagement
and general impact of scholarly materials (Piwowar 2013) through different perspectives
and levels of analysis (e.g., views, tweets, and shares).
The origins of the term CSCW can be traced to a workshop organized by Irene Greif and
Paul Cashman in 1984 intended to better understand how computer systems could be used
to improve and enhance group outcomes taking into account the highly nuanced, flexible,
and contextualized nature of human activity in cooperative work settings (Greif 1988;
Ackerman 2000; Schmidt and Bannon 2013). The enthusiasm for the topic continued with
the first open conference under the CSCW label convened in Austin, Texas in 1986
(Krasner and Greif 1986). GROUP conference (formerly Conference on Office Information
Systems) was held 2 years later at Palo Alto, California. The first European CSCW con-
ference was realized in London in 1989, while the CSCW Journal began to appear in 1992
(Schmidt and Bannon 2013).
Several authors attempted to define CSCW but there seems to be no general definition
(Suchman 1989; Schmidt and Bannon 1992). In broad terms, CSCW can be defined as ‘‘an
endeavor to understand the nature and characteristics of cooperative work with the
objective of designing adequate computer-based technologies’’ (Bannon and Schmidt
1989). Kuutti (1991) went even further by claiming that CSCW involves ‘‘work by mul-
tiple active subjects sharing a common object and supported by information technology’’.
Such ways of viewing CSCW encompass an explicit concern for the socially organized
practices related to the use of computer support systems (Bannon 1993). Interested readers
can investigate other sources for accounts of the field (e.g., Hughes et al. 1991; Bannon
1992; Schmidt and Bannon 1992; Grudin 1994; Schmidt 2011).
123
Scientometrics
Fig. 1 Socio-technical classification model of CSCW. Adapted from Cruz et al. (2012)
123
Scientometrics
in the literature are: planning, control models, task/subtask relationship and information
management, mutual adjustment, standardization, coordination protocol, and modes of
operation. In order to effectively support coordination, collaborative computing needs to
fulfill three important requirements: time management, resources, or shared artifacts
produced along the activity chain. Cooperative work arrangements appear and dissolve
again. Oppositely to conflict (McGrath 1984), cooperation occurs when a group works
toward a common goal (Malone and Crowston 1994) with high degrees of task interde-
pendencies, sharing the available information by some kind of shared space (Grudin 1994).
Cooperation categories can range from production (co-authoring), storage or manipulation
of an artifact, to concurrency, access or floor control. Technically, cooperation is supported
by systems with capabilities to send or receive messages, synchronously and/or asyn-
chronously (Mentzas 1993), and also develop or share documents (Grudin and Poltrock
1997).
Collaboration can occur in a specific time (synchronous, asynchronous) and place (co-
located, remote) with some level of predictability. A set of temporal and spatial subdo-
mains can be distilled, including: session persistence, delay between audio/video channels,
reciprocity and homogeneity of channels, delay of the message sent, and spontaneity of
collaboration. The application-level category identifies a set of typologies for collaborative
systems and technologies (Ellis et al. 1991). Some examples include shared editors, video
conferencing, shared file repositories, instant messaging, workflow systems, and social
tagging systems. As a subcategory of social computing, regulation means the representa-
tion of mechanisms that enable participants to organize themselves into a collaborative
working environment, where the regulation of collaboration activities concerns the defi-
nition and evolution of work rules to ensure conformity between the activity and group
goals (Ferraris and Martel 2000). System properties can range from the architecture to the
collaboration interface (portal, physical workspace, devices), content (text, links, graphic,
data-stream), supported actions (receive, add, edit, move, delete), alert mechanisms, and
access controls. In addition, social processes in work ‘‘strongly influence the ways in which
CSCW applications are adopted, used, and influence subsequent work’’ (Kling 1991).
Groups are social aggregations of individuals with awareness of its presence, supported
by task interdependencies, and conducted by its own norms towards a common goal
(Pumareja and Sikkel 2002). The collaboration cycle is bounded by awareness, which can
be understood as the perception of group about what each member develops, and the
contextual knowledge that they have about what is happening within the group when
executing activities in shared communication systems (Ackerman 2000; Mittleman et al.
2008). Cooperative work ensembles are characterized by size, composition, location,
proximity, structure, formation, cohesiveness, autonomy, subject, behavior, and trust. The
group members have a personal background (work experience, training, educational),
motivation, skills, attitude towards technology, satisfaction, knowledge, and personality.
There is a specific complexity associated to each task. In addition, group tasks can be
subdivided in creativity, planning, intellective, mixed-motive, cognitive-conflict, decision-
making, contests/battles/competitive, and performances/psychomotor (McGrath 1984).
They are supported by cultural impact, goals, interdependency and information exchange
needs, bottlenecks, and process gain. The contextual or situational factors range from an
organizational perspective (rewards, budget, training) to the cultural context (trust, equity),
physical setting, and environment (competition, uncertainly, time pressure, etc.). Interac-
tion variables are related to individual outcomes such as expectations and satisfaction on
system use, group factors (e.g., quality of group performance), and system outcomes
(enhancements and affordances). Some independent variables that characterize a socio-
123
Scientometrics
A cross-sectional analysis of CSCW and related research needs the consideration of pre-
vious studies with similar purposes to identify reasonable gaps and reduce duplicated
efforts. Research advances in CSCW have spread across journals, conference proceedings,
posters, tutorials, book series, technical reports, presentation videos, slides, and social
media platforms developed for scientists and researchers. Prior approaches to evaluating
CSCW bibliography emphasized a need for ‘‘building and maintaining a reference col-
lection of relevant publications’’ (Greenberg 1991), describing its evolution in terms of
systems and concepts deployed since the origins of the field. Jonathan Grudin also explored
historical changes and demographics in 1994. According to the author, North American
foci witnessed a paradigmatic shift from mainframe computing and large minicomputer
systems (developed to support organizational tasks and goals in the 1960s) to applications
conceived primarily for individual users in the 1990s. Similarities between the ACM
Conference on Computer-Supported Cooperative Work and Social Computing (ACM
CSCW) and the ACM CHI Conference on Human Factors in Computing Systems (CHI)
characterized the North American CSCW community as a consequence of HCI research,
‘‘fueled by computer companies moving beyond single-user applications to products
supporting small-group activity’’. This contrasted with a considerable attention from
European research labs on designing large-scale systems, often in user organizations. A
less visible but also a critical source of contributions stemmed from Asian researchers
affiliated with product development and telecommunications companies (Grudin 1994).
A review of prior groupware evaluations (Pinelle and Gutwin 2000) preceded the
introduction of the twenty first century with an examination of the ACM CSCW confer-
ence proceedings (1990–1998). Analyzing a sample of 45 publications that only introduced
and/or evaluated a groupware system, findings pointed to a small number of deployments
in real world settings when comparing with academic or research implementations. Lab-
oratory experiments and field studies were the primary types of evaluation, but ‘‘almost
forty percent of the articles contained no formal evaluation at all’’. A lack of formal
evaluation was also corroborated once only a very small number of studies examined the
impact on work practices in a user group when a piece of groupware was introduced.
CSCW has evolved in response to the constant social-technical advances, and the first
known bibliometric study devoted to CSCW research in this century was published by
Holsapple and Luo in 2003. The authors applied citation analysis to analyze variances in
journals with great influence. Tracking contributions, a total of 19,271 citations from
journal papers, conference series, technical reports, and books were analyzed across an
8-year period (1992–1999). An appreciation of the most significant CSCW outlets indi-
cated its presence in journals from computer science, business computing, library science,
and communications related disciplines. Reasonably, the interdisciplinary nature of CSCW
subject was characterized by a research focus on four prominent journals, including an
emphasis on design and use issues by the CSCW Journal, which foundation corresponded
with the initial wave of critical mass interest on CSCW research. Technically, the CSCW
123
Scientometrics
Journal had a higher amount of citations from conference proceedings, contrasting with the
lowest number of journal citations and papers cited. Results also revealed that its number
of citations remained constant in the last century.
Earlier studies of CSCW research demonstrated the value of examining its configuration
by analyzing topics of interest, citation indicators, co-authorship networks, etc. Results
achieved by Horn et al. (2004) denoted a ‘‘high volatility in the composition of the CSCW
research community over decade-long time spans’’. This view can corroborate a ‘punc-
tuated equilibrium’ established on time frames of evolutionary stasis disrupted by short
periods of sudden technological waves (Grudin 2012). Co-authorship network analysis
revealed that CSCW researchers maintained a high proportion of collaborations with
authors from outside fields of research between 1999 and 2003 (Horn et al. 2004). Col-
laboration robustness and co-authoring partnership were identified for HCI and CSCW
cumulative networks (1982–2003), and a top 20 of HCI and CSCW prolific authors was
identified using co-authorship networks. Physical proximity has represented an important
factor in establishing collaborations between HCI and CSCW researchers. This falls far
short of other fields where researchers cooperate actively with distinct institutions and it is
also relevant to note that a considerable number of highly central authors in CSCW were
also highly central in the HCI community. Oddly enough, there were quite a few cases in
which co-authorship happens between married couples. The authors went even further by
performing a citation data analysis from ACM CSCW conference series (2000–2002),
displaying a preference for citing journal papers, followed by CSCW devoted conferences
(i.e., ACM CSCW, GROUP, and ECSCW), books, CHI, other conferences, book chapters,
URL, and other reports.
A citation graph analysis of the ACM CSCW conference series (1986–2004) spotted the
field (Jacovi et al. 2006). Results pointed a relatively low number of papers that were never
cited. Eight prominent clusters emerged from the structural analysis, exhibiting ‘‘a
stable presence of a computer science cluster throughout the years’’ with decreasing
indicators in 2004 given the rise of multimedia tools and media spaces. This cluster was
characterized by prototypes and toolkits for collaborative tools, architectures of computer
systems, including other subjects devoted to computer science. The evolution of the social
science cluster in the ACM CSCW revealed periods of decomposition into sub-topics and
small clusters followed by a trend for convergence in the present century. Results from
Convertino et al.’s (2006) study on the structural and topical characteristics of ACM
CSCW conference (1986–2002) showed a high number of contributions by academics
from USA (70–90%) compared with Europe (10–30%). A focus on group issues was
predominant throughout the period of analysis, contrasting with organizational and indi-
vidual levels. Theoretical approaches declined gradually, whilst design contributions
focused on design aspects and/or system architectures remained consistently high.
Ethnography and experimental approaches increased steadily, while communication and
information sharing were considered the main functions of cooperative work presented in
the ACM CSCW conference, followed by coordination and awareness. In addition, the
main focus on design issues (system prototypes and architectures) was clearly visible.
Curiously, the conference also noted an emphasis on CSCW and HCI fully oriented topics
instead of expanding its scope on topics from other disciplines.
Corroborating previous results, Wainer and Barsottini (2007) examined four ACM
CSCW biannual series (1998–2004) indicating a remarkable decline of non-empirical
papers, a stable portion of design and evaluation studies, a high number of descriptive
studies, and a constant growth in the number of publications testing hypotheses about
group dynamics and collaboration settings through experiments. Groupware design and
123
Scientometrics
evaluation papers usually describe system details instead of evaluation results, an indicator
that can allow understanding how collaboration systems have been created without major
concerns on evaluating use contexts. Keegan et al. (2013) studied the dynamics of ACM
CSCW proceedings (1986–2013) by applying social network analysis and bibliometrics.
Hence, the authors suggested that the impact of CSCW authors and publications are
strongly related to their position within inner collaboration and citation networks. From
this point of view, they presented an analytical framework in which the ‘‘impact is sig-
nificantly influenced by structural position, such that ideas introduced by those in the core
of the CSCW community (e.g., elite researchers) are advantaged over those introduced by
peripheral participants (e.g., newcomers)’’. Recently, researchers have started going back
to exploring the ACM CSCW conference over a 25-year period (Wallace et al. 2017). The
authors draw our attention to understand how work practices have been impacted by
changes in technology from 1990 to 2015 taking into account synchronicity and distri-
bution, research and evaluation type, and the device hardware targeted by each publication.
CSCW studies are primarily concerned with revealing the nature of particular work
practices within which collaboration occurs. However, a somewhat distinct but interrelated
body of work has sought to understand collaboration dynamics in learning settings. Kienle
and Wessner (2006) examined the Computer-Supported Collaborative Learning (CSCL)
community and suggested some insights about the growth and maturity of its principal
conference (1995–2005). The findings revealed a stable ratio of new and recurrent
members and an increasing number of citations and co-authorship indicators. Recently, this
study was updated by combining social network analysis, document co-citation analysis
and exploratory factor analysis for gauging contemporary CSCL research (Tang et al.
2014). In the meantime, a bibliometric exercise was also made for the CRIWG Interna-
tional Conference on Collaboration and Technology (2000–2008), a publication venue
‘‘initially thought to be a meeting to exchange research approaches in the field of group-
ware for a few groups’’ (Antunes and Pino 2010). CSCW represents the main research area
referenced on CRIWG conference proceedings. The primary focus of this community
relies on building prototypes and designing collaborative systems and tools, followed by
theory and model development. In addition, a large number of CRIWG papers do not
present an emphasis on evaluation, addressing architectures, frameworks, prototypes and
design issues without an articulated focus on building guidelines for developers.
Recent scientometric studies have also been focused in CHI-related conference series
occurring in specific regions or countries, including but not limited to UK (Padilla et al.
2014), Korea (Lee et al. 2014), India (Gupta 2015), New Zealand (Nichols and Cun-
ningham 2015), Brazil (Barbosa et al. 2016), and Australia (Mubin et al. 2017). Such
studies applied techniques that range from topic modeling to trend analysis, author gender
analysis, institutional-level analysis and co-authorship networks, citations, and authoring
frequency (Mubin et al. 2017). In most instances, scientometric examinations exhibit a
quantitative emphasis on citation analysis and authorship data (Table 1). It is also worth
noting that such kind of studies in the field of HCI exhibits low citation rates comparing
with other types of publications (e.g., system design).
123
Scientometrics
Table 1 An overview of some of the bibliometric studies on HCI and contiguous literature in the twenty
first century
Years Author(s) Temporal Sample size Analytical dimensions
coverage
2003 Holsapple and 1992–1999 7364 publications Citation count; number of papers;
Luo rankings of influential journals; research
approach
2004 Horn et al. 1982–2003 22,887 publications Co-authorship networks; betweenness
centrality; author rankings; number of
members; citations by publication
source and date; visibility and stability
of the CSCW community
2005 Chen et al. 1980–2004 3620 authors and 2309 Co-authorship networks; hybrid networks
publications from of topical terms; co-citation analysis
the field of HCI
2006 Convertino 1986–2002 300 publications Author affiliation; geographical location;
et al. level of analysis; type of study; CSCW
characteristics; analysis of references;
Social Network Analysis
2006 Wania et al. 1990–2004 64 authors from the Co-citation count; cluster analysis; author
field of HCI co-citation map
2006 Jacovi et al. 1986–2004 465 publications Most quoted papers; modularity;
clustering; betweenness centrality;
chasm analysis
2006 Oulasvirta 1990–2006 CHI proceedings Most cited first authors; influential sites of
research; productive authors; cited
papers; expected citations
2006 Kienle and 1995–2005 CSCL proceedings Continuity of active and passive
Wessner membership; geographical distribution;
international connectivity of the
community; qualitative study of
policies and motives
2007 Wainer and 1998–2004 ACM CSCW Number of papers; research approach;
Barsottini proceedings references
2007 Henry et al. 1983–2006 CHI, UIST, AVI, and Most prolific authors; most cited
InfoVis proceedings researchers; betweenness centrality;
citation count; number of papers;
number of references; acceptance rate;
sources of key papers; keywords; co-
authorship networks; macro structure;
HCI communities
2007 Barkhuus and 1983–2006 CHI proceedings Research approach and methods; number
Rode of papers; average number of
participants
2008 Meho and 1996–2007 22 researchers from Number of citations by document type;
Rogers the field of HCI top sources of citations by database;
citation counts and rankings of
researchers; differences between
citation indexes in terms of top citing
entities of the three most cited
researchers; h-index scores of
researchers; rankings between citation
indexes
2009 Kaye 1983–2008 CHI proceedings Author count; gender analysis; repeat
authorship; conference growth
123
Scientometrics
Table 1 continued
2009 Bartneck and 1981–2008 CHI proceedings Number of papers; pages per year;
Hu citation count; geographical
distribution; number of authors;
h-index; g-index; top organizations;
affiliations; best paper awards
2010 Antunes and 2000–2008 CRIWG proceedings Citation count; number of papers; top
Pino cited papers; top authors; country
distribution; co-authorship; special
issues; analysis of references; main
research areas; research objectives;
topics; evaluation methods; SWOT
analysis
2010 Bartneck and 1983; 5697 publications Network graph of continents and
Hu 1985–1991; countries; collaboration conditioning
1993–2008 factors; average number of
organizations; citation count
2011 Bartneck 2006–2010 HRI proceedings Keyword analysis; citation count; ranking
of countries; top organizations; top
authors
2013 Keegan et al. 1986–2013 ACM CSCW Co-authorship; citation count; top
proceedings authors; top papers
2013 Correia et al. 2003–2010 1480 publications Citation count; type of research;
representativeness by country
2014 Padilla et al. 2008–2014 BHCI and CHI Topic modeling; trend analysis
proceedings
2014 Liu et al. 1994–2013 CHI proceedings Citation count; number of papers;
keyword analysis; size; co-word
frequency; cohesion coefficient;
centrality; density
2014 Kumar 2006–2011 63,137 publications Distribution of author publication
productivity; Lotka’s exponent and
constant
2014 Tang et al. 2006–2013 ijCSCL and CSCL Document co-citation analysis;
proceedings exploratory factor analysis; Social
Network Analysis
2014 Lee et al. 1991–2014 HCI Korea Institution-level analysis; internal and
proceedings external collaborative networks
2015 Nichols and 2001–2012 CHINZ proceedings Citation count; type of research; most
Cunningham cited papers; authoring frequency
2015 Gupta 2010–2014 IndiaHCI proceedings Demographics; citation count; number of
papers; number of authors; keyword
analysis
2016 Barbosa et al. 1998–2015 IHC proceedings Number of papers and acceptance rate per
year; mean number of authors per year;
author gender analysis; co-authorship
networks; countries of paper authors;
institution analysis; mean number and
percentage of references per paper;
histogram of papers with self-citations;
citation analysis; bibliographic
coupling; co-citation networks;
keyword analysis
123
Scientometrics
Table 1 continued
2017 Wallace et al. 1990–2015 ACM CSCW Synchronicity and distribution; research
proceedings type; evaluation type; devices
2017 Mubin et al. 2006–2015 OzCHI proceedings Submissions and acceptance rates;
popular authors and affiliations; popular
keywords and paper tracks; citation
analysis
Method
The dimension of a field is measured by the size of its literary core, rather than the number
of researchers (Garfield 1972). Researchers and educators can benefit from the application
of quantitative methods for estimating the impact and coverage of publications and
information dissemination services while detecting prolific authors, affiliations, countries,
and trending topics (Pritchard 1969). The quantitative approach to research is also con-
cerned with objective assessment of data for clarifying inconsistencies and conflicts while
contributing to formulate facts, uncover patterns, and generalize results by manipulating
pre-existing statistical data (Mulrow 1994). Moreover, quantitative data collection methods
and analytical processes are more structured, explicit, and strongly informed than would be
the norm for qualitative approaches (Pope et al. 2000). Qualitative research is construc-
tivist by nature and is driven by the understanding of a concept or phenomenon due to
insufficient or novel research (Bird et al. 2009). On the other hand, quantitative science is
grounded on postpositive knowledge claims and can be applied to generate statistically
reliable data from a sample typically larger (Sandelowski 1995).
Despite the usefulness of Systematic Literature Reviews (SLR), defined as ‘‘a form of
secondary study that uses a well-defined methodology to identify, analyze and interpret all
available evidence related to a specific question in a way that is unbiased and (to a degree)
repeatable’’ (Keele 2007), they are arduous and require more time if compared to other
review methods. In addition, solid literature coverage of the observed phenomenon is
essential for this method, being more adequate when the data come from situations where
the results can be broadly expected to be similar. It is very difficult to follow an updated
research agenda using such methods due to the large number of papers published every
year. Large volumes of information must be reduced for digestion, and it is also hard to
document the whole process in a SLR for even experienced researchers (Stapić et al. 2016).
The systematic mapping study should be used when a topic is either very little or very
broadly covered, and tertiary reviews are only suitable if several reviews in the domain of
analysis already exist and should be summarized. A meta-analysis is a technique that
statistically combines evidence from different studies to provide a more precise effect of
the results (Glass 1976). Substantial differences between fields and years in the relationship
between citations and many other variables (e.g., co-authorship and nationality of authors)
can be expected and so it is problematic to average results between fields or years
(Bornmann 2015). As an objective and systematic means of describing phenomena, content
analysis can be understood as ‘‘a research method for making replicable and valid infer-
ences from data to their context, with the purpose of providing knowledge, new insights, a
representation of facts and a practical guide to action’’ (Krippendorff 1980). The aim is to
123
Scientometrics
attain a condensed and broad description of the phenomenon, and the outcome of the
analysis is concepts or categories describing the phenomenon. Usually the purpose of those
concepts or categories is to build up a model, conceptual system, conceptual map or
category (Elo and Kyngäs 2008). Quantitative instruments are used for measuring publi-
cations, researchers, venues, institutions, and topics (Glänzel 2009). We are particularly
interested on measuring the impact of publications in a first stage, so content analysis will
be part of future work. We linked the various types of analyses carried out in this paper
with the research questions addressed in the introductory section.
Sample
This work relies on the analysis of a 15-year time period (2001–2015) constituted by
concomitant variances resulting from technological deployments to support or enhance
collaborative practices. Our corpus of study is comprised of a total of 1713 publications,
including 985 papers published on ACM CSCW, 298 papers from the International
Conference on Supporting Group Work (GROUP), 165 papers from the European Con-
ference on Computer-Supported Cooperative Work (ECSCW), and 265 articles published
on Computer Supported Cooperative Work: The Journal of Collaborative Computing and
Work Practices (JCSCW). When using scientometrics, it is necessary to consider several
issues related to the selection of papers and data normalization (Holsapple and Luo 2003).
Evaluating contributions in the field of CSCW constitutes itself as a challenging exercise
since ‘‘any heuristic chosen to identify which venue or publication belongs to CSCW field
is error prone and will be subject to criticism and arguments’’ (Jacovi et al. 2006). As
regards to the publication venues selected for this study, the choice to cover them was
mainly because of their scientific committees and editorial boards include some of the most
cited authors in CSCW. As such, one may expect that these venues provide a representative
(although limited) sample of the work published in this century.
Specificity criterion relies primarily on the Horn et al.’s (2004) categorization, which
acknowledged ACM CSCW, GROUP, and ECSCW as devoted venues in the field of
CSCW. The authors also recognized JCSCW as a CSCW-related outlet, while Holsapple
and Luo (2003) considered JCSCW articles as extremely specialized and influential con-
tributions to the body of CSCW knowledge for shaping research. This selection criterion
was also discussed in other studies that considered the included sources as flagship venues
of the field for regular publication by CSCW authors from North American and European
communities (Jacovi et al. 2006; Grudin and Poltrock 2012). On the other hand, Horn et al.
(2004) considered CHI and the ACM Symposium on User Interface Software and Tech-
nology (UIST) as non-CSCW publication outlets. We chose to include only peer-reviewed
documents, so book chapters were excluded in order to bring consistency to the data
analysis.
A vast number of studies have been focused on CHI proceedings (as shown in Table 1),
sketching a comprehensive portrait of this conference. The deterministic factors chosen to
delimit the sample of this study are also extended to the iterant presence of CSCW topics in
all selected venues, complementariness to the previous bibliometric studies, and free
(online) access to the proceedings of ECSCW (Grudin and Poltrock 2012). Moreover, we
can access the entire sample electronically in the ACM Digital Library (ACM-DL) and
SpringerLink database for extracting publicly available bibliometric indicators (Jacovi
et al. 2006). This is not possible for venues such as the IEEE International Conference on
Computer Supported Cooperative Work in Design (CSCWD) and International Journal of
Cooperative Information Systems (IJCIS), which were initially examined but the findings
123
Scientometrics
were not considered in this study due to their low representativeness when compared to the
selected outlets. Moreover, we chose to limit the data collection and analysis by selecting
only full and short papers, excluding plenary sessions, panels, posters, keynotes, intro-
ductions to special issues, workshops, extended abstracts, tutorials, and video presenta-
tions. Furthermore, editorials and tables of contents without a linear structure for analysis
were also omitted.
Seeking publications devoted to CSCW was a starting point for data retrieval and indexing.
We used Google Scholar (GS) as the main source of our analysis, as followed by some
previous authors such as Bartneck and Hu (2009) and Mubin et al. (2017), since it offers
the widest coverage of scientific literature in comparison to Web of Science (WoS) and
Scopus (Bar-Ilan et al. 2007; Meho and Yang 2007). The inclusion of ACM-DL and
SpringerLink was primarily due to their freely accessible altmetrics and wide coverage of
CSCW records. Despite SCImago provides some indicators from Scopus at the journal and
country level (Jacsó 2010), this service does not contain bibliometric data for some CSCW
conferences. More details on the comparison of data sources for citation analysis will be
given in the ‘‘Appendix’’.
For the purpose of this study, we extracted the publication metadata for every work
listed in the tables of contents provided by DBLP, ACM-DL, and SpringerLink for every
CSCW venue from 2001 through 2015. Bibliometrics and altmetrics were gathered from a
total of 1520 full papers and 193 short papers between May 28, 2016 and July 3, 2016.
Self-citations were not removed from the dataset. For each record, the following data was
manually inspected and recorded in a spreadsheet:
1. ID
2. Year of publication
3. Publication outlet
4. Paper title
5. Author(s)
6. Per-author affiliation
7. Country of authors’ affiliation
8. Author listed keywords
9. Citations of the paper (as available on GS, ACM-DL, and SpringerLink)
10. Altmetrics (downloads, readers, and social mentions)
11. Notes (e.g., errors identified in the study, and additional comments)
Data processing
Once all the metadata were collected and stored in a spreadsheet, we used several methods
for data cleaning and processing. In fact, these processes occurred in numerous stages and
cycles. A visual exploration of the resulting datasets allowed getting more familiar with the
data while revealing faults with the data cleaning. In addition, such visualizations also
unveiled new data to collect or combinations and calculations that would be advantageous
to explore. Error detection and de-duplication efforts were performed at different levels to
avoid misunderstandings. Before we could examine author and affiliation information, we
fixed misspelled errors, typos, and cases where the same entity appeared in different
formats. Such problems are challenging for both humans and computer algorithms. Our
123
Scientometrics
sample contained numerous variations in the way one’s name is reported. For example,
‘‘David W. Randall’’, ‘‘Dave Randall’’ and ‘‘David Randall’’ referred to the same author.
As discussed by Horn et al. (2004), this array of analogous identities must be resolved to a
single identifier (otherwise each identity will be treated as a separate entity). We manually
standardized the authors, affiliations, countries, and keywords. Thus, 3509 keywords were
selected for the period of 2001–2015 using name matching and synonyms mergence of
singular and plural forms of gerunds, nouns, abbreviations, and acronyms (Liu et al. 2014).
This covered 1329 papers (around 85.85% of the total number of documents published in
this period, since ECSCW publications do not contain author keywords).
After performing manual standardization, a total of 5416 individuals and 703 institu-
tions were identified and included for analysis. A lot of authors worked for distinct
organizations. For instance, Liam Bannon worked (at least) for University of Limerick and
Aarhus University, and Kjeld Schmidt worked for IT University of Copenhagen and
Copenhagen Business School. Last but not least, 40 authors’ affiliation countries were
crawled based on demographic data. We considered all the contributory institutions while
assuming that the country of each organization does not necessarily match the nationality
of the author. Once the input data was stored, a simple count of (and subsequent combi-
nation of calculations between) occurrences in the dataset was performed. We also com-
pared publications by their citation patterns and re-read the papers that had the highest
citation values to understand the influential factors affecting such ratio. Cluster analysis
was applied for grouping terms based on their similarity. Nevertheless, different weights
were given for each set of keywords according to their word frequency. Thus, we were able
to identify core research concepts from combined corpora of the four venues. In addition to
citation and authorship data also analyzed in earlier studies, we examined alternative
metrics of impact with an ‘‘additional ecological validity’’ (Keegan et al. 2013). These
altmetrics constitute a first step towards understanding the kind of CSCW readers and how
literature and current databases help people on accomplishing their search needs. It should
be noted that the techniques used here are not without problems, both in the data collection
and interpretation phases. Some problems include but are not limited to treating short
papers in the same way as long papers and considering self-citations in the same level of
importance of external ones.
Findings
In this section we report on the results obtained from our scientometric analysis of 15 years
of CSCW research and discuss the main findings. We considered the total and average
number of citations (and quantity of publications without citations) according to GS,
ACM-DL, and SpringerLink. Due to the limitations of databases in the citation retrieval
process, it was solely compared the full range of devoted venues over the years that present
the required indicators to this exercise, discarding null values by the omission of data.
Some conferences occur every 2 years, which affects citation counts and influences the
overall results. For our initial exploration of the resulting data, the average number of
citations in GS is higher in comparison to the other citation indexes, representing a more
comprehensive view of what is happening worldwide by comprising not only citations in
scholarly papers but also its reference in another documents such as PhD theses, MSc
dissertations, book chapters, and technical reports. Looking at the results, a total of 74,846
citations were retrieved from GS, while ACM-DL allowed counting 19,966 citations and
123
Scientometrics
SpringerLink showed only 425. Table 2 presents a summary of the quantitative indicators
calculated and organized by publication venue and year.
123
Table 2 Overall outputs of CSCW research (2001–2015) organized by: Total Number of Publications (A), Total Number of Citations (Google Scholar) (B), Average Number
of Citations (Google Scholar) (C), Total Number of Citations (ACM Digital Library) (D), Average Number of Citations (ACM Digital Library) (E), Total Number of Citations
(SpringerLink) (F), Average Number of Citations (SpringerLink) (G), Total Number of Authors (H), Average Number of Authors (I), Total Number of Countries/Institutions
(J), Average Number of Countries/Institutions (K), Total Number of Keywords (L), Average Number of Keywords (M), Total Number of References (N), and Average
Number of References (O)
Scientometrics
Source A B C D E F G H I J K L M N O
GROUP 298 12,175 (10)a 40.86 3059 (37) 4.23 – – 852 2.86 875 2.94 1146 3.85 – –
2001 32 1760 (0) 55 377 (0) 11.78 – – 78 2.44 81 2.53 97 3.03 – –
2003 41 2247 (0) 54.8 556 (0) 13.56 – – 109 2.66 112 2.73 168 4.1 – –
2005 41 2503 (1) 61.05 677 (4) 16.51 – – 112 2.73 113 2.76 164 4 – –
2007 47 2304 (1) 49.02 615 (3) 13.09 – – 147 3.13 150 3.19 238 5.06 – –
2009 40 2499 (0) 62.48 596 (1) 14.9 – – 111 2.78 115 2.88 157 3.93 – –
2010 36 524 (2) 14.56 152 (6) 4.22 – – 112 3.11 112 3.11 128 3.56 – –
2012 35 268 (1) 7.66 66 (10) 1.89 – – 102 2.91 102 2.91 105 3 – –
2014 26 70 (5) 2.69 20 (13) 0.77 – – 81 3.12 90 3.46 89 3.42 – –
ACM CSCW 985 41,879 (22) 42.52 12,414 (102) 12.6 – – 3344 3.39 3436 3.49 4298 4.36 – –
2002 39 6151 (0) 157.72 1695 (1) 43.46 – – 122 3.13 127 3.26 190 4.87 – –
2004 75 8137 (0) 108.49 2239 (2) 29.85 – – 258 3.44 263 3.51 323 4.31 – –
2006 62 6266 (0) 101.06 1655 (1) 26.69 – – 207 3.34 212 3.42 279 4.5 – –
2008 86 5530 (0) 64.3 1722 (4) 20.02 – – 278 3.23 284 3.3 366 4.26 – –
2010 58 3503 (0) 60.4 1136 (1) 19.59 – – 174 3 174 3 296 5.1 – –
2011 67 2363 (0) 35.27 746 (4) 11.13 – – 223 3.33 231 3.45 273 4.07 – –
2012 164 4267 (3) 26.02 1329 (11) 8.1 – – 568 3.46 583 3.55 657 4.01 – –
2013 139 2796 (2) 20.12 964 (11) 6.94 – – 475 3.42 492 3.54 624 4.49 – –
2014 134 1959 (1) 14.62 659 (13) 4.92 – – 464 3.46 473 3.53 593 4.43 – –
2015 161 907 (16) 5.63 269 (54) 1.67 – – 575 3.57 597 3.71 697 4.33 – –
ECSCW 165 6468 (8) 39.2 1449 (0) 22.29 [65]b 160 (98) 0.97 516 3.13 541 3.28 – – 4210 29.65 [142]
2001 21 2030 (0) 96.67 579 (0) 27.57 11 (16) 0.52 62 2.95 65 3.1 – – 494 23.52
2003 20 1881 (0) 94.05 517 (0) 25.85 52 (7) 2.6 64 3.2 72 3.6 – – 559 27.95
123
Table 2 continued
Source A B C D E F G H I J K L M N O
123
2005 24 1151 (0) 47.96 353 (0) 14.71 35 (9) 1.46 66 2.75 70 2.92 – – 687 28.63
2007 23 656 (1) 28.52 – – 26 (11) 1.13 79 3.43 79 3.43 – – – –
2009 23 372 (0) 16.17 – – 18 (13) 0.78 69 3 70 3.04 – – 635 27.61
2011 22 218 (1) 9.91 – – 8 (16) 0.36 80 3.64 84 3.82 – – 713 32.41
2013 15 132 (1) 8.8 – – 7 (11) 0.47 44 2.93 44 2.93 – – 595 39.67
2015 17 28 (5) 1.65 – – 3 (15) 0.18 52 3.06 57 3.36 – – 527 31
JCSCW 265 14,324 (11) 54.05 3044 (29) 11.89 [256] 265 (31) 13.58 704 2.66 720 2.71 1637 6.18 13,848 52.26
2001 13 926 (1) 71.23 157 (1) 12.07 233 (1) 17.92 26 2 26 2 83 6.38 627 48.23
2002 17 3058 (0) 179.88 606 (0) 35.65 668 (0) 39.29 41 2.41 41 2.41 103 6.06 788 46.35
2003 17 1866 (0) 109.76 297 (0) 17.47 427 (0) 25.12 43 2.53 44 2.59 106 6.24 636 37.41
2004 20 1677 (0) 83.85 427 (0) 22.47 [19] 358 (0) 17.9 49 2.45 50 2.5 118 5.9 811 40.55
2005 16 1805 (0) 112.81 417 (1) 26.06 503 (0) 31.44 40 2.5 41 2.56 92 5.75 927 57.94
2006 19 1048 (1) 55.16 221 (1) 11.63 250 (2) 13.16 51 2.68 52 2.74 104 5.47 813 42.79
2007 19 1069 (0) 56.26 237 (1) 12.47 277 (0) 14.58 62 3.26 64 3.37 110 5.79 813 42.79
2008 19 506 (0) 26.63 103 (1) 5.42 135 (0) 7.11 55 2.89 58 3.95 114 6 736 38.74
2009 20 438 (0) 21.9 108 (1) 5.4 191 (0) 9.55 48 2.4 49 2.45 134 6.7 1035 51.75
2010 20 682 (0) 34.1 166 (1) 8.3 206 (2) 10.3 48 2.4 50 2.5 149 7.45 1085 54.25
2011 16 248 (0) 15.5 67 (4) 4.19 69 (2) 4.31 38 2.38 38 2.38 86 5.38 952 59.5
2012 16 322 (0) 20.13 52 (0) 4 [13] 106 (2) 6.63 44 2.75 44 2.75 101 6.31 954 59.63
2013 16 497 (1) 31.06 145 (2) 9.06 122 (3) 7.63 49 3.06 51 3.19 107 6.69 1731 108.19
2014 18 161 (0) 8.94 32 (1) 2.46 [13] 47 (5) 2.61 60 3.33 60 3.33 118 6.56 976 54.22
2015 19 21 (8) 1.11 9 (15) 0.47 7 (14) 0.37 50 2.63 52 2.74 112 5.89 964 50.74
a
Number of publications without citations
b
Number of publications considered for calculating the average number of occurences
Scientometrics
Scientometrics
Fig. 2 Evolution of the number of publications, authors, and average citations (by year)
a mature community. There was also a considerable gap between the average number of
citations and the total number of authors per venue in ACM CSCW when comparing with
other venues.
Extrapolating to a more granular perspective regarding the JCSCW citations, they
remained constant between 2001 and 2015. A similar trend had already been recognized
for the 1992–1999 period when Holsapple and Luo (2003) identified a total of 1246
publications and 239 journals cited, from which 28.21% symbolized journal citations,
38.85% represented book citations, 6.04% reflected technical report citations, and 24.9%
were citations from conference proceedings (in which 21% represented CSCW devoted
conferences) in a total of 4417 citations. In our study, JCSCW had 14,324 citations in GS,
3044 citations in ACM-DL (with an average of 11.89 citations and 29 papers without any
citation), and 265 citations in SpringerLink (with an average number of 13.58 citations per
paper and 31 papers without citations). The most cited papers in our sample were published
in ACM CSCW, GROUP, and JCSCW (Table 3). Their main topics ranged from social
networks and online communities to crowd work, Wikipedia, and instant messaging. It
should be noted that the type of topics addressed by the community has a significant effect
on the number of citations.
The dispersion of cited studies in the ACM CSCW proceedings presented a ‘‘strong bias
toward citing recent papers within CSCW’’ (Horn et al. 2004), which is corroborated by the
deviation from trend in the number of citations for this century. If we look at the number of
citations received per year, it shows a typical birth–death process beginning with few
citations, then the number increases, and finally they die away as the content becomes
obsolete. Mingers and Leydesdorff (2015) also discussed other variants to this pattern,
including ‘‘shooting stars’’ that are highly cited but die quickly and ‘‘sleeping beauties’’
that are ahead of their time. As others have highlighted (e.g., Bartneck 2011; Gupta 2015),
it usually takes 1 year to get the first citations and they tend to increase annually thereafter.
Reasons leading other authors to cite a paper can derivate from multiple factors, including
the pertinence of the thematic to the topic that they want to address, quality of work,
proximity of members in a scientific community (Jacovi et al. 2006), amongst factors that
can considerably neglect a research approach by its partial lack of knowledge and empathy.
123
Scientometrics
2006 ACM Cliff Lampe, Nicole B. Ellison, A Face(book) in the Crowd: Social 1230
CSCW Charles Steinfield Searching versus Social Browsing
2002 JCSCW Carl Gutwin, Saul Greenberg A Descriptive Framework of 912
Workspace Awareness for Real-
Time Groupware
2009 GROUP Dejin Zhao, Mary Beth Rosson How and Why People Twitter: The 842
Role that Micro-blogging Plays in
Informal Communication at Work
2004 ACM Bonnie A. Nardi, Diane J. Schiano, Blogging as Social Activity, or, 749
CSCW Michelle Gumbrecht Would You Let 900 Million
People Read Your Diary?
2004 ACM Gerard Beenen, Kimberly S. Ling, Using Social Psychology to 725
CSCW Xiaoqing Wang, Klarissa Chang, Motivate Contributions to Online
Dan Frankowski, Paul Resnick, Communities
Robert E. Kraut
2005 GROUP Susan L. Bryant, Andrea Forte, Amy Becoming 676
Bruckman Wikipedian: Transformation of
Participation in a Collaborative
Online Encyclopedia
2010 ACM Mor Naaman, Jeffrey Boase, Chih- Is it Really About Me?: Message 642
CSCW Hui Lai Content in Social Awareness
Streams
2008 ACM Joan Morris DiMicco, David R. Motivations for Social Networking 584
CSCW Millen, Werner Geyer, Casey at Work
Dugan, Beth Brownholtz, Michael
J. Muller
2002 ACM Rebecca E. Grinter, Leysia Palen Instant Messaging in Teen Life 582
CSCW
2004 ACM Stacey D. Scott, M. Sheelagh T. Territoriality in Collaborative 509
CSCW Carpendale, Kori M. Inkpen Tabletop Workspaces
2001 ECSCW Rebecca E. Grinter, Margery Y Do Tngrs Luv 2 Txt Msg? 498
Eldridge
2008 ACM Aniket Kittur, Robert E. Kraut Harnessing the Wisdom of Crowds 490
CSCW in Wikipedia: Quality through
Coordination
2002 ACM Ellen Isaacs, Alan Walendowski, The Character, Functions, and Styles 486
CSCW Steve Whittaker, Diane J. Schiano, of Instant Messaging in the
Candace A. Kamm Workplace
2003 ECSCW Stacey D. Scott, Karen D. Grant, System Guidelines for Co-located, 484
Regan L. Mandryk Collaborative Work on a Tabletop
Display
2009 GROUP Meredith M. Skeels, Jonathan When Social Networks Cross 457
Grudin Boundaries: A Case Study of
Workplace Use of Facebook and
LinkedIn
2008 ACM Cliff Lampe, Nicole B. Ellison, Changes in Use and Perception of 436
CSCW Charles Steinfield Facebook
2002 ACM David Frohlich, Allan Kuchinsky, Requirements for Photoware 404
CSCW Celine Pering, Abbe Don, Steven
Ariss
123
Scientometrics
Table 3 continued
2001 ECSCW Mark O’Connor, Dan Cosley, Joseph PolyLens: A Recommender System 404
A. Konstan, John Riedl for Groups of User
2003 ECSCW Barry Brown, Matthew Chalmers Tourism and Mobile Technology 401
2004 ACM Carl Gutwin, Reagan Penner, Kevin Group Awareness in Distributed 391
CSCW A. Schneider Software Development
2003 JCSCW Alex S. Taylor, Richard H. The Gift of the Gab?: A Design 387
R. Harper Oriented Sociology of Young
People’s Use of Mobiles
2006 ACM Marcelo Cataldo, Patrick Wagstrom, Identification of Coordination 377
CSCW James D. Herbsleb, Kathleen M. Requirements: Implications for the
Carley Design of Collaboration and
Awareness Tools
2006 ACM Bonnie A. Nardi, Justin Harris Strangers and Friends: Collaborative 374
CSCW Play in World of Warcraft
2004 ACM Nicolas Ducheneaut, Robert J. The Social Side of Gaming: A Study 374
CSCW Moore of Interaction Patterns in a
Massively Multiplayer Online
Game
2013 ACM Aniket Kittur, Jeffrey V. Nickerson, The Future of Crowd Work 355
CSCW Michael S. Bernstein, Elizabeth
Gerber, Aaron D. Shaw, John
Zimmerman, Matt Lease, John
Horton
2005 JCSCW Wil M. P. van der Aalst, Hajo A. Discovering Social Networks from 347
Reijers, Minseok Song Event Logs
2007 GROUP Reid Priedhorsky, Jilin Chen, Creating, Destroying, and Restoring 333
Shyong K. Lam, Katherine A. Value in Wikipedia
Panciera, Loren G. Terveen, John
Riedl
2006 ACM Paul Dourish Re-space-ing Place: ‘‘Place’’ and 333
CSCW ‘‘Space’’ 10 years On
2007 GROUP Joan Morris DiMicco, David R. Identity Management: Multiple 326
Millen Presentations of Self in Facebook
2006 ACM Shilad Sen, Shyong K. Lam, Al Tagging, Communities, Vocabulary, 326
CSCW Mamunur Rashid, Dan Cosley, Evolution
Dan Frankowski, Jeremy
Osterhouse, F. Maxwell Harper,
John Riedl
In addition, Hu and Wu (2014) have already noted an impact of paper length on citation
likelihood. However, this tendency may require further scrutiny (Mubin et al. 2017).
Table 4 presents the number of publications that had 0–1, 2–10, 11–25, 26–50,
and [ 50 citations. As the table shows, JCSCW had the highest ratio of papers with more
than 50 citations, followed by GROUP, ACM CSCW, and ECSCW. On the other hand,
ECSCW presented the most expressive ratio in terms of average number of papers without
citations, while the ACM CSCW provided the lowest number of papers that were never
cited, a tendency corroborated by Jacovi et al. (2006). This kind of publication data can be
compared with non-exclusive CSCW conferences such as CHI (Henry et al. 2007), which
123
Scientometrics
large impact and prestige are related to a balance between low acceptance rates and a
considerable number of attendees and papers cited in distinct venues over the years.
Author analysis
123
Scientometrics
Table 5 Top authors ranked by individual productivity and their accumulated contribution to the CSCW
literature
Authors Number of Citation Average Percentage Percentage h-Index
publications counta number of (number of (citation (entire
citations publications) (%) count) (%) career)
123
Scientometrics
Table 5 continued
123
Scientometrics
Table 5 continued
specialized venues of CSCW, while the total number of authors has been increasing
considerably as a result of the policy changes in the ACM CSCW. As expected, the number
of authors per paper follows the curve of the total number of publications (Fig. 2). In order
to characterize the evolution and impact of CSCW research through social network
analysis, Horn et al. (2004) revealed a total of 22,887 papers in the HCI community with an
average of 3.7 collaborators, which can indicate that CSCW researchers had a high portion
of co-authors outside the CSCW community upon the birth of the field in the 1980s while
maintaining a large number of connections to co-authors from non-CSCW devoted outlets
(e.g., CHI). An average number of 5.1 CSCW co-authors and 4.5 non-CSCW collaborators
were identified in a total of 188 authors (1999–2003). In addition, we are also able to
conclude that the CSCW devoted venues have similar values in terms of average number of
authors per paper when comparing with HCI dedicated conferences (Gupta 2015).
To further investigate RQ2, we analyzed the authors’ affiliations and the countries in which
they are located. The publications come from a total of 40 countries. As expected, our
findings demonstrated a prevalence of institutions from USA in all venues (54.47%), as it
represents the original location where the first CSCW conference took place. Researchers
from institutions in the UK also produced a vast set of papers with more expressive values
in JCSCW, ECSCW, and ACM CSCW. Table 7 presents a detailed perspective of the
distribution of countries by venue. These indicators can be crossed with CHI conference
for the 1981–2008 period, where the majority of its authors was affiliated to institutions
Table 6 Count of papers that have 1–2, 3–4, 5–6, 7–8, and [ 8 authors
Source Number of publications 1–2 3–4 5–6 7–8 [8
123
Scientometrics
from USA, Canada, and UK (Bartneck and Hu 2009). In addition, Convertino et al.’s
(2006) results on the structural and topical characteristics of ACM CSCW conference
(1986–2002) also presented a high number of contributions by academics from USA
(70–90%) compared to Europe (10–30%) and Asia (0–10%). More recent evidence
(Bartneck and Hu 2010) reveals that there are influential citation factors when considering
the authors’ affiliation institution.
In a narrower level of analysis, Carnegie Mellon University led the way as the most
representative institution with 261 authors, followed by University of California at Irvine,
as shown in Table 8. If we take a closer look at the top 30 institutions ranked by research
productivity we find that they include big companies such as IBM, Microsoft, and Xerox.
At this stage, we excluded the department level from our analysis. Some of the most
influential institutions in CSCW devoted outlets were also significant in CHI for the
1990–2006 interval, as stated by Oulasvirta (2006). Most of these institutions have an
outstanding overall reputation, historical background in CSCW, and appropriate funding
and personnel resources. This provides the basis for encouraging new generations of highly
qualified scientists and enables to employ several scientists working on particular topics.
Research funds are critical to shape the quantity of science generated by universities and
research labs. Nevertheless, such studies are also guided by the demand for science at the
regional level. For instance, several ethnographic studies have been conducted taking into
account the sociocultural aspects related to the location in which the cooperative work
arrangement occurs.
The advent of the twenty first century expressed a paradigmatic shift in the relationships
between North American and European CSCW communities, which seem to change every
decade in terms of program committees, covered topics, research methods, experimental
and analytic approaches (Grudin 2012). Collaboration is an intrinsically social process and
often a critical component in scientific research (Lee and Bozeman 2005; Jirotka et al.
2013; Wang et al. 2017; Qi et al. 2017). In its basic form, scientific collaboration plays a
significant role on the production of scientific knowledge (Ziman and Schmitt 1995) and
occurs as a synergetic vehicle for resource mobilization due to the fact that a researcher
may not possess all the expertise, time availability, information, and ideas to address a
particular research question. More details on the predictors of scientific collaboration will
be given in the next section. To investigate how collaboration has evolved over time in
CSCW research (RQ3), we adopted Melin and Persson’s (1996) criteria to analyze the
different types of collaboration and their influence on scientific production (Table 9).
According to the authors, each paper can be internally (e.g., at an university), nationally,
and internationally co-authored. It can be reasonably assumed that a paper is co-authored if
it has more than one author, and institutionally co-authored if it has more than one author
address suggesting that the authors come from various institutions, departments, and other
research units. Scientists have used co-authorship analysis to assess collaborative activity.
Nonetheless, we should sound a note of caution with regard to such findings since this
indicator can underestimates the presence of less visible forms of collaboration in disci-
plinary fields (Katz and Martin 1997).
The effects of collaboration in CSCW were primarily expressed at a local level between
researchers from the same organization working together, followed by domestic collabo-
rations among authors from distinct institutions within the same country, and international
connections between authors from distinct institutions situated in distinct countries
123
Scientometrics
Argentina 0 2 0 0
Australia 9 43 22 34
Austria 16 20 11 14
Belgium 0 4 0 1
Brazil 10 18 6 9
Canada 63 204 27 33
China 12 88 5 0
Colombia 0 0 0 1
Croatia 1 0 0 0
Denmark 32 42 47 39
Finland 17 26 2 13
France 27 49 35 18
Germany 50 67 58 38
Greece 4 0 1 2
India 1 23 2 0
Ireland 1 10 4 13
Israel 6 47 10 3
Italy 15 1 9 25
Japan 9 64 16 4
Malaysia 0 0 0 1
Mexico 0 2 0 3
Netherlands 10 24 11 27
New Zealand 1 6 0 2
Nigeria 0 0 0 1
Norway 11 25 12 23
Portugal 2 14 0 0
Qatar 0 10 0 0
Singapore 6 47 0 0
South Africa 1 1 0 0
South Korea 3 57 0 1
Spain 2 16 0 2
Sweden 26 31 17 16
Switzerland 13 37 4 12
Taiwan 0 5 0 0
Thailand 0 1 0 0
Turkey 1 0 0 0
Uganda 0 1 0 0
UK 26 282 113 147
United Arab Emirates 0 1 0 0
USA 500 2168 129 238
123
Scientometrics
(Fig. 3). Bartneck and Hu’s (2010) findings appear to support a similar pattern regarding
the CHI conference proceedings. Very close associations are visible in our entire sample,
an indicator corroborated by Keegan et al. (2013). Such evidence is visible for collabo-
rations occurring between institutions from the same country with a reduced number of
outside connections. Generally, international collaborations involve only 1 author from an
external university (commonly, near countries such as USA and Canada). This means that
repeat authorships and familiar interactions between authors really occur, which can be
particularly observed in a pattern of repeat authorship in the core of each CSCW devoted
outlet (e.g., Saul Greenberg, Carl Gutwin). Nevertheless, some of the international
authorships derive from funded projects or conference editions occurring in a particular
country (except GROUP which is realized in Sanibel Island, Florida, USA since 2003). Our
123
Table 9 Collaboration types and their influence on citation count organized by: Total Number of Publications (A), Total Number of Citations (Google Scholar) (B),
Collaboration at a Local Level (Authors from the Same Organization Working Together) (C), Average Number of Collaborations at a Local Level (D), Average Number of
Citations (Collaboration at a Local Level) (E), Domestic Co-Authorships (Authors from Different Organizations within the Same Country) (F), Average Number of Domestic
Co-Authorships (G), Average Number of Citations (Domestic Co-Authorships) (H), International Collaborations (Co-Authorship between Authors from Different Countries)
(I), Average Number of International Collaborations (J), Average Number of Citations (International Collaborations) (K), Individual Authorships (L), Average Number of
Scientometrics
Individual Authorships (M), and Average Number of Citations (Individual Authorships) (N)
Source A B C D E F G H I J K L M N
GROUP 298 12,175 160 0.52 21.24 67 0.23 9.4 39 0.13 5.31 31 0.11 4.49
2001 32 1760 9 0.28 15.40 11 0.34 18.7 5 0.16 8.8 7 0.22 12.10
2003 41 2247 23 0.56 30.69 6 0.15 8.22 4 0.1 5.48 8 0.2 10.96
2005 41 2503 28 0.68 41.51 8 0.2 12.21 2 0.05 3.05 2 0.05 3.05
2007 47 2304 28 0.6 29.41 6 0.13 6.37 10 0.21 10.29 3 0.06 2.94
2009 40 2499 26 0.65 40.61 9 0.23 14.37 2 0.05 3.12 3 0.08 5
2010 36 524 16 0.44 6.4 12 0.33 4.8 6 0.17 2.47 2 0.06 0.87
2012 35 268 20 0.57 4.36 7 0.2 1.53 5 0.14 1.07 3 0.09 0.69
2014 26 70 10 0.38 1.02 8 0.31 0.83 5 0.19 0.51 3 0.12 0.32
ACM CSCW 985 41,879 489 0.52 22.11 276 0.27 11.48 158 0.15 6.38 60 0.06 2.55
2002 39 6151 23 0.59 93.05 10 0.26 41.01 2 0.05 7.89 4 0.1 15.77
2004 75 8137 48 0.64 69.44 16 0.21 22.78 8 0.11 11.93 2 0.03 3.25
2006 62 6266 30 0.48 48.51 17 0.27 27.29 11 0.18 18.19 4 0.06 6.06
2008 86 5530 53 0.62 39.87 17 0.2 12.86 12 0.14 9 4 0.05 3.22
2010 58 3503 32 0.55 33.22 13 0.22 13.29 9 0.16 9.66 4 0.07 4.23
2011 67 2363 31 0.46 16.22 19 0.28 9.88 14 0.21 7.41 3 0.04 1.41
2012 164 4267 70 0.43 11.19 59 0.36 9.37 25 0.15 3.9 10 0.06 1.56
2013 139 2796 58 0.42 8.45 40 0.29 5.83 28 0.2 4.02 13 0.09 1.81
2014 134 1959 66 0.49 7.16 39 0.29 4.24 22 0.16 2.34 7 0.05 0.73
2015 161 907 78 0.48 2.7 46 0.29 1.63 27 0.17 0.96 9 0.06 0.34
ECSCW 165 6468 80 0.48 18.82 37 0.22 8.62 34 0.21 8.23 14 0.09 3.53
2001 21 2030 11 0.52 50.27 4 0.19 18.37 5 0.24 23.2 1 0.05 4.83
123
Table 9 continued
Source A B C D E F G H I J K L M N
123
2003 20 1881 10 0.5 47.03 5 0.25 23.51 4 0.2 18.81 1 0.05 4.7
2005 24 1151 14 0.58 27.82 5 0.21 10.07 3 0.13 6.23 2 0.08 3.84
2007 23 656 9 0.39 11.12 7 0.3 8.56 6 0.26 7.42 1 0.04 1.14
2009 23 372 10 0.43 6.95 5 0.22 3.56 6 0.26 4.21 2 0.09 1.46
2011 22 218 12 0.55 5.45 4 0.18 1.78 4 0.18 1.78 2 0.09 0.89
2013 15 132 7 0.47 4.14 2 0.13 1.14 3 0.2 1.76 3 0.2 1.76
2015 17 28 7 0.41 0.68 5 0.29 0.48 3 0.18 0.3 2 0.12 0.2
JCSCW 265 14,324 93 0.35 18.92 69 0.26 14.05 44 0.17 9.19 59 0.23 12.43
2001 13 926 4 0.31 22.08 3 0.23 16.38 2 0.15 10.68 4 0.31 22.08
2002 17 3058 5 0.29 52.17 4 0.24 43.17 2 0.12 21.59 6 0.35 62.96
2003 17 1866 5 0.29 31.83 3 0.18 19.76 3 0.18 19.76 6 0.35 38.42
2004 20 1677 7 0.35 29.35 6 0.3 25.16 4 0.2 16.77 3 0.15 12.58
2005 16 1805 4 0.25 28.2 4 0.25 28.2 3 0.19 21.43 5 0.31 34.97
2006 19 1048 4 0.21 11.58 7 0.37 20.41 2 0.11 6.07 6 0.32 17.65
2007 19 1069 9 0.47 26.44 6 0.32 18 1 0.05 2.81 3 0.16 9
2008 19 506 7 0.37 9.85 5 0.26 6.92 5 0.26 6.92 2 0.11 2.93
2009 20 438 10 0.5 10.95 5 0.25 5.48 1 0.05 1.1 4 0.2 4.38
2010 20 682 7 0.35 11.94 9 0.45 15.35 2 0.1 3.41 2 0.1 3.41
2011 16 248 5 0.31 4.81 6 0.38 5.89 0 0 0 5 0.31 4.81
2012 16 322 6 0.38 7.65 2 0.13 2.62 5 0.31 6.24 3 0.19 3.82
2013 16 497 5 0.31 9.63 2 0.13 4.04 8 0.5 15.53 1 0.06 1.86
2014 18 161 5 0.28 2.5 3 0.17 1.52 3 0.17 1.52 7 0.39 3.49
2015 19 21 10 0.53 0.59 4 0.21 0.23 3 0.16 0.18 2 0.11 0.12
Scientometrics
Scientometrics
study provides further evidence for the fact that distance is no longer a barrier as it was in
the past (Wagner et al. 2016), despite the heterogeneity between some regions in their
propensity to collaborate.
Unsurprisingly, most of the scientific collaboration efforts take place in the university
sector. Subsequently, scholars pay more attention on their underlying laws and patterns
(Wang et al. 2017). Some of such collaborations also denote a past history of a researcher
on a particular university. For instance, both Cleidson de Souza and Pernille Bjorn passed
by USA and returned to institutions from their country of origin. Following previous
evidence on the decreasing role of practitioners when compared with academics in the field
of HCI (Barkhuus and Rode 2007), our sample shows a main authorship component
constituted by scholars/researchers from universities and research laboratories (N = 5212)
in comparison with corporate professionals (N = 170). This finding suggests that the low
number of interactions between academia and industry might mean that the industrial
sector ‘‘is not broad enough to satisfy the collaborative needs of the universities’’ (Melin
and Persson 1996). If we take a closer look at the industrial co-authorships we find that
such professionals were mainly from software companies (e.g., Google Inc) and aerospace
industry (e.g., Boeing company). A more sporadic set of co-authorships were recognized
from a total of 28 health care professionals working in research units and hospitals, 5
researchers from aerospace industry, 4 independent researchers, 3 bank collaborators, 2
climate researchers, 1 library worker, and 1 AI researcher. However, such results must be
interpreted with caution. As presented in Fig. 4, only JCSCW provided a set of individual
authorships more accentuated than international collaborations, while a prevalence of local
collaborations was noted in all venues. When considering the type of collaboration and its
influence on the average number of citations (Fig. 5), a similar pattern is visible for all
venues, denoting a strong impact of internal (local) collaborations on citation indicators,
followed by domestic co-authorships, international collaborations, and individual
authorships.
123
Scientometrics
Keyword analysis
Keywords have been used to abstractly classify the content of a paper, providing the basis
for examining the key topics and aspects in a particular field of research (Heilig and Voß
2014). By considering co-occurrences, it is possible to identify interconnections between
topics through word-reference combinations within the keyword list of publications (Van
den Besselaar and Heimeriks 2006). A closer inspection of the keywords the CSCW
authors have chosen to characterize their papers (RQ4) revealed a total of 83 main clusters.
The clusters were originated from a manual keyword analysis (Heilig and Voß 2014). To
obtain an overview, we have grouped keywords under similar topics from a total of 1329
papers due to the lack of access to or absence of keywords in the remaining publications.
We identified core and major clusters and analyzed their content by adopting Jacovi et al.’s
(2006) methodological guidelines. Instead of using clustering algorithms like hierarchical
clustering (Jain and Dubes 1988), manual clustering was used for aggregating data from
keywords, identifying the main topics according to their frequency. The data was pre-
processed and the clusters were identified on the basis of transformed values, not on the
original information. The ranking of the top clusters with a high frequency of keywords
(f greater than or equal to 13) is provided in Table 10.
Theories and models constitute the core cluster in CSCW research. For our initial
exploration, we have identified terms such as activity theory, ethnography and eth-
nomethodology, social network analysis, distributed cognition, workplace studies, and
survey. CSCW is the second most representative cluster, constituted by general terms
introduced or already established in the field. Examples include but are not limited to
123
Scientometrics
Fig. 5 Influence of collaboration types on the average number of citations at each venue
123
Scientometrics
123
Scientometrics
Table 10 continued
123
Scientometrics
teamwork and group dynamics (addressing issues such as group identity, size, and per-
formance), awareness, security and privacy (e.g., adolescent online behavior), and com-
munity building. Some of these research topics remain coexistent to the socio-technical
classification model of CSCW (Fig. 1). A considerable amount of papers have explored the
situated role of information within cooperative work settings, including knowledge man-
agement and collaborative information behavior. As put forward by Ackerman et al.
(2013), this is a key intellectual achievement of CSCW research community. A further
important set of contributions have been provided on the domain of games and simulations,
which span from virtual worlds to haptics and head-mounted displays, exergames, Mas-
sively Multiplayer Online Games (MMOG), and 3D telepresence. A growing trend for
studies on crowdsourcing has been already established since 2006 (when the term was
introduced). Some topics of intensive research in the last years include citizen science,
human computation, collective intelligence, the wisdom of crowds, and crowdfunding.
Research has also tended to focus on CSCL, including informal learning, peer learning,
situated learning, and Massive Open Online Courses (MOOC). On the other hand, software
engineering is one of the most addressed areas in the last years. In addition to the estab-
lished topics of global and open software development and code search on the Web such as
GitHub studies, a recent focus has been provided on crowdsourcing software development
(Mao et al. 2017b). Publications on coordination (e.g., Rolland et al. 2014) and cyberin-
frastructure (e.g., Lee et al. 2006) have been covering research findings on coordinating
centers, scientific collaboration, and scientific software development. This means that
collaborative computing affects how scientific knowledge is constructed, verified, and
validated (Jirotka et al. 2013) and thus it significantly accelerates the publication of sci-
entific results and makes the access to these scientific papers much easier (Qi et al. 2017).
Enterprise/organizational issues such as business process management, inter-organizational
crisis management, and organizational memory have been also addressed. As reported
before, groupware publications were mainly approached in the first decade of this century,
including toolkits, frameworks, and system prototypes. Another set of relevant contribu-
tions regards the study of websites that use a wiki model, allowing the collaborative
modification of its content and structure directly from a Web browser. Wikipedia has
represented an attractive ecosystem for performing behavioral studies (e.g., Panciera et al.
2009). Crisis management and emergency response also show an important trend, while the
use of machine learning and natural language processing has been explored in combination
with crowd-based human computation. An example is the Flock system (Cheng and
Bernstein 2015), an interactive machine learning prototype based on hybrid crowd-ma-
chine classifiers. Additional themes that can be distilled from this broad literature include
but are not restricted to tangible and surface computing, ethics and policy, sentiment
analysis, visual analytics, office and workspace, interruptions, context-aware computing,
decision making, smart cities and urban computing, ubiquitous computing, and gender
studies. Table 11 summarizes the most addressed keywords in CSCW by frequency range.
Social media platforms have become integrated into the scholarly communication
ecosystem. Previous work on altmetrics has addressed their usefulness as sources of impact
assessment (Sugimoto et al. 2017). Nonetheless, there is still ‘‘a lack of systematic evi-
dence that altmetrics are valid proxies of either impact or utility although a few case
studies have reported medium correlations between specific altmetrics and citation rates for
individual journals or fields’’ (Thelwall et al. 2013). An altmetric exercise in the field of
123
Scientometrics
CSCW was undertaken to address RQ 5, using different sources that have been proposed in
the literature as alternatives for measuring the impact of scholarly publications (Priem and
Hemminger 2010). Apart from the download data extracted from both ACM-DL and
SpringerLink, the last one was used for obtaining reader counts from Mendeley, CiteU-
Like, and Connotea. In addition, SpringerLink was also adopted for gathering mentions in
blogs, Wikipedia, Twitter, Facebook, Q&A threads, Google ? , and news outlets.
Table 12 presents an overview of the alternative metrics retrieved for analysis.
It is possible to observe that GROUP proceedings provided the highest ratio of (cu-
mulative) downloads, followed by ACM CSCW and ECSCW. Matching these indicators
with our bibliometric analysis, a slight similarity between the highest cited and the most
download papers is reported in Table 13. In this context, 17 of the most downloaded papers
were also some of the highest cited papers in the current century of CSCW research.
Furthermore, ACM CSCW had the highest number of papers with more than 5000
downloads (Table 14), while the GROUP proceedings represented the less downloaded
ones.
In terms of readership in ECSCW and JCSCW, it was mainly from computer and
information scientists, followed by social scientists. In addition, both venues presented a
high number of readers from disciplines such as design, electrical and electronic engi-
neering, business and management, education, and psychology. Table 15 shows a complete
view of readership by discipline. PhD students are the most recurrent readers in ECSCW
and JCSCW, followed by M.Sc. students, and researchers (Table 16). Such readers are
mostly from USA, Germany, and UK, as shown in Table 17.
A less expressive number of social mentions were gathered for scrutiny. As presented in
Table 18, they are mostly from USA and UK. Concerning the professional status of social
mentions’ authors, they are mainly from members of the public, followed by scientists,
science communicators (journalists, bloggers, editors), and practitioners (doctors and other
health care professionals). Finally, most of the readers use Mendeley to access publica-
tions, while Twitter is the most used platform for social mentions (Table 19).
Despite the growing interest in altmetrics research (Erdt et al. 2016), the field of CSCW
still lacks solid sources for measuring the impact of its publications. For instance, only the
publication venues covered by SpringerLink (i.e., ECSCW and JCSCW) provided metrics
on the impact of CSCW research on social media. In addition, we can also identify a lack
of conversation flow in social media about CSCW publications. The community appears to
be somewhat ‘‘silent’’ in terms of expressiveness, acting mostly in ‘‘offline’’ mode through
views and downloads.
Discussion
The investigation of CSCW research over a period of 15 years has reinforced the
importance of gathering quantitative evidence to reflect on the publication data and thus
understanding the field and the ways it is evolving over time. As suggested by Wania et al.
(2006), a bibliometric study can be suitable to determine how design and evaluation
communities are related in the field of HCI and how it can be used to build a compre-
hensive theory. To answer RQ6, we discuss the aspects that came out as a result of analysis
while grounding our findings with respect to the previous related work. Concerning the
publication and citation patterns in CSCW research (RQ1), remarkable indicators of sci-
entific production at the beginning of the new century and expressed again during this
123
Scientometrics
CSCW (108), collaboration (102), ethnography (70), CMC (68), social media (66), [=15
awareness (58), crowdsourcing (50), coordination (46), online communities (42), health
care (41), social computing (40), communication (39), Wikipedia (38), privacy (36),
social networks (33), social networking sites (32), design (32), Facebook (31), Twitter
(30), groupware (24), instant messaging (23), cyberinfrastructure (23), ethnomethodology
(22), empirical studies (22), work practices (20), crisis informatics (18), cooperative work
(17), communities of practice (17), learning (17), infrastructure (17), trust (17), activity
theory (16), electronic mail (16), knowledge management (16), evaluation (15), field
studies (15), video conferencing (15), community (15), HCI (15), wikis (15)
social networking (14), ubiquitous computing (14), e-Science (14), articulation work (13), \15
distributed cognition (13), conversation analysis (13), participatory design (13), children and [=10
(13), software development (13), collaborative work (12), cooperation (12), social
network analysis (12), CSCL (12), education (12), creativity (12), decision making (12),
games (12), interruptions (12), mobile (12), scientific collaboration (11), consistency
management (11), distributed work (11), distributed teams (11), information sharing (11),
participation (11), social interaction (11), social capital (11), user-generated content (11),
case studies (10), communities (10), collaborative systems and tools (10), collaborative
design (10), collaborative virtual environments (10), teams (10), boundary objects (10),
culture (10), emotion (10), information seeking (10), integration (10), motivation (10),
operational transformation (10)
user studies (9), chat (9), blogs (9), microblogging (9), health (9), social software (9), socio- \10 and [5
technical systems (9), mobile phones (9), social search (9), disasters (9), place (9), peer
production (9), social support (8), video (8), social awareness (8), citizen science (8),
human computation (8), eletronic health records (8), health informatics (8), context-
aware computing (8), identity (8), usability (8), information infrastructures (8), software
engineering (8), virtual worlds (8), avatars (8), groups (8), virtual teams (8), interaction
(8), language (8), affect (8), co-located collaboration (8), enterprise (8), family (8), home
(8), gender (8), workplace (8), recommender systems (8), sustainability (8), visualization
(8), policy (8), rhythms (8), qualitative research (7), survey (7), video-mediated
communication (7), Amazon Mechanical Turk (7), coordination mechanisms (7), ICTD
(7), public displays (7), science (7), mobile computing (7), mobility (7), teamwork (7),
information management (7), Web search (7), shared workspaces (7), adoption (7),
appropriation (7), concurrency control (7), documentation (7), expertise location (7),
expertise sharing (7), museums (7), workflow (7), tagging (7), values (7), sensemaking
(7), volunteered geographic information (7), classification (6), workplace studies (6),
meetings (6), SMS (6), collective intelligence (6), crowdfunding (6), hospitals (6),
collaboratories (6), single-display groupware (6), e-Research (6), collaborative software
development (6), group dynamics (6), socialization (6), artifacts (6), space (6),
telepresence (6), temporality (6), remote collaboration (6), families (6), leisure (6),
augmented reality (6), collaborative filtering (6), accountability (6), ambiguity (6),
attention (6), deception (6), social navigation (6), impression management (6), machine
learning (6), representations (6), negotiation (6), politics (6), reflection (6), reputation (6)
123
Scientometrics
Table 11 continued
common ground (5), diary studies (5), qualitative methods (5), theory (5), qualitative 5
studies (5), annotation (5), audio conferencing (5), crowd work (5), coordinative artifacts
(5), virtual communities (5), electronic medical records (5), electronic patient records (5),
online health communities (5), context (5), tabletops (5), public and situated displays (5),
group editors (5), access control (5), intimacy (5), user-centered design (5), audience (5),
creativity support tools (5), deliberation (5), mobile applications (5), group work (5),
information exchange (5), sharing (5), legitimate peripheral participation (5), cross-
cultural (5), cross-cultural communication (5), collaborative search (5), collaborative
information seeking (5), synchronization (5), organizational memory (5), information
visualization (5), expertise (5), impression formation (5), data mining (5), information
retrieval (5), social translucence (5), social Q&A (5), collaborative writing (5), assistive
technology (5), configuration (5), multitasking (5), feedback (5), flexibility (5), human–
robot interaction (5), network delay (5), leadership (5), web 2.0 (5), performance (5),
media spaces (5), photos (5), critical mass (5), urban informatics
work (4), technical support (4), practice (4), actor-network theory (4), categorization (4), 4
content analysis (4), empirical methods (4), ethnographic studies (4), fieldwork (4),
methodology (4), methods (4), grounded theory (4), gamification (4), interaction analysis
(4), teleconferencing (4), informal communication (4), presence (4), emergency medicine
(4), hospital work (4), large displays (4), tangible user interfaces (4), geowikis (4),
security (4), interaction design (4), system design (4), user eXperience (4), groupware
design (4), groupware usability (4), real-time groupware (4), distributed groupware (4),
informal learning (4), MOOC (4), credibility (4), global software development (4), online
games (4), multiplayer games (4), MMOG (4), World of Warcraft (4), smartphone (4),
distributed collaboration (4), distributed meetings (4), data sharing (4), knowledge
sharing (4), information reuse (4), metadata (4), sociability (4), social (4), China (4),
emergency (4), peer-to-peer (4), organizations (4), social visualization (4), paper (4),
social norms (4), natural language processing (4), machine translation (4), Q&A (4),
collaborative editing (4), WWW (4), productivity (4), common information spaces (4),
experiments (4), eye tracking (4), gestures (4), activism (4), urban computing (4),
brainstorming (4), commitment (4), collective memory (4), civic engagement (4), conflict
(4), digital libraries (4), death (4), invisible work (4), ecology (4), news (4),
personalization (4), redundancy (4), replication (4), storytelling (4), tie strength (4), craft
(4), embodiment (4)
123
Scientometrics
Table 11 continued
ad-hoc collaboration (3), open collaboration (3), social practices (3), workplace 3
collaboration (3), science and technology studies (3), quantitative (3), social cognition
(3), video analysis (3), interpersonal communication (3), multilingual communication (3),
mediated communication (3), messaging (3), reddit (3), texting (3), online social
networks (3), microblogs (3), situation awareness (3), situational awareness (3), social
presence (3), community building (3), autism spectrum disorder (3), chronic illness (3),
elderly (3), medical informatics (3), medication (3), nutrition (3), pervasive healthcare
(3), stress (3), wellness (3), context awareness (3), interactive tabletops (3), tabletop
interfaces (3), architectural work (3), ICT (3), whiteboards (3), anonymity (3),
information privacy (3), safety (3), collaboration architectures (3), interface design (3),
interfaces (3), value sensitive design (3), groupware performance (3), multi-display
environments (3), e-Social Science (3), digital volunteers (3), teenagers (3), amateurs (3),
eXtreme programming (3), open source (3), online creative collaboration (3), decision
support (3), game design (3), video games (3), virtual reality (3), notification systems (3),
notifications (3), nomadic work (3), mobile technology (3), telephones (3), consistency
(3), consistency control (3), group identity (3), support groups (3), knowledge (3),
information diffusion (3), social exchange (3), casual interaction (3), intercultural
collaboration (3), cultural differences (3), sentiment analysis (3), transformation (3),
emergency department (3), emergency management (3), location (3), time (3), spatiality
(3), synchronous communication (3), peer support (3), organizations (3), virtual
organization (3), family communication (3), office (3), shared visual spaces (3),
technology adoption (3), experience (3), workflow management (3), social bookmarking
(3), maps (3), social computing and social navigation (3), transparency (3), group editing
(3), customization (3), power (3), routines (3), calendars (3), photographs (3), voice (3),
group brainstorming (3), events (3), war (3), speech recognition (3), speech acts (3),
social relationships (3), affordances (3), availability (3), backchannel (3), connectedness
(3), data quality (3), discourse (3), engineering (3), formalization (3), mixed reality (3),
modeling (3), online dating (3), ontology (3), project management (3), ownership (3),
application sharing (3), dependability (3), disruption (3), economics (3), free riding (3),
innovation (3), maintenance (3), management (3), measurement (3), misinformation (3),
moderation (3), optimization (3), persuasion (3), reciprocity (3), repair (3), sensors (3),
standardization (3), stigmergy (3), tailoring (3), visibility (3), turn-taking (3), file sharing
(3), translation (3)
decade are in line with previous evidence for a possible convergence between research
communities in the field (Grudin 2012). Seminal contributions provided by ethnographic
(e.g., Hughes et al. 1994) and conceptual work (e.g., Schmidt and Bannon 1992; Grudin
1994) remain as some of the most cited publications, influencing a lot of studies that were
preceded by the launch of collaborative technologies such as Wikipedia and Facebook. For
our initial examination of the results, we also denoted that the core of CSCW papers has an
average of 11–25 citations. A relatively small number of 51 papers without citations ensure
the field a healthy and evolving nature.
In Meho and Rogers (2008) the authors considered JCSCW (4th place in journals) and
ACM CSCW (2nd place in conference proceedings) within the top sources of citations in
HCI taking into account the intersection between WoS and Scopus. Publication metadata
covering CHI were also examined by Henry et al. (2007), indicating its large impact and
prestige related to a balance between low acceptance rates and a considerable number of
attendees and publications cited in distinct conferences over the years. The average cita-
tions for all CHI publications (1990–2006) were 3.2 and 61% of full papers got less than 6
citations (Oulasvirta 2006). Mubin et al. (2017) provided data on the Nordic Conference on
123
Scientometrics
Table 12 Alternative metrics (altmetrics) of CSCW research (2001–2015) organized by: Total Number of
Publications (A), Total Number of Downloads (Cumulative) (B), Average Number of Downloads (C), Total
Number of Readers (D), Average Number of Readers (E), Total Number of Social Mentions (F), and
Average Number of Social Mentions (G)
Source A B C D E F G
123
Scientometrics
Table 12 continued
Source A B C D E F G
123
Scientometrics
2006 ACM Cliff Lampe, Nicole B. Ellison, A Face(book) in the Crowd: Social 13,146
CSCW Charles Steinfield Searching versus Social
Browsing
2009 GROUP Meredith M. Skeels, Jonathan When Social Networks Cross 11,802
Grudin Boundaries: A Case Study of
Workplace Use of Facebook and
LinkedIn
2009 GROUP Dejin Zhao, Mary Beth Rosson How and Why People Twitter: The 11,332
Role that Micro-blogging Plays
in Informal Communication at
Work
2004 ACM Bonnie A. Nardi, Diane J. Schiano, Blogging as Social Activity, or, 11,187
CSCW Michelle Gumbrecht Would You Let 900 Million
People Read Your Diary?
2002 ACM Rebecca E. Grinter, Leysia Palen Instant Messaging in Teen Life 10,454
CSCW
2008 ACM Joan Morris DiMicco, David R. Motivations for Social Networking 10,033
CSCW Millen, Werner Geyer, Casey at Work
Dugan, Beth Brownholtz,
Michael J. Muller
2007 GROUP Joan Morris DiMicco, David R. Identity Management: Multiple 8303
Millen Presentations of Self in
Facebook
2008 ACM Cliff Lampe, Nicole B. Ellison, Changes in Use and Perception of 8149
CSCW Charles Steinfield Facebook
2004 ACM Nicolas Ducheneaut, Robert J. The Social Side of Gaming: A 7822
CSCW Moore Study of Interaction Patterns in a
Massively Multiplayer Online
Game
2006 ACM Bonnie A. Nardi, Justin Harris Strangers and Friends: 5619
CSCW Collaborative Play in World of
Warcraft
2004 ACM Dennis C. Neale, John M. Carroll, Evaluating Computer-Supported 5009
CSCW Mary Beth Rosson Cooperative Work: Models and
Frameworks
2003 GROUP A. Agostini, S. Albolino, G. De Stimulating Knowledge Discovery 4799
Michelis, F. De Paoli, R. Dondi and Sharing
2004 ACM Gerard Beenen, Kimberly S. Ling, Using Social Psychology to 4448
CSCW Xiaoqing Wang, Klarissa Chang, Motivate Contributions to
Dan Frankowski, Paul Resnick, Online Communities
Robert E. Kraut
2015 ACM Bogdan State, Lada A. Adamic The Diffusion of Support in an 3829
CSCW Online Social Movement:
Evidence from the Adoption of
Equal-Sign Profile Pictures
2010 ACM Mor Naaman, Jeffrey Boase, Chih- Is it Really About Me?: Message 3616
CSCW Hui Lai Content in Social Awareness
Streams
2002 ACM Andreas Girgensohn, Alison Lee Making Web Sites Be Places for 3554
CSCW Social Interaction
123
Scientometrics
Table 13 continued
2002 ACM Erin Bradner, Gloria Mark Why Distance Matters: Effects on 3415
CSCW Cooperation, Persuasion and
Deception
2008 ACM Aniket Kittur, Robert E. Kraut Harnessing the Wisdom of Crowds 3369
CSCW in Wikipedia: Quality through
Coordination
2002 ACM Ellen Isaacs, Alan Walendowski, The Character, Functions, and 3318
CSCW Steve Whittaker, Diane J. Styles of Instant Messaging in
Schiano, Candace A. Kamm the Workplace
2004 ACM Carl Gutwin, Reagan Penner, Group Awareness in Distributed 3306
CSCW Kevin A. Schneider Software Development
2002 ACM Mark Handel, James D. Herbsleb What Is Chat doing in the 3195
CSCW Workplace?
2006 ACM Laura A. Dabbish, Robert E. Kraut Email Overload at Work: An 3192
CSCW Analysis of Factors Associated
with Email Strain
2001 GROUP Anne P. Massey, Yu-Ting Caisy When Culture and Style Aren’t 3121
Hung, Mitzi M. Montoya-Weiss, About Clothes: Perceptions of
Venkataraman Ramesh Task-Technology ‘‘Fit’’ in
Global Virtual Teams
2002 ACM Jonathan J. Cadiz, Gina Danielle Designing and Deploying an 3075
CSCW Venolia, Gavin Jancke, Anoop Information Awareness Interface
Gupta
2002 ACM David R. Millen, John F. Patterson Stimulating Social Engagement in 3019
CSCW a Community Network
2011 ACM Yan Qu, Chen Huang, Pengyi Microblogging After a Major 3009
CSCW Zhang, Jun Zhang Disaster in China: A Case Study
of the 2010 Yushu Earthquake
2004 ACM Christine Halverson, Thomas Behind the Help Desk: Evolution 3006
CSCW Erickson, Mark S. Ackerman of a Knowledge Management
System in a Large Organization
2004 ACM Barry Brown, Marek Bell CSCW at Play: ‘There’ as a 2979
CSCW Collaborative Virtual
Environment
2005 GROUP Susan L. Bryant, Andrea Forte, Becoming 2978
Amy Bruckman Wikipedian: Transformation of
Participation in a Collaborative
Online Encyclopedia
2006 ACM Paul Dourish Re-space-ing Place: ‘‘Place’’ and 2938
CSCW ‘‘Space’’ 10 years On
123
Scientometrics
Table 14 Count of papers that have 0–100, 101–500, 501–1000, 1001–5000, and [ 5000 downloads
(cumulative)
Source Number of publications 0–100 101–500 501–1000 1001–5000 [5000
Table 15 Readership by
Discipline ECSCW JCSCW
discipline
Computer and Information Science 1275 1456
Social Sciences 153 426
Psychology 55 58
Electrical and Electronic Engineering 66 87
Earth and Planetary Sciences 15 2
Design 149 69
Arts and Humanities 15 38
Philosophy 1 0
Management Science/Operations Research 12 37
Biological Sciences 12 10
Physics 0 1
Linguistics 2 2
Medicine 7 24
Environmental Sciences 5 13
Law 1 0
Education 55 101
Business, Management and Accounting 57 129
Economics 10 8
123
Scientometrics
5. high level of competition within organizations and research groups (Iglič et al. 2017);
6. desire of scholars to increase their scientific visibility, recognition, and popularity
(Beaver and Rosen 1978);
7. economic benefits (Katz and Martin 1997);
8. changing communication patterns, and increasing mobility of researchers (Glänzel and
Schubert 2004);
9. increasing specialization in science (Bush and Hattery 1956) and escalating demand
for the rationalization of resources (Price 1963);
10. advancement of scientific disciplines and increasing difficulty to make significant
advances (Manten 1970);
11. internationalization of the societal sectors (Melin and Persson 1996).
On the level of research topics (RQ4), an author keyword analysis revealed some
interesting insights of stability and change with a vast set of recurrent topics in all years,
whereas others have been decreasing or even disappearing. The topics addressed in the
most cited papers range from social networks to crowdsourcing, Wikipedia, and instant
messaging. This constitutes a reliable indicator of maturity and a possible pattern of
repetition in the citation practices for the next years. It is worthwhile noting that works that
deal with topics of common interest get more citations (Gupta 2015) and this often
influences the course of investigations. For instance, real-time search brought new ways for
mining social networking sites while monitoring streams of data (e.g., tweets, status
updates) from services like blogs, Facebook, and Twitter. Research on social networking
and microblogging has been more expressive on North American community. Publications
on instant messaging and electronic mail have been less visible when compared with the
beginning of the twenty first century. A slighter amount of Wikipedia studies have been
presented in the last years but CSCW researchers have maintained an active focus on
tagging activities, quality control, participation, coordination, and policy issues. Crowd
work also affected the social mobility, productivity, and the global economy by engaging
volunteers and paid workers on performing complex tasks (Kittur et al. 2013). Ethnography
has been applied for examining the work performed by contributors on marketplaces such
as Amazon Mechanical Turk. Variables analyzed in these studies range from demographics
123
Scientometrics
123
Scientometrics
Table 17 continued
Country ECSCW JCSCW
Turkey 1 0
Switzerland 9 4
123
Scientometrics
to motivation, bias, reputation, preferences, feedback, identity, and privacy. In the litera-
ture there are also some examples of studies addressing the integration of crowdsourcing
and AI (e.g., Cheng and Bernstein 2015). Last but not least, crowdfunding became a force
in financing technology start-ups while supporting creative and scientific projects (e.g.,
Experiment).
Another set of technologies that influenced the type and volume of studies in the CSCW
research community concerns to the establishment of smartphones and gestural interfaces
pervading our daily life settings. Some companies also made available their services for
free over the last years, enabling their exploitation by researchers. Hundreds of computer-
simulated 3D virtual environments have been already deployed and evaluated, and sim-
ulation experiments have been done in hospitals, universities, museums, and many other
cooperative work environments (Correia et al. 2016). Health care has been examined
recursively and extensively in order to understand health information sharing practices of
medical staff and patients, home health management, health games, participant behavior in
online health support groups, mobile health, etc. A review of CSCW research in health care
(1988–2011) demonstrated an increasing focus on medical, biomedical and nursing
informatics, envisaged in the political agendas of most Western countries and reflected in
terms of scientific production (Fitzpatrick and Ellingsen 2013). Some approaches have also
examined emergency response and crisis informatics for providing better support for ad
hoc teams in complex scenarios. Developments on ubiquitous and context-aware com-
puting have been originating novel ways for augmenting workspaces and urban environ-
ments. Despite the difficult task of characterizing all the research that has been done in the
field of CSCW, we see a widening of topics requiring further examination, including
accessibility, gestural interfaces, intelligent systems, and distributed software development.
If we look closer at the Liu et al.’s (2014) examination of keyword networks of CHI in
2004–2013, HCI topics related to research on social networks, design and evaluation,
games with a purpose, education/learning, and participatory design are concomitant to the
topics approached by CSCW researchers. Participatory design has emerged from Scandi-
navian culture and represents one of the most addressed topics in NordiCHI (Mubin et al.
2017). Broadly cited papers in CHI addressed research topics such as social information
filtering, email overload, and tangible user interfaces (Oulasvirta 2006). Chen et al. (2005)
explored hybrid networks of topical terms and co-cited papers for visualizing the field of
HCI, including user-centered design, ubiquitous and context-aware computing, usability
evaluation, knowledge representation and problem-solving, perceptual control, and
enterprise resource planning. Symmetries between publications proposing a new system
and empirical studies providing informative observations on the use of technology remain
unachieved and its absence seems to continue impairing the effectiveness of HCI research
at a macro, disciplinary level (Oulasvirta 2006). These findings allow the understanding of
the ecological and structural aspects of human-centred computing, correlating indicators to
analyze how HCI and CSCW communities have evolved over time (Wania et al. 2006).
Alternative metrics have gained momentum as a useful approach for measuring aspects
related to the output and impact of research, but are subject to numerous limitations on its
use. The coverage of scientific documents on social media, also understood as ‘‘the per-
centage of papers with at least one social media event or citation’’ (Haustein et al. 2015),
has been found to be rather low despite growth indicators in the last years, varying by
discipline and specialty (Sugimoto et al. 2017). Publications related to computer science
and engineering seem to receive lower altmetric scores (Haustein et al. 2015) when
comparing to the articles from social sciences and humanities (Zahedi et al. 2014).
Coverage also tends to be higher for more recent papers (Bar-Ilan et al. 2012) and for high
123
Scientometrics
impact journals (Li et al. 2011). Mendeley is one of the most comprehensive altmetrics
data source with 60–80% of papers having at least one reader (Thelwall and Wilson 2016).
In addition, the use of Twitter provides a way of measuring the impact of a publication
very soon after its appearance (Bornmann 2015).
Matching this set of findings with our altmetric examination of CSCW devoted venues
(RQ5), a small number of downloads is verified when comparing to other fields. Moreover,
the scholarly content attracting more readers is concerned with social networking sites and
microblogging, MMOG, crowdsourcing, Wikipedia, and instant messaging. A combination
of strengths between computer and social scientists is a trend maintained since the
emergence of the field in the 1980s. CSCW has also attracted researchers and professionals
from other disciplines such as education, psychology, business and management, medicine,
and design. PhD students are the most dynamic readers, followed by MSc students,
researchers, and professors. Readers from USA and Germany are the most active according
to our results. In spite of the fact that there are important insights concerning the use of
altmetrics in CSCW, social networks have not been widely used for the dissemination of
knowledge in the area.
Concerning the RQ6, CSCW has been established as a subfield of HCI with manifested
differences from other fields and disciplines. Some differences already reported include
theory building (mainly affected by the instability shaped by technology change) (Grudin
and Poltrock 2012), methodology (Bannon 1992), and socio-technicality (Ackerman 2000).
A vast set of studies addressed the significance of CSCW and its historical relation with
HCI (e.g., Bannon 1992; Schmidt and Bannon 2013). CSCW research is mainly published
in highly selective, largely accessible conference proceedings (Grudin and Poltrock 2012).
This is a differentiating factor when comparing with fields and disciplines where the
publication in peer-reviewed journals with high impact factors is an iterant practice. Most
CSCW devoted conferences occur annually or biennially in the USA and Europe, influ-
encing the nature of their authorships due to factors such as financial support and mobility.
The funding schemes are very different in the two continents, influencing the course of
action in CSCW research. While the North American CSCW community has been more
focused on providing services and IT artefacts for large groups and thus attracted more
people around the globe, the European tradition of CSCW has tried to understanding the
work and life practices more deeply to design cooperation technologies adequately.
ECSCW is a conference focused on practice-centred computing, being recently combined
with JCSCW, a journal primarily ‘‘oriented toward design, implementation, and use of
collaborative computing’’, growing out of the computer science field with a strong soci-
ological orientation (Holsapple and Luo 2003). By changing its policy for an annual
submission/publication process (as already done by ACM CSCW in 2010), ECSCW
conference will provide a new kind of configuration to the field for the upcoming years.
Behavioral researchers and system builders coming mainly from computer science
departments and industry research labs have been working on different ways of supporting
communication, coordination, and cooperation (Grudin and Poltrock 2012). Experimental
system prototypes supported by socio-technical infrastructures have been introduced
exhaustively in CSCW literature but there is a need for continual analysis of technology
development and research evolution. As IT research diversifies in its domains of appli-
cation, ‘‘CSCW might respond to the changing technological landscape’’ (Crabtree et al.
2005). Thus, researchers need to inform the design of computer systems appropriately
while identifying patterns and predicting trajectories since many underlying forces
grounded in the environment, context and culture are constant as technology moves on.
123
Scientometrics
Final remarks
We come to the end of our preliminary examination of the field of CSCW. Papers dedi-
cated to CSCW subject have appeared in varied journals and conferences that reflect its
interdisciplinary nature (Holsapple and Luo 2003). In this paper, a quantitative analysis
was carried out to comprehensively investigate the development and current state of
CSCW devoted outlets with emphasis on a large bibliographic data basis provided by GS,
ACL-DL, and SpringerLink. We have outlined the growth, spread, collaboration, and focus
areas of CSCW along the lines of authorship, institutional distribution, demographics, key
topics, and their respective impact in terms of number of publications, citations, and
coverage on social media channels. In contrast to the use of structured literature reviews, a
scientometric exercise can be valuable to easily obtain a general overview of a particular
field of research by allowing the assessment of large amounts of papers. The purpose of the
paper was to assist the reader in understanding the nature and evolution of this field as a
starting point for academics, practitioners, and general public to identify some of the main
insights behind the existing knowledge.
The results reveal that CSCW is rapidly developing as an independent research com-
munity where researchers can establish their identity. The beginning of the new millen-
nium brought profound changes reflected on a wide-ranging body of CSCW research, and
the technological advances have driven paradigmatic shifts while shaping (to some extent)
the way in which people work. As a probable consequence of a technological wave
characterized by a massive use of personal computers, which allowed CSCW researchers
to put social dynamics under the microscope by examining distinct kinds of human
behavior, CSCW communities have been converging during the last decade in terms of
content and program committees (Grudin 2012). The field of computer-aided collaboration
has also suffered structural changes with some outlets moving from a biennial to an annual
format and thus changing the composition of the community (Keegan et al. 2013). A small
number of papers without citations, the growth of the number of publications by year, and
an average number of more than 39 citations per paper in all venues ensure the field a
healthy and evolving nature. ECSCW presented the most expressive ratio in terms of
average number of papers without citations, while the JCSCW had the highest ratio of
papers with more than 50 citations. ACM CSCW provided the lowest number of papers
that were never cited, a tendency of maturity corroborated by Jacovi et al. (2006).
From the authorship analysis of the field of CSCW, we are able to see that the research
activities in CSCW are predominantly influenced by the fundamental and highly recog-
nized scientists and publications, as previously suggested by Horn et al. (2004). The most
prolific authors contributed to almost half of the publications and more than 50% of the
total citations. Keegan et al. (2013) went even further by claiming that the ‘‘ideas intro-
duced by those in the core of the CSCW community (e.g., elite researchers) are advantaged
over those introduced by peripheral participants (e.g., newcomers)’’. Nevertheless, a
deeper insight into contribution patterns resulted in an equal gender distribution, a vast
number of authors that published 10 or more publications, and a greater number of authors
per article in North American outlets. Given the results of the keyword analysis it is
obvious that the past and current focus of North American research community lies mainly
on the technology itself while the sociocultural issues have been more explored in Europe.
Some topics have remained consistent across years, including ethnography, social net-
works, and crowdsourcing. On the other hand, well-established topics such as awareness
and video conferencing have been loosing some prevalence in this century.
123
Scientometrics
In summary, CSCW and scientometrics research communities must be aware of the data
presented in this study as a basis for future decisions for which it is necessary to have prior
knowledge about scientific publications. It can be argued that we need to provide much
attention to bibliometric studies in the field of HCI. Such kind of publications is largely
unknown to the CSCW community and the low rates in terms of citations can validate such
claim. Thus, the results of the scientometric analysis may help the field of CSCW to
redefine itself in order to provide a clear direction. Based on these findings it is possible to
clarify research goals while identifying trends and other important factors for directing
individual research activities, understanding patterns, tracking research networks, identi-
fying main contributors and driving forces (e.g., themes), and organizing scientific data in
more appropriate ways. We hope that this study will contribute to inspire other research
communities to examine themselves by promoting a regular quantitative monitoring of
their scientific production.
123
Scientometrics
the divide (apparently continental) between research that is reactive (speculating on the
social and organizational impact of certain existing technologies) and committed to
developing new technologies through in-depth studies (by means of ethnography, con-
ceptual work, experiments, and evaluation studies). A qualitative analysis based on content
and interviews with researchers can help to recognize how practice-oriented research has
evolved and will continue to do so in response to new collaboration systems, supporting
researchers to understand their research endeavor through a history-aware research agenda.
Appendix
Some preliminary work has described GS as a freely available service that covers more
publications than the ‘‘very expensive, subscription-based’’ WoS and Scopus databases.
These sources constitute useful and scientifically valid data sources for bibliometric
analysis (Mikki 2009) but only comprise citations related to journal papers and conference
proceedings. Comparatively, GS additionally indexes book chapters, books, theses and
dissertations, workshops, technical reports, amongst a vast set of scholarly documents. GS
allows a researcher with limited access to commercial databases perform a bibliometric
exercise without geographic or linguistic barriers (Meho and Rogers 2008). It is also worth
mentioning that some advantages of the GS algorithm can be found on document detection
and filtering, free access to files and websites of institutions and researchers, and indexing
of the same set of documents covered by proprietary databases. In this context, GS can be
‘‘perhaps one of the largest scientific bibliographic databases’’ (Aguillo 2011) with fast and
very broadly full text search capabilities (Jacsó 2012). Notwithstanding these advantages, a
lack of control over its contents defines this service as a ‘‘noisy database’’ that implies a
complex and time-consuming data cleaning task for evaluation purposes. GS negative
aspects are mainly associated with its software resources, including an inadequate clus-
tering of identical citations that result in duplications, inability to detect all authors, inflated
counting of citations, and other problems caused by automatic indexing (Jacsó 2008). Its
search engine is designed to return only the most significant results, showing a low degree
of control to search systematically (Mikki 2009).
Comparing with WoS, GS results resemble this subscription-based tool while covering a
broader universe of metadata for multi-language documents, while books receive high
citation rates. GS extracts citations automatically from reference lists, whilst citation data
is manually handled to some extent in the WoS database (Mikki 2009). Some older papers
(not published in the WWW) are not susceptible to indexation by GS (Neuhaus et al.
2006). In addition, a lower index of WoS for books and conference papers can be limitative
for bibliometrics. This is particularly noted in the sample chosen in this study, where only
GS provided data for all publications. Looking at the results provided by Meho and Yang
(2007), all these services are valuable for performing bibliometrics studies with a small
overlap in citations and an analogous operation mode. GS and WoS rank groups of scholars
in a similar way, and both services present speediness in the searching process. GS pro-
vides more citations than WoS and Scopus, identifying a higher number of unique citations
that can be helpful for presenting evidence on broader intellectual and international
impacts. In addition, Bauer and Bakkalbasi (2005) did not find significant differences
between WoS and Scopus. A positive relationship between their rank was documented by
123
Scientometrics
Archambault et al. (2009) arguing that ‘‘the outputs (papers) and impacts (citations) of
countries obtained from the two databases are extremely correlated’’ despite content and
coverage differences in terms of scope and volume. Criticisms in the literature against
Scopus have been based on its systematic coverage for Elsevier’s journals, presenting only
citation data related to 1996 and beyond (Aguillo 2011). Despite some findings suggesting
that, for HCI, more valid citation data can be achievable by using Scopus than WoS (Meho
and Rogers 2008), depth and length of coverage have been disappointing for various
Scopus journals. Furthermore, Scopus comprises a short time span and critical gaps
underlying the coverage of lower-quality publications (Jacsó 2012).
The ACM-DL contains an archive with over 400,000 full-text articles and more than
18,000 new full-text entries added each year, ranging from journals and technical
magazines to conference proceedings published by the Association for Computing
Machinery. In addition, ACM-DL also provides easy access to bibliometric data (e.g.,
citation count) and altmetrics (e.g., number of downloads). Recently, SpringerLink also
introduced altmetrics to measure the scientific impact of its covered publications on social
media. Both services remain updated and provide mechanisms for information seeking.
Another comparison between GS and ResearchGate revealed that the last one ‘‘found less
citations than did Google Scholar but more than both Web of Science and Scopus’’
(Thelwall and Kousha 2017). At the same time, Mendeley was characterized as an ‘‘useful
tool for tracking the impact of both conference papers and journal articles in computer
science’’ (Aduku et al. 2017).
References
Abt, H. A. (2017). Citations and team sizes. Publications of the Astronomical Society of the Pacific,
129(972), 024008.
Ackerman, M. S. (2000). The intellectual challenge of CSCW: The gap between social requirements and
technical feasibility. Human–Computer Interaction, 15(2), 179–203.
Ackerman, M. S., Dachtera, J., Pipek, V., & Wulf, V. (2013). Sharing knowledge and expertise: The CSCW
view of knowledge management. Computer Supported Cooperative Work, 22(4–6), 531–573.
Aduku, K. J., Thelwall, M., & Kousha, K. (2017). Do Mendeley reader counts reflect the scholarly impact of
conference papers? An investigation of computer science and engineering. Scientometrics, 112(1),
573–581.
Aguillo, I. F. (2011). Is Google Scholar useful for bibliometrics? A webometric analysis. Scientometrics,
91(2), 343–351.
Antunes, P., & Pino, J. A. (2010). A review of CRIWG research. In Proceedings of the international
conference on collaboration and technology (pp. 1–15). Berlin: Springer.
Archambault, É., Campbell, D., Gingras, Y., & Larivière, V. (2009). Comparing bibliometric statistics
obtained from the Web of Science and Scopus. Journal of the American Society for Information
Science and Technology, 60(7), 1320–1326.
Bannon, L. (1992). Perspectives on CSCW: From HCI and CMC to CSCW. In Proceedings of the inter-
national conference on human–computer interaction (pp. 148–158). St. Petersburg: BCS HICOM.
Bannon, L. (1993). CSCW: An initial exploration. Scandinavian Journal of Information Systems, 5(2), 3–24.
Bannon, L., & Schmidt, K. (1989). CSCW: Four characters in search of a context. In Proceedings of the first
european conference on computer supported cooperative work, Gatwick, London, 13–15 September
1989 (pp. 358–372).
Barbosa, S. D. J., Silveira, M. S., & Gasparini, I. (2016). What publications metadata tell us about the
evolution of a scientific community: The case of the Brazilian Human–Computer Interaction confer-
ence series. Scientometrics, 110(1), 275–300.
Bar-Ilan, J., Haustein, S., Peters, I., Priem, J., Shema, H., & Terliesner, J. (2012). Beyond citations:
Scholars’ visibility on the social web. arXiv:1205.5611.
Bar-Ilan, J., Levene, M., & Lin, A. (2007). Some measures for comparing citation databases. Journal of
Informetrics, 1(1), 26–34.
123
Scientometrics
Barkhuus, L., & Rode, J. A. (2007). From mice to men-24 years of evaluation in CHI. In Proceedings of the
ACM SIGCHI conference on human factors in computing systems (pp. 1–16).
Bartneck, C. (2011). The end of the beginning: A reflection on the first five years of the HRI conference.
Scientometrics, 86(2), 487–504.
Bartneck, C., & Hu, J. (2009). Scientometric analysis of the CHI proceedings. In Proceedings of the ACM
SIGCHI conference on human factors in computing systems (pp. 699–708).
Bartneck, C., & Hu, J. (2010). The fruits of collaboration in a multidisciplinary field. Scientometrics, 85(1),
41–52.
Bauer, K., & Bakkalbasi, N. (2005). An examination of citation counts in a new scholarly communication
environment. D-Lib Magazine, 11, 9.
Beaver, D., & Rosen, R. (1978). Studies in scientific collaboration: Part I. The professional origins of
scientific co-authorship. Scientometrics, 1(1), 65–84.
Bird, S., Wiles, J. L., Okalik, L., Kilabuk, J., & Egeland, G. M. (2009). Methodological consideration of
story telling in qualitative research involving Indigenous Peoples. Global Health Promotion, 16(4),
16–26.
Blomberg, J., & Karasti, H. (2013). Reflections on 25 years of ethnography in CSCW. Computer Supported
Cooperative Work, 22(4–6), 373–423.
Bornmann, L. (2015). Alternative metrics in scientometrics: A meta-analysis of research into three alt-
metrics. Scientometrics, 103(3), 1123–1144.
Bush, G. P., & Hattery, L. H. (1956). Teamwork and creativity in research. Administrative Science Quar-
terly, 1(3), 361–372.
Chen, C., Panjwani, G., Proctor, J., Allendoerfer, K., Aluker, S., Sturtz, D., Vukovic, M., & Kuljis, J. (2005).
Visualizing the evolution of HCI. In Proceedings of the international BCS human computer interaction
conference (pp. 233–250). London: Springer.
Cheng, J., & Bernstein, M. S. (2015). Flock: Hybrid crowd-machine learning classifiers. In Proceedings of
the 18th ACM conference on computer supported cooperative work & social computing (pp. 600–611).
Convertino, G., Kannampallil, T. G., & Councill, I. (2006). Mapping the intellectual landscape of CSCW
research. In Poster presented at ACM conference on computer supported cooperative work, 6.
Correia, A., Fonseca, B., & Paredes, H. (2013). Exploiting classical bibliometrics of CSCW: Classification,
evaluation, limitations, and the odds of semantic analytics. In Proceedings of the first international
conference on human factors in computing and informatics (pp. 137–156). Berlin: Springer.
Correia A., Fonseca B., Paredes H., Martins P., & Morgado L. (2016). Computer-simulated 3D virtual
environments in collaborative learning and training: Meta-review, refinement, and roadmap. In Y.
Sivan (Ed.), Handbook on 3D3C Platforms. Progress in IS (pp. 403–440). Cham: Springer.
Crabtree, A., Rodden, T., & Benford, S. (2005). Moving with the times: IT research and the boundaries of
CSCW. Computer Supported Cooperative Work, 14(3), 217–251.
Cruz, A., Correia, A., Paredes, H., Fonseca, B., Morgado, L., & Martins, P. (2012). Towards an overarching
classification model of CSCW and groupware: A socio-technical perspective. In Proceedings of the
18th international conference on collaboration and technology (pp. 41–56). Berlin: Springer.
Diodato, V. P. (1994). Dictionary of bibliometrics. New York: The Haworth Press.
Ellis, C. A., Gibbs, S. J., & Rein, G. (1991). Groupware: Some issues and experiences. Communications of
the ACM, 34(1), 39–58.
Elo, S., & Kyngäs, H. (2008). The qualitative content analysis process. Journal of Advanced Nursing, 62(1),
107–115.
Erdt, M., Nagarajan, A., Sin, S. C. J., & Theng, Y. L. (2016). Altmetrics: An analysis of the state-of-the-art
in measuring research impact on social media. Scientometrics, 109(2), 1117–1166.
Ferraris, C., & Martel, C. (2000). Regulation in groupware: The example of a collaborative drawing tool for
young children. In Proceedings of the sixth IEEE international workshop on groupware (pp. 119–127).
Fitzpatrick, G., & Ellingsen, G. (2013). A review of 25 years of CSCW research in healthcare: Contribu-
tions, challenges and future agendas. Computer Supported Cooperative Work, 22(4–6), 609–665.
Garfield, E. (1972). Citation analysis as a tool in journal evaluation. Science, 178(4060), 471–479.
Garfield, E. (1979). Is citation analysis a legitimate evaluation tool? Scientometrics, 1(4), 359–375.
Glänzel, W. (2009). History of bibliometrics and its present-day tasks in research evaluation. Leuven:
Katholieke Universiteit Leuven.
Glänzel, W., & Schoepflin, U. (1995). A bibliometric study on ageing and reception processes of scientific
literature. Journal of Information Science, 21(1), 37–53.
Glänzel, W., & Schubert, A. (2004). Analysing scientific networks through co-authorship. Handbook of
Quantitative Science and Technology Research, 11, 257–279.
Glass, G. V. (1976). Primary, secondary, and meta-analysis of research. Educational Researcher, 5(10), 3–8.
123
Scientometrics
Greenberg, S. (1991). An annotated bibliography of Computer Supported Cooperative Work. ACM SIGCHI
Bulletin, 23(3), 29–62.
Greif, I. (1988). Computer-supported cooperative work: A book of readings. San Mateo: Morgan Kaufmann.
Grudin, J. (1994). Computer-Supported Cooperative Work: History and focus. IEEE Computer, 27(5),
19–26.
Grudin, J. (2012). Punctuated equilibrium and technology change. Interactions, 19(5), 62–66.
Grudin, J., & Poltrock, S. E. (1997). Computer-Supported Cooperative Work and groupware. Advances in
Computers, 45, 269–320.
Grudin, J., & Poltrock, S. E. (2012). Taxonomy and theory in computer supported cooperative work.
Handbook of organizational psychology (pp. 1323–1348). Oxford: Oxford University Press.
Gupta, A. (2015). Five years of IndiaHCI: A scientometric analysis. In Proceedings of the 7th international
conference on HCI (IndiaHCI) (pp. 56–61).
Haustein, S., Costas, R., & Larivière, V. (2015). Characterizing social media metrics of scholarly papers:
The effect of document properties and collaboration patterns. PLoS ONE, 10(3), e0120495.
Heffner, A. (1981). Funded research, multiple authorship, and subauthorship collaboration in four disci-
plines. Scientometrics, 3(1), 5–12.
Heilig, L., & Voß, S. (2014). A scientometric analysis of cloud computing literature. IEEE Transactions on
Cloud Computing, 2(3), 266–278.
Henry, N., Goodell, H., Elmqvist, N., & Fekete, J. D. (2007). 20 Years of four HCI conferences: A visual
exploration. International Journal of Human–Computer Interaction, 23(3), 239–285.
Hertzel, D. H. (1987). Bibliometrics, history of the development of ideas in. Encyclopedia of Library and
Information Science, 42(7), 144–211.
Hess, D. J. (1997). Science studies: An advanced introduction. New York: New York University Press.
Holsapple, C. W., & Luo, W. (2003). A citation analysis of influences on collaborative computing research.
Computer Supported Cooperative Work, 12(3), 351–366.
Horn, D. B., Finholt, T. A., Birnholtz, J. P., Motwani, D., & Jayaraman, S. (2004). Six degrees of Jonathan
Grudin: A social network analysis of the evolution and impact of CSCW research. In Proceedings of
the ACM conference on computer supported cooperative work (pp. 582–591).
Hu, Z., & Wu, Y. (2014). Regularity in the time-dependent distribution of the percentage of never-cited
papers: An empirical pilot study based on the six journals. Journal of Informetrics, 8(1), 136–146.
Hughes, J., King, V., Rodden, T., & Andersen, H. (1994). Moving out from the control room: Ethnography
in system design. In Proceedings of the ACM conference on computer supported cooperative Work (pp.
429–439).
Hughes, J., Randall, D., & Shapiro, D. (1991). CSCW: Discipline or paradigm. In Proceedings of the second
european conference on computer-supported cooperative work (pp. 24–27).
Iglič, H., Doreian, P., Kronegger, L., & Ferligoj, A. (2017). With whom do researchers collaborate and why?
Scientometrics, 112(1), 153–174.
Jacovi, M., Soroka, V., Gilboa-Freedman, G., Ur, S., Shahar, E., & Marmasse, N. (2006). The chasms of
CSCW: A citation graph analysis of the CSCW conference. In Proceedings of the 20th anniversary
ACM conference on computer supported cooperative work (pp. 289–298).
Jacsó, P. (2008). Google Scholar revisited. Online Information Review, 32(1), 102–114.
Jacsó, P. (2010). Comparison of journal impact rankings in the SCImago Journal & Country Rank and the
Journal Citation Reports databases. Online Information Review, 34(4), 642–657.
Jacsó, P. (2012). Google Scholar metrics for publications: The software and content features of a new open
access bibliometric service. Online Information Review, 36(4), 604–619.
Jain, A. K., & Dubes, R. C. (1988). Algorithms for clustering data. Upper Saddle River: Prentice-Hall Inc.
Jirotka, M., Lee, C. P., & Olson, G. M. (2013). Supporting scientific collaboration: Methods, tools and
concepts. Computer Supported Cooperative Work, 22(4–6), 667–715.
Johnson, D. P. (2008). Contemporary sociological theory: An integrated multi-level approach. Berlin:
Springer.
Katz, J. S., & Martin, B. R. (1997). What is research collaboration? Research Policy, 26(1), 1–18.
Kaye, J. J. (2009). Some statistical analyses of CHI. In CHI extended abstracts on human factors in
computing systems (pp. 2585–2594). New York, NY: ACM.
Keegan, B., Horn, D., Finholt, T. A., & Kaye, J. (2013). Structure and dynamics of coauthorship, citation,
and impact within CSCW. arXiv:1307.7172.
Keele, S. (2007). Guidelines for performing systematic literature reviews in software engineering. EBSE
Technical Report, Ver. 2.3.
Kienle, A., & Wessner, M. (2006). The CSCL community in its first decade: Development, continuity,
connectivity. International Journal of Computer-Supported Collaborative Learning, 1(1), 9–33.
123
Scientometrics
Kittur, A., Nickerson, J. V., Bernstein, M., Gerber, E., Shaw, A., Zimmerman, J., Lease, M., & Horton, J.
(2013). The future of crowd work. In Proceedings of the ACM conference on computer supported
cooperative work (pp. 1301–1318).
Kling, R. (1991). Cooperation, coordination and control in computer-supported work. Communications of
the ACM, 34(12), 83–88.
Krasner, H., & Greif, I. (1986). CSCW’86: Proceedings. In Proceedings of the first conference on computer-
supported cooperative work, 3–5 December 1986, Austin, Texas.
Krippendorff, K. (1980). Content analysis: An introduction to its methodology. London: The Sage
Commtext Series, Sage Publications Ltd.
Kumar, S. (2014). Author productivity in the field Human Computer Interaction (HCI) research. Annals of
Library and Information Studies, 61(4), 273–285.
Kuutti, K. (1991). The concept of activity as a basic unit of analysis for CSCW research. In Proceedings of
the second European conference on computer-supported cooperative work (pp. 249–264).
Lee, S., & Bozeman, B. (2005). The impact of research collaboration on scientific productivity. Social
Studies of Science, 35(5), 673–702.
Lee, C. P., Dourish, P., & Mark, G. (2006). The human infrastructure of cyberinfrastructure. In Proceedings
of the 20th anniversary ACM conference on computer supported cooperative work (pp. 483–492).
Lee, H. E., Park, J. H., & Song, Y. (2014). Research collaboration networks of prolific institutions in the
HCI field in Korea: An analysis of the HCI Korea conference proceedings. In Proceedings of HCI
Korea (pp. 434–441).
Li, X., Thelwall, M., & Giustini, D. (2011). Validating online reference managers for scholarly impact
measurement. Scientometrics, 91(2), 461–471.
Liu, Y., Goncalves, J., Ferreira, D., Xiao, B., Hosio, S., & Kostakos, V. (2014). CHI 1994–2013: Mapping
two decades of intellectual progress through co-word analysis. In Proceedings of the 32nd annual ACM
conference on human factors in computing systems (pp. 3553–3562).
Malone, T. W., & Crowston, K. (1994). The interdisciplinary study of coordination. ACM Computing
Surveys (CSUR), 26(1), 87–119.
Manten, A. A. (1970). Statistical analysis of a scientific discipline: Palynology. Earth-Science Reviews, 6(3),
181–218.
Mao, J., Cao, Y., Lu, K., & Li, G. (2017a). Topic scientific community in science: A combined perspective
of scientific collaboration and topics. Scientometrics, 112(2), 851–875.
Mao, K., Capra, L., Harman, M., & Jia, Y. (2017b). A survey of the use of crowdsourcing in software
engineering. Journal of Systems and Software, 126, 57–84.
McGrath, J. E. (1984). Groups: Interaction and performance. Englewood Cliffs, NJ: Prentice-Hall.
Meho, L. I., & Rogers, Y. (2008). Citation counting, citation ranking, and h-index of Human–Computer
Interaction researchers: A comparison of Scopus and Web of Science. Journal of the American Society
for Information Science and Technology, 59(11), 1711–1726.
Meho, L. I., & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: Web
of Science versus Scopus and Google Scholar. Journal of the American Society for Information Science
and Technology, 58(13), 2105–2125.
Melin, G., & Persson, O. (1996). Studying research collaboration using co-authorships. Scientometrics,
36(3), 363–377.
Mentzas, G. N. (1993). Coordination of joint tasks in organizational processes. Journal of Information
Technology, 8(3), 139.
Mikki, S. (2009). Google Scholar compared to Web of Science. A literature review. Nordic Journal of
Information Literacy in Higher Education, 1(1), 41–51.
Mingers, J., & Leydesdorff, L. (2015). A review of theory and practice in scientometrics. European Journal
of Operational Research, 246(1), 1–19.
Mittleman, D., Briggs, R., Murphy, J., & Davis, A. (2008). Toward a taxonomy of groupware technologies.
In Proceedings of the 14th international workshop on groupware: Design, implementation, and use
(pp. 305–317).
Mubin, O., Al Mahmud, A., & Ahmad, M. (2017). HCI down under: Reflecting on a decade of the OzCHI
conference. Scientometrics, 112(1), 367–382.
Mulrow, C. D. (1994). Rationale for systematic reviews. British Medical Journal, 309(6954), 597.
Nalimov, V. V., & Mulchenko, B. M. (1969). Scientometrics. Studies of science as a process of information.
Moscow: Science.
Narin, F., Stevens, K., & Whitlow, E. (1991). Scientific co-operation in Europe and the citation of multi-
nationally authored papers. Scientometrics, 21(3), 313–323.
Neuhaus, C., Neuhaus, E., Asher, A., & Wrede, C. (2006). The depth and breadth of Google Scholar: An
empirical study. Portal: Libraries and the Academy, 6(2), 127–141.
123
Scientometrics
Nichols, D. M., & Cunningham, S. J. (2015). A scientometric analysis of 15 years of CHINZ conferences. In
Proceedings of the 15th New Zealand conference on human–computer interaction (pp. 73–80).
Oulasvirta, A. (2006). A bibliometric exercise for SIGCHI conference on human factors in computing
systems. Retrieved November, 2016 from http://www.hiit.fi/node/290.
Padilla, S., Methven, T. S., & Chantler, M. J. (2014). Is British HCI important? A topic-based comparison
with CHI. In Proceedings of the 28th international BCS human computer interaction conference on
HCI (pp. 365–370).
Panciera, K., Halfaker, A., & Terveen, L. (2009). Wikipedians are born, not made: A study of power editors
on Wikipedia. In Proceedings of the ACM international conference on supporting group work (pp.
51–60).
Peters, H., & Van Raan, A. (1991). Structuring scientific activities by co-author analysis: An exercise on a
university faculty level. Scientometrics, 20(1), 235–255.
Pinelle, D., & Gutwin, C. (2000). A review of groupware evaluations. In Proceedings of the IEEE 9th
international workshops on enabling technologies: Infrastructure for collaborative enterprises (pp.
86–91).
Piwowar, H. (2013). Altmetrics: Value all research products. Nature, 493(7431), 159.
Pope, C., Ziebland, S., & Mays, N. (2000). Qualitative research in health care: Analysing qualitative data.
British Medical Journal, 320(7227), 114.
Price, D. S. (1963). Little science, big science. New York City: Columbia University Press.
Price, D. S. (1976). A general theory of bibliometric and other cumulative advantage processes. Journal of
the American Society for Information Science, 27(5), 292–306.
Priem, J., & Hemminger, B. H. (2010). Scientometrics 2.0: New metrics of scholarly impact on the social
Web. First Monday, 15(7).
Pritchard, A. (1969). Statistical bibliography or bibliometrics. Journal of Documentation, 25, 348.
Pumareja, D., & Sikkel, K. (2002). An evolutionary approach to groupware implementation: The context of
requirements engineering in the socio-technical frame. CTIT Technical Reports Series, 02–30(30),
1–27.
Qi, M., Zeng, A., Li, M., Fan, Y., & Di, Z. (2017). Standing on the shoulders of giants: The effect of
outstanding scientists on young collaborators’ careers. Scientometrics, 111(3), 1839–1850.
Rolland, B., Paine, D., & Lee, C. P. (2014). Work practices in coordinating center enabled networks
(CCENs). In Proceedings of the 18th ACM international conference on supporting group work (pp.
194–203).
Sandelowski, M. (1995). Sample size in qualitative research. Research in Nursing & Health, 18(2),
179–183.
Schmidt, K. (2011). The concept of ‘work’ in CSCW. Computer Supported Cooperative Work, 20(4–5),
341–401.
Schmidt, K., & Bannon, L. (1992). Taking CSCW seriously. Computer Supported Cooperative Work,
1(1–2), 7–40.
Schmidt, K., & Bannon, L. (2013). Constructing CSCW: The first quarter century. Computer Supported
Cooperative Work, 22(4–6), 345–372.
Stapi, Z., De-Marcos, L., Strahonja, V., Garcı́a-Cabot, A., & López, E. G. (2016). Scrutinizing systematic
literature review process in software engineering. TEM JOURNAL: Technology, Education, Man-
agement, Informatics, 5(1), 104–116.
Suchman, L. (1989). Notes on computer support for cooperative work. Working paper WP-12, University of
Jyväskylä, Department of Computer Science.
Sugimoto, C. R., Work, S., Larivière, V., & Haustein, S. (2017). Scholarly use of social media and
altmetrics: A review of the literature. Journal of the Association for Information Science and Tech-
nology, 68(9), 2037–2062.
Tague-Sutcliffe, J. (1992). An introduction to informetrics. Information Processing and Management, 28(1),
1–3.
Tang, K. Y., Tsai, C. C., & Lin, T. C. (2014). Contemporary intellectual structure of CSCL research
(2006–2013): A co-citation network analysis with an education focus. International Journal of
Computer-Supported Collaborative Learning, 9(3), 335–363.
Thelwall, M., Haustein, S., Larivière, V., & Sugimoto, C. R. (2013). Do altmetrics work? Twitter and ten
other social web services. PLoS ONE, 8(5), e64841.
Thelwall, M., & Kousha, K. (2017). ResearchGate versus Google Scholar: Which finds more early cita-
tions?. Scientometrics, 112(2), 1125–1131.
Thelwall, M., & Wilson, P. (2016). Mendeley readership altmetrics for medical articles: An analysis of 45
fields. Journal of the Association for Information Science and Technology, 67(8), 1962–1972.
123
Scientometrics
Van den Besselaar, P., & Heimeriks, G. (2006). Mapping research topics using word-reference co-occur-
rences: A method and an exploratory case study. Scientometrics, 68(3), 377–393.
Van Raan, A. (1997). Scientometrics: State-of-the-art. Scientometrics, 38(1), 205–218.
Wade, N. (1975). Citation analysis: A new tool for science administrators. Science, 188(4187), 429–432.
Wagner, C. S., Whetsell, T., & Leydesdorff, L. (2016). Growth of international cooperation in science:
Revisiting six case studies. arXiv:1612.07208.
Wainer, J., & Barsottini, C. (2007). Empirical research in CSCW – a review of the ACM/CSCW conferences
from 1998 to 2004. Journal of the Brazilian Computer Society, 13(3), 27–35.
Wallace, J. R., Oji, S., & Anslow, C. (2017). Technologies, methods, and values: Changes in empirical
research at CSCW 1990–2015. UWSpace. http://hdl.handle.net/10012/12396.
Wang, W., Yu, S., Bekele, T. M., Kong, X., & Xia, F. (2017). Scientific collaboration patterns vary with
scholars’ academic ages. Scientometrics, 112(1), 329–343.
Wania, C. E., Atwood, M. E., & McCain, K. W. (2006). How do design and evaluation interrelate in HCI
research? In Proceedings of the 6th conference on designing interactive systems (pp. 90–98).
Zahedi, Z., Costas, R., & Wouters, P. (2014). How well developed are altmetrics? a cross-disciplinary
analysis of the presence of ‘alternative metrics’ in scientific publications. Scientometrics, 101(2),
1491–1513.
Ziman, J., & Schmitt, R. W. (1995). Prometheus bound: Science in a dynamic steady state. American
Journal of Physics, 63(5), 476–477.
Zuckerman, H. (1987). Citation analysis and the complex problem of intellectual influence. Scientometrics,
12(5–6), 329–338.
123
View publication stats