isern 2013

Keynotes




Dr. Ben Shneiderman Dr. Natalia Juristo
Thursday, October 10th Friday, October 11th
8:45am to 10:00am 8:45am to 10:00am
Keynote Slides



Ben
HIGH IMPACT RESEARCH: BLENDING BASIC AND APPLIED METHODS
Dr. Ben Shneiderman
University of Maryland, USA
ben [at] cs [dot] umd [dot] edu


Abstract. Vannevar Bush's 1945 book Science: The Endless Frontier, made a sharp distinction between basic (or pure) and applied research, urging strong government support for academic basic research and warning that "applied research invariably drives out pure". He also suggested the "linear model" in which basic research leads to applied research, and then commercial development. These biased views are still widely held, so a fresh analysis is needed to repair the damage and to provide a guiding framework for scientific researchers across many disciplines. The High-Impact Research framework integrates current concepts that have been voiced by many people:

  1. Science research projects that address basic and applied questions, seek theoretical and practical outcomes, and are inspired by curiosity-driven as well as mission-driven goals are likely to have the greatest payoffs.
  2. These increased expectations may require teamwork from multidisciplinary participants and from those who are skilled with multiple research methods: randomized controlled trials, ethnographic case studies, automatic logging, retrospective analysis, etc. Teamwork can be difficult, but has a stronger possibility of producing breakthrough results.
  3. Since complex problems are resistant to reductionist approaches and small laboratory controlled experiments, high-impact can be achieved by using new research methods that depend on interventions that are evaluated in large-scale systems. Such repeated interventions constitute case studies that can provide evidence to support or falsify hypotheses. Guidelines for promoting and reporting rigorous case studies will accelerate progress.

Bio. Dr. Ben Shneiderman is a Professor in the Department of Computer Science and Founding Director (1983-2000) of the Human-Computer Interaction Laboratory at the University of Maryland.
He is a Fellow of the AAAS, ACM, and IEEE, and a member of the National Academy of Engineering, in recognition of his pioneering contributions to human-computer interaction and information visualization. His contributions include the direct manipulation concept, clickable highlighted web-link, touchscreen keyboards, dynamic query sliders from Spotfire, development of treemaps, innovative network visualization strategies for NodeXL, and temporal event sequence analysis for electronic health records. Ben is the co-author with Catherine Plaisant of Designing the User Interface: Strategies for Effective Human-Computer Interaction (5th ed., 2010) http://www.awl.com/DTUI/. With Stu Card and Jock Mackinlay, he coauthored Readings in Information Visualization: Using Vision to Think (1999). His book Leonardo's Laptop appeared in October 2002 (MIT Press) and won the IEEE book award for Distinguished Literary Contribution. His latest book, with Derek Hansen and Marc Smith, is Analyzing Social Media Networks with NodeXL (www.codeplex.com/nodexl, 2010).




Abstract. To consolidate a body of knowledge built upon evidence, experimental results have to be extensively verified. Experiments need replication at other times and under other conditions before they can produce an established piece of knowledge. Several replications need to be run to strengthen the evidence. Most SE experiments have not been replicated. If an experiment is not replicated, there is no way to distinguish whether results were produced by chance (the observed event occurred accidentally), results are artifactual (the event occurred because of the experimental configuration but does not exist in reality) or results conform to a pattern existing in reality. The immaturity of experimental SE knowledge has been an obstacle to replication. Context differences usually oblige SE experimenters to adapt experiments for replication. As key experimental conditions are yet unknown, slight changes in replications have led to differences in the results that prevent verification. There are still many uncertainties about how to proceed with replications of SE experiments. Should replicators reuse the baseline experiment materials? How much liaison should there be among the original and replicating experimenters, if any? What elements of the experimental configuration can be changed for the experiment to be considered a replication rather than a new experiment? The aim of replication is to verify results, but different types of replication serve special verification purposes and afford different degrees of change. Each replication type helps to discover particular experimental conditions that might influence the results. We need to learn which types of replications are feasible in SE as well as the acceptable changes for each type and the level of verification provided.

Bio. Dr. Natalia Juristo is full professor of software engineering with the Computing School at the Technical University of Madrid (UPM) in Spain since 1997 and has recently got a FiDiPro (Finnish Distinguish Professor) at the University of Oulu. She was the Director of the UPM MSc in Software Engineering from 1992 to 2002 and the coordinator of the Erasmus Mundus European Master on SE (whith the participation of the University of Bolzano, the University of Kaiserslautern and the University of Blekinge) from 2007 to 2012. Natalia has served in several Program Committees ICSE, RE, REFSQ, ESEM, ISESE and others. She has been Program Chair EASE13, ISESE04 and SEKE97 and General Chair for ESEM07, SNPD02 and SEKE01. She has been member of several Editorial Boards, including IEEE Software and the Journal of Empirial Software Engineering. Dr. Juristo has been Guest Editor of special issues in several journals, including Journal of Empirial Software Engineering, IEEE Software, Journal of Software and Systems, Data and Knowledge Engineering and the International Journal of Software Engineering and Knowledge Engineering. Back in 1988 Natalia was fellow of the European Centre for Nuclear Research (CERN) in Switzerland and during 1989 she got a young graduate contract with the European Space Agency (ESA) in Italy. During 1992 she was resident affiliate of the Software Engineering Institute at Carnegie Mellon University (USA). Natalia has a B.S. and a Ph.D. in Computing from UPM.