TY - GEN
T1 - A Practical Tutorial Discussion the Evaluating ITArtifacts Using Controlled Experiments using the Design Science Framework
AU - Leroy, Gondy
N1 - Funding Information:
II. INSTRUCTOR BIBLIOGRAPH Gondy Leroy, PhD, is Professor in MIS at the University of Arizona’s Management Information Systems. Her research focuses on the design, development and evaluation of artifacts that support, facilitate and improve information exchange between people. She has worked on apps to facilitate communication with children with autism, search engines for biomedical information, information extraction and interview systems for crime witness reports and recently she has focused on automating text simplification in healthcare to increase patient comprehension of information. She has won grants from NIH, AHRQ, NSF, Microsoft Research and several foundation. She has her education background in experimental psychology with a combined BS and MS (1996) from the Catholic University of Leuven, (1996) and a MIS (1999) and PhD (2003) from the University of Arizona. She authored the book “Designing User Studies in Informatics (Springer, 2011). Finally, she is an active contributor to increasing the diversity and inclusion in computing and founded and leads the “Tomorrow’s Leaders Equipped for Diversity” program at the University of Arizona’s Eller School of Management.”.
Publisher Copyright:
© 2020 IEEE.
PY - 2020/11
Y1 - 2020/11
N2 - This tutorial teaches how to conduct evaluations that fit within the design science paradigm, i.e., evaluations of algorithms and entire systems. Design science is of increasing importance in IS with many of the main and top journals recognizing it as an important research approach in our field. However, many business schools and i-schools focus on behavioral or econometrics when teaching evaluation. This tutorial brings the complement to this: ANOVA and t-Test for evaluation of artifacts under different conditions.
AB - This tutorial teaches how to conduct evaluations that fit within the design science paradigm, i.e., evaluations of algorithms and entire systems. Design science is of increasing importance in IS with many of the main and top journals recognizing it as an important research approach in our field. However, many business schools and i-schools focus on behavioral or econometrics when teaching evaluation. This tutorial brings the complement to this: ANOVA and t-Test for evaluation of artifacts under different conditions.
KW - artifact
KW - design science
KW - evaluation
KW - experiment design
UR - http://www.scopus.com/inward/record.url?scp=85103155519&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85103155519&partnerID=8YFLogxK
U2 - 10.1109/ICHI48887.2020.9374351
DO - 10.1109/ICHI48887.2020.9374351
M3 - Conference contribution
AN - SCOPUS:85103155519
T3 - 2020 IEEE International Conference on Healthcare Informatics, ICHI 2020
BT - 2020 IEEE International Conference on Healthcare Informatics, ICHI 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 8th IEEE International Conference on Healthcare Informatics, ICHI 2020
Y2 - 30 November 2020 through 3 December 2020
ER -