Normal view MARC view ISBD view

Serious Games Analytics : Methodologies for Performance Measurement, Assessment, and Improvement.

By: Loh, Christian Sebastian.
Contributor(s): Sheng, Yanyan | Ifenthaler, Dirk.
Material type: TextTextSeries: eBooks on Demand.Advances in Game-Based Learning Ser: Publisher: Cham : Springer, 2015Copyright date: ©2015Description: 1 online resource (497 pages).Content type: text Media type: computer Carrier type: online resourceISBN: 9783319058344.Subject(s): EducationGenre/Form: Electronic books.Additional physical formats: Print version:: Serious Games Analytics : Methodologies for Performance Measurement, Assessment, and ImprovementDDC classification: 006.312 LOC classification: L1-991Online resources: Click here to view this ebook.
Contents:
Intro -- Preface -- Contents -- Contributors -- About the Editors -- About the Authors -- Reviewers -- Part I: Foundations of Serious Games Analytics -- Chapter 1: Serious Games Analytics: Theoretical Framework -- 1 From Edu-Games to Serious Games -- 1.1 Early-Days Digital Games for Learning -- 1.2 The Serious Games Industry -- 2 Serious Games: Not for Entertainment -- 2.1 Message Broadcasters Are Not Serious Games -- 3 Gamification, Game-Based Learning, and Serious Games -- 3.1 Gamification Is Not Games -- 3.2 Problems with Game-Based Learning: Media Comparison -- 3.2.1 Media Comparison -- 3.2.2 Pretest-Posttest Validity -- 3.2.3 Talk Aloud and Self-Reports -- 4 Serious Games as Tools -- 4.1 Games for Skills and Human Performance Improvement -- 4.2 Gameplay Data -- 4.3 Datafication -- 4.4 In Situ vs. Ex Situ Data Collection -- 4.5 Actionable Insight: Using Analytics to Improve Skills and Human Performance -- 5 Types of Analytics -- 5.1 Learning Analytics -- 5.1.1 Metrics for Learning Analytics -- 5.2 Game Analytics -- 5.3 Does Game Analytics + Learning Analytics = Serious Games Analytics? -- 5.4 Why Serious Games Analytics? -- 5.5 Analytics Differ by Origins and Purposes -- Conclusion -- References -- Chapter 2: A Meta-Analysis of Data Collection in Serious Games Research -- 1 Introduction -- 2 Study Method -- 2.1 Data Characterization -- 2.2 Identify Data Sources (Systematic Review) -- 2.3 Data Collection and Analysis -- 3 Systematic Review Papers -- 4 Results -- 5 Discussion -- 5.1 Issues Highlighted Within Our Study Outcomes -- 5.2 What Data Is Being Collected? -- 5.3 When Data Is Being Collected? -- 5.4 Where Data Is Being Collected? -- 5.5 Who Is Involved in Data Collection? -- 5.6 Why Data Is Being Collected? -- 6 Conclusions -- References -- Part II: Measurement of Data in Serious Games Analytics.
Chapter 3: Guidelines for the Design and Implementation of Game Telemetry for Serious Games Analytics -- 1 Introduction -- 2 Game Telemetry and Its Uses -- 2.1 Event Data -- 2.2 Uses of Game Telemetry -- 3 Issues in the Use of Game Telemetry for Measurement Purposes -- 4 Game Telemetry Design Guidelines -- 4.1 Guideline 1: Target Behaviors That Reflect the Use of Cognitive Demands -- 4.2 Guideline 2: Record Data at the Finest Usable Grain Size -- 4.3 Guideline 3: Represent Data to Require Minimal Preprocessing -- 4.4 Guideline 4: Record Descriptions of Behavior and Not Inferences with as Much Contextual Information as Feasible -- 4.4.1 Descriptive -- 4.4.2 Unambiguous -- 4.4.3 Contextualized -- 5 Case Study: Deriving Measures from Game Telemetry -- 5.1 Case Study Game: Save Patch -- 6 Evidence of Save Patch as a Learning Game -- 6.1 Telemetry Design in Save Patch -- 6.2 Measuring Overall Game Performance -- 6.3 Measuring In-Game Performance -- 6.4 Measuring In-Game Strategies -- 7 Discussion -- References -- Chapter 4: The Dynamical Analysis of Log Data Within Educational Games -- 1 Introduction -- 2 The Utility of Log Data Within Game-Based Environments -- 3 Applying Dynamical Analyses to Log Data -- 4 iSTART-2 -- 4.1 iSTART-2 Log Data -- 4.2 Dynamical Methodologies and Log Data Within iSTART-2 -- 4.2.1 Random Walks -- 4.2.2 Entropy -- 4.2.3 Hurst -- 5 Conclusion -- References -- Chapter 5: Measuring Expert Performance for Serious Games Analytics: From Data to Insights -- 1 Introduction -- 1.1 Design-Centric vs. Performance-Centric Game Making -- 2 Working with Users' Data -- 2.1 Ex Situ Data and Black Box -- 2.2 In Situ Data and White Box -- 2.2.1 Behavioral Research Considerations -- 2.2.2 Telemetry and Information Trails -- 2.3 The Information Trails Assessment Framework -- 2.4 Event Listeners -- 2.5 Event Tracers -- 2.6 Data Mining Processes.
2.7 Information Visualization -- 3 Collecting User-Generated Data -- 3.1 Big Data vs. Good Data -- 3.2 Repetition and Behaviors -- 3.3 Providing (More Than) Enough Game Actions -- 3.4 Game Design and Players' Behaviors -- 3.5 Game Metrics -- 3.6 Validity of Gameplay Time in Serious Games Research -- 3.7 Time of Completion -- 3.7.1 Caution for Gamification -- 3.8 Creating New Metrics -- 3.9 Three Different Analytics for Serious Games: Gaming, Testing, and Training -- 4 User Performance Measurement for Serious Games -- 4.1 Decision Analyses by Bayesian Network -- 4.1.1 Bayesian Networks Are Computationally Prohibitive -- 4.1.2 Limitations of Bayesian Network -- 4.1.3 Inability to Handle Spatial-Temporal Gameplay Data -- 5 Performance Measurement and Player Behavioral Profiling -- 5.1 Machine/Statistical Learning -- 5.1.1 Clustering Techniques -- 5.2 Cluster Analysis -- 5.3 Linear Discriminant Analysis -- 5.4 Item Response Theory -- 6 Conclusions -- 6.1 From Serious Games Analytics to Insights -- 6.2 Expertise Index as Serious Games Analytics -- 6.2.1 Competency and Observable Action Sequences -- 7 Conclusions -- References -- Chapter 6: Cluster Evaluation, Description, and Interpretation for Serious Games -- 1 Introduction -- 2 Theoretical Background -- 3 Minecraft Data -- 4 Analysis -- 4.1 Data Transformation -- 4.2 Feature Selection -- 4.3 Clustering -- 4.4 Cluster Evaluation -- 4.5 Cluster Description -- 4.6 Cluster Interpretation -- 4.7 Additional Partitions -- 5 Applications -- 6 Conclusions -- References -- Part III: Visualizations of Data for Serious Games Analytics -- Chapter 7: Comparative Visualization of Player Behavior for Serious Game Analytics -- 1 Introduction -- 2 Comparative Visualization -- 2.1 Juxtaposition -- 2.2 Superposition -- 2.3 Explicit Encoding -- 3 Comparative Visualization in Serious Game Analytics -- 4 Case Studies.
4.1 Case Study: Gender Differences -- 4.2 Case Study: Age Differences -- 5 Conclusions -- References -- Chapter 8: Examining Through Visualization What Tools Learners Access as They Play a Serious Game for Middle School Science -- 1 Introduction -- 2 Relevant Literature -- 2.1 Definition and Examples -- 2.2 Research Trends in Serious Games -- 2.3 Issues in SEGA Evaluation -- 2.4 Background of Research -- 3 Research Questions and Research Context -- 3.1 Research Questions -- 3.2 Description of the Serious Game Environment -- 3.3 Cognitive Tools and Their Corresponding Conceptual Categories -- 4 Method -- 4.1 Participants -- 4.2 Data Sources -- 4.2.1 Log Files -- 4.2.2 Solution Scores -- 4.2.3 Goal Orientation -- 4.3 Data Processing and Analysis -- 4.3.1 Data Cleaning and Processing -- 4.3.2 Analysis -- 5 Findings -- 5.1 How Do Play-Learners Access Different Tools Built into the Game? -- 5.2 How Do Play-Learners with Different Goal Orientations Access the Tools? -- 5.2.1 Mastery Goal Orientation (Mastery GO) -- 5.2.2 Performance-Approach Goal Orientation (Performance GO) -- 5.2.3 Performance-Avoidance Goal Orientation (Performance-Avoid GO) -- 5.3 How Do Play-Learners with Different with Performance Scores Access the Tools? -- 6 Discussion and Implications -- 6.1 General Patterns of Tool Use -- 6.2 Productive Tool Use by High-Performance and Mastery Goal Orientation Groups -- 6.3 Visualization as a Promising Technique for Serious Games Analytics -- 6.4 Limitations and Future Directions -- 7 Conclusion -- References -- Part IV: Serious Games Analytics for Medical Learning -- Chapter 9: Using Visual Analytics to Inform Rheumatoid Arthritis Patient Choices -- 1 Introduction -- 2 Rheumatoid Arthritis Care -- 2.1 Patient-Physician Communication -- 2.2 Decision-Making for RA Patients -- 2.3 Patient Risk Perception -- 3 The Case for Game-Based Decision AIDS.
3.1 Evaluating DAs -- 4 The Case for a Data-Driven Game -- 4.1 Use Cases -- 5 Technical Challenges and Analytics -- 5.1 Human-Computer Interaction -- 5.2 Arthritic Hand Models -- 5.3 Automatic Deformation Discovery and Prediction -- 6 Conclusions and Future Work -- References -- Chapter 10: The Role of Serious Games in Robot Exoskeleton-Assisted Rehabilitation of Stroke Patients -- 1 Background -- 1.1 Rehabilitation -- 1.2 Psychological State -- 1.3 Measurement of Psychological State -- 1.4 Machine Learning -- 1.5 Implementation -- 2 Experimental Protocol -- 2.1 Task -- 2.2 Measurement -- 2.3 Feature Selection -- 3 Results -- 3.1 Changes During the Training Session -- 4 Discussion -- 5 Conclusion -- References -- Chapter 11: Evaluation-Based Design Principles -- 1 Introduction: Why Use Kinect for Medical Procedure Evaluation? -- 2 Technical and Computational Challenges of the Prototype -- 2.1 Creating Master Models for Medical Procedures -- 2.2 Computational Challenges -- 2.2.1 Tracking Specific Users -- 2.2.2 Identifying Objects Within the Scene -- 2.3 Challenges for the Evaluation -- 3 Generalizing the Ideas -- 3.1 Process Versus Outcome -- 3.2 Series Versus Parallel -- 3.3 Enumeration Versus Collection -- 3.4 Variation and Deviation -- 3.5 Deterministic Versus Stochastic -- 3.6 Summarizing the Lessons Learned -- 4 Alignment with Learning Theory -- 4.1 Kirkpatrick Four Level Evaluation Model -- 4.2 Applying a Nickols Design View to AIMS -- 4.3 Outlook on Next Steps -- 5 Conclusions -- References -- Part V: Serious Games Analytics for Learning and Education -- Chapter 12: Analytics-Driven Design: Impact and Implications of Team Member Psychological Perspectives on a Serious Games (SGs) Design Framework -- 1 Introduction -- 1.1 Complex Assessment and SGs -- 2 ECD Theory Overview -- 2.1 Previous ECD-Driven Serious Games Research.
3 Cycles: A Worked Example.
Summary: This volume brings together research on how gameplay data in serious games may be turned into valuable analytics or actionable intelligence for performance measurement, assessment, and improvement. Chapter authors use empirical research methodologies, including existing, experimental, and emerging conceptual frameworks, from various fields, such as: computer science software engineering educational data mining statistics information visualization. Serious games is an emerging field where the games are created using sound learning theories and instructional design principles to maximize learning and training success. But how would stakeholders know what play-learners have done in the game environment, and if the actions performance brings about learning? Could they be playing the game for fun, really learning with evidence of performance improvement, or simply gaming the system, i.e., finding loopholes to fake that they are making progress? This volume endeavors to answer these questions.
Tags from this library: No tags from this library for this title. Log in to add tags.
Item type Current location Call number URL Status Date due Barcode
Electronic Book UT Tyler Online
Online
L1-991 (Browse shelf) https://ebookcentral.proquest.com/lib/uttyler/detail.action?docID=2096094 Available EBC2096094

Intro -- Preface -- Contents -- Contributors -- About the Editors -- About the Authors -- Reviewers -- Part I: Foundations of Serious Games Analytics -- Chapter 1: Serious Games Analytics: Theoretical Framework -- 1 From Edu-Games to Serious Games -- 1.1 Early-Days Digital Games for Learning -- 1.2 The Serious Games Industry -- 2 Serious Games: Not for Entertainment -- 2.1 Message Broadcasters Are Not Serious Games -- 3 Gamification, Game-Based Learning, and Serious Games -- 3.1 Gamification Is Not Games -- 3.2 Problems with Game-Based Learning: Media Comparison -- 3.2.1 Media Comparison -- 3.2.2 Pretest-Posttest Validity -- 3.2.3 Talk Aloud and Self-Reports -- 4 Serious Games as Tools -- 4.1 Games for Skills and Human Performance Improvement -- 4.2 Gameplay Data -- 4.3 Datafication -- 4.4 In Situ vs. Ex Situ Data Collection -- 4.5 Actionable Insight: Using Analytics to Improve Skills and Human Performance -- 5 Types of Analytics -- 5.1 Learning Analytics -- 5.1.1 Metrics for Learning Analytics -- 5.2 Game Analytics -- 5.3 Does Game Analytics + Learning Analytics = Serious Games Analytics? -- 5.4 Why Serious Games Analytics? -- 5.5 Analytics Differ by Origins and Purposes -- Conclusion -- References -- Chapter 2: A Meta-Analysis of Data Collection in Serious Games Research -- 1 Introduction -- 2 Study Method -- 2.1 Data Characterization -- 2.2 Identify Data Sources (Systematic Review) -- 2.3 Data Collection and Analysis -- 3 Systematic Review Papers -- 4 Results -- 5 Discussion -- 5.1 Issues Highlighted Within Our Study Outcomes -- 5.2 What Data Is Being Collected? -- 5.3 When Data Is Being Collected? -- 5.4 Where Data Is Being Collected? -- 5.5 Who Is Involved in Data Collection? -- 5.6 Why Data Is Being Collected? -- 6 Conclusions -- References -- Part II: Measurement of Data in Serious Games Analytics.

Chapter 3: Guidelines for the Design and Implementation of Game Telemetry for Serious Games Analytics -- 1 Introduction -- 2 Game Telemetry and Its Uses -- 2.1 Event Data -- 2.2 Uses of Game Telemetry -- 3 Issues in the Use of Game Telemetry for Measurement Purposes -- 4 Game Telemetry Design Guidelines -- 4.1 Guideline 1: Target Behaviors That Reflect the Use of Cognitive Demands -- 4.2 Guideline 2: Record Data at the Finest Usable Grain Size -- 4.3 Guideline 3: Represent Data to Require Minimal Preprocessing -- 4.4 Guideline 4: Record Descriptions of Behavior and Not Inferences with as Much Contextual Information as Feasible -- 4.4.1 Descriptive -- 4.4.2 Unambiguous -- 4.4.3 Contextualized -- 5 Case Study: Deriving Measures from Game Telemetry -- 5.1 Case Study Game: Save Patch -- 6 Evidence of Save Patch as a Learning Game -- 6.1 Telemetry Design in Save Patch -- 6.2 Measuring Overall Game Performance -- 6.3 Measuring In-Game Performance -- 6.4 Measuring In-Game Strategies -- 7 Discussion -- References -- Chapter 4: The Dynamical Analysis of Log Data Within Educational Games -- 1 Introduction -- 2 The Utility of Log Data Within Game-Based Environments -- 3 Applying Dynamical Analyses to Log Data -- 4 iSTART-2 -- 4.1 iSTART-2 Log Data -- 4.2 Dynamical Methodologies and Log Data Within iSTART-2 -- 4.2.1 Random Walks -- 4.2.2 Entropy -- 4.2.3 Hurst -- 5 Conclusion -- References -- Chapter 5: Measuring Expert Performance for Serious Games Analytics: From Data to Insights -- 1 Introduction -- 1.1 Design-Centric vs. Performance-Centric Game Making -- 2 Working with Users' Data -- 2.1 Ex Situ Data and Black Box -- 2.2 In Situ Data and White Box -- 2.2.1 Behavioral Research Considerations -- 2.2.2 Telemetry and Information Trails -- 2.3 The Information Trails Assessment Framework -- 2.4 Event Listeners -- 2.5 Event Tracers -- 2.6 Data Mining Processes.

2.7 Information Visualization -- 3 Collecting User-Generated Data -- 3.1 Big Data vs. Good Data -- 3.2 Repetition and Behaviors -- 3.3 Providing (More Than) Enough Game Actions -- 3.4 Game Design and Players' Behaviors -- 3.5 Game Metrics -- 3.6 Validity of Gameplay Time in Serious Games Research -- 3.7 Time of Completion -- 3.7.1 Caution for Gamification -- 3.8 Creating New Metrics -- 3.9 Three Different Analytics for Serious Games: Gaming, Testing, and Training -- 4 User Performance Measurement for Serious Games -- 4.1 Decision Analyses by Bayesian Network -- 4.1.1 Bayesian Networks Are Computationally Prohibitive -- 4.1.2 Limitations of Bayesian Network -- 4.1.3 Inability to Handle Spatial-Temporal Gameplay Data -- 5 Performance Measurement and Player Behavioral Profiling -- 5.1 Machine/Statistical Learning -- 5.1.1 Clustering Techniques -- 5.2 Cluster Analysis -- 5.3 Linear Discriminant Analysis -- 5.4 Item Response Theory -- 6 Conclusions -- 6.1 From Serious Games Analytics to Insights -- 6.2 Expertise Index as Serious Games Analytics -- 6.2.1 Competency and Observable Action Sequences -- 7 Conclusions -- References -- Chapter 6: Cluster Evaluation, Description, and Interpretation for Serious Games -- 1 Introduction -- 2 Theoretical Background -- 3 Minecraft Data -- 4 Analysis -- 4.1 Data Transformation -- 4.2 Feature Selection -- 4.3 Clustering -- 4.4 Cluster Evaluation -- 4.5 Cluster Description -- 4.6 Cluster Interpretation -- 4.7 Additional Partitions -- 5 Applications -- 6 Conclusions -- References -- Part III: Visualizations of Data for Serious Games Analytics -- Chapter 7: Comparative Visualization of Player Behavior for Serious Game Analytics -- 1 Introduction -- 2 Comparative Visualization -- 2.1 Juxtaposition -- 2.2 Superposition -- 2.3 Explicit Encoding -- 3 Comparative Visualization in Serious Game Analytics -- 4 Case Studies.

4.1 Case Study: Gender Differences -- 4.2 Case Study: Age Differences -- 5 Conclusions -- References -- Chapter 8: Examining Through Visualization What Tools Learners Access as They Play a Serious Game for Middle School Science -- 1 Introduction -- 2 Relevant Literature -- 2.1 Definition and Examples -- 2.2 Research Trends in Serious Games -- 2.3 Issues in SEGA Evaluation -- 2.4 Background of Research -- 3 Research Questions and Research Context -- 3.1 Research Questions -- 3.2 Description of the Serious Game Environment -- 3.3 Cognitive Tools and Their Corresponding Conceptual Categories -- 4 Method -- 4.1 Participants -- 4.2 Data Sources -- 4.2.1 Log Files -- 4.2.2 Solution Scores -- 4.2.3 Goal Orientation -- 4.3 Data Processing and Analysis -- 4.3.1 Data Cleaning and Processing -- 4.3.2 Analysis -- 5 Findings -- 5.1 How Do Play-Learners Access Different Tools Built into the Game? -- 5.2 How Do Play-Learners with Different Goal Orientations Access the Tools? -- 5.2.1 Mastery Goal Orientation (Mastery GO) -- 5.2.2 Performance-Approach Goal Orientation (Performance GO) -- 5.2.3 Performance-Avoidance Goal Orientation (Performance-Avoid GO) -- 5.3 How Do Play-Learners with Different with Performance Scores Access the Tools? -- 6 Discussion and Implications -- 6.1 General Patterns of Tool Use -- 6.2 Productive Tool Use by High-Performance and Mastery Goal Orientation Groups -- 6.3 Visualization as a Promising Technique for Serious Games Analytics -- 6.4 Limitations and Future Directions -- 7 Conclusion -- References -- Part IV: Serious Games Analytics for Medical Learning -- Chapter 9: Using Visual Analytics to Inform Rheumatoid Arthritis Patient Choices -- 1 Introduction -- 2 Rheumatoid Arthritis Care -- 2.1 Patient-Physician Communication -- 2.2 Decision-Making for RA Patients -- 2.3 Patient Risk Perception -- 3 The Case for Game-Based Decision AIDS.

3.1 Evaluating DAs -- 4 The Case for a Data-Driven Game -- 4.1 Use Cases -- 5 Technical Challenges and Analytics -- 5.1 Human-Computer Interaction -- 5.2 Arthritic Hand Models -- 5.3 Automatic Deformation Discovery and Prediction -- 6 Conclusions and Future Work -- References -- Chapter 10: The Role of Serious Games in Robot Exoskeleton-Assisted Rehabilitation of Stroke Patients -- 1 Background -- 1.1 Rehabilitation -- 1.2 Psychological State -- 1.3 Measurement of Psychological State -- 1.4 Machine Learning -- 1.5 Implementation -- 2 Experimental Protocol -- 2.1 Task -- 2.2 Measurement -- 2.3 Feature Selection -- 3 Results -- 3.1 Changes During the Training Session -- 4 Discussion -- 5 Conclusion -- References -- Chapter 11: Evaluation-Based Design Principles -- 1 Introduction: Why Use Kinect for Medical Procedure Evaluation? -- 2 Technical and Computational Challenges of the Prototype -- 2.1 Creating Master Models for Medical Procedures -- 2.2 Computational Challenges -- 2.2.1 Tracking Specific Users -- 2.2.2 Identifying Objects Within the Scene -- 2.3 Challenges for the Evaluation -- 3 Generalizing the Ideas -- 3.1 Process Versus Outcome -- 3.2 Series Versus Parallel -- 3.3 Enumeration Versus Collection -- 3.4 Variation and Deviation -- 3.5 Deterministic Versus Stochastic -- 3.6 Summarizing the Lessons Learned -- 4 Alignment with Learning Theory -- 4.1 Kirkpatrick Four Level Evaluation Model -- 4.2 Applying a Nickols Design View to AIMS -- 4.3 Outlook on Next Steps -- 5 Conclusions -- References -- Part V: Serious Games Analytics for Learning and Education -- Chapter 12: Analytics-Driven Design: Impact and Implications of Team Member Psychological Perspectives on a Serious Games (SGs) Design Framework -- 1 Introduction -- 1.1 Complex Assessment and SGs -- 2 ECD Theory Overview -- 2.1 Previous ECD-Driven Serious Games Research.

3 Cycles: A Worked Example.

This volume brings together research on how gameplay data in serious games may be turned into valuable analytics or actionable intelligence for performance measurement, assessment, and improvement. Chapter authors use empirical research methodologies, including existing, experimental, and emerging conceptual frameworks, from various fields, such as: computer science software engineering educational data mining statistics information visualization. Serious games is an emerging field where the games are created using sound learning theories and instructional design principles to maximize learning and training success. But how would stakeholders know what play-learners have done in the game environment, and if the actions performance brings about learning? Could they be playing the game for fun, really learning with evidence of performance improvement, or simply gaming the system, i.e., finding loopholes to fake that they are making progress? This volume endeavors to answer these questions.

Description based on publisher supplied metadata and other sources.

Author notes provided by Syndetics

<p>Christian Sebastian Loh's research interests focus on the performance measurement/assessment/ improvement with and the analytics for serious games and virtual environments. He was the 2008/09 President for the Division of Multimedia Production of the AECT (Association for Educational Communications and Technology), and recipient of the 2009 Defense University Research Instrument Program grant awarded by the Army Research Office (ARO). He has designed and developed serious games for research, Information Trails for telemetric performance measurement, Performance Tracing Report Assistant (PeTRA) for performance improvement via gameplay data visualization. He is currently serving on the editorial board of Technology, Knowledge and Learning (TKL), and as associate editor for International Journal of Gaming and Computer-Mediated Simulations (IJGCMS), and International Journal of Game-Based Learning (IJGBL).</p> <p>Dr. Yanyan Sheng's research interests focus on modeling dichotomous responses in educational and psychological measurement using advanced modern statistics, and specifically on developing and applying complex yet efficient Bayesian hierarchical item response models. She developed complex Bayesian multidimensional models with various latent dimensional structures and has written and published MATLAB programs for these models. She is also interested in applying the biased coin up-and-down design to adaptive testing.</p> <p>Dr. Ifenthaler's research interests focus on learning analytics, cognitive structures, complex problem solving, learning analytics, game-based and mobile learning, as well as computer-based assessment. He developed computer-based methodologies for the assessment and analysis of graphical and natural language representations (SMD Technology, HIMATT, AKOVIA, TASA) as well as games for teacher education (DIVOSA, SeSIM). Dr. Ifenthaler's research outcomes spans numerous co-authored books, book chapters, journal articles and international conference papers. He was a 2012 Fulbright Scholar-in-Residence at the Jeannine Rainbolt College of Education, the University of Oklahoma, USA and Interim Department Chair and Professor at the University of Mannheim, Germany. He is the 2013/2014 President for the AECT (Association for Educational Communications and Technology) Division Design and Development, 2013/2014 Chair for the AERA Special Interest Group Technology, Instruction, Cognition and Learning and Program Chair for the international conference on Cognition and Exploratory Learning in the Digital Age. Dr. Ifenthaler received the 2012 Outstanding Journal Article Award by AECT, 2009 Outstanding Reviewer Award for Educational Technology Research and Development and the 2006 Outstanding Dissertation Award by University of Freiburg, Germany. He is the Editor-In-Chief of Technology, Knowledge and Learning.</p>

There are no comments for this item.

Log in to your account to post a comment.