Table of Links
-
Prior Work and 2.1 Educational Objectives of Learning Activities
-
3.1 Multiscale Design Environment
3.2 Integrating a Design Analytics Dashboard with the Multiscale Design Environment
-
5.1 Gaining Insights and Informing Pedagogical Action
5.2 Support for Exploration, Understanding, and Validation of Analytics
5.3 Using Analytics for Assessment and Feedback
5.4 Analytics as a Potential Source of Self-Reflection for Students
-
Discussion + Implications: Contextualizing: Analytics to Support Design Education
6.1 Indexicality: Demonstrating Design Analytics by Linking to Instances
6.2 Supporting Assessment and Feedback in Design Courses through Multiscale Design Analytics
2.3 Assessing Creative Visual Design
As Gero and Maher discuss, in creative design, differences in the representation of ideas are a rule rather than an exception [33]. This makes assessing creative design challenging. A common pedagogical tool is to organize “design critiques”, where instructors, peers, and invited jury provide students with feedback on their work [22, 57]. Feedback is based on a broad range of criteria, across dimensions, such as product, process, content knowledge, and communication [25]. Criteria for product visual design focus on characteristics such as color, form, composition, and layout [25]. However, instructor-led assessment is not able to keep up with the growing demands of design education [47].
Approaches developed by creativity and crowd researchers align with visual design assessments in courses. Kerne et al. developed creativity metrics for assessing information-based ideation tasks and activities [44]. Their ‘Visual Presentation’ metric includes criteria such as whitespace, alignment, and organization of ideas in lines, grids, or other shapes. Human raters applied these metrics to assess free-form creative assemblage of ideas. Xu et al. developed guidelines for crowd assessment of visual design work, including criteria of proximity, alignment, repetition, and contrast [82]. For multiscale design, Lupfer et al. measured the number of scales used, by counting the number of times one needs to zoom in, in order to make inner elements legible [52]. Despite the potential of these approaches to assist design courses, they face similar limitations as instructor assessment, i.e., human support may not always be available.
Computational approaches have the advantage of processing data at speed and providing on demand assessment [38]. Reinecke et al. assessed website aesthetics by developing a regression model based on attributes such as color, symmetry, and the number of images and text groups [64]. Oulasvirta et al.’s Aalto Interface Metrics web service is aimed at providing assessments of a graphical user interface design, to help designers in identifying and addressing the shortcomings [59]. For multiscale design, Jain et al. developed a computational model based on spatial clustering, which identifies scales and clusters present in design work [40]. However, prior computational approaches did not focus investigation on experiences in design education course contexts.
Our investigation presents data about instructor experiences with AI-based analytics by invoking Jain et al.’s model [40] to compute multiscale design analytics. We consider prior work on the derivation and presentation of learning analytics in order to subsequently inform the approach we take for presenting multiscale design analytics in our research artifact.
2.4 Learning Analytics and Dashboards
Learning analytics and dashboards technologies have been found to support instructors in identifying student problems and intervening, which improved student retention and success [6]. In lecture-based course contexts, analytics—e.g., the number of times a student accessed a resource, time spent, and length of textual annotations—have assisted instructors in assessing student understanding [24]. Likewise, dashboards have proven effective in lecture-based contexts, providing a quick understanding of student progress through representations such as tables and graphs [27, 78].
Design course contexts involve project-based learning [28]. As Blikstein discusses, in project-based contexts, there is a need to measure more open-ended and complex characteristics, which can provide instructors with insights into students’ creative processes [13]. In design course contexts, specifically, Britain et al.’s study surfaced this need, as they presented Fluency analytics—i.e., the number of elements, words, and images—to design instructors [16]. While the instructors found Fluency useful in gaining insight into students’ efforts across various dimensions, they desired more sophisticated analytics. In our investigation, we present AI-based analytics to instructors—the number of scales and clusters—providing them insights into students’ multiscale design organization.
Prior analytics dashboards focus on presenting facts. But with advances in AI and its applicability in diverse domains, the community has begun researching AI-based analytics, which include inferences [36, 48]. Presenting AI results in a comprehensible manner is vital, so that users can trust the system and rely on its assistance [67]. Among prior work, Oulasvirta et al. provide visualizations of assessed visual design characteristics—e.g., of clutter, colorfulness, and white space—within website design. However, they did not assess or present multiscale design characteristics. The present research focuses on conveying the meaning of scales and clusters analytics computed with AI recognition. For this, we integrate the dashboard presentation of analytics with the actual design work they measure.
Authors:
(1) Ajit Jain, Texas A&M University, USA; Current affiliation: Audigent;
(2) Andruid Kerne, Texas A&M University, USA; Current affiliation: University of Illinois Chicago;
(3) Nic Lupfer, Texas A&M University, USA; Current affiliation: Mapware;
(4) Gabriel Britain, Texas A&M University, USA; Current affiliation: Microsoft;
(5) Aaron Perrine, Texas A&M University, USA;
(6) Yoonsuck Choe, Texas A&M University, USA;
(7) John Keyser, Texas A&M University, USA;
(8) Ruihong Huang, Texas A&M University, USA;
(9) Jinsil Seo, Texas A&M University, USA;
(10) Annie Sungkajun, Illinois State University, USA;
(11) Robert Lightfoot, Texas A&M University, USA;
(12) Timothy McGuire, Texas A&M University, USA.
This paper is
