| Literature DB >> 33967587 |
Christiane Gresse von Wangenheim1, Jean C R Hauck1, Fernando S Pacheco2, Matheus F Bertonceli Bueno1.
Abstract
Teaching Machine Learning in school helps students to be better prepared for a society rapidly changing due to the impact of Artificial Intelligence. This requires age-appropriate tools that allow students to develop a comprehensive understanding of Machine Learning in order to become creators of smart solutions. Following the trend of visual languages for introducing algorithms and programming in K-12, we present a ten-year systematic mapping of emerging visual tools that support the teaching of Machine Learning at this educational stage and analyze the tools concerning their educational characteristics, support for the development of ML models as well as their deployment and how the tools have been developed and evaluated. As a result, we encountered 16 tools targeting students mostly as part of short duration extracurricular activities. Tools mainly support the interactive development of ML models for image recognition tasks using supervised learning covering basic steps of the ML process. Being integrated into popular block-based programming languages (primarily Scratch and App Inventor), they also support the deployment of the created ML models as part of games or mobile applications. Findings indicate that the tools can effectively leverage students' understanding of Machine Learning, however, further studies regarding the design of the tools concerning educational aspects are required to better guide their effective adoption in schools and their enhancement to support the learning process more comprehensively.Entities:
Keywords: Computing education; K-12; Machine learning; Visual tool
Year: 2021 PMID: 33967587 PMCID: PMC8087535 DOI: 10.1007/s10639-021-10570-8
Source DB: PubMed Journal: Educ Inf Technol (Dordr) ISSN: 1360-2357
Human-centric ML process
| Phase | Description |
|---|---|
| Requirements analysis | During this stage, the main objective of the model and its target features are specified. This also includes the characterization of the inputs and expected outputs, specifying the problem. This may also involve design thinking approaches to define the objectives with existing needs and problems |
| Data management | During data collection, available datasets are identified and/or data is collected. This may include the selection of available datasets (e.g., ImageNet), as well as specialized ones for transfer learning. The data is prepared by validating, cleaning, and preprocessing the data. Data sets may be labeled for supervised learning. The data set is typically split into a training set to train the model, a validation set to select the best candidate from all models, and a test set to perform an unbiased performance evaluation of the chosen model on unseen data |
| Feature engineering | Using domain knowledge of the data, features are created including feature transformation, feature generation, selecting features from large pools of features among others |
| Model learning | Then a model is built or more typically chosen from well-known models that have been proven effective in comparable problems or domains by feeding the features/data to the learning algorithm. Defining network architectures involves setting fine-grained details such as activation functions and the types of layers as well as the overall architecture of the network. Defining training routines involves setting the learning rate schedules, the learning rules, the loss function, regularization techniques, and hyperparameter optimization to improve performance |
| Model evaluation | The quality of the model is evaluated to test the model providing a better approximation of how the model will perform in the real world, e.g., by analyzing the correspondence between the results of the model and human opinion. The evaluation of ML models is not trivial, and many methods can be applied for model evaluation with various metrics such as accuracy, precision, recall, F1, mean absolute error, among others, which appropriateness depends on the specific task |
| Model deployment and monitoring | During the production/deployment phase, the model is deployed into a production environment to create a usable system and apply it to new incoming events in real-time |
Search string for each source
| Source | Search string |
|---|---|
| ACM Digital Library | [Abstract: "visual programming"] OR [Abstract: "block-based programming"] OR [Abstract: "gui tool"] OR [Abstract: toolkit]] AND [[All: "machine learning"] OR [All: "neural network"]] AND [Publication Date: (01/01/2010 TO 12/31/2020)] |
| ERIC | ((abstract:“visual programming” OR abstract:“block-based programming” OR abstract:“gui tool” OR abstract:“toolkit”) AND (abstract:"machine learning" OR abstract:"neural network")) pubyear: since 2010 |
| IEEE Xplore Digital Library | (("Abstract":“visual programming” OR "Abstract":“block-based programming” OR "Abstract":“gui tool” OR "Abstract":“toolkit”) AND ("Abstract":"machine learning" OR "Abstract":"neural network")) Filters Applied: 2010—2020 |
| Science Direct | Year: 2010–2020 Title, abstract, keywords: (("visual programming" OR "block-based programming" OR "gui tool" OR toolkit) AND ("machine learning" OR "neural network")) |
| Scopus | TITLE-ABS-KEY ((( "visual programming" OR "block-based programming" OR "gui tool" OR toolkit) AND ( "machine learning" OR "neural network"))) AND PUBYEAR > 2009 AND PUBYEAR < 2021 AND ( LIMIT-TO ( SUBJAREA, "COMP")) |
| Web of Science | (AB = (("visual programming" OR "block-based programming" OR "gui tool" OR toolkit) AND ("machine learning" OR "neural network"))) AND LANGUAGE: (English) Timespan: 2010–2020. Indexes: SCI-EXPANDED, SSCI, A&HCI, CPCI-S, CPCI-SSH, ESCI |
| Wiley | "visual programming" OR "block-based programming" OR "gui tool" OR toolkit" in Abstract and "machine learning" OR "neural network" in Abstract (Filter 2010–2020) |
| "block-based" "machine learning" |
Number of artifacts identified per stage of selection
| Source | No. of search results | No. of analyzed artifacts | No. of potentially relevant artifacts | No. of relevant artifacts |
|---|---|---|---|---|
| ACM | 263 | 263 | 12 | 3 |
| ERIC | 160 | 160 | 2 | 0 |
| IEEE | 310 | 300 | 9 | 2 |
| Science Direct | 76 | 76 | 3 | 0 |
| SCOPUS | 703 | 300 | 8 | 4 |
| Web of Science | 434 | 300 | 8 | 3 |
| Wiley | 28 | 28 | 0 | 0 |
| 484,000 | 300 | 14 | 5 | |
| Backward snowballing | 15 | 6 | ||
| Forward snowballing | 7 | 3 | ||
| 24 (without duplicates) | ||||
Fig. 1Quantity of relevant artifacts per type
Visual tools for teaching ML in K-12
| Name | Brief description | Scope | Reference(s) | |
|---|---|---|---|---|
| ML platform | Deployment platform9 | |||
| AlpacaML | An iOS application that supports users in building, testing, evaluating, and using ML models of gestures based on data from wearable sensors | x | Scratch | (Zimmermann-Niefield et al., |
| BlockWiSARD | A visual programming environment that makes use of the WiSARD WANN to enable people to develop systems with some learning capability | x | BlockWiSARD | (Queiroz et al., |
| Cognimates | An AI education platform for programming and customizing the development of AI models embodied in devices, such as Amazon’s smart speaker Alexa, Cozmo, etc | x | Scratch | (Druga, |
| DeepScratch | A programming language extension to Scratch that provides elements to facilitate building and learning about deep learning models by either training a neural network based on built-in datasets or using pre-trained deep learning models | x | Scratch | (Alturayeif et al., |
| eCraft2learn | Additional blocks to the visual programming language Snap! that provides an easy-to-use interface to both AI cloud services and deep learning functionality | x | Snap! | (Kahn & Winters, |
| Educational Approach to ML with Mobile Applications | A set of App Inventor extensions spanning several ML subfields, among which the Teachable Machine extension allows to develop an ML model | x | App Inventor | (Zhu, |
| Google Teachable Machine (TM) | A web-based interface that allows people to train their own ML classification models, without coding, using their webcam, images, or sound | x | – | (Carney, |
| LearningML | A platform aimed at learning supervised ML for teaching ML in K-12 | x | Scratch | (Rodríguez-García et al., |
| mblock | A block and code-based programming software and its Teachable Machine extension that allows to create an ML model | x | mblock | |
| Milo | A web-based visual programming environment for Data Science Education | x | – | (Rao et al., |
| ML4K | A tool that introduces ML by providing hands-on experiences for training ML systems and building things with them | x | Scratch, App Inventor, Python | (Lane, |
| Orange | A data visualization, ML, and data mining toolkit that features a visual programming front-end for exploratory data analysis and interactive data visualization | x | – | (Demšar, |
| Personal Image Classifier (PIC) | A web system where users can train, test, and analyze personalized image classification models with an extension for MIT App Inventor that allows using the models in apps | x | App Inventor | (Tang et al., |
| RapidMiner | Comprehensive data science platform with visual workflow design and full automation of ML solutions | x | – | (Sakulkueakulsuk et al., |
| ScratchNodesML | A system enabling children to create personalized gesture recognizers and share them | x | Scratch | (Agassi et al., |
| SnAIp | A framework that enables constructionist learning of Reinforcement learning with Snap! | x | Snap! | (Jatzlau et al., |
9We consider the availability of a deployment platform only when the tool allows the model deployment in platforms that can be directly accessed by the students.
Fig. 2Supported environments for the deployment of the created ML models
General characteristics of the tools
| Name | Website of running version of the tool | Platform | Usage license | User registration/API key | Language(s) |
|---|---|---|---|---|---|
| AlpacaML | – | online/app | – | Key to connect with Scratch | English |
| BlockWiSARD | – | desktop | Free | – | English, Portuguese |
| Cognimates |
| online | Free | required | English |
| DeepScratch | - | online | NI | - | English |
| eCraft2learn |
| online | Free | required | English |
| Educational Approach to ML with Mobile Applications |
| online | Free | – | English |
| Google TM |
| online | Free | – | English, German, Spanish, Japanese |
| LearningML |
| online | Free | only for cloud storage and sharing | English, Spanish, Catalan, Galician |
| mblock |
| online/ desktop/ app | Free | only for cloud storage/ sharing | 26 languages, including English |
| Milo |
| online | Free | – | English |
| ML4K |
| online | Free | required for most projects | More than 15 different languages including English |
| Orange |
| desktop | Free | – | English |
| PIC |
| online | Free | – | English |
| RapidMiner |
| online/ desktop | Paid | required | English |
| ScratchNodesML | – | online | – | – | English |
| SnAIp |
| online | Free | – | English, German |
Fig. 3Frequency of educational stages targeted by the tools
Educational characteristics of the tools
| Name | Target audience | Accompanying educational units | ||
|---|---|---|---|---|
| Type and duration | Educational strategy | Educational resources | ||
| AlpacaML | Middle- and high-school students without ML experience | 3-h workshop | The tool is demonstrated and students learn to build a predefined model by following an interactive tutorial. Then students build models of their own physical activity by collecting, labeling data, and evaluating the model | – |
| BlockWiSARD | People in general, including children | – | – | – |
| Cognimates | Children (7 to 12 years) | 1.5–2 h workshops | Children are asked to draw and imagine the future of AI agents. Then they are introduced to different AI agents (Alexa home assistant, Jibo andCozmo robots) by playing with them. Then they get to program the agents with their dedicated coding applications | Lesson plans, example projects, tutorials, teacher guides |
| DeepScratch | Kids and high-school students | – | – | – |
| eCraft2learn | Non-experts, K-12 students | – | Students first discuss examples of AI applications and are introduced to Snap! and how to use its AI blocks. Then they experiment with speech synthesis and create programs using image recognition | Learning process, tutorials, and exercises, example programs, videos |
| Educational Approach to ML with Mobile Applications | High-school students | 6-weeks course | A simple curriculum using the extensions following a series of tutorials. Each class is split up into two parts: lecture (up to the first half of the class) and the mobile application development | Lesson outlines, interactive and video tutorials, slides |
| Google TM | Novices without ML or coding expertise (children and adults) | 45 min-4hours class as part of an AI course | Several educational units running from online tutorials instruction step-by-step how to build an ML model to their integration as part of AI courses such as (Payne, | Tutorials, example programs, slides, videos |
| LearningML | Children or people interested in AI | – | – | Video tutorials, user manual (in Spanish only) |
| mblock | Children (10–13 years) | 2 h class | After an introduction of basic concepts, students are guided to build a recycling recognition system | Lesson plans, videos, tutorials |
| Milo | High-school to undergraduate students in non-computer science fields without programming experience | – | – | – |
| ML4K | Young people (6 – 19 +) (Beginner to advanced level) | 1–4 h workshops | Students follow a step-by-step tutorial | Tutorials, teacher guides |
| Orange | K-12, university students, professionals | 1-h to 1-week workshops | Students follow a step-by-step tutorial to solve a particular problem | Lecture notes, video tutorials, example workflows |
| PIC | High-school students | Workshop with two 50 min classes | The first classes start with a short introduction to basic ML concepts, then students use the PIC to build an ML model. In the second class, students use the extension with the trained models to create intelligent mobile applications | Lesson plan, teacher guide, slides, tutorials |
| RapidMiner | Professionals, Middle school students | 3 days workshop | The first phase introduces ML, and students develop an ML model that predicts the sweetness of mangoes using only the outer physical properties. In the second phase, students have to classify mangoes into different categories. In the third phase, the students aim at using the prediction from ML | – |
| ScratchNodesML | Children | – | – | – |
| SnAIp | High-school students | – | Introduction to reinforcement learning, practical activity following a pre-defined tutorial including the deployment, evaluation, and optimization of the model | Tutorials, solution examples, cards, teacher instructions (in German only) |
Fig. 4Examples of ML development support
Fig. 6Frequency of tool types and steps of the ML process supported
General characteristics concerning the ML platform
| Name | Type of tool | Supported ML tasks | Supported parts of the ML process |
|---|---|---|---|
| AlpacaML | Workflow | Motion recognition | Data management, Model learning, Model evaluation, Model deployment |
| BlockWiSARD | Block-based | Image recognition | Data management, Model learning, Model deployment |
| Cognimates | Workflow | Image and text recognition | Data management, Model learning, Model evaluation, Model deployment |
| DeepScratch | Block-based | Image and numerical recognition | Model learning, Model evaluation, Model deployment |
| eCraft2learn | Block-based | Image and speech recognition, speech synthesis, object detection, and segmentation | Data management, Model learning, Model evaluation, Model deployment |
| Educational Approach to ML with Mobile Applications | Workflow | Image and text recognition, object detection | Data management, Model learning, Model deployment |
| Google TM | Workflow | Image, sound, and pose recognition | Data management, Model learning, Model evaluation |
| LearningML | Workflow | Image, numerical, sound, and text recognition | Data management, Model learning, Model evaluation, Model deployment |
| mblock | Workflow | Image recognition | Data management, Model learning, Model evaluation, Model deployment |
| Milo | Block-based | Data clustering | Model learning |
| ML4K | Workflow | Image, text, number, sounds, and face recognition | Data management, Model learning, Model evaluation, Model deployment |
| Orange | Dataflow | Image and text recognition | Data management, Feature engineering, Model learning, Model evaluation |
| PIC | Workflow | Image recognition | Data management, Model learning, Model evaluation, Model deployment |
| RapidMiner | Dataflow | Data classification and prediction | Data management, Feature engineering, Model learning, Model evaluation |
| ScratchNodesML | Block-based | Gesture recognition | Data management, Model evaluation, Model deployment |
| SnAIp | Block-based | Game agent | Model learning, Model deployment |
Fig. 5Frequency of supported ML tasks
Fig. 7Frequency of supported data types and input options
Characteristics concerning data
| Name | Types of data | Input options | Availability of datasets ready to use |
|---|---|---|---|
| AlpacaML | Motion | Wearable sensor | – |
| BlockWiSARD | Image | File upload, webcam | – |
| Cognimates | Image, Text | File upload | – |
| DeepScratch | Image, Text | Webcam | Iris and MNIST |
| eCraft2learn | Image, pose, sound | File upload, webcam, microphone, | – |
| Educational Approach to ML with Mobile Applications | Image, video stream | Webcam | – |
| Google TM | Image, pose, sound | File upload, webcam, microphone | Initial Teachable Machines datasets (e.g., cat-dog dataset) |
| LearningML | Image, sound, text | File upload, keyboard, webcam, microphone | – |
| mblock | Image | Webcam | – |
| Milo | Numbers, text | File upload | Few popular datasets used in introductory ML courses, like the Iris dataset |
| ML4K | Image, sound, text | Upload from a browser, webcam, microphone | - |
| Orange | Image, text, network graph, gene expression, time series, mosaic spectral image, SQL table | File upload | 65 popular datasets from UCI ML Repository and other sources |
| PIC | Image | File upload, webcam | – |
| RapidMiner | Numbers, Text | Instant connection to diverse data sources | – |
| ScratchNodesML | Motion | Via Bluetooth from a physical hardware device | – |
| SnAIp | Game actions | Snap! | – |
Characteristics concerning ML model and learning
| Name | ML algorithms/backend | Model parameters | Types of learning | Training parameters |
|---|---|---|---|---|
| AlpacaML | DTW algorithm | – | Supervised | – |
| BlockWiSARD | WiSARD | – | Supervised | – |
| Cognimates | IBM Watson SDK and API for custom classifiers, uClassify/Clarifai | – | Supervised, reinforcement | – |
| DeepScratch | Dense, RNN, and CNN models or pre-trained models offered byTensorflow.js | Quantity of layers | Supervised | Epochs, batch size |
| eCraft2learn | Pretrained cloud models, built-in browser support, IBM Watson | Model creation defining layers/neurons, optimization method, loss function | Supervised | Training iterations, learning rate, validation split, data shuffle |
| Educational Approach to ML with Mobile Applications | Tensorflow.js | – | Supervised | – |
| Google TM | Tensorflow.js | – | Supervised | Epochs, batch size, learning rate |
| LearningML | Tensorflow.js | – | Supervised | – |
| mblock | NI | – | Supervised | – |
| Milo | Tensorflow.js | Number of features, type of layers connections, number of nodes, activation function, optimizer function | Supervised and unsupervised | Learning rate, loss function, training metrics, iterations |
| ML4K | IBM Watson | – | Supervised and reinforcement learning | – |
| Orange | Diverse ML algorithms, including naive Bayesian classifier, k-nearest neighbors, induction of rules and trees, support vector machines, neural networks, … | Diverse parameters depending on the respective ML algorithm | Supervised and unsupervised | Learning rate, Epochs, Batch size among others |
| PIC | Tensorflow.js | Model (MobileNet or Squeezenet), model type (convolution, flatten), and quantity of layers | Supervised | Learning rate, epochs, training data fraction, optimizer |
| RapidMiner | Hundreds of ML algorithms | Activation function, hidden layers number, and size, gradient descent method | Supervised, unsupervised | Epochs, number of training data rows to be processed per iteration, learning rate, learning rate decay, momentum, loss function, … |
| ScratchNodesML | 1NN-DTW algorithm | – | Supervised | – |
| SnAIp | Q-Learning algorithm | – | Reinforcement learning | Learning rate, discount factor, exploration rate, available actions |
Characteristics concerning evaluation
| Name | Evaluation metrics | Dataset splitting |
|---|---|---|
| AlpacaML | Testing with new images indicating their label | After model training, students can add new actions the users are making for testing |
| BlockWiSARD | – | – |
| Cognimates | – | After model training, students can add new images for testing |
| DeepScratch | Training loss & accuracy, Testing accuracy, object correctness/Confidence level | As the model is training, the user can observe how the accuracy and the loss values are being optimized after each epoch. Once the training is done, the accuracy of the testing data will be available to the user |
| eCraft2learn | Training loss, accuracy, duration | Manual splitting by user |
| Educational Approach to ML with Mobile Applications | Confidence level shown as black boxes under a class | – |
| Google TM | Training accuracy and loss function, accuracy per class category | Fixed splitting separating 15% as test data |
| LearningML | Testing with new images indicating their label together with the degree of confidence. In advanced mode, a set of gauge bars are added to the label indicating the numerical probability, expressed as a percentage, as well as a confusion matrix | – |
| mblock | Testing indicating the label with the highest confidence level | After model training, students can test new images |
| Milo | Accuracy plot, Loss plot | – |
| ML4K | – | – |
| Orange | Classifier metrics: Area under ROC, accuracy, F-1, precision, recall, Specificity, LogLoss, confusion matrix Regression metrics: MSE, RMSE, MAE, R2, CVRMSE Graphical metrics: ROC analysis, Lift curve, calibration plot | Manual splitting by proportion, by the number of instances, as well as apply cross-validation partitions |
| PIC | Prediction results per image, correctness table, confidence graph | After model training, students can add new images for testing in the same way as for the training step |
| RapidMiner | Precision and Recall, Root Mean Square Error, Average Absolute Error, Average Relative Error, Squared Correlation | – |
| ScratchNodesML | – | – |
| SnAIp | – | – |
Fig. 8Frequency of evaluation metrics
Fig. 9Examples of support for the evaluation of the ML models
Fig. 10Example of block-based deployment support (PIC and Cognimates)
Characteristics concerning the deployment platform
| Name | Export only | Deployment support | |
|---|---|---|---|
|
|
| ||
| AlpacaML | – | Scratch extension | – |
| BlockWiSARD | – | BlockWiSARD | 5 blocks for model development |
| Cognimates | – | Scratch extension | 12 vision training blocks, 9 text training blocks, 4 sentiment blocks, 11 input/transformation blocks |
| DeepScratch | – | Scratch extension | 119 blocks |
| eCraft2learn | – | Snap! | Diverse blocks for several ML tasks |
| Educational Approach to ML with Mobile Applications | – | App Inventor extension | Diverse block extensions |
| Google TM | tensorflow.js | – | – |
| LearningML | – | Scratch extension | 3 text recognition blocks, 6 image recognition blocks, 5 sound recognition, 3 number recognition blocks, 4 general ML blocks |
| mblock | – | mblock extension | 3 image recognition blocks |
| Milo | Python code | – | – |
| ML4K | – | Mostly Scratch, some for App Inventor and Python | – |
| Orange | Models in Python pickle format | – | – |
| PIC | – | App Inventor extension PersonalImageClassifier.aix | The PIC extension has three properties in the designer and 11 blocks in the blocks editor |
| RapidMiner | Python code | RapidMiner proprietary blocks | User can define new blocks with configurable functions |
| ScratchNodesML | – | Scratch extension | – |
| SnAIp | – | Snap! extension | Diverse block extensions |
Information regarding the development of the tools
| Name | Scientific methodology | Code available | Code license |
|---|---|---|---|
| AlpacaML | – | – | – |
| BlockWiSARD | Based on Constructivism, Constructionism and knowledge building and intelligent agents, learning process and the perception of intelligence | – | – |
| Cognimates | Participative design with codelab sprites |
| Open-source |
| DeepScratch | The Incremental Agile Model is used to develop the DeepScratch extension |
| – |
| eCraft2learn | – |
| BSD |
| Educational Approach to ML with Mobile Applications | – | – | – |
| Google TM | – |
| Apache License 2.0 |
| LearningML | – | – | GNU General Public |
| mblock | – | – | – |
| Milo | – |
| Apache 2.0 License |
| ML4K | – |
| Apache-2.0 License |
| Orange | – |
| GNU GPL 3.0 |
| PIC | – |
| Apache-2.0 License |
| RapidMiner | – | – | Proprietary |
| ScratchNodesML | – | – | – |
| SnAIp | – | – | – |
Information regarding the evaluation of the tools
| Name | Evaluated quality factors | Research design | Sample size | Application context | Findings |
|---|---|---|---|---|---|
| AlpacaML | How young people could use AlpacaML to build, test, and refine models of athletic skills that they wish to improve | Case study | 6 | Students aged 8–14 years who had experience with scratch | Leverages the students’ domain knowledge to collect data, build models, test and evaluate models; allows them to conduct rapid iterations to test hypotheses about model performance and reformulate their models; allows students to develop theories about how the model works, and the characteristics of a good model |
| BlockWiSARD | – | – | – | – | – |
| Cognimates | Impact of tool design and actors not directly related with the tool (e.g., how do children imagine AI in the future, how is their cultural and social-economical context influence their perception of smart technologies, what role do parents and technology access play) | Series of case studies | 102 | Children (7–12 years old from schools from 4 different countries | The children developed a rich grasp of AI concepts through play and coding with our platform. They also became more skeptical of the agents’smarts and truthfulness even if they continued to perceive them as friendly and exciting |
| DeepScratch | Usability in terms of effectiveness, efficiency, and satisfaction | Case study | 5 | Children | All users completed the two tasks successfully. The usability test indicates that DeepScratch is efficient in supporting users in achieving their goals and tasks in minimal time. In terms of satisfaction, all participants reported that the applications were very easy to create, and supplemented their understanding of deep learning models. However, three students raised the need of translating the blocks into the Arabic Language |
| eCraft2learn | Student understanding of AI and agent environment, perception and action, as well as student attention, engagement, and enjoyment and usability of the Snap! environment | Case-study | 40 | Senior High Schools and Vocational students | 77.5% of the students indicated that they understand AI. All students enjoyed the learning process except for one student. More than 40% of the students stated that it was easy. 82.5% of the students were interested and motivated to make the AI program by using Snap! |
| Educational Approach to ML with Mobile Applications | ML learning and interest | Pre-/post-test case study | 10 | High-school students | The extensions proved beneficial to grasp the concepts introduced in the first part of the class |
| Google TM | ML learning | Informal | NI | Middle-school students | Results indicate that Google TM has been useful to introduce ML concepts, allowing an understanding of concepts like bias and fairness. Google TM also seems to facilitate active learning of AI concepts by letting students interact with those concepts by making models themselves |
| LearningML | Learning of AI knowledge | Pre/post-test case study | 14 | Adult students with prior programming experience but no previous training on AI | The results of the intervention seem very promising, especially taking into account that this effect was experienced after just 2 h of treatment, which highlights the impact that LearningML had on the learners and draws attention to its potential as a tool to foster AI and ML knowledge |
| mblock | – | – | – | – | – |
| Milo | Usefulness and ease of use of the tool, along with the perceived level of understanding of ML concepts | Pre/post-test case study | 20 | Undergraduate students from a first introductory course in ML | 90% of participants reported that visualizations were very easy to create using Milo and supplemented their understanding of the concepts. 70% of students felt the tool would be very useful for novice learners |
| ML4K | – | – | – | – | – |
| Orange | – | – | – | – | – |
| Personal Image Classifier | Usability of the PIC tools and their effectiveness in introducing novices to ML | Pre-/post-test case study | 23 | Workshops with high school students | The students enjoyed using the PIC interface and the Expressions Match app, indicating that the tools were intuitive and fun to use. Students were able to use the analysis tools to develop reasoning for how their models were behaving. The way the tool provides visual representations of data-enabled guided discussions about dataset imbalance and how that can lead to what appears to be a biased model |
| RapidMiner | Learning of ML concepts, fun/engagement, awareness/attitude towards ML | Pre-/post-test case study | 84 | Middle school students | Based on the results, ML can be used as a powerful tool to successfully conduct interdisciplinary education at the middle school level. Students combined knowledge, observations, and teamwork efforts to achieve the goal of using the ML model for prediction. Students had more fun, engagement, and hands-on interactivity in the workshop compared to their regular classroom, even though the topic of AI is much more complex and challenging |
| ScratchNodesML | – | – | – | – | – |
| SnAIp | – | – | – | – | – |