Tech education and training. Cloud computing. Infrastructure as code. Serverless architectures. Machine Learning.
I've taught certificate and diploma level network administration courses for several years in a TAFE college, the Australian equivalent of the North American community college. A common challenge in this role is the collection of consistent, accurate and compliant evidence to assess complex network configuration tasks.
One tried and true method is the "Screenshot into Word": students collect a sequence of screenshots as they perform each stage of a task, then assemble those screenshots along with captions and other relevant material (tables, diagrams) into a word processor document. The result is a kind of HOWTO document which is usually an accurate enough record of what occurred and the student's understanding of each step.
Problems with this method:
- Assumes an outdated approach to system and network administration where administrators sit in front of computer displays all day hand-configuring individual devices, whether on the command line or in a GUI.
- Students spend a lot of time processing images and fiddling with word-processor document formatting, neither of which have anything to do with the task at hand. Not optimal unless the unit of study is actually about producing documentation or desktop publishing.
- Assessment becomes as much an English test as anything else. Different skill levels in English literacy produce different results, on the surface at least. Assessors can get bogged down in trying to extract evidence of actual skills and knowledge from the evidence provided.
- Difficult to repeat, reproduce or modify the task. If this needs to be done for any reason, the entire manual process must be repeated.
- Difficult to automate assessment when assessment data is locked up inside a word-processor document, sometimes with unpredictable formatting.
A promising avenue to avoid these problems and enable more authentic, equitable and consistent assessment is to require that lab procedures be documented and submitted in the form of scripts and configuration files. In other words, a "Show Me the Code" approach to assessment.
Scripted automation is fast becoming the norm in the IT industry, and there are now a plethora of tools out there that are commonly used to do this. These include scripting languages such as Python, Bash and PowerShell, and configuration engines such as Ansible, Terraform, CloudFormation, DSC and others. The more we can get our students automating network administration tasks in code, the better we are preparing them for a future career in IT.
By capturing procedures in code rather than word-processor documents, and submitting that code for assessment, students can succinctly and accurately detail every step performed in a complex configuration sequence. The code may require additional internal comments (also an authentic industry practice), but rather than commenting on the content of a screenshot students are now commenting on the how and why of a particular command or function in a script.
Scripts and configuration files are much more amenable to automated assessment. Automated tests can be run against student-built infrastructure to determine if it has been configured correctly.
This is not to say that students should not initially learn manual processes. They should, especially when new to a task. This way, they also learn how time-consuming and error prone manual configuration can be, and why automation is a good thing.
Having students first perform a task manually, then convert that process into an automated script not only reflects authentic industry best-practice, but probably also goes a long way to satisfying the assessment requirement for sufficiency (completing a task more than once in different contexts).
Another common performance criterion in many TAFE college IT units is the "test-debug-modify" cycle. Having procedures embodied in a script is a huge benefit in this regard. Scripts can be easily modified incrementally to debug and modify specific procedures, without students having to remember and repeat an entire sequence of steps (error prone and time wasting).
Assessing scripts and configuration files for content and structure greatly reduces any English literacy effects. Code is code is code, regardless of the language of the coder. Students are on a more level playing field when the primary focus is on what is in the code.
A focus on code as documentation and assessment can be integrated into a broader Investigate Demonstrate Automate learning cycle based on projects which solve real world problems:
Investigate: Research the task, the technology and the options.
Demonstrate: Build a working solution.
Automate: Implement the solution in code.
- The Investigate phase is about gaining content knowledge and the vocabulary to express it.
- It broadly corresponds to the Knowledge Evidence component of the assessment requirements of each unit of competency.
- Activities in this phase include reading websites and books, viewing videos, completing online tutorials, and class discussions.
- Assessment in this phase includes content-related questions (verbal and written), quizzes and formal tests. Success in the two subsequent phases depends on learning in this phase.
- The Demonstrate phase is about applying knowledge to solving a specific task (which may involve many smaller tasks).
- It broadly corresponds to the Performance Evidence component of the assessment requirements of each unit of competency.
- Activities in this phase are practical and hands-on, whether in a physical or virtual environment. Students apply knowledge, experiment, test, modify, assemble and configure practical solutions to specific problems in specific contexts.
- Assessment in this phase includes observation check-lists, and evidence of specific tasks being successfully completed such as screenshots.
- The Automate phase is about re-implementing practical solutions in code.
- The Automate phase is about reliability, repeatability and sufficiency (performing a task more than once in different contexts). It is also about efficiently managing the "test-debug-modify" cycle that is explicitly required in many units.
- Activities in this phase include writing, testing and commenting scripts (short pieces of code designed to simplify or automate a system or network administration task) using a common industry-standard scripting language. Well documented and commented scripts elegantly and succinctly combine both knowledge evidence and performance evidence).
- Assessment in this phase is based on the scripts or code presented.
The Demonstrate and Automate phases can sometimes overlap or be delivered concurrently. This way, practical tasks can be "codified" in scripts as each step is completed, allowing direct comparison between manual and automated procedures. Either way, students learn that tasks performed laboriously and in an error-prone manner in a GUI or on a CLI can usually be re-implemented with a few crisp lines of code in a script.
This approach is not particularly ground-breaking or revolutionary in an IT industry context. It mostly reflects the reality of how problems are solved in the modern IT industry. What it does bring to more traditional TAFE college training and assessment practices is the notion of code as documentation and assessment.
Create your free account to unlock your custom reading experience.