Make programmatic assessment a breeze!

Learn more about programmatic assessment, a form of holistic assessment, in combination with our e-portfolio Portflow and a Virtual Learning Environment (VLE) or Learning Management System (LMS).

Expertise page programmatic assessment


At Drieam, we believe that taking responsibility for one’s own learning is an important condition for preparing for the rapidly changing future of which lifelong learning is an integral part. The best way to learn to take responsibility is by taking ownership of one’s own learning as much as possible. Our tool, a student-driven development portfolio called Portflow, is designed to increase ownership and foster agency among students by giving them insight into their own development.

Within higher education, there is an increasing interest in a more holistic approach to assessment, that closely matches Portflow’s vision: programmatic assessment. Programmatic assessment focuses on the student’s learning process. Tests are part of the learning process and are used to guide, stimulate, and provide insight into the student’s own progress (Peeters, 2019). On this page, you will learn more about Portflow, and how it can support programmatic assessment, a form of holistic assessment, when combined with a Virtual Learning Environment (VLE).

Programmatic assessment in Portflow and a VLE

Programmatic assessment does not follow a singular approach, as it is a concept rather than a prescriptive method. Thus, there is no step-by-step guide for implementing programmatic assessment (Baartman et al., 2020). Similarly, there is no one “right” way to use Portflow in the context of programmatic assessment. Multiple approaches exist for integrating programmatic assessment with tools like a digital portfolio and a VLE. To understand how Portflow in combination with a VLE can support programmatic assessment, it’s essential to first discuss the principles of programmatic assessment and the components of Portflow.

The principles of programmatic assessment

The principles of programmatic assessment (Baartman et al., 2020) are as follows:

1. Insight into student development occurs through a mix of different data points;

2. Each data point is feedback-oriented and has no fail/pass decision;

3. Learning outcomes are the backbone of the assessment program;

4. There is a constant dialogue about using feedback for self-direction;

5. The number of data points and the severity of the decision are proportionally related;

6. The severity of a decision guides the amount of assessor expertise required.

The components of Portflow

In Portflow, students (under their own direction) can visualize their developmental progress. They are accountable for collecting evidence, soliciting feedback from teachers, fellow students, and experts, and recording progress evaluations. Teachers/programs can offer structured templates for different parts of the portfolio to students. In the template, the program can define sections, collections, and goals. This allows the program to provide scaffolding such as setting specific goals in the template for the first and second years while allowing more independence in the third year. The evidence (data points) is added by the student, who can also add to the structure of the template. The components (sections, collections, goals) in Portflow are adaptable for each course, depending on the educational program:
  • Section: set of collections (year, semester, project, minor, subject)
  • Collection: where learning takes place (course, module, project, internship)
  • Goal: that which is learned toward (competency, learning objective, skill, learning outcome)
  • Evidence: artefacts, reports, products, reflections (data points)

The integration of programmatic assessment in Portflow and the VLE

Below we illustrate how the various principles of programmatic assessment are supported by Portflow and the VLE.

In programmatic assessment, student development is tracked through data points. Created assignments, professional products, presentations, feedback from fellow students, teachers, or clients, all serve as information segments about the learning process. Students incorporate feedback at each data point. Thus, they learn to reflect on their learning process and learn to formulate what next steps are needed (Baartman et al., 2020). In Portflow, data points are added as evidence. A rich mix of data points is supported by Portflow. This not only stimulates students’ creativity, but also encourages them to take responsibility for their own learning. Students can decide which piece of evidence best demonstrates a particular learning outcome. The pieces of evidence are linked to specific goals. Data points can be imported from a VLE, including associated feedback, or placed directly into Portflow. A student can request feedback in Portflow from all those involved in the learning process (e.g., teachers, fellow students and external parties).

In programmatic assessment, no fail/pass decision is made based on a single data point. Thus, no credits are associated with a single data point (Baartman et al., 2020). This stimulates the learning process by allowing students to make mistakes and make the most of feedback (Platform Leren van Toetsen, 2024). Portflow supports this by enabling students to receive detailed feedback at the evidence level and engage in a dialogue about it. Furthermore, new versions of evidence can be created where the complete feedback dialogue for each version is visible. This feature encourages students to incorporate feedback into improved versions and learn from the feedback they receive. The student is then able to link the given feedback to a specific goal. In the goal chart, the feedback is clearly displayed and easily accessible.

In programmatic assessment, a framework of competencies, skills, professional tasks, or learning outcomes forms the backbone of both the curriculum and the testing program. This clarifies for both students and teachers how all assignments, feedback, etc. contribute to the student’s learning process and learning outcomes (Baartman, et al., 2020). 

Portflow allows students to create such a framework. Students create sections, collections, and goals and link evidence to the goals. To help the student do this, a program can use templates. In the template, a program gives the student a structure. The template, based on competencies, skills, professional tasks or learning outcomes (a goal in Portflow), dictates the information to be included in the portfolio. In a goal description, a program can, for example, indicate what kind of evidence the program expects for the goal. 

Feedback on a particular goal can be integrated into subsequent pieces of evidence, thus focusing not only on the piece of evidence itself, but also on broader competency development. Students can then apply this feedback to subsequent assignments related to that goal (the competency, skill, or learning outcome), as these assignments are also part of the overarching structure, in which the competency is again evident.

A prerequisite for students to learn from feedback is having an ongoing dialogue with the student (Baartman et al., 2020). The goal chart can serve as a starting point in progress evaluation. In the goal chart, all development of the student, including quantitative and qualitative feedback, is visible. This insight is reinforced by the growth chart and timeline (quantitative feedback). Because feedback can be given in different places, on evidence, collections, and goals, and this feedback comes together in the goal chart, a complete feedback loop is created. 

It is important to guide students in taking self-direction. In Portflow, the progress evaluation feature supports coaching conversations with students. Evaluations provide both quantitative and qualitative feedback. They can be used for self-evaluation and can be completed by a teacher, fellow student, or external party. Together with the goal chart, they provide a solid foundation for coaching conversations, such as during the intermediate-stakes phase of programmatic assessment, where the complete portfolio and data points collected so far are evaluated. 

In a progress evaluation, feedback (qualitative and/or quantitative) is given on a goal rather than on one specific data point. This allows the student and teacher to see how a student is progressing on a particular goal. These progress evaluations ultimately contribute to forming the final decision (high-stakes). The focus then shifts from a specific assignment to an overarching learning outcome, competency, or skill (a goal in Portflow).

Programmatic assessment distinguishes between data points (low-stakes), intermediate-stakes, and high-stakes. Low-stakes can take place in Portflow at the evidence level, where the student receives feedback on a specific piece of evidence. Here, the stakes are low, allowing the student to make mistakes and learn from them.

During intermediate-stakes, progress evaluation can be deployed in Portflow. In the progress evaluation, the stakes are already a little higher. It looks at what is going well, what can be improved and which direction the development is going (Baartman et al., 2020). 

At the high-stake, the final decision takes place. A pass/fail decision is made. Such a high-stake moment can take place in the VLE or an external assessment tool. Prior to the high-stake, a student takes a snapshot of the entire portfolio, or a part of it. A snapshot freezes the portfolio at a specific time. This frozen version can then be turned in, after which the final decision is made on the portfolio. During this time, the student can continue to work on evidence in the unfrozen “live” portfolio.

A final decision is best made by an independent group of evaluators (Baartman et al., 2020). Because a frozen version of the portfolio can be created and submitted separately, the student does not need to give the reviewers access to the “live” portfolio. The decision can thus take place entirely independently. 

A concrete example

Now that we have seen how Portflow can support programmatic assessment, we can further illustrate it in an example. As indicated earlier, programmatic assessment is not a strict set of steps, but rather a concept. So there are many different possibilities. This is just one of them:

Week 1-3 Low-stakes

  • The course works with templates and gives students a structure in advance;
  • The student collects data points both in the VLE and in Portflow itself;
  • The data points and feedback from the VLE are imported by the student into Portflow and any additional feedback is extracted and processed.
  • The student links the data points to the relevant goals (learning outcomes, competencies, etc.)

Week 3/4 Intermediate-stake 1

  • The student works with self-evaluations and additionally requests progress evaluations for the intermediate-stakes from their coach or an expert.

Week 4 to 6 Low-stakes

  • Based on these progress evaluations, the student continues to work on new data points and collects new feedback.

Week 6/7 Intermediate-stake 2

  • A second self-assessment and progress evaluation follows.

Week 7 through 9 Low-stakes

  • The student collects further data points and feedback following the intermediate-stake.

Week 10 High-stake 1

  • There follows a decision moment, the high-stake, looking at the progress evaluations and goal cards with collected feedback. The decision moment takes place in the VLE where a snapshot (frozen version of the portfolio) is viewed by the program designated reviewers.

Are you ready to get started with programmatic assessment in Portflow and a VLE?

Because we have linked the principles of programmatic assessment to the functionalities in Portflow, you have gained insight into the support that Portflow in combination with a VLE offers for programmatic assessment, a form of holistic assessment. Curious about how Portflow works in practice? Book a demo and discover what Portflow can do for your education.