Optimizing Student Assessment:

Bridging the Gap Between Traditional and Online Teaching

EdTech

Web app

Complex system

B2B

User Research & Pain Points:

Uncovering online teaching challenges

While online education has made learning more accessible than ever, it still presents unique challenges, especially in fields that rely on hands-on practice and subjective evaluation. Current platforms focus heavily on course content delivery but lack effective tools for assessing students' progress beyond standard quizzes and assignments.

I conducted AI user interviews with five teachers from various fields, specifically focusing on subjects that are not naturally suited for remote learning. All of the teachers expressed difficulty in understanding where their online students stood in terms of the material - how well they were able to apply what they learned and what their process looked like, rather than just seeing the submitted final result

While online education has made learning more accessible than ever, it still presents unique challenges, especially in fields that rely on hands-on practice and subjective evaluation. Current platforms focus heavily on course content delivery but lack effective tools for assessing students' progress beyond standard quizzes and assignments.

I conducted AI user interviews with five teachers from various fields, specifically focusing on subjects that are not naturally suited for remote learning. All of the teachers expressed difficulty in understanding where their online students stood in terms of the material - how well they were able to apply what they learned and what their process looked like, rather than just seeing the submitted final result.

"During in-person workshops, I can observe and correct mistakes immediately"

"Students' progress is limited by ability to provide feedback"

"Quizzes and assignments only scratch the surface"

Many teaching platforms rely on final submissions, leaving educators without insight into students’ learning process or creative decisions along the way. This is particularly problematic in subjects where technical execution and creative choices matter just as much as the end result.

Competitive Analysis

Exploring existing online teaching assessments

Comparing four major competitors, I found that most now utilize AI to streamline course setup, with some also leveraging it for basic quiz creation.

= available

= generated by AI

= generated and graded by AI

LearnWorlds

Teachable

Thinkific

Kajabi

Multiple choice

Multiple selection

File upload assignment

Open ending

Short answer quiz

True/false

Ordering questions

Matching questions

Fill in the blank

Student reflection

But how far can true/false questions and the like get you when you're trying to learn how to cook? Or how to paint?

For the sake of this case study, I chose a field that isn't completely sensory but still presents a challenge: photography - where the final result can be analyzed digitally, but the creative and technical process behind it remains unseen.

But how far can true/false questions and the like get you when you're trying to learn how to cook? Or how to paint?

For the sake of this case study, I chose a field that isn't completely sensory but still presents a challenge: photography - where the final result can be analyzed digitally, but the creative and technical process behind it remains unseen.

Ideation & Use Case:

Integrating AI-powered feedback with instructor refinement

Recognizing the gap in student assessment, I aimed to develop an approach that balances innovation with practicality—ensuring more insightful evaluations without increasing workload for instructors.

I researched what AI assessment methods could work for the photography discipline. After sorting through AI suggestions, I realized that many could be incorporated to analyze students' submitted photos, while some could even serve as new standalone test formats. I brought all of these possibilities together under an elaborate evaluation framework called Smart Assessment.

For example, a photography instructor using an existing online platform might struggle to evaluate a student's thought process and technical execution beyond the final image submission. With Smart Assessment, the evaluation of submitted photos would be split into two steps:

Recognizing the gap in student assessment, I aimed to develop an approach that balances innovation with practicality—ensuring more insightful evaluations without increasing workload for instructors.

I researched what AI assessment methods could work for the photography discipline. After sorting through AI suggestions, I realized that many could be incorporated to analyze students' submitted photos, while some could even serve as new standalone test formats. I brought all of these possibilities together under an elaborate evaluation framework called Smart Assessment.

For example, a photography instructor using an existing online platform might struggle to evaluate a student's thought process and technical execution beyond the final image submission. With Smart Assessment, the evaluation of submitted photos would be split into two steps:

Step 1

AI-Assisted Review

The platform automatically analyzes elements such as composition, exposure, and adherence to specific technical guidelines. This process can be iterative—students receive immediate technical feedback and have the opportunity to refine their submission before presenting it to the instructor.

Step 2

Instructor Feedback

The instructor is left to evaluates the final submission, providing feedback on creative choices using annotation tools for guidance. Key areas can be highlighted to help students improve. Previous attempts remain visible to assess students' learning progress.

This method ensures a balanced, efficient assessment system that enhances online learning without burdening educators.

Sketches & High Fidelity Screens:

Design iterations towards high-fidelity

Before diving into high fidelity designs, I went through the process of collecting screenshots of UI inspirations for each of my screens, and then sketched multiple low-fidelity wireframes to quickly explore layout ideas. This helped me iterate efficiently and validate key interactions before committing to digital wireframes.

Before diving into high fidelity designs, I went through the process of collecting screenshots of UI inspirations for each of my screens, and then sketched multiple low-fidelity wireframes to quickly explore layout ideas. This helped me iterate efficiently and validate key interactions before committing to digital wireframes.

Course > Settings

A form that enables educators to effortlessly customize key aspects of their courses. Users can define basic course details, choose the preferred content delivery format, and design a personalized course certificate to enhance the learning experience.

A form that enables educators to effortlessly customize key aspects of their courses. Users can define basic course details, choose the preferred content delivery format, and design a personalized course certificate to enhance the learning experience.

Course > Curriculum

An intuitive drag-and-drop canvas interface that lets educators design and structure their courses with ease. This dynamic tool allows users to arrange content blocks, organize lessons, and customize the learning experience. To enhance workspace efficiency, the side navigation is minimized.

An intuitive drag-and-drop canvas interface that lets educators design and structure their courses with ease. This dynamic tool allows users to arrange content blocks, organize lessons, and customize the learning experience. To enhance workspace efficiency, the side navigation is minimized.

Smart Assessment > Questions (empty state)

A space where educators can add questions to build a multi-step assessment. They are provided with question types specifically designed for photography, alongside standard options to incorporate into the Smart Assessment process.

A space where educators can add questions to build a multi-step assessment. They are provided with question types specifically designed for photography, alongside standard options to incorporate into the Smart Assessment process.

Smart Assessment > Questions

In this use case, the assessment consists of two questions. The first question is an "Interactive image analysis", and the second, currently displayed, is a "File upload task".

The file upload task can be graded entirely by AI, by the instructor, or through a combination of both. In this setup, students are given up to three automatically AI-graded attempts before submitting their final version for a deeper instructor evaluation.

In this use case, the assessment consists of two questions. The first question is an "Interactive image analysis", and the second, currently displayed, is a "File upload task".

The file upload task can be graded entirely by AI, by the instructor, or through a combination of both. In this setup, students are given up to three automatically AI-graded attempts before submitting their final version for a deeper instructor evaluation.

Smart Assessment > Submissions

A dashboard showcasing analytics on student submissions across the entire Smart Assessment, along with detailed insights for each individual question.

On screen are final file upload task submissions, awaiting instructor evaluation. AI-driven tagging highlights potential issues, enabling educators to provide precise feedback and streamline the assessment process.

A dashboard showcasing analytics on student submissions across the entire Smart Assessment, along with detailed insights for each individual question.

On screen are final file upload task submissions, awaiting instructor evaluation. AI-driven tagging highlights potential issues, enabling educators to provide precise feedback and streamline the assessment process.

Smart Assessment > Review

On this screen, instructors can annotate and draw directly on students' uploaded images, adding personalized feedback. They also have access to students’ previous AI-checked attempts, allowing for a deeper understanding of the learning process and progress. This interactive approach helps instructors provide more precise guidance and support tailored to each student's development.

On this screen, instructors can annotate and draw directly on students' uploaded images, adding personalized feedback. They also have access to students’ previous AI-checked attempts, allowing for a deeper understanding of the learning process and progress. This interactive approach helps instructors provide more precise guidance and support tailored to each student's development.

Main Dashboard

The home screen provides an organized overview of the user's products, including both published and drafts. Below, real-time notifications keep instructors informed about student activity, highlighting assessments awaiting review.

The home screen provides an organized overview of the user's products, including both published and drafts. Below, real-time notifications keep instructors informed about student activity, highlighting assessments awaiting review.

KPIs:

How I Would Measure the Effectiveness of Smart Assessments

To assess the effectiveness of the AI assessment framework, several key performance indicators will need to be taken into account.

Task completion rate

Tracks how successfully teachers create intricate assessments, and how students navigate and complete assignments, indicating whether the AI-driven feedback supports their learning process.

Time on task

Evaluates how much time instructors spend grading and reviewing submissions, ensuring that AI automation reduces workload without compromising quality

Customer satisfaction score

Measures the overall experience of both students and instructors, reflecting ease of use and perceived value.

Cognitive load time

Measuring how long it takes users to understand and efficiently use the interface—helps determine whether the system is intuitive and accessible.

Next Steps:

My takeaways & learnings

This case study focused on the photography discipline and its unique challenges in online teaching. With further development, this approach to remote student assessment could extend to additional fields that require innovative solutions.

Some of the assessment tools I introduced may be applicable to other disciplines, and existing AI capabilities could be further explored. As AI continues to evolve, more industries may benefit from similar advancements in assessment and learning.

Thanks for scrolling 🙌

View next project >

"During in-person workshops, I can observe and correct mistakes immediately"

"Students' progress is limited by ability to provide feedback"

"Quizzes and assignments only scratch the surface"


Learn

Worlds

Teachable

Thinkific

Kajabi

Multiple

choice

Multiple

selection

File upload

assignment

Open

ending



Short answer

quiz

True/false


Ordering

questions


Matching

questions


Fill in the

blank


Student

reflection


= available

= generated by AI

= generated and graded by AI

Task completion rate

Tracks how successfully teachers create intricate assessments, and how students navigate and complete assignments, indicating whether the AI-driven feedback supports their learning process.

Time on task

Evaluates how much time instructors spend grading and reviewing submissions, ensuring that AI automation reduces workload without compromising quality

Customer satisfaction score

Measures the overall experience of both students and instructors, reflecting ease of use and perceived value.

Cognitive load time

Measuring how long it takes users to understand and efficiently use the interface—helps determine whether the system is intuitive and accessible.

Shahar Birka | Product Designer

Shahar Birka | Product Designer

Linkedin

Linkedin