UNESCO International Institute for Educational Planning x Latitudes Tech for Good Hackathon

EdVis

A data visualization tool to help education planners more easily identify and intervene in regions where learners are performing poorly and create sustainable, equitable education systems.

Project Overview

In January 2021, I participated in the UNESCO International Institute for Educational Planning (IIEP) and Latitudes’s #HackingEdPlanning Hackathon. Together with a team of data scientists, designers, and web developers, we created the data visualization tool EdVis:

ROLE
Product Designer
PROJECT TYPE
Hackathon, Web App
TIMELINE
48 Hours
TOOLS
Pen & Paper, Figma
DESIGNERS
  • Kayley Cheung
  • Rejeanne De Jong
  • Aletheia Délivré
  • Nicolas Fan
  • Henry Mai
  • Oyin Runsewe
DATA SCIENTISTS
  • Jenna Fu
  • Bowen Zhang
WEB DEVELOPERS
  • Haley Glavina
  • Sam Tang
  • Onimisi Ukanah

Discover & Define

Understanding the problem

Problem Space

According to the latest data from UNESCO’s Institute for Statistics, 617 million children worldwide are not meeting minimum proficiency levels in reading and mathematics, despite the fact that two-thirds of these children are in school, and the availability learning assessment data has been steadily increasing over time.

The IIEP and its research team on this subject are working to better understand and promote the use of learning assessment data as it relates to education planning and policy implementation to ensure inclusive and equitable quality education and promote lifelong learning opportunities for all.

Understanding User Needs

While we didn’t have access to the education ministers and policy planners who use this data directly, we did speak to two contacts from IIEP who have been working on this research and policy analysis to better understand the challenges and pain points educational planners have been facing. From those conversations, we learned some key information about the users and current state of education policy planning:

 

Lack of Tools

There is a lack of tools available that effectively visualize learning assessment data for policymakers.

Poor Data Management

Assessment results are often stored in multiple databases, and not accessible to users at all levels of education planning.

Need for Relevant Analysis

Current data analysis does not show relevant correlations between learning assessment outcomes and external factors (eg. health, nutrition, etc.)

Project Goals

With the knowledge of the users’ current challenges in mind, we established a set of overarching project goals:

  • Create a data model that determines statistically significant correlations between learning assessment outcomes and external factors affecting students, families, and teachers.
  • Design a user-friendly interface that presents the findings in a way that is easy to understand.
  • Make a product that is scalable and accessible to policy makers at all levels involved in educational planning.

Design & Development Process

Collaborating across disciplines to build the best solution

Ideation

After our project goals were established, we discussed with our data science and web development teams to determine the scope of the project, and what we all felt was feasible to accomplish and present as a proof of concept within our 48-hour time period. With those constraints in place, the designers began sketching what a data visualization tool addressing this problem could look like.

Sketches

Click images to enlarge

Aligning on a Solution

While our sketches made it appear that we were on a similar wavelength about how the product should function, we needed to align on a single solution to present. In the process of drilling down to an MVP, we made several adjustments and pivots:

  • Simplifying the map view from multiple colours and measurements to one aggregated measurement of overall learning outcome performance metric. This was done to avoid overcomplicating the interface for the user, as that was one of the challenges they had faced in the past.
  • Making the information outcome-focused. As we worked to refine our sketches and ideas, we realized that it wasn’t enough to simply present the data – it had to point to something actionable that would have a positive impact on both the users and the students and teachers their policies affect.
  • Establishing a single view. While we had originally planned to show the scalability of the product with multiple views of the map at different levels, we elected to focus on one as a proof of concept to really show the power of the tool with more in-depth analysis.

Wireframes & Flow

We regrouped with our data science team to determine what information they had found to be relevant to the problem, and what our dashboard interface would ultimately need to show for it to be most beneficial to the users. Once we had a grasp on that, we created a set of mid-fidelity wireframes and worked with the developers to establish how the interface should behave, and allow them enough time to build the basic structure of the product before we established a visual identity and UI components.

A national view of the country, divided into specific regions. The map will highlight the regions in which learning outcomes are the poorest based on established thresholds, allowing education planners to easily visualize where policy intervention is necessary.

With a specific region highlighted, the user can see an overview of the statistically significant factors impacting learning assessment performance in that region. There are also filters for student gender and standard of living index, as access to quality education for girls in particular is of great concern.

A more detailed view of province-specific data, featuring more in-depth data analysis and filtering options that allow policy makers to target specific population groups where intervention is most urgently needed.

High-Fidelity Mockups

With the structure and scope of the product established, we began refining our UI elements and injecting colour, with the goal of making the data easy to interpret and the interface intuitive to use. At this point we also established what state changes would look like, and which ones we wanted to demonstrate in our final pitch of the product.

High-fidelity wireflow, showing different task flows and state changes (click to enlarge)

Interactive Prototype

Design Impact & Next Steps

Design Impact

  • Accessible Data: EdVis has the power to make learning assessment data accessible to and easy to understand for stakeholders and actors at multiple levels.
  • Targeted Policy Intervention: This tool can help education planners understand the specific factors impacting learning outcomes across specific regions and populations, allowing them to create targeted policies and move toward sustainable, equitable education globally.

EdVis’s potential design impact led to it being selected as winner of this hackathon’s juried prize for Impact. This prize was awarded based on which project could have the greatest positive impact on IIEP-UNESCO and its member states, and was adjudicated by a panel of 6 judges, including the Deputy Director of the IIEP.

Next Steps

Since we only wanted to present a proof of concept for this hackathon, we also established some next steps. After the interest this project earned from IIEP-UNESCO, these feel particularly important to consider when looking at how EdVis could have real-world impact on less developed education systems in the long term.

  • Scale the product to include views for school districts, specific schools, and eventually more countries
  • Observe year-over-year trends in learning assessment data so that the model may better predict what policy implementations will have the greatest benefits
  • Integrate policy recommendations and reporting into the interface
  • Build out a data collection portal to facilitate information sharing at all administrative levels

Thanks for reading!

If you'd like to learn more about my project and design process, feel free to get in touch:

E-mail Me