What if you could learn in an environment that knows the Whats, Whens, Wheres, and Hows needed in order for you to have an optimal learning outcome? The LXMatters website is where we, a group of enthusiasts within the EdTech space spearheaded by Cerpus, a learning technology company in Northern Norway, document and discuss our research into how we can apply a methodological approach to digital learning resulting in truly adaptive learning experiences. What’s more, we would like to take this opportunity to encourage and invite other people to contribute their thoughts, knowledge, insights, and experience with regards to everything learning experience-related. Your contribution to the LXMatters initiative is genuinely appreciated.
Wikipedia states the following about the more general topic of User Experience:
“User eXperience (UX) refers to a person’s emotions and attitudes about using a particular product, system or service. User experience includes the practical, experiential, effective, meaningful and valuable aspects of human–computer interaction and product ownership.
User experience may be considered subjective in nature to the degree that it is about individual perception and thought with respect to the system. User experience is dynamic as it is constantly modified over time due to changing usage circumstances and changes to individual systems as well as the wider usage context in which they can be found.”
We’re not here to give another definition of UX, we are inspired by the basics of UX: testing, modifying, test again, modify again, to inform the production and functionality of digital learning environments, as well as to give users the best Learning eXperience possible.
The first requirement when creating an optimal User eXperience is to meet the exact needs of the user. The same requirement is true for Learning eXperience (LX); in order for learning to be effective, the learning experience must meet the needs of the individual learner.
In an article at The Glossary of Education Reform it is stated that:
“Learning experience as a term is in growing use by educators and reflects larger pedagogical shifts that have occurred in the design and delivery of education to students, and it most likely represents an attempt to update conceptions of how, when, and where learning does and can take place.
Learning experience may also be used to underscore or reinforce the goal of an educational interaction—learning—rather than its location (school, classroom) or format (course, program), for example.”
This definition underlines the importance of learning, and more specifically the importance of interactive learning, rather than instruction. Based on this, the learning experience becomes the focal point, which is continually adapted and iterated, using UX theories, to create the best learning experience possible for the learner. The principle of focusing on learner interaction as the core of learning experience, and of learning itself, isn’t new, as it has historical roots in areas such as Experiential Learning and Constructivism.
Wikipedia states the following for experiential learning:
“Experiential learning is the process of learning through experience, and is more specifically defined as ‘learning through reflection on doing’. Experiential learning is distinct from rote or didactic learning, in which the learner plays a comparatively passive role… experiential learning considers the individual learning process. As such, compared to experiential education, experiential learning is concerned with more concrete issues related to the learner and the learning context.”
Experiential learning focuses on the learning process for the individual. Because of the direct involvement of active learning, the learner makes discoveries and experiments with knowledge firsthand, instead of hearing or reading about the experiences of others.
David Kolb, an educational theorist and proponent for experiential learning developed the “Experiential Learning Model” (ELM) which was made up of four elements:
- Concrete experience
- Observation of and reflection on that experience
- Formation of abstract concepts upon the reflection
- Testing the new concepts (and then repeating the whole process)
Steve Wheeler discusses that Kolb’s model aligns with Jean Piaget’s constructivism, where accommodation and assimilation are seen as part of a process which leads to internalization of knowledge by learners. Piaget’s notion of “knowledge schemata” were thought to be:
- Critically important building block of conceptual development
- Constantly in the process of being modified or changed
- Modified by on-going experiences
- A generalized idea, usually based on experience or prior knowledge
The idea Piaget puts across, is that new learning happens when the learner revises, elaborates, adapts and balances what he/she already knows with the new learning experience he/she is part of. In a neurological sense, the brain/mind is seen to constantly work on building and rebuilding itself as it takes in, adapts/modifies new information, and enhances understanding.
The processes and mechanisms Piaget and Kolb put forward are still valid and relevant, even if the context within which their theories were born has changed drastically. They universally suggest to us that learning is an active, iterative, adaptive and dialectic process which happens within set contexts. A study analyzing whether interactive learning increases student performance, documents increases in examination performance that would raise average grades by a half a letter, and that failure rates under traditional lecturing increase by 55% over the rates observed under interactive learning.
Active, interactive learning can increase student performance by 55%.
Kolb and Piaget’s theories were based on a learning process focused on the student as an individual (active) learner, which is the core rationale in our understanding of adaptive learning and learning experience. Had they formed their theories within the context of today’s technologically advanced and social learning environments, they would probably have taken into account that digital learning, as well as, collaborative learning in the context of the social web is showing great promise. LXMatters is a collaborative research platform focusing on how one can further evolve new models and methods for how future learning experiences might take place.
A Scientific and Methodological Approach To Learning Experience
The dialectic nature of learning within the constructivist learning approach suggests that one can profit from expressing a methodological approach to digital learning. With current technologies we have the possibility of creating intelligent software that can learn and adapt according to the changing nature of the learners’ needs, such as learning styles, individual learning path, and so forth. What needs to be constant, is the underlying methodology, ensuring common functionality as well as individual adaptivity, and last, but not least, consistency as a the common denominator for all learners and their learning experience.
UX Processes As A Blueprint For LX Processes
Looking at an outline of UX processes from UX Mastery, it is easy to see how this process could be a blueprint for the LX process.
The UX process illustrated below aligns with what we envision an LX process to be, basing our assumptions on the fact that learning experience, especially in a digital environment, is essentially based on the same attributes and needs of the learner as end users have in the UX paradigm. The emphasis on the continuous iterative and dialectic processes to align with the changing character of the needs of the user would be at the center of an LX method as well.
The strategy, in UX terms, articulates guiding principles and long-term visions of organisations and businesses. In a learning experience methodology it would play the same role, in that it would guide and shape any learning experience design in order to reach the ultimate goal of delivering a tangible and valid learning outcome. The methodology would outline how we build systems for digital learning learning experiences, as well as, guide the iterative learning process connected to the interactions of the learner.
The research phase in UX terms is often referred to as the discovery phase, where the results of research on the target users’ needs is seen as the key to an informed user experience. In order to provide the learner with an adapted experience, the research phase is crucial in learning experience, as well. In LX terms, specifically applied to software, this phase would be in the form of a continuous research and collection of the learners’ attributes, including: learning style, skills, contextual needs, and progress, and storing it in a Learning Record Store (LRS) by the means of xAPI statements.
This measurement and collection of data about the learner, establishes the basis for Learning Analysis in the following analytic phase of the process.
The aim of this phase is to draw insights from the collected data, making inferences which can guide the subsequent iterations of the process. In LX terms, the data would be analyzed to gain insight on the learner and provide suggestions on how to improve the overall learning experience.
With data about learners collected and stored in an LRS in the research phase, one is ready to process the data in order to produce outcomes that will drive suggestions for iterations. Several different analytical methods may be used, depending on what fits best for the type of data that has been collected. The following are common analytical methods that may be used:
- Knowledge analysis, aiming to capture the degree of the knowledge of the learner within a given knowledge domain
- Content analysis of resources created by learners, such as essays and the like
- Discourse analysis, aiming to capture data from the learner’s interactions as well as properties of the learner’s language used during interactions
- Social Learning Analytics, aimed at exploring the role of social interaction in learning
The statistical outcomes of these analytical methods already contain enough valuable data to inform the next step in the process. But, to really be able to predict and suggest an adaptive learning experience as accurately as possible for the learner, we need to apply machine learning techniques to the data.
Machine learning, a subset of artificial intelligence, is a way of programming computers to identify patterns in data as input to algorithms that can make data-driven predictions or decisions. As we interact with computers, we are continuously teaching them what we are like. The more a user interacts, the smarter the predictions become.
This step would entail modeling artificial neural networks representing data such as subject matters and the learner’s interactions with subject matter content, predicting probable outcomes and learning paths. The resulting outcomes and predictions would then suggest iterations in next phases in the process.
4. Design and Production
In the case of a digital learning application, the previous steps in the UX process guides the initial design, wireframing and prototype production of the application, as well as the subsequent iterative processes, which continuously improves of the product in terms of UI and UX, creating an optimal user experience.
These two phases merge in LX, as the data from research and analysis are processed to automatically propose the initial information architecture, logic and design of the user interfaces in the product.
The reiteration phase of the UX process focuses on continuously evaluating and re-evaluating a project or product, through rounds of revisions in order to improve the overall user experience. Reiteration is also at the heart of LX. By basing the learning experience process on algorithms derived from machine learning, this step of the process is inherently iterative and adaptive, as it is driven by parameters like the constantly changing state of knowledge and contextual needs of the learner.
A Tentative LX Method
The method draft outlined below is illustrated below as an inherently gradual and iterative process with no apparent end or conclusion guided by a consistent set of rules. This reflects the lifelong nature of learning, and the need for learning experiences to be methodologically founded both in philosophy and technology for adaptivity and consistency.
Context is always king, and as such, an initial measurement of the learner’s learning environment is necessary. Attributes measured might include: gender, age, culture, language, former education, goals, learning style, or any data previously collected from learning experiences stored in an LRS or from services like Mozilla Backpack.
1. Learner Interaction
The learning material, along with learning paths and other similar data, have been rendered to provide an adaptive learning experience for the learner. The learner interacts with the adapted learning materials, which is used as the baseline for data analysis.
2. Discovery and Learning
The rendered learning experience exposes the learner to content and other data, which by the method has been tailored and adapted with the goal of aligning with the current dialectic process in the learner. This process is at the core of this phase, lending from the constructivist theories it is built upon. The learning experience supports the learner by creating an environment that challenges and inspires creativity and immersion, in turn, allowing for independent thinking and new ways of learning.
3. Data Collection
Data about the learner’s context and interaction are collected and marshalled into xAPI statements and stored in a LRS. These statements contain data about “Who,” “What,” “When,” “Where,” “How,” and form the basis for the next step, where we start the statistical analysis of our data.
In this phase, statistical analysis of the data takes place. At this point, highly advanced analytical methods aren’t necessary, as we are looking to abstract simple inferences to lay the groundwork for further intelligent adaptive learning. The abstracted data functions as a self sufficient backbone which will be enriched by machine learning outcomes as soon as they have reached a mature enough state.
For the next steps in the process, data is converted into the appropriate data structures and delivered to our machine learning services for further processing.
5. Deep Learning
The statistical data collected during the previous phase is fed into machine learning algorithms, where it is processed and becomes part of what the services already know for a given learner. Artificial neural networks and other data models representing a learner’s learning experience and criteria are updated with new weights and probabilities as the software continues to learn about the learner from the incoming data. As the networks mature in knowledge they are able to create increasingly accurate predictions by inference, which are returned and passed on to the next step in the process.
6. Algorithmic Composition of Adapted Learning Paths and Materials
With xAPI statements and plain statistical data, combined with predictions from the machine learning algorithms, one has a sufficient basis for the algorithmic composition of adapted learning content, as well as recommendations for alternative learning paths through learning content for a given learner. The result of this composition results in data structures and metadata guiding the next step of the process.
7. Adaptive Learning
As the data resulting from the machine learning analytics take time to reach a state of relative accurate expression, the rendering of learning experiences should still strive to achieve the uttermost of adaptivity for the learner. The minimal measure of adaptivity would be to render according to the contextual needs of the learner combined with the more naive statistical analysis of xAPI statements.
Wash, Rinse, and Repeat
Since the nature of learning experiences are to be adaptive, reiterations and redesign are at the core of the process, which aligns with the UX process for creating optimal User Experiences.
LXMatters is a research effort into how we can apply a scientific, methodological approach married with insights from User Experience to learning and the learning experience as a whole.
Combining UX thinking of continuous evolution of ideas through testing and revisions, with notions of how individual learning happens, we believe we can develop a methodology for creating better learning experiences.
The LXMatters blog will document the research and processes done on the road to a robust, tried, and proven methodology applicable when creating any digital learning system with adaptivity as a goal for learning experiences, leading to an optimal learning outcome.
- A theory of knowledge that argues that humans generate knowledge and meaning from an interaction between their experiences and their ideas.
- Also known as the dialectical method, it is a discourse between two or more people holding different points of view about a subject but wishing to establish the truth through reasoned arguments.
- Adaptive Learning
- An educational method which uses computers as interactive teaching devices, and to orchestrate the allocation of human and mediated resources according to the unique needs of each learner. Computers adapt the presentation of educational material according to student’s learning needs, as indicated by their responses to questions, tasks and experiences.
- Also known as the Tin Can API, is an e-learning software specification that allows learning content and learning systems to speak to each other in a manner that records and tracks all types of learning experiences.
- Learning Record Store (LRS)
- Is a data store system that serves as a repository for learning records necessary for using the xAPI.
- Machine Learning
- Gives computers the ability to learn without being explicitly programmed. Machine learning explores the study and construction of algorithms that can learn from and make predictions on data.
- Deep Learning
- Is a branch of machine learning based on a set of algorithms that attempt to model high-level abstractions in data by using multiple processing layers, with complex structures or otherwise.
- Predictive Analytics
- Is a branch of machine learning based on a set of algorithms that attempt to model high-level abstractions in data by using multiple processing layers, with complex structures or otherwise.
- Artificial Neural Networks
- In machine learning and cognitive science, artificial neural networks (ANNs) are a family of models inspired by biological neural networks (the central nervous systems of animals, in particular the brain) and are used to estimate or approximate functions that can depend on a large number of inputs and are generally unknown.
- In mathematics and computer science, an algorithm is a self-contained step-by-step set of operations to be performed. Algorithms exist that perform calculation, data processing, and automated reasoning.
Join The Conversation
Leave us a comment with your thoughts related to how LX thinking can be applied to create adaptive learning. Finally, if you want more stories like these delivered to you, sign up for the LXMatters newsletter.