A UX Design Process with ADDIE
First published by Pat Godfrey: July 7 2017
ADDIE (Analysis, Design, Development, Implementation, Evaluation) is a dynamic, cyclic, and evaluative design cycle based on how people learn and think. It focusses and strengthens our, "create, test, and learn" cycles.
It is a meta-process. It is not a dogma. It aids design. It does not rule or prejudice it.
The ADDIE cycle considers our users' performance and tasks, fitting user experience (UX) design well. It benefits digital marketing too!
If you think this content is useful to your Friends, colleagues, or connections, then please consider flagging it to them.
I love to Experience Learning Too. Your feedback is welcome.
What is ADDIE?
It's just another design cycle. It's happens to make the most sense to me.
ADDIE (Analysis, Design, Development, Implementation, Evaluation) is similar to Herbert Simon's 1969 decision making model.
You may be more familiar with similar cycles?
I find ADDIE a comfortable and flexible methodology applicable across a range of situations. It is easy to share and it actively enhances the design process. It applies to the Big Picture and the finest of details.
ADDIE encourages formative evaluation by promoting a reflective practice suitable to iteration and continual improvement and research rather than a linear model of progression. It also fits Agile Scrum perfectly and (UK) Government Digital Standards (GDS) methodologies well.
A process tool
As ADDIE (Analysis, Design, Development, Implementation, Evaluation) needs some Research to analyse, perhaps we should refer to (R)ADDIE?
The (R)ADDIE cycle eddies around and inter-relates to or disrupts other stages in the design and development process.
Analysis should be one of our greatest efforts. Scrimp here, and quality will suffer.
The more research completed around your enterprise, customers, and end users, the more insights you gather into their wants and needs.
Researchers may grab their clipboards, white coats, and entrench themselves in laboratories. In fact, the useful (big) data is discovered by talking to our consumers and understanding their motivational and environmental opportunities and constraints.
We need a research plan beyond A-B 1 and 'Multivariate' techniques. What are we designing in the first place; why, and for whom?
Analysis of your research finds common avenues on which to found levels of communication, cooperation, collaboration, and corroboration. You may prove or dispel preconceptions, prejudices, and 'borrowed beliefs' moving forward toward a user-centric design.
Note: user research can appear an expensive resource. There are different levels of research to suit every pocket. For example, you can 'user test' paper prototypes among team members role-playing personas. Just avoid being too frugal - you want your enterprise to work for real users, yes?
Recall that Usability Labs are a often a compromise solution. You cannot observe our users in their natural environment unless they are in their natural environment?
Footnote 1: A - B tests: bad v. worse? "Our respondents preferred B..." That really is bad.
Analysis is the most important aspect of any enterprise project, and yet it will be the least well resourced.
UX designers can make do and improvise intelligently, but projects benefit exponentially in line with the resources given to analysis.
What is our enterprise's Mission; its goals; its constraints, strengths, and features? How do these map to the product's and its users' environment, tasks, and needs? These are the first questions at the heart of our analysis. They are the background on which all design may flourish, or fail.
We need scope. What is the scope and how will we resource it? Is the scope finite, or does it take a more agile approach: evolution and not revolution?
What resources are available, and are there contingencies if more are needed?
And so on...
Not everyone in your audience is destined to become our customer or our user... unless we court them by demonstrating that there is something in it for them, by making connections (emotional, practical, and social), and working on generating interest and converting that into business.
Who is in our audience, and what's in it for them?
We need to get off our backside and find our users and talk to our customers and end users; not with check-boxes on a clip board or eye tracking lasers - but talk to them. They are not Pavlovian machines. We must understand what drives them, will block their progress, or worry them. What encourages and engages them? What is it that our enterprise is asking our users to do, and what do our users expect of our enterprise?
It is important to "come off the screen". When our users access our digital products on a screen, there's a whole universe traversing their lives and their experience at the same time. We must know how that will affect the Human Computer Interaction and design for it where we can.
Even the advantages and costs of quality qualitative and quantitative research can be negated by scalping statistical bell curves into flat generalisations. How will we deal with the data?
I am not a fan of the "Three Amigos" approach to Personas: a young user, a less young user, and an older user invariably useless with a browser; or whatever imaginary demographic meets our assumed hypothesis.
Where are our users accessing our digital content differently across devices, in improbable environments, or by overcoming differences in their cognitive or motor skills, browsing technologies, or other difficulties: not just visual impairments? All of them. No, your enterprise's success rests on the Three Amigos..?
Personas need to be real. To build a persona we must talk to an intelligently selected sample of our end-users.
In practice, meeting our users may not be practical. We can assume details about our users, but the result is assumed Personas. They will always miss vital aspects of who, how, what, when and where, as well as why our users approach our platforms as they do.
Inclusivity Built In
Statistics suggest 80% of Internet users have a difference in their perception of, or use of our digital products. It is more important than ever to design inclusively for access and usability. This does not mean designing only for 'alternative browser technologies', but for alternative browsing strategies and methods. There's a wealth of difference.
W3C's Web Accessibility Perspectives is an excellent reference from which to challenge our perceptions of what designing for accessibility and usability is. Through videos, W3C illustrates why the underlying content and HTML architecture must be semantic, well written, and visually easy to use. Once we have the basics right, then we can do the pretty it up.
Perhaps too many visual artists get to be UX Designers and not enough UX Designers? Perhaps because managers are visual users and judge UX on the visual experience alone?
Mobile First "mythology" is an engineering methodology, not necessarily a design one. We can design for larger viewports first and scale down the UI for mobile presentation and still be "Mobile First" as long as the implementation is, "Mobile First".
Caution: Mobile device presentations may no longer be predicted by the screen size. The viewport is often under the user's or software's control. For example, two apps may run on the same mobile device screen. "Mobile First" must now, more than ever include Fluid Responsive presentation strategies. Read more about the Fluid Responsive Design philosophy and Mobile First methodology.
Leaving "user testing" until after the design is completed, or only A-B testing trite solutions is not ideal, but next to useless without a clear understanding against what we are testing. Designs should be tested throughout the process from the first workshops to the last prototype. (Evolution: not revolution).
Cheapness will, if allowed, disable innovation and anything but run-of-the-mill experiences. If we want our products to do what we want and to profit from them, then we really must face up to and properly resource what User Analysis is: research!
There is no one-size-fits-all process to build Personas or digital products and platforms. As for all aspects of design, it depends. But there are frameworks against which we can underpin our research and design processes within budget and time constraints.
There are safe assumptions we can make based on our and other enterprise's experience. Enterprises may already have sufficient knowledge of their users to make very informed assumptions.
At the least, we can prod and poke a Subject Matter Expert (SME) for information and insight. There are great resources available via Internet Searches too, which can yield background information on common user behaviours.
The ideal situation is to conduct primary user research and draw from it our users' expectations and to learn how to meet or better, exceed them. Any research is better than only making assumptions: even assumptions need researched.
Our assumptions are growingly intelligent. All is not lost.
Design relies on good intelligence of our enterprise and our user. It may be that further analysis and evaluation of our users' and enterprise's needs are required to ask questions raised by the team and our stakeholders.
The design process will generally include initial sketches of the solution used to garner buy-in from stakeholders, and which are then developed into wireframes and prototypes that can be tested against the Analysis.
In an agile scrum environment the question is, "When do we design? In Sprint, or pre-Grooming?" There are advantages to both and both can work equally well when adding new features to an existing product. However, with new products and designs, entering a Sprint with an untested solution may cause additional team effort through iterations. Contingencies of time may compromise the end solution.
The artefacts that a design team must deliver are the, "deliverables". These can include sketches, wireframes, prototypes, style guides, and all manner of supporting materials that will aide the development team - and perhaps the writers - in their tasks.
The skill is to know what needs delivered and not to deliver anything unnecessary to the task. Opinion may divide a team on what these should be. It is the designer's role to satisfy the team's needs, and to understand where and why these may at times be excessive. It's why designers need to work great with teams, and also to have the strength to help lead on and to communicate what their work-flow should be.
It is an arrogant team indeed not to offer user support. User support commences with the on-screen feedback we give our users through interaction with our materials to feedback on form completion.
Errors should be designed out where possible, but where possible errors should be managed for our user. If the on-screen environment is restricted, then this may need an external help system. This in turn may be required to open contextually to the support our user needs depending on where in the product they find difficulty.
UX Designers must have a part in this process before conflicting brands and values are implemented by other specialities who may be less well rehearsed in the digital paradigm.
Visual and Code Design
While the cognitive and interactive strategies and designs are finalising, it is a good time to tie in with visual designers and with the UI developers. It is always best practice to ensure that the overall User Experience design fits within the enterprise vision and brand, and also its technological capacities. We want to improve the Universal Experience after all?
Design for Smaller Devices
You cannot 'prove' your design for smaller viewports and devices unless you test them on a selection of devices, operating systems, and browsers. Similarly, when sharing your design concepts as wireframes and prototypes, enable your colleagues to play with them at relevant scales. If devices are in short supply, consider scale paper prints.
Remember that your design team are not your users. Test designs with real users in their real environs with their real devices. You'll be glad you did.
Designers cannot sit back during the development of the solution. It is their role to quality assure that the design is implemented correctly and that contingencies of resources and effort are minimised by solid preparation.
A designer must understand the opportunities of technology to fully exploit it. Designers must be sensitive to the the engineers' skill set and abilities or otherwise prepared to mentor them.
Designers need to be nimble: to react to changing scope and needs rationally. They must return the best a compromise as necessary - or to call a halt on a Story that needs more attention than its Scrum schedule can afford.
Designers need to be able to say, "No". Equally, designers need to say, "Can do!"
Designers may take a back seat during a products implementation ("Release") phase, but will be far from redundant. It is during this short respite that design thinking should be applied to coming Stories and tasks.
It is also key to our engineers that, if there is an unforeseen problem, the designer can quickly confirm or adapt designs and evaluate any changes needed now or in the future.
Our designer should also take part in User Acceptance Testing (UAT): monitoring the engineering output for quality and accuracy and also picking up any nuances that may indicate a need to update a design or flow.
The ultimate or summative test is when our users use the released features. It is important to gather feedback as quickly as possible. Your enterprise's support department is an ideal resource. And we don't only evaluate the process at its close. As a team we should formatively evaluate our design throughout each stage of the design to release process.
Back to Personas: you cannot capture every user in a persona. Now we have access to all of our user population there will be surprises. Designers must deal with them.
How ADDIE fits UX
The (R)ADDIE design cycle’s emphasis and priority on Research and importantly on Analysis supports our ambition to deliver delightful experiences.
The ADDIE processes' cyclic nature steers a design beyond what our users expect. It doesn’t only fix a "flag in the sand" to aim for, but constantly re-targets exactly where that flag should be. Our design can flex and evolve intelligently toward offering our users:
- Ease of effort.
- Context to their tasks.
- Give results or feedback.
Research alone cannot improve design. The greater our analysis the better our design for Ease, Context, and Results.
Note: the Ease, Context, and Results mantra cross-maps reasonably well to Dana Chisnell's, "The Three Levels of Happy Design" (Pleasure, Flow, Meaning), which Jared Spool terms, "3 Approaches to Delight".
Considering the user journey
At each stage of the ADDIE process, I ground the work to the basic user journey and the enterprise's values, aims, and objectives. As its name suggests, this is a basic and easy to apply tool: not a deliverable such as a Journey Map or Customer Experience Map, etc.
I developed the Basic User Journey from a number of tools and simplified it to relate our users' interaction with data. It's what we do online. We communicate with our users using data.
I apply my Basic User Journey as an interim Empathy Map, User Journey, and design tool at the macro and micro ends of our workflow. Being cyclic, it eddies down into the finest details – even to individual data inputs. It answers the, "what are we doing here?"
The more research we carry out then the more illuminating this tool is. And it works on pure logic, experience, and intuition, too.
|Wants||Wants give context to the user journey and enterprise aims. What our user and our enterprise want may not match what is needed.|
|Needs||Needs may indicate requirements, or preconditions that must be met to complete the journey.|
|Tasks||Tasks are discrete objectives formed from needs, against which success may be measured on completion of the journey.|
|Input||What our user and enterprise must actively do or to provide to achieve tasks set during the journey.|
|Output||The result of performing tasks such as knowledge acquisition, product orders being processed, or giving feedback on journey progress, etc.|
|Review||An overview of the status of a task or journey, of any further activity required or set in motion; perhaps an event history or time-line with which to track task or process progress, etc.|
|Recycle||There may be two or more phases: first, that our user and enterprise may access and repeat the journey or tasks as necessary, and second to review the success of each task or journey with a view to identifying improvements that can be made.|
Note: See an application of the Basic User Journey that enabled an 'eleventh hour' rapid understanding of, and design update of a failing UI.
Evaluating, conflict, and change
Conflict should inform our evaluative design process although it may also harm it.
Conflict in teams shouldn't be 'bloody'. Without care, conflict can damage relationships, erode trust, and ultimately cost the product and enterprise. Yet, without conflict and honest exchanges of opinions and data in an open environment, the design may conclude in consensus built on compromises, fear, and even misguided respect.
Designers particularly must be free to conflict with one another and with the team. Such conflict needs managing, of course and steering clear of personal and discriminatory attacks. Sure, there may be emotional wounds and we must learn from them and return reinvigorated by lunchtime, or at the latest by breakfast.
A 100% UX conflict
In a 100% UX team, conflict will be valued and respected. Research will be commissioned and evidence sought to seek the best outcomes. Learning will be shared in the spirit intended.
An effective team will grow stronger and more adaptable over time. Teams run by ids may suffer permanent damage.
Designers will almost always conflict between themselves over something or other. At times the conflict will be so trivial you want to bash their heads together—and only in a metaphorical way, of course! You may also see cataclysmic exchanges including salvoes of id. As long as it is not only the loudest voice or largest ego that wins, there is no personality or "pecking order" clashing, and the outcome is positive to the enterprise, then there is no reason it cannot be encouraged.
Designers know that conflict is healthy. And they should know when it is not.
Team managers need to be aware that conflict generally has a cause—and right or wrong, that cause may feed vital insight into a design. Just don't leave two or more designers on their own for too long.
They're [designers] you dolt. Apart from you, they're the most stupid creatures on this planet. They don't plot, they don't scheme, and they are NOT ORGANISED!
Melisha Tweedy, Product Manager.
On free speech
...Restricting speech leads to restricting ideas and therefore restricted innovation—the most successful societies have generally been the most open ones. Usually mainstream ideas are right and heterodox ideas are wrong, but the true and unpopular ideas are what drive the world forward...
...You can't tell which seemingly wacky ideas are going to turn out to be right, and nearly all ideas that turn out to be great breakthroughs start out sounding like terrible ideas... When we move from strenuous debate about ideas to casting the people behind the ideas as heretics, we gradually stop debate on all controversial ideas.
There's no one-size-fits-all design process. (R)ADDIE is simply one of my tools of choice.
Each designer and enterprise will follow what works for them. It is only essential that whatever the analysis is based on (research or intuition, or both), that it is thorough and not overly compromised by low resources, poor appetite, voluminous voices, or inflated id.
When encouraged, open, and transparent conflict can have a positive impact on design. Conflict resolution must be evidence based and not a compromise that bleeds into our product.
Chickens Designers may need careful observation? And feeding. Biscuits.
Reference this article
Godfrey, P. (Year, Month Day). Article Heading. Retrieved Month Day, Year, from URL