What if the problem one is trying to solve is not completely understood?
Let me rephrase: what if the solution you’re proposing isn’t solving the actual problem tasked to you? Or, as it often happens: what if the solution a client propose you produce isn’t what would solve their problem.
Not just by the problem persisting post launch, after even the best efforts (of wasted time and effort); it’ll show sooner than that. If a problem is unclear you shouldn’t even start designing: that’s designing without a clear purpose.
So how to identify the true problem?
Let’s look for some trouble.
Problem search (Looking for trouble)
There’s really a 1000 places to look for trouble.
If say, sales are bad for company x, we can’t assume, say, a new website that they ask for will solve sales. Neither can we assume it is because of...
2. Or inferior marketing and advertisement
3. Or packaging
4. Or a stronger competitions product/marketing/packaging/price/what have you
5. Or simply a product being there first
6. So on...
All of the above? Maybe. Probably not. But we shouldn’t jump ahead of ourselves. Let’s ask some people where we are at first in order to find- then identify trouble.
Market research vs User research
People's idea of themselves and their needs can be deceiving. We’ve not only seen it happen people have written books on the subject. How can we measure things correctly? How can we even start looking at a problem without knowing where we are actually standing- and are perceived en masse ?
For example: when I worked at arguably one of the world's largest ad agencies (which, probably not entirely by coincident, happened to function as an army information agency during ww2) there was an immense amount of market data readily available and maintained. Made everyone's life a little bit easier.
While at other agencies, without these incredible resources, research was conducted when necessary and budget allow.
Either way, whatever the tool and conducts used (specifics readily available, won’t bore you with the details), the practice usually boils down to a wide-segment surveillance, and tend to spring with ties to where you want to position a product or service.
As such let’s set aside this macro, and touch the micro where we get into individuals in user research. If you’re after alternative-, possibly improved ways I’ve found that a lot of ideas that answer existing problems are born from hearing individual perceptions and usage of whatever you’re researching. The macro approach of market research tend to circle conventional usage and perceptions, which is great for it’s intent but not necessarily the most suitable if the aim is to be a unique solution to a problem.
User testing, (be it questioners, interviews, all kinds of quirky tests) need not only to be objective and utterly unframed when conducted, but also equally objective when made.
An example of what not to do: while working abroad, albeit not at the previously mentioned ad agency, I found myself to be senior creative where the local staff were sold on the idea that design in Japan could never be done well by a foreigner. Me.
To my protest (not to mention working in Japan for over a decade back then, backed with previously mentioned incredible amount of research and a Japanese degree to boot) the staff insisted that, while I at least sold the idea of conducting user testing, they’d sneak in questions in a design survey early on asking if the designs were “Looking like it’s produced by a [foreign person]”.
The survey did say that no one thought my design looked foreign, or as the survey-producer somewhat leadingly put it “produced by non-Japanese”
These kinds of question tend to lead, and they’re a waste of focus in surveys. They might seem like just another way to get the most out of the people of whom you’ve already have their attention, but I have seen enough examples of dilation, live, to know not to mix mindsets. The “too many chefs” issue distilled into an individual.
The survey did say that no one thought my design looked foreign (or as the survey-producer somewhat leadingly put it “produced by non-Japanese”). Which may be a counterpoint to my own argument were it not for the other results and free commentary after the survey. Once the idea of something specific to look for was introduced the answers got vague and blurred. The survey got muddy, useless actually, and the studio head had to ask the producer to compile another survey and design test. One that was meaningful.
Granted Japanese designs tend to differ from much else so the concerns are understandable. Which is the point of design as I’ve argued: it’s by purpose. Should I design something in Japan it will be purposed for best reception in Japan. Should I design elsewhere, the design should reflect and resonate with that culture.
I rue, rue I say, the common notion that a designer from x country couldn’t design for users in y country.
I thought I’d make a point by including design that looks off in my portfolio for the US. There’s no possible way that these solutions would work well here. If we’re on the same wavelengths a decent designer (at least through my perhaps limited view) would automatically understand this. Some won’t (how could you ever blame someone for not working where I have worked), but the ones who do are the kind of designers I feel at ease with. That’s what I’m fishing for in a team.
Needs, trends and growth, etc I’ll leave out for now. In part because it’s too broad for this process sheet (and preferably performed by a dedicated marketing dept.), but in part because the market vs. user approach I find useful design-wise can widely applied all the same.
Moving on to another type of research…
Competitor research
So, know your market right? Decent market research should include (or at least in some scale cover) the competition, but the two should be differentiated.
Conveniently “the competition” sometimes end up on a competitor analysis or SWOT chart being a chartered reference point to where one want to position oneself and others, and just that. A bit of a lazy, disingenuous profile that doesn't help with much.
An actual quote:
I mostly work with what I am given, as I'm not a marketing research specialist, but try to avoid material of the nature such as above whenever possible.
Useful practices used are plenty, but I can't list them all here (it would further the already diluted whatever coherency there was in this document). Sufficient for my work is honest analyses, even if shallow, rather than deep-dives. It's not rocket surgery fgs.
A part of what some phrase "Media scanning". Interestingly enough it may reveal the same sort of flawed results as the previously mentioned useless fluff, albeit from the viewpoint of the company. The difference however is getting a glimpse into the competitor strategy for the future. Whatever narratives being pushed atm will cement later on.
Conclusive research
What am I ultimately trying to accomplish?
You might think this comes before market research and you’ll be right at times (especially if you skipped the user research part and went straight here). Asking yourself what’s really up is one of the more important things to do.
However doing that before research might skew the results if you’re disposition makes its way into questionnaires, interviews or tests. And, from experience, that happens a lot with large teams where communication turn focus into a blurry general direction pointer oppose to a sharp arrow.
-
Small project with few people?
Ask what’s really up early if you can handle it. -
Large project with a relatively large group of people?
Ask this early only if everyone can handle it.
What are we really trying to accomplish when company XYZ say they want to create the new Facebook, they want to increase sale or fight back some credibility after spilling oil in the Atlantic? Using a fictional oil spill, as an example, and set morals aside for a second, we may think damage-control being what’s sought to accomplish but a flat out denial could have worked without tarnishing the image in the first place.
Problem analysis
Diving deeper into our problem identification we might want to analyze our findings.
As before, micro- rather than macro level. Done before or after market- and user research, depending on what kind of answer one is looking for.
Protocol analysis
A protocol analysis is great way to dissect problems. In essence you break down every single step of a process, any process, and write up the steps.
Different from say a flow chart, a protocol analysis is a transcripts of every single step taken. It’s what you should be doing before a flow chart, as flow charts tend to suggest what should be happening (and often visually too). As such it’s a bit more encompassing. And the more steps you map the easier it is to find hiccups and bottlenecks.
It’s not:
Step one: A user clicks the app and presses menu to access his/her options
...when analyzing protocols.
That’s, at best, something in flow territory.
It’s...
- Reach for device [Location variable]
- Bring device into personal visual field
- Initiates device wake by pressing the phones on/off button
3.1. If the device is secured by a passw …
3.2 ....
…And so forth.
What’s good in all of this (to some seemingly overreaching) approach one might ask. It’s simple: finding, exploring alternatives. We tend to overlook steps that we are accustomed to, and with it we may overlook both steps to improve or even chances of other solutions.
In that example, reread it, maybe instead of starting with a reach for the device a voice command could bring up what you want in a jiffy?
Also imagine the analysis for starting a car 10 years ago and compare that analysis to today.
Whether what’s found would be the True problem or not remains to see (we’re getting into that later), but they more often than not are ammunition for your solution(s).
Creative Tech analysis
- Could the problem be that existing solutions are sound, just not working efficiently.
- A smooth, intuitive, speedy interface might be stippled by slow servers
- Fast servers might be stippled by poor code.
- Good code might be stippled by unoptimized graphics.
- ...and so on.
Much worth checking into should the situation allow. A tech analysis might seem a) obvious to some, and b) like a small detour for others, considering many asks are simply to “increase XYZ because ABC is taking shares”. Truth is that sometimes simple things like response and smoothness determines more than marketing. Top of my head: Viber vs Line in Japan.
Content analysis
Analysing the existing content might be high (if not right there on top) of the list, content is king and all that, but process wise analysing content might distract analysers since it would force focus on normalizing the existing material. In other words it colors all the previously mentioned analyses.
All in all, a lot of angles to consider. The payoff is considerable: