Categories
Design Strategy Talks & Workshops

Facilitating Investment Discussions in Strategy

In this post, I’ll focus on how to help teams with facilitating investment discussions by finding ways to remove (or at least reduce) subjectivity when we compare, contrast, or debate the value ideas, approaches, solutions to justify the investment on them.

In a previous post, I talked about the need for set of tools for quantifying and qualifying strategy that both empowers intuition and creativity, while also helping teams find objective ways to value design solutions to justify the experience investments in that bring us ever closer to our vision and goals.

In this post, I’ll focus on how to help teams with facilitating investment discussions by finding ways to remove (or at least reduce) subjectivity when we compare, contrast, or debate the value ideas, approaches, solutions to justify the investment on them.

TL;DR;

  • We need objective ways to value design solutions to justify the experience investments, and to look at the different points in the strategic planning and execution and identify the discussions that strategists should facilitate investment discussions, pivot and risk mitigation, while tracking and tracing the implementation of strategy to ensure we are bringing value for our customers and out business.
  • designers must to understand what objectives and unique positions business stakeholders want their products to assume in the industry, and the choices that are making in order to achieve such objectives and positions before they can help with facilitating investment discussions.
  • I’ve seen too many teams that a lot of their decisions seem to be driven by the question “What can we implement with least effort” or “What are we able to implement”, not by the question “what brings value to the user”.
  • That said I’ve also seen many designers make the mistake of focusing only on needs of the user, at the expense of business needs and technological constraints.
  • Strategists need to help the team identify all three types of hypotheses underlying a business idea: desirability, feasibility, and viability.
  • The kinds of products we are trying to bring to the world today are complex, which makes discussions around desirability, feasibility and viability very difficult, so the key to dealing with such complexity is to focus on having good conversations about assumptions.
  • Once we acknowledge we are dealing with assumptions, we should frame discussions around work that needs to be done through building, measuring and learning.
  • When we’ve considered all our hypotheses, it’s essential to set priorities and remove distractions so that people can get on with providing service to customers, thus increasing profits and the value of the business.
  • Ask the business stakeholders and the team: what is our prioritisation policy and how is it visualised? How does each and every item of work that has prioritised helps get us closer to our vision and achieve our goals?
  • In my experience, I’ve found helpful to come up with visualisations that help remove subjectivity off while facilitating investment discussions.
  • We should be aware that facilitating investment discussions at different phases of the development process means different things: reduce ambiguity though better problem framing, making good decisions by creating great choices, learn as fast and as cheap as possible if we should pivot, persevere or stop.

Quantifying and Qualifying Strategy

“What do people need?” is a critical question to ask when you build product. Wasting your life’s savings and your investor’s money, risking your reputation, making false promises to employees and potential partners, and trashing months or work you can never get back is a shame. It’s also a shame to find out you were completely delusional when you thought that everyone needed the product you were working on (Sharon, T., Validating Product Ideas, 2016)

In a previous article, I mentioneds that we need objective ways to value design solutions to justify the experience investments, and to look at the different points in the strategic planning and execution and identify the discussions that strategists should facilitate around what customers and users perceive as value, while tracking and tracing the implementation of strategy to ensure we are bringing value for both customers and business.

Design is the activity of turning vague ideas, market insights, and evidence into concrete value propositions and solid business models. Good design involves the use of strong business model patterns to maximize returns and compete beyond product, price and technology.

Bland, D. J., & Osterwalder, A., Testing business ideas, (2020)

From that perspective, we need to find ways to:

  • Explore (and preferably test) ideas early
  • Facilitate investment discussions by objectively describe business and user value, establishing priorities
  • Asses risk of pursuing ideas, while capturing signals that indicate if/when to pivot if an idea “doesn’t work”
  • Capture and track progress of strategy implementation
A holistic Quantifying and Qualifying set of tools and frameworks should help teams Pivot & Risk Mitigation Assessing the risk, capturing signals, know when to pivot Visibility and Traceability Capturing and tracking progress  Facilitating Investment Discussions Business /User Value, Priorities, Effort, etc Validating / Testing Ideas Finding objective ways to explore (and preferably test) ideas early
Instead of a single metric to measure ROI, let’s look at the different discussions that need to be facilitated while quantifying and qualifying strategy, namely: Pivot and Risk Mitigation, Facilitating Investment Discussions, Validating / Testing Business Ideas, Visibility and Traceability.

In that previous article I went deep into quantification and metrics, so I suggest to take a look at that if you’re interesting in measuring experiences.

Facilitating Investment Discussions

It is crucial that designers engage with their business stakeholders to understand what objectives and unique positions they want their products to assume in the industry, and the choices that are making in order to achieve such objectives and positions.

Six Strategic Questions, adapted from "Strategy Blueprint" in Mapping Experiences: A Guide to Creating Value through Journeys, Blueprints, and Diagrams (Kalbach, 2020).
Six Strategic Questions, adapted from “Strategy Blueprint” in Mapping Experiences: A Guide to Creating Value through Journeys, Blueprints, and Diagrams (Kalbach, 2020).

If you clearly articulated the answer to the six strategic questions (what are our aspirations, what are our challenges, what will we focus, what our guiding principles, what type of activities), strategies can still fail — spectacularly — if you fail to establish management systems that support those choices. Without the supporting systems, structures and measures for quantifying and qualifying outcomes, strategies remains a wish list, a set of goals that may or may not ever be achieved (“Manage What Matters” in Playing to Win: How Strategy Really Works (Lafley, A.G., Martin, R. L., 2013).

Facilitating Investment Discussions around Value

As I mentioned in a previous post, designers must become skilled facilitators that respond, prod, encourage, guide, coach and teach as they guide individuals and groups to make decisions that are critical in the business world though effective processes. There are few decisions that are harder than deciding how to prioritise.

I’ve seen too many teams that a lot of their decisions seem to be driven by the question “What can we implement with least effort” or “What are we able to implement”, not by the question “what brings value to the user”.

From a user-centered perspective, the most crucial pivot that needs to happen in the conversation between designers and business stakeholders is the framing of value:

  • Business value
  • User value
  • Value to designers (sense of self-realisation? Did I impact someone’s life in a positive way?)

The mistake I’ve seen many designers make is to look at prioritisation discussion as a zero-sum game: our user centered design tools set may have focused too much on needs of the user, at the expense of business needs and technological constraints.

That said, there is a case to be made that designers should worry about strategy because it helps shape the decisions that not only create value for users, but value for employees.

Companies that achieve enduring financial success create substantial value for their customers, their employees, and their suppliers.

Oberholzer-Gee, F. (2021). Better, simpler strategy (2021)

Therefore, a strategic initiative is worthwhile only if it does one of the following (Oberholzer-Gee, F. (2021). Better, simpler strategy. 2021):

  • Creates value for customers by raising their willingness to pay (WTP): If companies find ways to innovate or to improve existing products, people will be willing to pay more. In many product categories, Apple gets to charge a price premium because the company raises the customers’ WTP by designing beautiful products that are easy to use, for example. WTP is the most a customer would ever be willing to pay. Think of it as the customer’s walk-away point: Charge one cent more than someone’s WTP, and that person is better off not buying. Too often, managers focus on top-line growth rather than on increasing willingness to pay. A growth-focused manager asks, “What will help me sell more?” A person concerned with WTP wants to make her customers clap and cheer. A sales-centric manager analyzes purchase decisions and hopes to sway customers, whereas a value-focused manager searches for ways to increase WTP at every stage of the customer’s journey, earning the customer’s trust and loyalty. A value-focused company convinces its customers in every interaction that it has their best interests at heart.
  • Creates value for employees by making work more appealing: When companies make work more interesting, motivating, and flexible, they are able to attract talent even if they do not offer industry-leading compensation. Paying employees more is often the right thing to do, of course. But keep in mind that more-generous compensation does not create value in and of itself; it simply shifts resources from the business to the workforce. By contrast, offering better jobs not only creates value, it also lowers the minimum compensation that you have to offer to attract talent to your business, or what we call an employee’s willingness-to-sell (WTS) wage. Offer a prospective employee even a little less than her WTS, and she will reject your job offer; she is better off staying with her current firm. As is the case with prices and WTP, value-focused organizations never confuse compensation and WTS. Value-focused businesses think holistically about the needs of their employees (or the factors that drive WTS).
  • Creates value for suppliers by reducing their operating cost: Like employees, suppliers expect a minimum level of compensation for their product. A company creates value for its suppliers by helping them raise their productivity. As suppliers’ costs go down, the lowest price they would be willing to accept for their goods—what we call their willingness-to-sell (WTS) price—falls. When Nike, for example, created a training center in Sri Lanka to teach its Asian suppliers lean manufacturing, the improved production techniques helped suppliers reap better profits, which they then shared with Nike.
Oberholzer's Value Stick
The Value Stick is an interesting tool that provides insight into where the value is in a product or service. It relates directly to the Michael Porter’s Five Forces, reflecting how strong those forces are: Willingness to Pay (WTP), Price, Cost, Willingness to Sell (WTS). The difference between Willingness to Pay (WTP) and Willingness to Sell (WTS) — the length of the stick — is the value that a firm creates (Oberholzer-Gee, F., Better, simpler strategy, 2021)

This idea is captured in a simple graph, called a value stick. WTP sits at the top and WTS at the bottom. When companies find ways to increase customer delight and increase employee satisfaction and supplier surplus (the difference between the price of goods and the lowest amount the supplier would be willing to accept for them), they expand the total amount of value created and position themselves for extraordinary financial performance. 

Organizations that exemplify value-based strategy demonstrate some key behaviours (Oberholzer-Gee, F., “Eliminate Strategic Overload” in Harvard Business Review, 2021):

  • They focus on value, not profit. Perhaps surprisingly, value-focused managers are not overly concerned with the immediate financial consequences of their decisions. They are confident that superior value creation will result in improved financial performance over time.
  • They attract the employees and customers whom they serve best. As companies find ways to move WTP or WTS, they make themselves more appealing to customers and employees who particularly like how they add value.
  • They create value for customers, employees, or suppliers (or some combination) simultaneously. Traditional thinking, informed by our early understanding of success in manufacturing, holds that costs for companies will rise if they boost consumers’ willingness to pay—that is, it takes more-costly inputs to create a better product. But value-focused organizations find ways to defy that logic.

While in the past designers would concentrate on enhancing desirability, the emerging strategic role of designers means they have to balance desirability, feasibility and viability simultaneously. Designers need to expand their profiles and master a whole new set of strategic practices.”

“Strategic Designers: Capital T-shaped professionals” in Strategic Design (Calabretta et al., 2016)

For such conversation pivot to focus on value to happen, designers will need to get better at influencing the strategy of their design project. However, some designers lack the vocabulary, tools, and frameworks to influence it in ways that drive user experience vision forward. Advocating for how can we inform the decisions that increase our customer’s Willingness to Pay (WTS) by — for example — increasing customer’s delight.

battle board game castle challenge
Learn more about how unprepared designers are if they are not able to understand and influence strategy in Becoming a Design Strategist (Photo by Pixabay on Pexels.com)

To understand the risk and uncertainty of your idea you need to ask: “What are all the things that need to be true for this idea to work?” This will allow you to identify all three types of hypotheses underlying a business idea: desirability, feasibility, and viability (Bland, D. J., & Osterwalder, A., Testing business ideas, 2020):

  • Desirability (do they want this?) relates to the risk that the market a business is targeting is too small; that too few customers want the value proposition; or that the company can’t reach, acquire, and retain targeted customers.
  • Feasibility (Can we do this?) relates to the risk that a business can’t manage, scale, or get access to key resources (technology, IP, brand, etc.). This is isn’t just technical feasibility; we also look need to look at overall regulatory, policy, and governance that would prevent you from making your solution a success.
  • Viability (Should we do this?) relates to the risk that a business cannot generate more revenue than costs (revenue stream and cost stream). While customers may want your solution (desirable) and you can build it (feasible), perhaps there’s not enough of a market for it or people won’t pay enough for it. 
A Veen Diagram representing the intersection between Desirability, Viability and Feasibility.
 The Sweet Spot of Innovation in Brown, T., & Katz, B., Change By Design (2009)

Design strategists should help team find objective ways to value design ideas/ approaches/ solutions to justify the investment on them from both desirability, feasibility and viability.

Quantifying and Qualifying Desirability

It’s been my experience that a lot of product decisions seem to be driven by the question What can we implement with least effort or What are we able to implement, not by the question what brings value to the user

In the previous article, I’ve mentioned that — when customers evaluate a product or service — they weigh its perceived value against the asking price. Marketers have generally focused much of their time and energy on managing the price side of that equation, since raising prices can immediately boost profits. But that’s the easy part: Pricing usually consists of managing a relatively small set of numbers, and pricing analytics and tactics are highly evolved. What consumers truly value, however, can be difficult to pin down and psychologically complicated (Almquist, E., Senior, J., & Bloch, N., The Elements of Value, 2016).

The Elements of Value Pyramid: in the lowest level of the pyramid, Functional; one level higher, emotional; one level higher, life changing; in the upper most level; social impact.
“30 Elements of Value” in The Elements of Value (Almquist, E., Senior, J., & Bloch, N., 2016)

How can leadership teams actively manage value or devise ways to deliver more of it, whether functional (saving time, reducing cost) or emotional (reducing anxiety, providing entertainment)? Discrete choice analysis—which simulates demand for different combinations of product features, pricing, and other components—and similar research techniques are powerful and useful tools, but they are designed to test consumer reactions to preconceived concepts of value—the concepts that managers are accustomed to judging (Almquist, E., Senior, J., & Bloch, N., The Elements of Value, 2016).

So how do you facilitate discussions that help teams clearly see value from different angles? I’ve found that alignment diagrams are really good to get the teams to have qualifying discussions around value. Here are some examples below:

Visualising Desirability through Alignment Diagrams

Alignment diagrams to refer to any map, diagram, or visualization that reveals both sides (Business and Users) of value creation in a single overview. They are a category of diagram that illustrates the interaction between people and organizations (Kalbach, J., ”Visualizing Value: Aligning Outside-in” in Mapping Experiences, 2021).

Customer Journey Maps are visual thinking artifacts that help you get insight into, track, and discuss how a customer experiences a problem you are trying to solve. How does this problem or opportunity show up in their lives? How do they experience it? How do they interact with you? (Lewrick, M., Link, P., & Leifer, L., The design thinking playbook. 2018).

Experience Maps look at a broader context of human behavior. They reverse the relationship and show how the organization fits into a person’s life (Kalbach, J., ”Visualizing Value: Aligning Outside-in” in Mapping Experiences, 2021).

User story mapping is a visual exercise that helps product managers and their development teams define the work that will create the most delightful user experience. User Story Mapping allows teams to create a dynamic outline of a set of representative user’s interactions with the product, evaluate which steps have the most benefit for the user, and prioritise what should be built next (Patton, J.,  User Story Mapping: Discover the whole story, build the right product, 2014).

Opportunity Solution Trees are a simple way of visually representing the paths you might take to reach a desired outcome (Torres, T., Continuous Discovery Habits: Discover Products that Create Customer Value and Business Value, 2021)

Service Blueprints are visual thinking artifacts that help to capture the big picture and interconnections, and are a way to plan out projects and relate service design decisions back to the original research insights. The blueprint is different from the service ecology in that it includes specific detail about the elements, experiences, and delivery within the service itself (Polaine, A., Løvlie, L., & Reason, B., Service design: From insight to implementation, 2013).

Strategy Canvas help you compare how well competitors meet costumer buying criteria or desired outcomes. To create your own strategy canvas, list the 10-12 most important functional desired outcomes — or buying criteria — on the x-axis. On the y-ais, list the 3-5 most common competitors (direct, indirect, alternative solutions and multi-tools solutions) for the job. (Garbugli, É., Solving Product, 2020).

white dry erase board with red diagram
Learn more about how to use Visual Thinking methods to visualise Value and Desirability in Strategy, Facilitation and Visual Thinking (Photo by Christina Morillo on Pexels.com)

While Alignment Diagrams are good for facilitating discussions around qualifying value by bringing both Business and Users perspectives together, there is still a need for quantifying value objectively. Let’s look why.

Quantifying Desirability, Value, and Satisfaction

When product managers, designers and strategists are crafting their strategy or working on discovery phase, the kind of user and customer insights they are looking for are really hard to acquire through quantitative metrics, either because we cannot derive insights from the existing analytics coming from the product, or because we are creating something new (so there are no numbers to refer to). Most of such insights (especially desirability and satisfaction) would come from preference data.

Preference data consists of the more subjective data that measures a participant’s feelings or opinions of the product.

Rubin, J., & Chisnell, D., Handbook of usability testing: How to plan, design, and conduct effective tests (2011)

Just because preference data is more subjective, it doesn’t meant is less quantifiable: although design and several usability activities are certainly qualitative, the image of good and bad designs can easily be quantified through metrics like perceived satisfaction, recommendations, etc (Sauro, J., & Lewis, J. R., Quantifying the user experience: Practical statistics for user research. 2016).

Preference Data is typically collected via written, oral, or even online questionnaires or through the debriefing session of a test. A rating scale that measure how a participant feels about the product is an example of a preference measure (Rubin, J., & Chisnell, D., Handbook of usability testing, 2011).

measurement-millimeter-centimeter-meter-162500.jpeg
Learn about ways to objectively measure the value of design in The Need for Quantifying and Qualifying Strategy (Photo by Pixabay on Pexels.com)

You can find examples of preference data that design strategist can collect to inform strategic decisions in my previous post, so I’ll just mention the ones that I find to get more traction with business stakeholders.

Jobs To Be Done (JTBD) and Outcome-Driven Innovation

Outcome-Driven Innovation (ODI) is a strategy and innovation process built around the theory that people buy products and services to get jobs done. It links a company’s value creation activities to quantifying and qualifying customer-defined metrics. Ulwick found that previous innovation practices were ineffective because they were incomplete, overlapping, or unnecessary.

Outcome-Driven Innovation® (ODI) is a strategy and innovation process that enables a company to create and market winning product and service offerings with a success rate that is 5-times the industry average

Ulwick, A.,  What customers want: Using outcome-driven innovation to create breakthrough products and services (2005)

Clayton Christensen credits Ulwick and Richard Pedi of Gage Foods with the way of thinking about market structure used in the chapter “What Products Will Customers Want to Buy?” in his Innovator’s Solution and called “jobs to be done” (JTBD) or “outcomes that customers are seeking”.

UX Matrix: OPPORTUNITY SCORES
Clayton Christensen credits Ulwick and Richard Pedi of Gage Foods with the way of thinking about market structure used in the chapter “What Products Will Customers Want to Buy?” in his Innovator’s Solution and called “jobs to be done” or “outcomes that customers are seeking”.

Ulwick’s “opportunity algorithm” measures and ranks innovation opportunities. Standard gap analysis looks at the simple difference between importance and satisfaction metrics; Ulwick’s formula gives twice as much weight to importance as to satisfaction, where importance and satisfaction are the proportion of high survey responses.

You’re probably asking yourself “where these values come from?” That’s where User Research comes in handy: once you’ve got the List of Use Cases, you go back to your users and probe on how important each use case is, and how satisfied with the product they are with regards to each use case.

Once you’ve obtained the opportunity scores for each use case, what comes next? There are two complementary pieces of information that the scores reveal: where the market is underserved and where the it is overserved. We can use this information to make some important targeting and resource-related decisions.

Opportunity Scores: GRAPH
Plotting the Jobs-to-be-Done, in order to map where the market is underserved and where the it is overserved (Ulwick, A., What customers want: Using outcome-driven innovation to create breakthrough products and services, 2005)
The Importance versus Satisfaction Framework

Similar to Outcome-Driven Innovation, this framework proposes you should be quantifying and qualifying customers need that any particular feature of the product is going to address (Olsen, D. The lean product playbook, 2015):

  • How important is that?
  • Then how satisfied are people with the current alternatives that are out there?
Dan Olsen's framework proposes you should be quantifying and qualifying the customer need that any particular feature of the product is going to address: How important is that? Then how satisfied are people with the current alternatives that are out there? You want to build things that have high important needs with low satisfaction
Importance versus Satisfaction Quadrants in The lean product playbook (Olsen, D., 2015)

What I like about Olsen’s approach to assessing opportunities is that he created a couple of variations of opportunities scores:

  • Customer Value Delivered = Importance x Satisfaction
  • Opportunity to Add Value = Importance x (1 – Satisfaction)
  • Opportunity = Importance – Current Value Delivered
Underserved versus Overserved Markets

Jobs to be Done helps identify markets where customers needs underserved (therefore, ripe for disruption) but also help us find out where it is overserved:  Jobs and outcomes that are unimportant or already satisfied represent little opportunity for improvement and consequently should not receive any resource allocation in most markets, it is not uncommon to find a number of outcomes that are overserved — and companies that are nevertheless continuing to allocate them development resources. We say that an outcome is overserved when its satisfaction rating is higher than its importance rating. When a company discovers these overserved outcomes, it should consider the following three avenues for possible action (Ulwick, A. W., What customers want, 2005):

  1. If the company is currently focusing on these overserved outcomes, those efforts should be halted. Making additional improvements in areas that are already overserved is simply a waste of resources and is likely to add cost without adding additional value.
  2. If cost reduction is an important consideration in the market, then costs can be reduced by taking out costly function in areas that are overserved. For example, if a five-dollar feature can be redesigned so that it satisfies an outcome 80 percent as well as it does currently but for half the cost, then the company may want to make this trade-off.
  3. If many overserved outcomes are discovered in a market, then the company should consider the possibility of engaging in disruptive innovation. This would mean taking out cost along multiple dimensions and creating a lower-cost business model that existing competitors would be unable to match. The concept of a low-end disruptive innovation, as described in The Innovator’s Solution, is only possible when the customer population, or a segment of that population, is overserved.

In the outcome-driven paradigm, companies do not brainstorm hundreds of ideas and then struggle to figure out which — if any — have value. Instead they figure out which of the 50 to 150 outcomes for a given job are important and unsatisfied and then systematically devise a few ideas that will better satisfy those underserved outcomes.

Ulwick, A. W., What customers want (2005)

Since they know which outcomes are underserved, they know where to make improvements, and, more importantly, they know doing so will result in products that customers want. This flips the innovation process on its head (Ulwick, A. W., What customers want, 2005).

In the outcome-driven paradigm the focus is not on the customer, it is on the Jobs to be Done (JTBD): the job is the unit of analysis. When companies focus on helping the customer get a job done faster, more conveniently, and less expensively than before, they are more likely to create products and services that the customer wants (picture: man in red long sleeve shirt holding a drilling tool)
Learn how Jobs to be Done (JTBD) work as a great “exchange” currency to facilitate strategy discussions around value between designers, business stakeholders and technology people (Photo by Blue Bird on Pexels.com)
Kano Model

The Kano Model, developed by Dr. Noriaki Kano, is a way of classifying customer expectations into three categories: expected needs, normal needs, exciting needs. This hierarchy can be used to help with our prioritization efforts by clearly identifying the value of solutions to the needs in each category (“Kano Model” in Product Roadmaps Relaunched, Lombardo, C. T., McCarthy, B., Ryan, E., & Connors, M., 2017):

  • The customer’s expected needs are roughly equivalent to the critical path: if those needs are not met, they become dissatisfiers.
  • If you meet the expected needs, customers will start articulating normal needs, or satisfiers — things they don’t normally need in the product but will satisfy them.
  • When normal needs are largely met, then exciting needs (delighters or wows) go beyond the customers’ expectations.
“X axis: Investment; Y axis: Satisfaction” in Kano Model Analysis in Product Design

The Kano methodology was initially adopted by operations researchers, who added statistical rigor to the question pair results analysis. Product managers have leveraged aspects of the Kano approach in Quality Function Deployment (QFD). More recently, this methodology has been used by Agile teams and in market research (Moorman, J., “Leveraging the Kano Model for Optimal Results” in UX Magazine, 2012).

Learn more about Quantifying and Qualifying User Needs and Delight using the Kano Method (Jan Moorman: Measuring User Delight using the Kano Methodology)
Learn more about the Kano Method from Measuring User Delight using the Kano Methodology (Moorman, J., 2012)

Quantifying and Qualifying Feasibility

Maybe I’m an idealist, but I believe everything is feasible — given enough time and resources. The task of strategists then becomes understand the expectations of stakeholders, facilitate the discussions necessary to to identify the gap between vision and the current state, then work out what needs to be true to get to that vision.

beach bench boardwalk bridge
Learn more about creating product vision in The Importance of Vision (Photo by Pixabay on Pexels.com)

With that being said, the gap between current state and vision can only be filled by the people that are actually going to do the work, which is why I think a lot projects fail: if decisions are made (e.g.: roadmaps, release plans, investment priorities, etc) without involving that people that actually going to do the work.

We need to ensure feasibility before we decide, not after. Not only this end up saving a lot of wasted time, but it turns out that getting the engineers’s perspective earlier also tends to improve the solution itself, and it’s critical to shared learning (Cagan, M., Inspired: How to create tech products customers love, 2017).

If the first time your developers see an idea is at the sprint planning, you have failed.

Cagan, M., Inspired: How to create tech products customers love (2017)

My dad was a civil engineer with the Brazilian Air Force Corp of Engineers, and he responsible for building major infrastructure projects in the Amazon Basin. Even though Engineering and architecture being disciplines thousand of years old, he used to say there was a degree of “art” coming with project costs and timelines. That’s why he work (based on his previous experiences) with estimates.

Different organizations, industries, and sectors employ different models or formulae to estimate time. At first sight they always seem mathematical, but in most cases their effectiveness is psychological — either overcoming aversion to estimating, or encouraging more careful thought in those who tend to rush in. Perhaps the most widely known is the formula (Project Evaluation and Review Technique). To use PERT you need three estimates of the time it could take to complete a task or activity (Baron, E., The Book of Management: the ten essential skills for achieving high performance, 2010):

  • The most likely time required (Tm)
  • The most optimistic time assessment (To)
  • The most pessimistic time assessment (Tp)

Using the following formula to estimate the most probable duration for that activity (Te)

Te = (To + 4Tm + Tp)/6

No matter what the type, size, or budget of a project is, estimating can be a daunting task. Every project request comes with a set of unknowns, or a gray area that makes a team or individual nervous about expectations concerning cost, timelines, and level of effort. Because the gray area changes from project to project, there is no simple way of saying, “It always takes us this long to do this thing” without qualifying it with some additional factors (“with these people, on this project, in this place, at this time, etc.”). (Harned, B, Project Management for Humans, 2017).

That said, even the most experienced engineer (or designer, or architect, etc) needs to deal with uncertainty by acknowledging that even our best estimates can be based on only assumptions.

With regards to estimations and assumptions, here are a few wise words from my mentors Karsten Hess, Jon Innes, and Richard Howard:

  • Estimations are more useful for representing agreement, not reality: put the numbers in, compare and learn from other estimators, assess what can we agree on, commit and move on.
  • One only can only estimate on work he/she has actually done: if you’ve never done this kind of work, acknowledge it, produce our best guesses, monitor the progress of work, compare estimations with the reality once implementation happens (that’s when the traceability and visibility aspect of the Quantifying and Qualifying Strategy framework comes in), and you’ll will get better with practice.
  • We should never commit to estimations without consulting the person responsible or affected by the work: when assessing things like “T-Shirt sizes”, “Business Value”, or “Effort”, have your first pass but keep in mind that you’ll need to confirm and agree on the estimates with whoever are responsible or affected by the work before communicating you decisions.
blue printer paper
Learn more about requirements and estimations in Strategy, Planning and Project Management (Photo by Startup Stock Photos on Pexels.com)

Many companies try to deal with complexity with analytical firepower and sophisticated mathematics. That is unfortunate, since the most essential elements of creating a hypothesis can typically be communicated through simple pencil-and-paper sketches (Govindarajan, V., & Trimble, C., The other side of innovation: Solving the execution challenge, 2010.)

The key to dealing with complexity is to focus on having good conversations about assumptions.

Break Down the Hypothesis in The other side of innovation: Solving the execution challenge, Govindarajan, V., & Trimble, C., (2010)
Help teams with facilitating investment discussions with Assumptions Mapping in Bland, D. J., & Osterwalder, A., Testing business ideas (2020)

Which is why a lot teams are taking cues from the Lean playbook and frame discussions around work that needs to be done through building, measuring and learning. As you sit down with your teams to plan out your next initiative, ask them these questions (Gothelf, J., & Seiden, J., Sense and respond. 2017):

  • What is the most important thing (or things) we need to learn first?
  • What is the fastest, most efficient way to learn that?
Learn more about how to ask questions (turned on pendant lamp)
Learn more about how asking questions can help teams with facilitating investment discussions that ensure teams are making good decisions in Strategy, Facilitation, and the Art of Asking Questions (Photo by Burak K on Pexels.com)

If you only have one hypothesis to test it’s clear where to spend the time you have to do discovery work. If you have many hypotheses, how do you decide where your precious discovery hours should be spent? Which hypotheses should be tested? Which ones should be de-prioritised or just thrown away? To help answer this question, Jeff Gothelf put together the Hypothesis Prioritisation Canvas (Gothelf, J., The hypothesis prioritization canvas, 2019):

The hypothesis prioritization canvas helps facilitate an objective conversation with your team and stakeholders to determine which hypotheses will get your attention and which won’t (Gothelf, J., 2019)
The hypothesis prioritization canvas helps help teams with facilitating investment discussions with your team and stakeholders to determine which hypotheses will get your attention and which won’t (Gothelf, J., 2019)

Feasibility Hypothesis

The Business Model Canvas contains infrastructure risk in the key partners, key activities, and key resources components. Identify the feasibility hypotheses you are making in (Bland, D. J., & Osterwalder, Testing Business Ideas: A Field Guide for Rapid Experimentation, 2019):

KeActivities

Wbelievthat we

• can perform all activities (at scale) and at the right quality level that is required to build our business model.

KeResources

Wbelievthat we

• can secure and manage all technologies and resources (at scale) that are required to build our business model, including intellectual prop- erty and human, financial, and other resources.

KePartners

Wbelievthat we

• can create the partner- ships required to build our business.

Quantifying and Qualifying Viability

Similarly to Feasibility, we need to validate business viability of our ideas during discovery, not after (Cagan, M., Inspired: How to create tech products customers love, 2017).

It’s absolutely critical to ensure that the solution we build will meet the needs of our business — before we talk the time and expense to build out the product.

Cagan, M., Inspired: How to create tech products customers love (2017)

Viability Hypothesis

The Business Model Canvas contains financial risk in the revenue stream and cost structure. Identify the viability hypotheses you are making in (Bland, D. J., & Osterwalder, Testing Business Ideas: A Field Guide for Rapid Experimentation, 2019):

Revenue Streams

Wbelievthat we

• can get customers to pay a specific price for our value propositions.

• can generate sufficient revenues.

Cost Structure

Wbelievthat we

• can manage costs from our infrastucture and keep them under control.

Profit

Wbelievthat we

• can generate more revenues than costs in order to make a profit.

Facilitating Investment Discussions through clear Priorities

It’s essential to set priorities and remove distractions so that people can get on with providing service to customers, thus increasing profits and the value of the business (Kourdi, J., Business Strategy: A guide to effective decision-making, 2015).

As a design manager, I’ve always found that — while defining and shaping the Product Design vision to ensure cohesive product narratives through sound strategy and design principles — the way priorities are defined can potentially create a disconnect from vision, especially when tough choices around scope needs to be made. It’s important that we facilitated discussions around priorities, so the hard choices that needs to be made take in account not just feasibility, but also viability and desirability.

The challenge — though — is to have the team to clearly connect goals to priorities.

What slows progress and wastes the most time on projects is confusion about what the goals are or which things should come before which other things. Many miscommunications and missteps happen because person A assumed one priority (make it faster), and person B assumed another (make it more stable). This is true for programmers, testers, marketers, and entire teams of people. If these conflicts can be avoided, more time can be spent actually progressing toward the project goals (Berkun, S., Making things happen: Mastering project management, 2008).

Product Definition and Requirements Prioritization
Visualising the impact of user experience of any given use case based on its opportunity score, while helping in the product decision making process by providing a better sense of priorities.

Priorities Make Things Happen

Berkun, S., Making things happen: Mastering project management (2008)

If you have priorities in place, you can always ask questions in any discussion that reframe the argument around a more useful primary consideration. This refreshes everyone’s sense of what success is, visibly dividing the universe into two piles: things that are important and things that are nice, but not important. Here are some sample questions (Berkun, S., Making things happen: Mastering project management, 2008):

  • What problem are we trying to solve?
  • If there are multiple problems, which one is most important?
  • How does this problem relate to or impact our goals?
  • What is the simplest way to fix this that will allow us to meet our goals?
Learn more about Prioritisation (pen calendar to do checklist)
Learn more how to help teams with facilitating investment discussions with Alignment Diagrams in Strategy and Prioritisation (Photo by Breakingpic on Pexels.com)

Clarity of Priorities through Visualisations

There are a few things you should ask yourself and/or the team when we keep coming revisiting and renegotiating the scope of work (DeGrandis, D., Making work visible: Exposing time theft to optimize workflow, 2017):

  • What is your prioritisation policy and how is it visualised? How does each and every item of work that has prioritised helps get us closer to our vision and achieve our goals?
  • How will you signal when work has been prioritised and is ready to be worked on? In other words — where is your line of commitment? How do people know which work to pull?
  • How will we visually distinguish between higher priorities and lower priority work?

From that perspective, I find important to come up with visualisations that help remove subjectivity off investment discussions. Let’s talk about a few examples.

Opportunity-Solution Tree

Many teams generate a lot of ideas when they go through a journey-mapping or experience-mapping exercise. There are so many opportunities for improving things for the customer that they quickly become overwhelmed by a mass of problems, solutions, needs, and ideas without much structure or priority (“Opportunity-Solution Tree” in Product Roadmaps Relaunched, Lombardo, C. T., McCarthy, B., Ryan, E., & Connors, M., 2017).

Opportunity solution trees are a simple way of visually representing the paths you might take to reach a desired outcome (Torres, T., Continuous Discovery Habits, 2021):

  • The root of the tree is your desired outcome—the business need that reflects how your team can create business value.
  • Below the opportunity space is the solution space. This is where we’ll visually depict the solutions we are exploring.
  • Below the solution space are assumption tests. This is how we’ll evaluate which solutions will help us best create customer value in a way that drives business value.
“Opportunity Solution Tree” in Continuous Discovery Habits: Discover Products that Create Customer Value and Business Value (Torres, T., 2021)
Opportunity-Solution Trees (OST) are a simple way of visually representing the paths you might take to reach a desired outcome (Torres, T., Continuous Discovery Habits, 2021)

Opportunity solution trees have a number of benefits. They help product trios (Torres, T., Continuous Discovery Habits: Discover Products that Create Customer Value and Business Value, 2021):

  • Resolve the tension between business needs and customer needs
  • Build and maintain a shared understanding of how they might reach their desired outcome
  • Adopt a continuous mindset
  • Unlock better decision-making
  • Unlock faster learning cycles
  • Build confidence in knowing what to do next
  • Unlock simpler stakeholder management

Impact Mapping

Like highway maps that show towns and cities and the roads, connecting them, Impact Maps layout out what we will build and how these connect to ways we will assist the people who will use the solution. An impact map is a visualisation of the scope and underlying assumptions, created collaboratively by senior technical people and business people. It’s a mind-map grown during a discussion facilitated by answering four questions: WHY, WHO, HOW and WHAT of the problem the team is confronting (Adzic, G., Impact Mapping, 2012)

"Goals, Actors, Impact, and Deliverables" in Impact Mapping: Brings visibility to what is important and helps teams with facilitating investment discussions.
“Goals, Actors, Impact, and Deliverables” in Impact Mapping: Making a big impact with software products and projects (Adzic, G., 2012).

Prioritisation Grids

We answer these questions by figuring out what the tradeoffs are between the product’s importance and its feasibility/viability (Natoli, J., Think first, 2015).

Prioritisation Grid is probably the simplest visualisation tool to help teams with facilitating investment discussions
IBM Enterprise Design Thinking, “Decide your next move by focusing on the intersection of importance and feasibility” in Prioritisation Grid

Furthermore, we can adapt these axises in these prioritisation grids to suit the discussion at hand (value to business and time to market, number of customers impacted and speed to adoption, importance and urgency, etc.) as long as all the stakeholders involved agree on the which criterion are more useful to the decision being discussed, and if there is enough expertise and data available for the team making the prioritisation exercise.

Use Cases Lists: Pugh Matrix

The UXI Matrix is a simple, flexible, tool that extends the concept of the product backlog to include UX factors normally not tracked by agile teams. To create a UX Integration Matrix, you add several UX-related data points to your user stories (Innes, J., Pugh Matrix in Integrating UX into the product backlog, 2012)

Pugh Matrix helps us visualise the complete backlog and helps teams with facilitating investment discussions while quantifying and qualifying outcomes.
Help teams with facilitating investment discussions with Decision Matrices or “Pugh Matrix” in Integrating UX into the product backlog (Innes, J., 2012)

The UXI Matrix helps teams integrate UX best practices and user-centered design by inserting UX at every level of the agile process:

  • Groom the backlog: During release and sprint planning you can sort, group, and filter user stories in Excel.
  • Reduce design overhead: if a story shares several personas with another story in a multi-user system, then that story may be a duplicate. Grouping by themes can also help here.
  • Facilitate Collaboration: You can share it with remote team members. Listing assigned staff provides visibility into who’s doing what (see the columns under the heading Staffing). Then team members can figure out who’s working on related stories and check on what’s complete, especially if you create a hyperlink to the design or research materials right there in the matrix.
  • Track user involvement and other UX metrics: It makes it easier to convince the team to revisit previous designs when metrics show users cannot use a proposed design, or are unsatisfied with the current product or service. Furthermore, it can be useful to track satisfaction by user story (or story specific stats from multivariate testing) in a column right next to the story.
Use Case List (also known as PUGH MATRIX) is great tool to help teams with facilitating investment discussions  by bringing visibility to the number and the status of the backlog.
Click on the image to see an example of how to help teams with facilitating investment discussions with Use Case List: PUGH MATRIX

I’ve created Use Cases Lists (or Pugh Matrix), which is decision matrix to help evaluate and prioritize a list of options while working with Product Management and Software Architecture teams in both AutoCAD Map3D and AutoCAD Utility Design projects to first establish a list of weighted criteria, and then evaluates each use case against those criteria, trying to take the input from the different stakeholders of the team into account (user experience, business values, etc).

Using the Outcome-driven Innovation Framework above, you can prioritize the Use Cases based on their Opportunities Scores.

The Right Time for Facilitating Investment Discussions

You might be asking yourself “These are all great, but when should I be doing what?”. Without knowing what kind of team set up you have, and what kinds of processes you run in your organization, the best I can do is to map all of the techniques above the the Double Diamond framework.

The Double Diamond Framework

Design Council’s Double Diamond clearly conveys a design process to designers and non-designers alike. The two diamonds represent a process of exploring an issue more widely or deeply (divergent thinking) and then taking focused action (convergent thinking).  

  • Discover. The first diamond helps people understand, rather than simply assume, what the problem is. It involves speaking to and spending time with people who are affected by the issues.
  • Define. The insights gathered from the discovery phase can help you to define the challenge in a different way.
  • Develop. The second diamond encourages people to give different answers to the clearly defined problem, seeking inspiration from elsewhere and co-designing with a range of different people.
  • Deliver. Delivery involves testing out different solutions at small-scale, rejecting those that will not work and improving the ones that will.
Design Council’s framework for innovation also includes the key principles and design methods that designers and non-designers need to take, and the ideal working culture needed, to achieve significant and long-lasting positive change.
A clear, comprehensive and visual description of the design process in What is the framework for innovation? (Design Council, 2015)

Map of Facilitating Investment Discussions Activities and Methods

Process Awareness characterises a degree to which the participants are informed about the process procedures, rules, requirements, workflow and other details. The higher is process awareness, the more profoundly the participants are engaged into a process, and so the better results they deliver.

In my experience, the biggest disconnect between the work designers need to do and the mindset of every other team member in a team is usually about how quickly we tend — when not facilitated — to jump to solutions instead of contemplate and explore the problem space a little longer.

Knowing when team should be diverging, when they should be exploring, and when they should closing will help ensure they get the best out of their collective brainstorming and multiple perspectives’ power and keep the team engaged.

Map of Quantifying and Qualifying Activities that can help with Facilitating Investment Discussions in the Double Diamond (Discover, Define, Develop and Deliver)

My colleagues Edmund Azigi and Patrick Ashamalla have created a great set of questions and a cheatsheet that maps which questions are more appropriate for different phases of the product development lifecycle. So the following set of activities is inspired in their cheat sheet.

Facilitating Investment Discussions during “Discover”

This phase has the highest level of ambiguity, so creating shared understanding is really critical. While a degree of back and forth is expected and facilitating investment discussions might be too early, you can still move to clarity faster by having a strong shared vision, good problem framing, clear priorities defined through outcomes upfront.

Here are my recommendations for suggested quantifying and qualifying activities and methods:

Better problem framing is probably the very first step to help teams with facilitating investment discussions (yellow letter tiles)
Learn more about problem framing techniques that can help teams with facilitating investment discussions by creating clarity of what problems they are trying to solve in Problem Framing for Strategic Design (Photo by Ann H on Pexels.com)

Facilitating Investment Discussions during “Define”

This phase we should see the level of ambiguity diminishing, and facilitating investment discussions have the highest pay off in mitigating back-and-forth. Helping the team make good decisions by creating great choices is critical. Here are my recommendations for suggested quantifying and qualifying activities and methods:

Facilitating Investment Discussions is probably the best type of good decision that product teams can make (description: banking business checklist commerce)
Learn more about how to help teams with facilitating investment discussions by making good decisions in Facilitating Good Decisions (Photo by Pixabay on Pexels.com)

Facilitating Investment Discussions during “Develop”

In this phase we are going to a point that the cost of changing your mind increases rapidly as time passes. So the team should be focusing on learning as cheap as possible (by capturing signals from the market) and discussions around investment should answer the questions if we should pivot, persevere, or stop.

Here are my recommendations for suggested quantifying and qualifying activities and methods:

Facilitating Investment Discussions during “Deliver”

In this phase is too late to facilitate investment discussions. The best you can do is to collect data from real customer usage for visibility and traceability, and make hard choices about pivot, persevere, or stop the next iteration of the product.

Here are my recommendations for suggested quantifying and qualifying activities and methods:

Facilitating Quantifying and Qualifying Discussions

I’m of the opinion that designers — instead of complaining that everyone else is jumping too quickly into solutions — should facilitate the discussions and help others raise the awareness around the creative and problem solving process.

I’ll argue for the Need of Facilitation in the sense that — if designers want to influence the decisions that shape strategy — they must step up to the plate and become skilled facilitators that respond, prod, encourage, guide, coach and teach as they guide individuals and groups to make decisions that are critical in the business world though effective processes.

That said, my opinion is that facilitation here does not only means “facilitate workshops”, but facilitate the decisions regardless of what kinds of activities are required.

Designers and strategists can quickly learn the skills to help teams with facilitating investment discussions (photo of people near wooden table)
Learn more about how to help teams with facilitating investment discussions by becoming a skilled facilitator (Photo by fauxels on Pexels.com)

Adzic, G. (2012). Impact Mapping: Making a big impact with software products and projects (M. Bisset, Ed.). Woking, England: Provoking Thoughts.

Almquist, E., Senior, J., & Bloch, N. (2016). The Elements of Value: Measuring—and delivering— what consumers really want. Harvard Business Review, (September 2016), 46–53.

Baron, E. (2010). The Book of Management: the ten essential skills for achieving high performance. London, UK: Dorling Kindersley.

Berkun, S. (2008). Making things happen: Mastering project management. Sebastopol, CA: O’Reilly Media.

Bland, D. J., & Osterwalder, A. (2020). Testing business ideas: A field guide for rapid experimentation. Standards Information Network.

Brown, T., & Katz, B. (2009). Change by design: how design thinking transforms organizations and inspires innovation. [New York]: Harper Business

Cagan, M. (2017). Inspired: How to create tech products customers love (2nd ed.). Nashville, TN: John Wiley & Sons.

DeGrandis, D. (2017). Making work visible: Exposing time theft to optimize workflow. Portland, OR: IT Revolution Press.

Design Council. (2015, March 17). What is the framework for innovation? Design Council’s evolved Double Diamond. Retrieved August 5, 2021, from designcouncil.ork.uk website: https://www.designcouncil.org.uk/news-opinion/what-framework-innovation-design-councils-evolved-double-diamond

Garbugli, É. (2020). Solving Product: Reveal Gaps, Ignite Growth, and Accelerate Any Tech Product with Customer Research. Wroclaw, Poland: Amazon.

Gothelf, J. (2019, November 8). The hypothesis prioritization canvas. Retrieved April 25, 2021, from Jeffgothelf.com website: https://jeffgothelf.com/blog/the-hypothesis-prioritization-canvas/

Gothelf, J., & Seiden, J. (2017). Sense and respond: How successful organizations listen to customers and create new products continuously. Boston, MA: Harvard Business Review Press.

Govindarajan, V., & Trimble, C. (2010). The other side of innovation: Solving the execution challenge. Boston, MA: Harvard Business Review Press.

Hanington, B., & Martin, B. (2012). Universal methods of design: 100 Ways to research complex problems, develop innovative ideas, and design effective solutions. Beverly, MA: Rockport.

Harned, B. (2017). Project Management for Humans: Helping People Get Things Done (1st edition). Brooklyn, New York USA: Rosenfeld Media.

Innes, J. (2012, February 3). Integrating UX into the product backlog. Retrieved July 28, 2021, from Boxesandarrows.com website: https://boxesandarrows.com/integrating-ux-into-the-product-backlog/

Kalbach, J. (2020), “Mapping Experiences: A Guide to Creating Value through Journeys, Blueprints, and Diagrams“, 440 pages, O’Reilly Media; 2nd edition (15 December 2020)

Kourdi, J. (2015). Business Strategy: A guide to effective decision-making. New York, NY: PublicAffairs

Kortum, P., & Acemyan, C. Z. (2013). How Low Can You Go? Is the System Usability Scale Range Restricted? Journal of Usability Studies9(1), 14–24. https://uxpajournal.org/wp-content/uploads/sites/7/pdf/JUS_Kortum_November_2013.pdf

Lafley, A.G., Martin, R. L., (2013), “Playing to Win: How Strategy Really Works”, 272 pages, Publisher: Harvard Business Review Press (5 Feb 2013)

Lewis, J. R., Utesch, B. S., & Maher, D. E. (2015). Measuring perceived usability: The SUS, UMUX-LITE, and AltUsability. International Journal of Human-Computer Interaction31(8), 496–505.

Lewrick, M., Link, P., & Leifer, L. (2018). The design thinking playbook: Mindful digital transformation of teams, products, services, businesses and ecosystems. Nashville, TN: John Wiley & Sons

Lombardo, C. T., McCarthy, B., Ryan, E., & Connors, M. (2017). Product Roadmaps Relaunched. Sebastopol, CA: O’Reilly Media.

Lund, A. M. (2001). Measuring usability with the USE questionnaire. Usability Interface, 8(2), 3-6 (www.stcsig.org/usability/newsletter/index.html).

Moorman, J., (2012), “Leveraging the Kano Model for Optimal Results” in UX Magazine, captured 11 Feb 2021 from https://uxmag.com/articles/leveraging-the-kano-model-for-optimal-results

Oberholzer-Gee, F. (2021). Better, simpler strategy: A value-based guide to exceptional performance. Boston, MA: Harvard Business Review Press.

Oberholzer-Gee, F. (2021). Eliminate Strategic Overload. Harvard Business Review, (May-June 2021), 11.

Olsen, D. (2015). The lean product playbook: How to innovate with minimum viable products and rapid customer feedback (1st ed.). Nashville, TN: John Wiley & Sons.

Patton, J. (2014). User Story Mapping: Discover the whole story, build the right product (1st ed.). Sebastopol, CA: O’Reilly Media.

Pichler, R. (2016). Strategize: Product strategy and product roadmap practices for the digital age. Pichler Consulting.

Polaine, A., Løvlie, L., & Reason, B. (2013). Service design: From insight to implementation. Rosenfeld Media.

Rubin, J., & Chisnell, D. (2011). Handbook of usability testing: How to plan, design, and conduct effective tests (2nd ed.). Chichester, England: John Wiley & Sons.

Sauro, J., & Lewis, J. R. (2016). Quantifying the user experience: Practical statistics for user research (2nd Edition). Oxford, England: Morgan Kaufmann.

Sharon, T. (2016). Validating Product Ideas (1st Edition). Brooklyn, New York: Rosenfeld Media.

Torres, T. (2021). Continuous Discovery Habits: Discover Products that Create Customer Value and Business Value. Product Talk LLC.

Tullis, T., & Albert, W. (2013). Measuring the user experience: Collecting, analyzing, and presenting usability metrics (2nd edition). Morgan Kaufmann.

Ulwick, A. (2005). What customers want: Using outcome-driven innovation to create breakthrough products and services. Montigny-le-Bretonneux, France: McGraw-Hill.

Van Der Pijl, P., Lokitz, J., & Solomon, L. K. (2016). Design a better business: New tools, skills, and mindset for strategy and innovation. Nashville, TN: John Wiley & Sons.

By Itamar Medeiros

Originally from Brazil, Itamar Medeiros currently lives in Germany, where he works as Director of Design Strategy at SAP.

Working in the Information Technology industry since 1998, Itamar has helped truly global companies in several countries (Argentina, Brazil, China, Czech Republic, Germany, India, Mexico, The Netherlands, Poland, The United Arab Emirates, United States, Hong Kong) create great user experience through advocating Design and Innovation principles.

During his 7 years in China, he promoted the User Experience Design discipline as User Experience Manager at Autodesk and Local Coordinator of the Interaction Design Association (IxDA) in Shanghai.

7 replies on “Facilitating Investment Discussions in Strategy”

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.