Three Lessons from Six Years of Experience in Design Thinking, with a Touch of AI
Insights on the practical application of Design Thinking principles from Raluca Piteiu and Claudia Mascas, Product and Design Consultants at Accesa.
Over the past six years, we have applied Design Thinking across various contexts—from greenfield projects starting from scratch to modernising existing solutions and fostering innovation. We have faced challenges and skeptical looks, but also moments where colleagues and clients said, “Aha, now I understand why you did that.”
Perhaps the most important thing we’ve demonstrated by applying these principles and tools is the immense value of testing something imperfect with real users early to quickly uncover what doesn’t work—the agile principle of failing fast. With the democratisation of AI tools, this has become significantly faster and simpler, enabling quicker idea validation.
This article does not aim to define the Design Thinking process but rather to share three key lessons learned from applying it while continuously striving to persuade stakeholders, colleagues, and clients.
Design Thinking: Theory vs. Practice Design
Thinking is often depicted as a five-step process leading to user-centered solutions. Simple, right? However, this perfect image can lead to anxiety, especially when comparing our complex realities with the ideal examples presented in courses or articles.
Over time, we observed colleagues overly focused on following the process by the book—spending too much time crafting perfect personas or finding the ideal template—without considering whether these steps were necessary for their specific situation. In reality, the process is messy and full of challenges. To overcome such blocks, it’s essential to accept that Design Thinking is a tool to achieve a goal, not a goal in itself. The objectives tied to this method, such as user-centricity and innovation, are undoubtedly valuable but vague. What we truly aim for is to minimise risks when developing a new product or feature.
Investments are wasted if the product doesn’t deliver value to users, fails to be adopted, frustrates users, or cannot be implemented with available resources or justified from a business perspective. That’s why we must select the methods that deliver maximum results with minimal effort, adhering to the agile principle of failing fast.
How Do We Do This? By collecting data from users about proposed solutions to see if they truly address problems and meet user needs. Validation must consider not just user-centricity but also technical feasibility and business objectives.
“But I Don’t Have Access to Users”
Not everyone works in an ideal environment where user access is easy. In regulated, sales-driven, or outsourcing settings, reaching actual users can be challenging or impossible.
The solution lies in creativity and a guerrilla approach. Here are a few tactics we’ve used:
Analyse Job Listings When direct access to users was unavailable, we reviewed job ads for similar roles to understand their skills, responsibilities, and challenges.
Consume Related Content We searched for interviews or articles with people in roles similar to our users, revealing authentic perspectives and habits.
Test It Yourself Adopt the “eat your own dog food” approach—use the product as if you were the end user, in realistic scenarios and with real data. This provides firsthand insight into shortcomings and areas for improvement.
Leverage Networks Use LinkedIn for surveys or client newsletters for interviews and feedback collection.
Friends and Family Testing Ask friends for testing help, but only if they fit the target geography and segment. Otherwise, their feedback may misguide the solution.
User-Centricity through Secondary Research
“Any research is better than no research” goes the saying, but this isn’t entirely true. To avoid merely confirming biases, set clear research objectives:
What is the goal?
What assumptions need validation?
What questions need answering?
Who is the target audience?
Answering these will guide actions, ensuring even minimal research is meaningful. Whether it’s usability tests, market trend analysis, or competitor reviews, every step moves you closer to understanding your users.
Example: While developing a workspace booking solution, we analysed competitor reviews on platforms like the App Store. In just a couple of hours, we identified pain points, missing features, and the roles that frequently used the app.
Start Small
If you don’t have full management support for applying Design Thinking, prove its value on a small scale. Example: For a workspace booking tool, we addressed a specific functionality receiving negative feedback. We rapidly prototyped and tested the feature using tools like Useberry, gathering feedback within four days. The results convinced stakeholders and informed prioritisation.
Identifying Opportunities for Impact Through Small Functionalities
Let’s explore an example from developing a workplace desk reservation solution and how, through a few small steps, we streamlined collaboration with internal stakeholders. We focused on a minor but impactful functionality that had received negative feedback in a user survey. By conducting quick, low-effort research—engaging directly with users as they arrived at the office—we addressed the issue with a simple prototype.
Testing this remotely using tools like Useberry enabled us to collect direct feedback and measure the impact within just four days. We synthesized the findings into a report, helping to convince stakeholders and prioritise solutions effectively.
Applying the Method in a Learning Context Using a Real Case
Another way we applied Design Thinking principles was by running a short experiment as part of a learning session. The results demonstrated the method's value while gaining stakeholder trust. The low-risk activity was not part of the main project but an internal workshop for practice and learning.
With limited time, we selected a specific product functionality, built a simple prototype in just five hours, and paired it with a video outlining the solution. By emailing potential users, we quickly received valuable feedback to guide further development. This approach turned a learning session into an opportunity to solve a real product issue. Once you achieve results, remember to quantify them. Quantification is key to building a strong case for the method. Ask yourself: how much time was saved by addressing this problem during the prototyping phase versus fully implementing the functionality?
You’re Not Alone – Find Allies and Ambassadors
Working alone can be exhausting, especially in complex environments. Seek colleagues with diverse skills (UX, product, tech) to form partnerships. Collaborative efforts bring varied perspectives, enabling mutual feedback and building on each other's ideas. With allies in your team, you can collectively demonstrate the method's value.
Do your teammates truly understand your users? Start with a simple persona and journey map exercise to ensure everyone aligns with the users’ needs. This foundation enables the team to brainstorm solutions effectively during future sessions. Leverage recurring team meetings for such exercises, as they are easier to schedule. Choose activities based on a specific goal or challenge you’ve identified.
Understand Your Audience and Speak Their Language
If you’ve made it this far, you agree that understanding users is critical for creating solutions that address real needs. To find ambassadors and allies, you’ll need to apply the same principle to your stakeholders. Once they grasp the value, they can unlock opportunities and amplify the method's successes within the organisation. Success hinges on understanding what matters to them. For business stakeholders, arguments around user-centricity and validation might not resonate as much as concrete figures. Address these key questions:
How much money will we save by applying this method? (If financial data isn’t available, use effort or time instead)
What results have been achieved?
How confident are we in these results?
Prepare data to demonstrate how user-centred, quick methods yield concrete business benefits. Learn the objectives of your company, project, and product, and focus on areas of high uncertainty or impact. Stakeholders are less likely to be impressed by validating a login flow already established as an industry best practice.
Instead of saying: “We tested with X users and concluded that we need a different solution for this problem.”
Try saying: “We saved X € by avoiding the development of a suboptimal solution. We invested Y time in this exercise, resulting in a Z benefit.”
GenAI as a Process Accelerator
Generative AI accelerates data collection and analysis, offering a starting point without requiring significant time investment. Results can be obtained quickly, but critical review and refinement using your expertise remain essential. Data security and ethical considerations should also guide the use of AI tools.
Here are some examples from our recent experience where AI significantly sped up processes:
Prompting: A Critical Skill
To achieve relevant results with AI, crafting effective prompts is essential. While there are numerous templates online, consider including the following elements:
Specify the role AI should play (e.g., Product Manager, UX Researcher).
Clearly state the task.
For multi-step instructions, ask AI to outline its approach before delivering results.
Provide examples.
Give clear formatting instructions for the output.
Ask AI to clarify any questions it has before proceeding.
What Not to Use GenAI For
AI is best for automating repetitive or time-consuming tasks, particularly in research and ideation phases, but it should not replace human decision-making or interpretation.
Conclusion
Applying Design Thinking with a "just do it" mindset encourages action and tangible results without waiting for perfection. The goal remains to create user-focused, relevant solutions while minimising risks and aligning with business needs.
AI tools provide shortcuts for prototyping, testing, feedback collection, and data analysis at an unprecedented scale, saving both time and resources. However, critical thinking and ethical considerations are essential to harness AI's potential effectively.