What Didn't Work
(And What I Learned)
The most valuable design lessons come from failure. These are the experiments that failed, the hypotheses that were disproven, and the design decisions I would change if I could go back.
Why Document Failures?
In financial design, the cost of failure is measured in real money lost, user trust eroded, and regulatory violations incurred. I document these failures not as confessions, but as evidence of iterative learning — the only reliable path to good design.
V1: Maximum Transparency
Hypothesis: More data = more trust. If users see all performance metrics, they'll feel confident making decisions.
What I Built:
- 12+ metrics visible per provider card
- Full monthly returns table on browse page
- Sharpe ratio, drawdown, win rate all exposed
- Assumption: "Power users want density"
What Changed:
- Visual risk gauge (color-coded: green/amber/red)
- Strategy style labels ("Conservative", "Aggressive Scalper")
- "Show Details" progressive disclosure
- Metrics hierarchy: Right things at right time
(V1 usability test)
wrong metrics
risk labels (V2)
💡 Key Learning
Trust comes from showing the right things at the right hierarchy, not everything at once.
V2's layered transparency approach distributed 73% of users into "quick decision" path (visual risk gauge only)
and 27% into "deep analysis" path (expanded metrics). This matched actual user behavior patterns better than
forcing everyone through the same information-dense interface.
V1: Gamifying Risk
Hypothesis: Badges and leaderboards will increase engagement without encouraging reckless behavior.
What I Built:
- "Top Trader" leaderboard (most profitable this week)
- Achievement badges for high-risk strategies
- Streak counters ("5 winning days in a row!")
- Assumption: "Social proof drives trust"
What Changed:
- Leaderboards penalize high drawdown
- Badges for "Consistent performance" not "Highest return"
- Mandatory risk disclosure before copying
- ASIC compliance: No inducements to trade
💡 Key Learning
Gamification in financial products requires ruthless ethical discipline.
What works in fitness apps (streaks, badges, leaderboards) can be dangerous in trading platforms.
The V1 design inadvertently encouraged users to chase high-risk strategies to "win" badges.
Legal flagged this as potentially violating ASIC's inducement prohibitions. V2 shifted to
"risk-adjusted performance" metrics and explicit warnings.
V1: Institutional Design for Retail Users
Hypothesis: If we design like Bloomberg Terminal, users will feel like professionals and trust the platform more.
What I Built:
- 8 concurrent widgets on Finlogix dashboard (inspired by Bloomberg Terminal)
- Dense typography (12px font size for data tables)
- Advanced technical indicators exposed by default
- Assumption: "More data = more professional = more trust"
What Changed:
- Start with 2-3 essential widgets (Account Summary, Watchlist, Quick Trade)
- Progressive disclosure: "Add More Tools" panel for advanced features
- User segmentation: Novice vs Intermediate vs Expert modes
- 14px minimum font size for accessibility (WCAG 2.1 AA)
(novice traders, V1)
widget layout (V1)
increase (V2)
💡 Key Learning
Institutional design patterns don't translate directly to retail — context matters.
Bloomberg Terminal users are professional traders who spend 8+ hours/day in the platform and receive formal training.
Finlogix users are part-time retail traders managing personal portfolios. User research showed novice traders
exhibited 3X higher error rates with V1's dense layout (accidentally clicking wrong instruments, misreading P&L).
V2's progressive disclosure approach let users "graduate" to complexity as they gained expertise — matching the
actual learning curve instead of assuming all users are experts on day 1.
This lesson directly applies to Private Banking: UHNW clients expect institutional-grade
tools, but wrapped in luxury UX that doesn't require 40 hours of training. The balance is: power when needed,
simplicity by default.
V1: Design First, Legal Later
Hypothesis: Design the best UX, then adjust for compliance — Legal can work around good design.
What Happened:
- Designed ACYVerse "Recommended Providers" feature based on performance rankings
- Showed 4 weeks of design iterations to stakeholders
- Legal review flagged potential ASIC violation (RG146 inducement prohibition)
- Result: 4 weeks of design work discarded, team morale impact
New Process:
- Legal stakeholder invited to Day 1 kickoff — discuss regulatory constraints upfront
- Low-fidelity wireframes reviewed with Legal before high-fidelity investment
- Pivoted to "Risk-Adjusted Performance" ranking with disclaimers (Legal-approved)
- No design-related violations over 2+ years under ASIC/SEC/FINRA scrutiny
💡 Key Learning
In regulated environments, Legal is not a "blocker" — they're a design constraint that should inform architecture from day 1.
The V1 failure taught me that designers in financial services can't treat compliance as an afterthought.
Legal teams think in risk mitigation (what could go wrong?), while designers think in opportunity (what could delight users?).
These perspectives conflict unless aligned early. V2's process — inviting Legal to wireframe reviews, building dedicated
QA environments for compliance testing — turned Legal from a "veto gate" into a collaborative partner. This approach
resulted in no design-related compliance violations across 40+ jurisdictions for 2+ years.
This mindset is critical for Private Banking, where SEC/FINRA regulations are even stricter than
ASIC. A single violation can cost millions in fines and destroy client trust.
Want to discuss these learnings?
These failures taught me more than most successes. If you're building financial products and want to discuss design trade-offs, regulatory constraints, or user research approaches, I'd be happy to share more detailed insights.