Trading 212 reviews split sharply by use case
926 Trading 212 App Store reviews split sharply by use case. Beginners are happy, active traders aren't, and 13 customers want one specific UI revert.
We pulled 926 of Trading 212's App Store reviews into a dashboard at app.sunbeam.cx/public/yc4qalrg. The headline number is innocuous. Overall NPS sits around +19. But the dataset has a shape worth looking at.
Beginners and casual investors are happy. Active traders are not. The same app gets two very different reviews depending on who's writing.
Reading reviews of a product with multiple user types
Most apps serve more than one job. Trading 212 covers everything from a first-time ISA opener to a leveraged CFD trader. Reading the reviews together blurs the two stories. Pull them apart and the picture is much sharper, and much more actionable.
What the beginners say
Among reviews that talk about the basics, the experience holds up well.
Learning Tools (practice mode, in-app guidance, beginner education) has a positive sentiment cohort of 58 of 65 reviews on the topic. Cash ISAs likewise get a strongly positive cluster. The single most frequent topic in the entire dataset is "overall ease of use and user-friendliness," with 416 reviews. Most of those are positive.
There is a recurring theme of beginner appreciation:
I've been using this for quite a while now, and it's been a good experience, highly recommended for anyone who's interested in investing, try it you'll not regret it.
If you're shipping a beginner-investor product, this looks great.
What the active traders say
The picture flips when you look at the surfaces active traders use.
Order Controls (the surfaces traders use to set stop losses, manage positions on contract expiries, place orders during volatility) has 27 reviews in the dataset and a sentiment score of -85. Of 25 negative reviews, 0 positive. CFD Trading sits at -72. The Positions view, which active traders use constantly, is at -64.
The complaints converge on two threads. First, recent UI changes have added friction to trade execution, with extra confirmation windows, slower navigation, screens going grey while closing positions. Second, the new position display shows current market prices instead of entry/cost prices, making it harder to see real gains and losses at a glance.
What 13 customers asked for
In the dataset, there's a cluster of reviews where customers ask for one specific thing: revert the position display to show entry/cost price, not current market price. The suggestion-extraction picks up 13 reviews making this ask. The parent topic, "inability to see entry or purchase price on position summary," has 9 reviews on its own; the suggestion pulls in adjacent mentions where the same ask appears.
Some examples:
What kind of design decision led to updating the position page so that it now displays all your positions with only the current price, but no indication of your entry price unless you click into the specific merchandise?
Please revert back to the old way of showing CFD positions. It now has the live price of the asset on each position instead of the entry/cost price like it used to.
They removed option to aggregate positions and now you can't even collapse an instrument's open positions. So you could end up with a long list of 10-20-50+ open positions that you have to scroll, set TP/close individually.
These are not generic complaints. They are specific requests for a specific UI revert. A team triaging this dataset has a clear, low-effort lane: ship a setting that toggles between entry-cost and current-market view, and a way to collapse positions by instrument. Two changes, addresses a substantial cluster of dissatisfaction.
What this means for product teams reading their own reviews
If your product serves multiple cohorts, the aggregate review score is misleading. A +19 NPS that hides a +47 from beginners and a -85 from a power-user surface is not really +19 to anyone. It is two different stories being averaged together.
The action is in the segmentation. Splitting reviews by which surface they talk about turns "our reviews are mixed" into "this surface is fine, this surface is broken, here is what customers asked for on each." Different surfaces, different owners, different fixes.
Most companies don't do this because reading 926 reviews and sorting them by surface is roughly a day of work. The dashboard at app.sunbeam.cx/public/yc4qalrg was built in a couple of hours from public reviews. We didn't need access to Trading 212's data.
Run this on your own app
If your product has multiple distinct user segments, the same analysis takes minutes to set up against your own App Store reviews. Try it at sunbeam.cx/try.