Category:
Insights
Published On:
What we learned from watching Planda Portal in the wild
Hi, Alice here, Head of Product at Planda Portal.
We’ve now been live for almost three weeks. I wanted to share what we’re seeing.
For context: we’re a self-funded proptech team. We write code and learn fast. This release has been our most exciting release to date. We’ve worked on it for over a year!
Since launch, we’ve focused on three things:
Learning how people actually use the product: every click, every drop-off, every ignored button.
Talking to our customers through intercom. No AI this time! The “Alice” replying is me (with the help of Brandon at times).
Speaking directly to local authorities. This is a new system. We need councils to know who we are, and we need their feedback to make it better.
The result: we’re no longer guessing. We’re observing real behaviour inside the planning submission process.
And because Planda Portal sits inside that process, behaviour matters. It affects applications, LPAs, agents and outcomes.
Here’s what we’ve learned so far.
Application types cause friction
Many users are unsure which type to select. For now, we’re helping manually over chat. We’re using those conversations to design a guided selection feature. Until that’s live, continue to ask us!
Location plans are often invalid
Architect plans are frequently uploaded, but they don’t always meet LPA requirements. We allow custom uploads, which removes our default plan, but that creates risk. We’ll introduce automated checks for user-uploaded location plans, similar to other document validations we already run. I have plans for towards the end of March.
Historic buildings need earlier signals
Some properties require additional documents. We’re analysing council-level patterns and will flag likely requirements upfront as part of our local validation breakdown.
Forms don’t always match proposals
Invalid forms often stem from inconsistent answers. We’ll run controlled experiments to catch these earlier. We don’t want to over-correct or create friction. After these experiments are concluded, I will share the results.
Not all suggested corrections are being acted on
This is the hardest one. We don’t want to force compliance, but we also don’t want preventable invalids reaching LPAs. We’re exploring a tiered validation model and will A/B test before releasing anything permanent.
CIL requirements vary
Some councils require it. Some don’t. We’re grouping councils and introducing a local CIL validation check within the next month.
We’re sharing this because planning doesn’t improve by accident. It improves through visibility, feedback and iteration.
What’s next
Deeper conversations with councils to align on validation requirements and explain our AI checks clearly.
Expanding supported application types.
Refining the validation layers above.
Preparing for the emerging data standard under MHCLG’s initiative.
Join us next
We're hosting a webinar on Tuesday, 17 March at 12pm, where I'll be walking through everything above in more detail and opening up for questions. Whether you're an LPA, an agent, or just curious about where we're heading, we'd love to have you.
Register here
And thank you - whether you're using the platform, challenging it, or simply reading this!
RELATED POSTS

