Whistleblower Frances Haugen has told MPs Facebook is "unquestionably making hate worse", as they consider what new rules to impose on big social networks, BBC reports.
Haugen was talking to the Online Safety Bill committee in London.
She said Facebook safety teams were under-resourced, and "Facebook has been unwilling to accept even little slivers of profit being sacrificed for safety".
And she warned that Instagram was "more dangerous than other forms of social media".
While other social networks were about performance, play, or an exchange of ideas, "Instagram is about social comparison and about bodies... about people's lifestyles, and that's what ends up being worse for kids", she told a joint committee of MPs and Lords.
She said Facebook's own research described one problem as "an addict's narrative" - where children are unhappy, can't control their use of the app, but feel like they cannot stop using it.
"I am deeply worried that it may not be possible to make Instagram safe for a 14-year-old, and I sincerely doubt that it is possible to make it safe for a 10-year-old," she said.
The committee is fine-tuning a proposed law that will place new duties on large social networks and subject them to checks by the media regulator Ofcom.
Asked if the law was "keeping Mark Zuckerberg awake at night", Haugen said she was "incredibly proud of the UK for taking such a world-leading stance".
"The UK has a tradition of leading policy in ways that are followed around the world.
"I can't imagine Mark isn't paying attention to what you're doing."
It comes as several news outlets published fresh stories based on the thousands of leaked documents Haugen took with her when she left Facebook.
Facebook has characterised previous reporting as misleading, and at one point referred to the leaked documents as "stolen".
"Contrary to what was discussed at the hearing, we've always had the commercial incentive to remove harmful content from our sites," a spokesperson said, after Haugen finished giving evidence.
"People don't want to see it when they use our apps, and advertisers don't want their ads next to it. That's why we've invested $13bn (£9.4bn) and hired 40,000 people to do one job: keep people safe on our apps. "
The company said that over the last three quarters it has halved the amount of hate speech seen on Facebook, which it claims now accounts for only 0.05% of all content viewed.
"While we have rules against harmful content and publish regular transparency reports, we agree we need regulation for the whole industry so that businesses like ours aren't making these decisions on our own," the spokesperson said.
"The UK is one of the countries leading the way and we're pleased the Online Safety Bill is moving forward."