Images of Zora's abuse were originally only available on the so-called dark web, but she now has to live with the reality that links are being openly promoted on X.
Social media platforms are trying to rid their platforms of illegal material, but the scale of the problem is enormous.
Last year the US National Center for Missing and Exploited Children (NCMEC), received more than 20 million mandatory reports from tech companies about incidents of child sexual abuse material (CSAM) - illegal images and videos on their platforms.
NCMEC attempts to identify victims and perpetrators, the organisation then contacts law enforcement.
We approached "hacktivist" group Anonymous, whose members are trying to combat the trade in child abuse images on X. One of them told us the situation was as bad as ever.
They tipped us off about a single account on X. It used a photo of the head and shoulders of a real child as its avatar. There was nothing obscene about it.
But the words and emojis in the account's bio made it clear the owner was selling child sexual abuse material and there was a link to an account on the messaging app Telegram.
