"If they don't start to seriously change the way they operate their services, then I think those demands for things like bans for children on social media are going to get more and more vigorous," she said.
"I'm asking the industry now to get moving, and if they don't they will be hearing from us with enforcement action from March."
Under Ofcom's codes, platforms will need to identify if, where and how illegal content might appear on their services and ways they will stop it reaching users
According to the OSA, this includes content relating to child sexual abuse material (CSAM), controlling or coercive behaviour, extreme sexual violence, promoting or facilitating suicide and self-harm.
But critics say the Act fails to tackle a wide range of harms for children.
The Molly Rose Foundation - set up in memory of teenager Molly Russell, who took her own life in 2017 after being exposed to self-harm images on social media - said the OSA has "deep structural issues".
Andy Burrows, its chief executive, said the organisation was "astonished and disappointed" by a lack of specific, targeted measures for platforms on dealing with suicide and self-harm material in Ofcom's guidance.
"Robust regulation remains the best way to tackle illegal content, but it simply isn't acceptable for the regulator to take a gradualist approach to immediate threats to life," he said.
And children's charity the NSPCC has also voiced its concerns.
"We are deeply concerned that some of the largest services will not be required to take down the most egregious forms of illegal content, including child sexual abuse material," said acting chief Maria Neophytou.
"Today's proposals will at best lock in the inertia to act, and at worst create a loophole which means services can evade tackling abuse in private messaging without fear of enforcement."
The OSA became law in October 2023, following years of wrangling by politicians over its detail and scope, and campaigning by people concerned over the impact of social media on young people.
Ofcom began consulting on its illegal content codes that November, and says it has now "strengthened" its guidance for tech firms in several areas.