Cooper said AI was "industrialising the scale" of sexual abuse against children and said government measures "may have to go further."
Other laws set to be introduced include making it an offence to run websites where paedophiles can share child sexual abuse content or provide advice on how to groom children. That would be punishable by up to 10 years in prison.
And the Border Force will be given powers to instruct individuals who they suspect of posing a sexual risk to children to unlock their digital devices for inspection when they attempt to enter the UK, as CSAM is often filmed abroad. Depending on the severity of the images, this will be punishable by up to three years in prison.
Artificially generated CSAM involves images that are either partly or completely computer generated. Software can "nudify" real images and replace the face of one child with another, creating a realistic image.
In some cases, the real-life voices of children are also used, meaning innocent survivors of abuse are being re-victimised.
Fake images are also being used to blackmail children and force victims into further abuse.
The National Crime Agency (NCA) said that there are 800 arrests each month relating to threats posed to children online. It said 840,000 adults are a threat to children nationwide - both online and offline - which makes up 1.6% of the adult population.
Cooper said: "You have perpetrators who are using AI to help them better groom or blackmail teenagers and children, distorting images and using those to draw young people into further abuse, just the most horrific things taking place and also becoming more sadistic."
She continued: "This is an area where the technology doesn't stand still and our response cannot stand still to keep children safe."
Some experts, however, believe the government could have gone further.
Prof Clare McGlynn, an expert in the legal regulation of pornography, sexual violence and online abuse, said the changes were "welcome" but that there were "significant gaps".
The government should ban "nudify" apps and tackle the "normalisation of sexual activity with young-looking girls on the mainstream porn sites", she said, describing these videos as "simulated child sexual abuse videos".
These videos "involve adult actors but they look very young and are shown in children's bedrooms, with toys, pigtails, braces and other markers of childhood," she said. "This material can be found with the most obvious search terms and legitimises and normalises child sexual abuse. Unlike in many other countries, this material remains lawful in the UK."

