Governments have to find new ways to regulate online platforms says a former CIA officer and Facebook staffer who believes their drive to keep us on their sites is fostering extremism.

Former CIA officer Yaël Eisenstat, who spent 18 years in various roles with the US government before joining Facebook to head the social media platfom’s elections integrity operations, was speaking at Data61’s recent D61 Live conference in Sydney.

In her presentation she described how online services, particularly YouTube and Facebook, depend upon serving users steadily more extreme content to keep us engaged with their platforms.

"Their number one metric is user engagement, they have to keep you on their platform."

"The business model of trying to ensure you are addicted to their technology so that they can target you with ads is causing real issues.

"In order to keep you on their screens, they have to serve up more extreme content."

"The algorithm was trained with the intent of keeping your eyes on the screen and the algorithm has figured out that if I just show you something slightly more extreme than the last thing you saw, you will click and you will watch."

Former CIA staffer and Facebook executive, Yaël Eisenstat, says we need to find new ways to regulate online services

Her views come as Facebook and Google struggle with an increasing global push to regulate their activities. In Australia, the Federal government is preparing its response to the recent ACCC’s Digital Platforms Review which has proposed wide ranging reforms to online regulation.

Eisenstat said her concerns about the online services started to form four years ago.

“As 2015 continued, and as elections in the United States were heating up, I started to truly become concerned about a different kind of threat,"she said.

"The fear that started keeping me up at night, and the only time I've ever truly believed could unravel this imperfect democratic dream of ours, was actually coming from within.

"It was the hatred, and lack of empathy for, our fellow citizens. In 2015 I was watching civil discourse break down in the United States."

The breakdown in civility was due to inflammatory misinformation being posted and shared, Eisenstat observed.

"To me, the bigger question wasn't 'did Russia hack our elections?', it wasn't 'did Facebook hack our democracy?'. To me, the questions I was really concerned with were 'why are we so easily manipulated?', 'why are we so easily persuaded?' and 'why are we so easily drawn into these filter bubbles or down these online rabbit holes?'

"What is actually happening here?"

Eisenstat said the answers to these questions lies in the services’ advertising funded business model.

"By now we all know that technology, particularly the proliferation of this advertising business model-based social media industry, has exacerbated this toxic polarisation to a degree that I fear may not be reversible.

"This need to push people further into extreme paths in order to keep our eyes on their screens in order to serve us targeted ads is absolutely central to their business model.”

"For example, teenage girls go onto YouTube. They are looking up some video, maybe swimsuits for the next season, and then they're being shown anorexia video."

"In order to keep us engaged, these recommendation engines are bringing us down more extreme paths and they are wrecking our ability to even see that we are being manipulated."

Fixing this will require government intervention, says Eisenstat, but the rules for online services need to be different to what applies to traditional broadcasters and publishers.

"I don't want them regulated like a publisher, because they are not publishing their own content.

"What they are doing is more dangerous -- they are curating our content, they are deciding what we do and do not see, they are deciding which rabbit holes to take us down and which filter bubbles to put us in.

"If they are curating our content, they need to be classified as such. Call them a 'digital curator' if you like and regulate them that way as you cannot tell me they bear no responsibility for the content they put out there if they are curating their content.

"This is one regulation they will fight tooth and nail to not have this changed, because if they actually beared the responsibility for the content on their platforms, they claim it will destroy their business.

"I claim that maybe it will just destroy your business model and you should figure out a different way to operate Facebook."

"The internet we all know was created to break down borders and help connect us, but unfortunately as we have seen over the last however many years, it seems to have managed to divide us as much, if not more, than unite us."

"It may not be any one company's responsibility to decide what is true or false, to decide whose voices to silence, or to show us content from a variety of perspectives. But when they are purposely creating algorithms with the goal of keeping us engaged in order to monetise a so-called free service, they are contributing to an increased polarisation that has undeniable real-world consequences."

Ultimately, despite her own experiences, she remains upbeat about the online platforms to do good, saying, "I'm cautiously optimistic the tech industry can play a positive role in steering this ship.

"At the very least, we must ensure technology doesn't make it worse."