Australian Organisations Need to Build Trust With Consumers Over Data & AI

We Keep you Connected

Australian Organisations Need to Build Trust With Consumers Over Data & AI

Australian Organizations Need to Build Trust With Consumers Over Data & AI
Your email has been sent
Global data from Cisco highlights a concerning trend, where Australian organizations are starting to fall behind in a “trust gap” between what customers expect them to do with data and privacy and what is actually happening.
New data shows that 90% of people want to see organizations transform to properly manage data and risk. With a regulatory environment that has fallen behind, for Australian organizations to deliver this, they will need to move faster than the regulatory environment.
According to a recent major global Cisco study, more than 90% of people believe generative AI requires new techniques to manage data and risk (Figure A). Meanwhile, 69% are concerned with the potential for legal and IP rights to be compromised, and 68% are concerned with the risk of disclosure to the public or competitors.
Essentially, while customers appreciate the value AI can bring to them in terms of personalization and service levels, they’re also uncomfortable with the implications to their privacy if their data is used as part of the AI models.
PREMIUM: Australian organizations should consider an AI ethics policy.
Around 8% of participants in the Cisco survey were from Australia, though the study doesn’t break down the above concerns by territory.
Other research shows that Australians are particularly sensitive to the way organizations use their data. According to research by Quantum Market Research and Porter Novelli, 74% of Australians are concerned about cybercrime. In addition, 45% are concerned about their financial information being taken, and 28% are concerned about their ID documents, such as passports and driver’s licences (Figure B).
However, Australians are also twice as likely as the global average to violate data security policies at work.
As Gartner VP Analyst Nader Henein said, organizations should be deeply concerned with this breach of customer trust because customers will be quite happy to take their wallets and walk.
“The fact is that consumers today are more than happy to cross the road over to the competition and, in some instances, pay a premium for the same service, if that is where they believe their data and their family’s data is best cared for,” Henein said.
Part of the problem is that, in Australia, doing the right thing around data privacy and AI is largely voluntary.

“From a regulatory perspective, most Australian companies are focused on breach disclosure and reporting, given all the high-profile incidents over the past two years. But when it comes to core privacy aspects, there is little requirement on companies in Australia. Main privacy pillars such as transparency, consumer privacy rights and explicit consent are simply missing,” Henein said.
It is only those Australian organizations that have done business abroad and run into outside regulation that have needed to improve — Henein pointed to the GDPR and New Zealand’s privacy laws as examples. Other organizations will need to make building trust with their customers an internal priority.
While data use in AI might be largely unregulated and voluntary in Australia, there are five things the IT team can – and should – champion across the organization:
Currently, data privacy law — including data collected and used in AI models — is regulated by old regulations created before AI models were even being used. Therefore, the only regulation that Australian enterprises apply is self-determined.
However, as Gartner’s Henein said, there is a lot of consensus about the right way forward for the management of data when used in these new and transformative ways.

“Back in February 2023, the Privacy Act Review Report was published with a lot of good recommendations intended to modernize data protection in Australia,” Henein said. “Seven months later in September 2023, the Federal Government responded. Of the 116 proposals in the original report the government responded favorably to 106.”
For now, some executives and boards may baulk at the idea of self-imposed regulation, but the benefit to this is that an organization that can demonstrate it is taking these steps will benefit from a greater reputation among customers and be seen as taking their concerns around data use seriously.
Meanwhile, some within the organization might be concerned that imposing self-regulation might impede innovation. As Henein said in response to that: “would you have delayed the introduction of seat belts, crumple zones and air bags for fear of having these aspects slow down developments in the automotive industry?”
It is now time for IT professionals to take charge and start bridging that trust gap.
Stay up to date on the latest in technology with Daily Tech Insider Australian Edition. We bring you news on industry-leading companies, products, and people, as well as highlighted articles, downloads, and top resources. You’ll receive primers on hot tech topics that are most relevant to AU markets that will help you stay ahead of the game. Delivered Thursdays
Stay up to date on the latest in technology with Daily Tech Insider Australian Edition. We bring you news on industry-leading companies, products, and people, as well as highlighted articles, downloads, and top resources. You’ll receive primers on hot tech topics that are most relevant to AU markets that will help you stay ahead of the game. Delivered Thursdays
Australian Organizations Need to Build Trust With Consumers Over Data & AI

source

GET THE LATEST UPDATES, OFFERS, INFORMATION & MORE