However, despite the investment, there is some evidence that security teams are still worried their organizations aren’t prepared to protect the massive amounts of data they are collecting and storing for emerging technologies, like machine learning systems. According to the survey, 96% of organizations believe they have processes in place to meet the responsible and ethical standards that customers expect for AI solutions and services. However, nearly all of them (92%) also said their organizations need to do more to reassure customers about their data.
The report also found that more than 70% of respondents said they were getting “significant” or “very significant” benefits from their privacy investments, including building trust with customers, reducing sales delays, and mitigating losses from data breaches.
Claude Mandy, chief evangelist, data security at Symmetry Systems, told SC Media that customer demands around privacy – particularly around issues like transparency of data use in emerging technologies like AI – does influence organizational approaches to privacy. Unfortunately, the also tend to take longer to be thoughtfully integrated into current practices.
“Security professionals struggle to secure this data already and are continually challenged in understanding where personal data is being stored, let alone used,” said Mandy. “The increased consumerization of AI, including the advent of tools like OpenAI will make it even harder to control the flow of data into AI tools, representing an even bigger hurdle to prove organizations are using customer data ethically and with consideration of their privacy rights and needs.”
David Maynor, director of the Cyber Threat Intelligence Team (CTIG), added that it is good when there is another source that confirms good investment in privacy will deliver a return on investment.
“However, I do have trouble resolving these reports with the number of people I meet who will give a random app complete access to their data to watch free movies and TV,” Maynor said.