Dive Brief:
- In a tightening AI regulatory landscape, executives are grappling with the impact of new rules on implementation plans and existing practices, a KPMG survey published Thursday found.
- More than half of executives expect complying with data privacy and security requirements will increase costs for their organization. Nearly two-thirds of leaders project the requirements will tighten as regulators shape policies.
- To prepare for stricter regulation, 3 in 5 businesses are currently reviewing and updating data practices. Around half of organizations are implementing technical measures to improve the transparency and fairness of AI applications.
Dive Insight:
Regulators are closely watching the AI landscape, from enterprise deployments to vendor partnerships and company practices.
In the U.K., regulators are assessing the market competition impact of several AI partnerships between technology firms. The EU AI Act has also upped the ante for makers and deployers of AI models.
Under the U.S. federal government’s purview, AI is regulated under existing laws. Compliance could result in further fees for enterprises regarding securing counsel, updating practices and technology, or paying outside partners to assist.
Despite the challenge, most enterprises aren’t shying away from the evolving goalposts. Leaders are allocating more money and resources to reaching AI goals. Technology leaders are working to set businesses up for success by carefully considering the value and risks of tools and applications.
Enterprises often turn to third-party vendors when integrating generative AI to bypass upfront development costs and the required technical expertise. However, many are concerned with assuming the risk of regulatory non-compliance and data privacy and security when partnering with third parties.