How to beef up security when using AI tools

BVWireIssue #266-2
November 13, 2024

valuation profession news
business valuation profession, conference, practice management, information technology, artificial intelligence

One of the big concerns about using ChatGPT or other AI tools is the confidentiality of client information when uploading files or documents. At the recent ASA Conference in Portland, Ore., the point was made that some tools allow users to opt out of having the information shared or made public. But is there any way to verify this? Not really, a panel noted.

What to do: At the AICPA Conference in Dallas, attendees were advised that, regardless of what the tool or app says about data security and sharing, users should get a security review from a “trusted technologist.” Somebody asked: Where do you find one? At large firms, their IT people will have someone or will know who to call. For others, a Google search on “IT security” or “[Microsoft] Azure cloud professional” should turn up some results.

The ASA session was The AI Revolution: Why It Matters to Appraisers & Application Strategies (Greg Endicott, Kevin Couillard, and Andrew Couillard). The AICPA session was Let’s Have a Chat: Applying ChatGPT and Other Large Language Models to the Practice of Forensic Accounting (Daniel Street and Joseph Wilck).

Please let us know if you have any comments about this article or enhancements you would like to see.