Unexpected results from survey on AI usage at BV firms

BVWireIssue #257-4
February 28, 2024

practice management and growth
business valuation marketing, valuation practice management, benchmark, information technology, artificial intelligence

“More practitioners are using AI than I would have guessed,” says Rod Burkert (Burkert Valuation Advisors), a practice management advisor, about the latest in a series of “Two-Minute Practice Builder” surveys he developed for BVWire.

About half (48%) of BV firms polled say they are using artificial intelligence in their practices, despite a lack of ethics guidance from the valuation professional organizations (which are working on it), Burkert notes. The usage is mostly for researching economic or industry conditions (see below). “This (I think) is a very low-risk application and a great way to get familiar with AI,” he says.

 

How are you using AI in your practice?

 

Research of economic or industry conditions

61%

Using as an “editor” for valuation report narrative

43%

Assist in marketing efforts

39%

Assisting with analysis of comps for the market approach

26%

Analyzing historical financials of a subject company

17%

Examining financial records for forensics purposes

4%

Other (please specify)

30%

 

The “other” category includes legal research (tax code, court cases), coding, and compiling public stock data (but without much success yet). One firm is thinking about building and deploying an internal ChatGPT application to analyze past reports and Excel models to help draft valuation reports.

The clear tool of choice is ChatGPT, used by 70% of respondents, “which surprises me since Copilot is so integrated into the Microsoft suite of software that practitioners use,” he remarks. Copilot is among the other tools being used, also Bard (now Gemini), and Grammarly.

Many wary:“Most of the objections against using AI come down to not understanding it and data confidentiality (and fear),” Burkert observes. Of the 52% of respondents who are not using AI, the main reason is that they “don’t trust it.” This is not surprising, as stories abound about AI “hallucinating” and doling out incorrect or make-believe information. Others are taking a wait-and-see stance, letting others be the guinea pigs before they take the plunge. There are also concerns about the confidentiality of client information.

We will have a new survey on another topic in the next issue. We thank all of you who participated!

Please let us know if you have any comments about this article or enhancements you would like to see.