Powered by MOMENTUM MEDIA
lawyers weekly logo
Powered by MOMENTUM MEDIA
  • subs-bellGet the latest news! Subscribe to the ifa bulletin
Advertisement

Practice support staff less welcoming of AI tools

Though AI has taken the financial advice industry by storm in recent years, practice support staff appear considerably more hesitant when it comes to AI tools.

Research by Investment Trends has revealed that practice support staff are much less keen on the introduction of AI tools to the advice process, with 31 per cent claiming no intention of using AI, compared with 20 per cent of advisers.

A further 36 per cent of support staff and 43 per cent of advisers said they had an interest in AI tools but needed more education or support.

According to the findings, advisers and support staff are primarily using AI tools for reporting and editing, though both groups are still calling for greater capabilities in these areas, which the report said presented an opportunity for platforms to deliver more for their users.

These were followed by research and modelling, customer service and data analytics with both advisers and support staff calling for greater AI capabilities across the board.

However, nearly six in 10 support staff said they were unsure or had no view on the implementation of AI tools, highlighting a possible gap in education or understanding in regard to this technology.

Speaking on an Investment Trends webinar last week, the firm’s finance and research director, Paul McGivern, suggested that platforms have an important role to play in getting support staff on board.

 
 

“You can see that the advisers have got a preference to do more. The support staff need a little bit more help and education, and we would say that the platforms are very good at bringing the practice support staff along for the ride and providing great training and tools in relation to that,” McGivern said.

Privacy, cyber security and AI

Speaking at the FAAA Roadshow event in Sydney last week, Samantha Hill, a financial services specialist lawyer with Holley Nethercote, said those that choose to introduce AI tools into their practice must always keep regulatory obligations regarding privacy and cyber security requirements in mind.

“Things like the Privacy Act, and Parliament has been busy because that act was also updated recently. Privacy Principle 11, which talks about your obligation to take reasonable steps to protect personal information, now explicitly states that you need to have measures in place, organisational and technological measures, to help you take those reasonable steps,” Hill said.

“So, that would be something that you need to have in the back of your mind when you’re actually keeping a record of a client meeting or a summary of a client meeting.

“Also, things like just the overarching obligation under the financial services licensing regime to provide financial services efficiently, honestly and fairly would be something that you need to keep in mind, as well as just general cyber security.”

Hill also argued that advisers need to double check the work of AI tools, including transcripts and generated meeting summaries, if they are going to rely on them when producing financial advice.

“You could argue that, at any point in the past, when we’ve made file notes of conversations, we’ve probably made mistakes as well, but the problem with AI is that we sometimes tend to treat it a bit more as gospel,” she said.

“Just be careful when you’re creating a transcript of a conversation at your end, or using AI to do that, that you check that the transcript is actually correct before you sort of set it in stone and correct it if need be.

“You also need to know what your client is recording at their end. If they’re creating a transcript of your conversation, there’s a risk that if their transcript has errors, that that could be problematic if you had a dispute down the track and they wanted to take action against you, for example.”

Hill added: “The other thing to note about the privacy regime is that from mid-2026 you’re going to have to include information in your privacy policy about how you use automated decision making processes, so that’s something to keep on your radar for 2026 as well.”