Skip navigation
Can Technology Tell You When a Client Is Worried?

Can Technology Tell You When a Client Is Worried?

Since hanging a shingle of his own, fee-only financial planner Michael Solari adopted virtual conferencing as a way to connect with clients and prospects across the country. Yet he’s not convinced the technology is better than a face-to-face meeting, where he can better keep a close watch for the visual cues—looking confused, or glancing down during certain moments of a conversation—that clue him into their feelings about whatever is being discussed.

“There’s nothing like being in front of the person,” says the Nashua, New Hampshire-based advisor.  “I don’t know if technology will ever completely replace that.”

Technology, however, is trying. From verbal to visual apps, tech tools are offering a virtual window in the way we read feelings and the concerns of others. Sensing emotion either through a facial expression or a tone of voice, these tools promise to help us understand what’s going on behind someone’s words and gestures.

Imagine being able to tell if a client truly is comfortable with a risky investment just from the way they are speaking or the facial expression they are making. Often referred to as affective computing, this kind of digital interaction is beginning to gain traction. Take US+, a plug-in for Google Hangouts developed by artists Lauren McCarthy and Kyle McDonald. It works across virtual meetings within the Google environment, and uses linguistic inquiry and word counts to “read” a discussion, tracking a user’s qualities, like positivity and aggression. It’s even slightly amusing, shutting off a microphone if the program senses one person has talked much more than the other. More art project than applicable technology to an advisory practice, US+ gives some hint of what the future may hold for virtual meetings.

An app for Google Glass has also garnered some attention through a facial analysis program called SHORE; the program can detect basic emotions from happy to angry in the person they’re talking with. Companies including Emotient and Noldus also use facial emotional analysis to “read” a person’s expression.

Beyond Verbal’s Moodies interprets a speaker’s voice modulations to identify their emotions. The app can be used across dozens of languages, either live or in a recorded playback. These emotional voice sensors are increasingly popular; Sharecare, a health and wellness site, is using an app that monitors a user’s normal phone conversations and gives a reading of their emotional state and, from green to red, their level of stress.

Advisors think of themselves as therapists. They need to read beyond the list of numbers on a financial document and sense any worry, fear or hope that a client may have about their financial history—and future. Tools that offer clarity into a client’s mindset produces not only better planning, but potentially fewer phone calls. When bumps appear, advisors are likely to be better aligned to what investors want, whether they’ve articulated their thoughts or not.

“We may not be reading everyone the right way,” says Solari. “There are some touchy issues that financial planners get involved with. If technology can make me a better planner, I would use that.”

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish