Automation, AI and community management: the soul in our new machines
Should the custodians of peer-to-peer relationships around brands be afraid of automation and AI? Vanessa Paech on machines and community management.
This article originally appeared in The Serve Issue, our October/November 2017 print edition of Marketing magazine.
Community managers are building the human infrastructure of the 21st century. AI and automation can support them.
The machine layer offers community strategists power – think Tony Stark’s suit in Iron Man. Used as a tool, it can liberate our time for work that demands a human touch – essential to build that human infrastructure of interconnected ecosystems.
But the Iron Man scenario only works if community professionals help drive the creation of AI – behaviourists alongside technologists to shape nuanced, contextually astute intelligences that consider where the human begins and the machine ends.
Real strides have been made here, even though most tools are geared for comment streams, not many-to-many discussions. In 2012, HuffPost gave its human moderators a textual analysis AI called Julia to help them manage 10 million comments per month. Argentinian social network Taringa! uses a hybrid model of linguistic, image and human eyeballing to manage 1.5 million monthly posts.
The Washington Post launched ModBot – its AI loaded with the lessons of WaPo moderators across a decade – for comment moderation. With filters and flags, ModBot processes what it can, escalating content up the chain if it needs human intervention. Instagram and Google are experimenting with AI-supported moderation for toxic content. Community managers have tried Google’s AI, Perspective, via its open source API and found it struggles with sarcasm, subtlety and cultural translations.
UK start-up Spirit AI is giving us a glimpse of the future with its embodied software called Ally. The tech supports community managers of virtual worlds, monitoring interactions to detect harassment such as stalking. Ally drops a private note to the possible victim to see if they’re OK. The user can ask for help, or tell Ally not to worry, and the system learns. It’s impressive, but there’s no singular response to harassment or threatening behaviour.
We can and will break our machines
Humans cannot resist the urge to manipulate systems, and AI is a system. Users are already entering into adversarial relationships with algorithms as they game them to suit their needs. Serial pests are adept at this arms race, and AI is merely a new dimension to their sport.
Machines are ideal for handling ‘low-fi’ moderation tasks like vacuuming up spam and swatting trolls. But moderation is also ‘hi-fi’ – balancing of voices and social dynamics, highlighting ideal behaviour, diffusing tensions between members, or discreetly putting someone in touch with crisis services.
Smart social structures
Early online communities were bounded digital destinations. Social networks connected them. Now, communities are highly distributed across multiple environments. Most have some version of an ecosystem anchored in one or two key places. Community managers oversee and steer the ecosystems, and build out and massage them as participant needs change.
AI can help us create new smart zones for engagement using conversational analysis. If a hot button issue is emerging, we can program our AI to create a sub-forum
to house discussion around that topic. Community architecture can become more fluid with the support of machine learning and a community manager to feed its smarts.
Bots connecting the dots
Bots can connect us to useful information when we first join a community. Many people prefer a tip from a bot to self-service. Xero’s bot for Messenger, Hey Xero, does a great job with asset discovery and troubleshooting, answering questions and connecting wider members of Xero’s B2B customer community to local experts.
This can be extended into broader community management contexts. An intelligent agent could help a manager of a large community by welcoming newcomers and recommending certain conversations or content. They can stimulate activity by sending a private message or making a public post suggesting that member A. connect to member B.
And they can help the community manager identify influencers, experts or members that fit certain criteria.
Making more time and space for trust
Trust is currency for the community manager; our communities don’t work without it. We earn it, nurture and maintain it through human-centred interactions and social proof. Introducing machines at the wrong moments, or in the wrong way, erodes trust.
Customer service agents in too many businesses are pressured to behave like machines as it is – incentivised on speed and cost reductions, not satisfied customers or care invested. We’ve dehumanised this work to such an extent that a bot that can spend as long as we need solving our problems actually seems refreshing.
Let’s use the freedom this offers responsibly, and reinvest our humanity where it counts: solving the sophisticated and strategic problems for our customers and enabling our communities.
We can’t lose the common ground
The value of hyper-personalisation is clear: reduce signal-to-noise ratio and connect consumers with content that activates and fulfils needs efficiently. Personalising a community experience misses the point. Community is valuable because it’s a shared experience. Even conflict – like debate – can engage and bind members. Community is a highly effective silo-killer. Individualise an experience excessively and those silos shoot back up.
Peer support communities in particular rely on immersion in a collective. Mental health organisations Sane, Cancer Council NSW, Beyond Blue and Reach Out each run outstanding forum communities where a common connection fuels constructive engagement. Although you may be able to personalise certain content within
the community for a member, AI can’t replace the critical inclusivity and empathy of talking with someone going through a similar situation.
Building our hive
AI will change the way community professionals manage their knowledge base. Machine learning can wrangle the repository of resources we’ve assembled over decades, and connect us with the right resource for the job at hand. We tell the machine our problem and it cobbles together a batch of case studies, research, tools and individuals we can tap to solve it.
Founder of virtual assistant x.ai, Dennis Mortensen, talks about a Bring Your Own Agent (BYOA) future-state in less than a decade, where workers will create or order intelligent agents to accomplish highly specific (vertical) tasks that help them in their jobs. The community manager of 2027 could carry with them a team of agents designed to complete single tasks in their community management portfolio.
Social marketers are vulnerable
Social media managers who distribute content systematically and drive engagement against it will be superseded by automated equivalents. Community management’s point of distinction is its humanity. Instead of being replaced, community experts will upgrade. We’ll work to help businesses set up bots and intelligent interactions.
We’ll plot behavioural frameworks for machine learning. We’ll spill into HR, marketing, IT, innovation – anywhere there’s a need to understand and optimise social intelligence. Leveraging AI for communities demands we extend our capabilities as social systems engineers. If we get it right, we can see to it that AI augments our best natures.
Vanessa Paech is the co-founder of the SwarmConf community management conference.
* * * * *
To purchase a copy of The Serve Issue or a subscription to our mag, visit the online shop »
* * * * *
Image copyright: phonlamaiphoto © 123RF