Microsoft Word just got woke with its new social justice plugin
Coming soon to a word processing app you probably already subscribe to is Microsoft’s new Ideas plugin. This leap forward in the predictive text trend will endeavor to help you be less offensive. Worried you might be a little bit racist? A little gender confused? Not sure about the difference between disabled persons and persons who are disabled? Never fear, Microsoft will fix your language for you.
Using machine learning and AI, Microsoft’s Ideas in Word will help writers be their least offensive, most milquetoast selves. Just like spell check and grammar check function, Ideas will make suggestions as to how to improve your text to be more inclusive. On the surface, this seems like a terrible idea, but when we dig further beneath the impulse, and the functionality of the program, it gets even worse. What’s happening is that AI and machine learning are going to be the background of pretty much every application, learning from our behaviours not only how we’d like to format our PowerPoint presentations, but learning, across platforms, how best to construct language so that we say what we are wanted to say as opposed to what we really mean.
There is an essential component of honest communication, namely that a person express themselves using their own words. When children are learning to talk and to articulate themselves, they are told to “use your words.” Microsoft will give writers the option of using someone else’s words, some amalgamation of users’ words across the platform, and the result will be that the ideas exhibited will not be the writer’s own.
“We want to augment your skills,” said Rewari, the senior product marketing manager for Microsoft 365 on Microsoft’s blog. “We want to help you communicate more efficiently, effectively and inclusively.” What does this mean? Like Peggy McIntosh, let’s unpack.
Communicating efficiently means, primarily, using less words to get an idea across. There is an idea in writing that the best practice is to use the least amount of words possible to get a given idea across. However, many concepts cannot be boiled down into a simple sentence structure. Already there’s this issue with Microsoft’s annoying grammar check, but this new thing will emphasize brevity over the formulation of complex ideas.
What does it mean to communicate effectively? It means to get the idea across to the reader, ideally. But what Microsoft does not know is who the writer’s intended audience is, and what that audience is expected to know, or expects to be able to comprehend. Writing a test prep guide on biology for third graders is a vastly different enterprise from writing one for high school seniors, or at least one would hope.
The final adjective tossed into the word soup of what Microsoft hopes to achieve with their ideas tool is inclusivity. That great, confusing, ill and over used word that tends to mean whatever anyone wants it to mean at the time they speak it. Inclusivity in a language with over a million words, as English has (and despite the qualifiers about there being multiple forms of the same word, and word with multiple meanings, is less important than accuracy. In fact, accuracy may be the reason that English has so many words in the first place. When there is not a word to sufficiently say the nuanced thing we wish to, we drag in a word from another language, and adopt it as our own.
Is the idea of being inclusive in language is to use words that are the most acceptable to the most people, or is it to use the words that those who created the AI and machine learning algorithms think should be the most acceptable to the most people? Not only are these two completely different things, but both are completely anathema to either good writing, or honest writing.
The more we shout our differences into the void of social media and the world at large, demanding recognition of our own, unique circumstances, genomes, cultural backgrounds, sexualities, and whatever else, the more we simultaneously demand to decrease precisely those particularities. While Ai and machine learning engineers may think this is a glorious leap forward in getting all of us on the same page with regard to how we write and what we say, anyone else should be pissed. This kind of tool is theft of a writer’s brain power, ingenuity, inventiveness, and creativity.
As AI and machine learning move off the desktop and into our bodies, as with Elon Musk’s Neuralink, we will find that these predictive algorithms and corrective texts help us think more efficiently, effectively, and inclusively. It won’t be long before autocorrect corrects our thoughts to be either the most acceptable to the most people, or to be those thoughts that the algorithms, complete with their own undetectable biases, are designed to make us think. Either way, giving away your speech, giving away your words, is giving away your ideas and your independence. Use your words, not anyone else’s, and least of all, Microsoft’s. And perhaps it’s time to use an open source word processor.