Minecraft Chat Reporting is DUMB

By spitemim, 2022-07-05

Before you read this, take a look at Minecraft’s official documentation on player reporting, as well as their officially published “rules”. All links are archived using archive.today and accurate for 2022-07-05.

Player Reporting

Community Guidelines

Player Reporting (and why it’s dumb)

Player reporting is a new feature added in Minecraft 1.19, which allows users to report player chat messages that violate community guidelines. This feature has been made possible by cryptographically signed chat, a feature which allows chat messages to be verified as coming from a specific Microsoft account.

Here are some examples of types of messages that could get you banned, from the Player Reporting FAQ and Community Standards:

Of course, every report will be reviewed by a highly trained Microsoft employee, and the user will get an explanation as to why they got banned. Except, this never happens on any platform. I expect there to be a high level of animosity as to exactly why a ban is issued, and there will probably be no indication as to which message the user got banned for (e.g: “your account was suspended for hate speech” with no other context given).

Not only is it unjust for someone to be punished with no explanation as to WHY they got punished, it’s also a really horrible idea if your goal is to encourage misbehaving community members to adhere to community guidelines. If you can’t be sure of exactly which slightly offensive chat message got you banned, how are you supposed to judge where the line is? Basically, you have to either refrain from sending messages entirely, or be as concise and inoffensive as possible, completely stripping personality and wit from your messages.

It’s also safe to assume that this report system will be used as a tool for witch hunts targeting server owners or generally unpopular people. Microsoft has stated that an increased volume in reports will not correspond to automated moderation actions, nor will it constitute “more evidence” that a misbehavior took place and subsequently increase the chance of action being taken. From the FAQ:

The volume of reports does not correspond to an automated action being taken on a report. The reported chat evidence is always the basis of any action. The volume of reports does not constitute as more evidence.

However, it only takes one (1) report for a moderator to ban someone, and there’s a much greater chance that 1 out of 100,000 false reports is mistakenly (or purposefully, by an opinionated reviewer) prosecuted than just 1 or 2. So really, an increased volume of reports WILL result in a greater chance of being banned just by inflating the rate of human error. And of course, this is all assuming that Microsoft isn’t lying about hiring real people to handle reports, and won’t just resort to bot moderation when they recieve too many. Let’s be real: Microsoft is part of big tech. Taking anything they say seriously or affording them even an ounce of trust is a fool’s errand.

My biggest problem with this feature is that it applies to independent and private servers, not just Realms. Realms are Microsoft’s own thing and run on their own servers, so in my eyes, Microsoft has the right to do whatever they want with them. But if someone runs a Minecraft server on their own hardware, enforcing their own rules and standards, why should Microsoft get a say in how their server should be run?? It shows that Microsoft has no respect for server admins to run their servers the way they want. Luckily the decentralized nature of Minecraft servers means there are some solutions to this problem, which I’ll get to later in this post.

Inconsistencies

I’m nitpicking a bit here, but the documentation has many inconsistencies in places. The examples I found are probably not comprehensive, but just the fact that Microsoft doesn’t have their own story straight on what behavior is allowed does NOT give me hope for the fairness of the future moderation team.

Threats

From “Addressing Player Reporting Tool”:

The type of behavior that will get someone suspended or banned is hate speech, bullying, harassment, sexual solicitation, or making true threats to others.

This seems to imply that they will only punish people who make “true threats”, whatever that means. As far as I can tell, it would be impossible to verify whether a threat made in Minecraft chat was carried out in real life or not.

But, the Community Standards has this to say about making threats:

Minecraft has a zero-tolerance policy towards hate speech, bullying, harassing, sexual solicitation, or threatening others.

...which seems to imply that ALL threats are to be punished equally, carried out or not. “Zero-tolerance policy” is a strong choice of words. I can see this having a huge negative effect on gameplay. Consider the following scenario:

player1 was pushed into lava by player2

<player1> i'm gonna kill you dude

(player2 proceeds to report player1)

player1 has been banned

Self-harm and suicide

From the “Player Report Categories” section of “Player Reporting in Minecraft: Java Edition”:

Woah! So talking about self-harm or suicide is punished?

From the FAQ:

Will you ban people for talking about suicide?

Which one is it, Microsoft? Will you ban players who talk about self-harm, or will you attempt to provide them with support resources? Your documentation on this topic is extremely inconsistent, so there’s no way to tell what the outcome will be. This further highlights the issue that Minecraft doesn’t seem to have their story straight on exactly which behaviors are allowed and disallowed.

“Intimate imagery”

Once again, from the “Player Report Categories” section of “Player Reporting in Minecraft: Java Edition”:

First of all, including the phrase “non-consensual” in the title of this category is completely redundant. The category’s description makes no mention of whether or not the intimate imagery is consensual, therefore it should just be called “intimate imagery.”

Second of all, merely talking about private/intimate images is in violation of this rule. That means it would be against the rules to even provide a description of a private image without exposing any players to the actual image in question. I guess graphic descriptions of sexual content is something that should be regulated, but “talking about private and intimate images” is a very broad & vague rule, and I can see its ambiguity being used to unfairly report innocent players.

Possible Solutions

The decentralized nature of Minecraft servers and clients means that there are many ways to get around this chat report system. Unfortunately, many of them involve disabling cryptographically signed chat, and therefore rely on servers disabling “enforce-secure-profile” in the server options, which is enabled by default, and also rely on other players disabling “Only Show Secure Chat.” It’s a sad reality that sending chat messages on most public Minecraft servers is no longer an option if you want to keep your account safe. However, if you just wanna play with your friends, there are a few solutions.

NoChatReports

NoChatReports is a mod that I’m planning to use the next time I want to set up a server to play with friends. You can read the website to find out exactly how it works when installed on the client and server, but here’s a basic explanation: When installed on the client, it disables cryptographically signing your messages, which in turn disables chat reporting since your messages can no longer be verified as being sent by you. Many servers will refuse to allow you to connect, so the mod will prompt you for whether you agree to send signed chat. When installed on the server, it strips the signatures from chat messages. Players will not be able to see these messages if they have “Only Show Secure Chat” enabled, but you can configure the mod to send all chat messages as system messages to get around this.

NoEncryption

NoEncryption is a Spigot plugin that disables chat reporting by stripping messages of cryptographic information, much like NoChatReports.

Not sending messages in chat

This one is the simplest method of all, and the one I’m going to use on all servers that refuse to send unsigned chat messages. If you don’t send any chat messages, there’s nothing for other players to report. You can talk to your friends on an alternative platform with less dystopian policies.

Conclusion

Minecraft’s new player reporting feature is poorly documented and seems easy to exploit. It also demonstrates Microsoft’s lack of respect for server administrators and personal space. Fortunately, plugins and mods that disable cryptographically signed chat messages can disable this feature on servers and, in some cases, allow clients to send chat messages without being reported on servers that don’t have these plugins themselves.

It may never be safe to send chat messages in a public Minecraft server ever again. But if you just want to play with your friends, or if your server’s admins agree that Microsoft has no right to moderate the chat in their server, then you can safely enjoy the block game... for now.