Passport

Literally yesterday (read the post) I was saying to myself, wow, Discord added a good feature! Maybe this will be the start of a trend?

Today. One day after that.

We get this blog post.

The post starts by talking about how bots will be able to incorporate themselves into Discord's interface and feel more like part of the platform — cool!

The post then says that you'll need to tell Discord who you are in real life by sending them scans of your passport if you want your bot to grow beyond more than 100 guilds.

Neat.

Well, that's not an entirely fair way of putting it. Let's see what actually happens.

Problems with this:

But the real question is, why is this necessary? Why are they doing this? What is this information going to be used for?

The blog post says that it's necessary and secure and proves you're legit, while providing absolutely no reasons what this helps to protect or what this data is used for exactly. And no, it's not necessary, it's not secure, and it doesn't prove that anyone is legit.

I'm extremely worried by the lack of transparency here. Remember gateway intents? Remember the GitHub issue where bot developers could discuss with Discord staff how these changes would work? Remember the transparency, and the information, and the gradual rollout, and the requested changes, and the experimental gateway version, and the feedback? We have none of that.

We have a screenshot of a design that is currently entirely a mockup to lure people in, followed by verification presented as a cool thing that will help trust and security on the platform, followed by them holding your bot at ransom until you upload scans of your passport, finished with a warning that you have 6 months to comply. This came out of nowhere. No opporunities for feedback. No progression towards it. No discussion. Just suddenly, "by the way, please give us your identity," and people are swarming over it at the promise of a profile badge.

So I've listed things that your identity is unlikely to be used for. How about a list of things that it can be used for? Because remember, we still have absolutely no idea why this is necessary or what it will help improve. The best I can do is write theories on the internet.

The key to my theory is that the form Discord asks you to fill out about your bot includes many questions about keeping your user's data safe. It's similar to the GDPR controls you see on some websites. You're asked what you do with the data, how long you hold it for, how you keep it safe, how users can report security issues, and how you delete data if a user asks you to. Based on this form, and the fact that they want to know your real world identity, and that they plan to minimise your impact on the platform if you do not comply, and in absence of any official information, my prediction is that they plan to hold bot developers legally accountable if the developer mishandles user information.

It's probably correct, it's hopefully not correct, but it's the only thing I can think of that makes sense.

Finally, speculation about its purpose aside, this verification procedure is bad for the Discord ecosystem in several ways.

One comment I've seen a lot about verification is that 100 servers is too low. My question is, too low for what? We have no idea what this will be used for. What are you comparing the number 100 to? What information did you use to make the decision that it's too small? "100 is too low" is not a good argument at all.

Another argument: "this is a vetting process for big bots so that they don't harvest private conversations from thousands of servers". The verification process does not vet anything or ensure that data is kept safe at all. Discord cannot verify what your bot does with the information it receives.

To sum up why this is a bad thing:

Scream.

The memes / of / the / day [oc] / are all related to this post.

— Cadence

← Previous: Tableau IINext: Hill III →