As the European Parliament prepares for tomorrow's vote on the new Copyright Directive with its provisions requiring mass-scale filtering of the majority of public communications to check for copyright infringement (Article 13) and its provisions requiring paid permission to link to the news if you include as little as two words from the headline in your link text (Article 11), a dismaying number of "creators groups" are supporting it, telling their members that this will be good for them and their flagging financial fortunes.

The real incomes of real creators are really important (disclosure: my primary income source comes from writing science fiction novels for Tor Books, a division of Macmillan). Improving the incomes of the creators who enliven our days, inform, shock, delight and frighten us is a genuine Good Thing.

And creators are not faring well: as both the entertainment industry and tech industry have consolidated, our power to negotiate for a fair slice of the pie has been steadily eroded. While it's never been the case that the majority of people who wanted to be artists managed to make a career out of it, we're also at a very low point in how much of the money generated by artists' work finds its way into artists' pockets.

Enter the Copyright Directive. Under Article 11, tech platforms are expected to send license fees to news companies, and the journalists whose work appears on news sites are presumed to get a share of these new profits.

But this will not happen on its own. A tax on linking means that smaller news sites—where writers are paid to analyze and criticize the news—will be frozen out of the market. They will face legal jeopardy if they link to the news they are discussing, and they will be unable to pay expensive linking fees geared to multinational tech platforms. Publishers have little incentive to negotiate licenses with small players – particularly if those writers wish to criticize the publisher’s work. Meanwhile, experience has shown that in the absence of competitive or legal pressure, news proprietors are more apt to disburse profits to shareholders, not journalists. The most likely outcome of Article 11 is fewer places to sell our work, and a windfall for the corporations who have been slicing our pay for decades.

Even worse, though, is Article 13, the copyright filters. Creative people worry that their works are displayed and sold wholesale, without permission, and to the benefit of unscrupulous "content farmers" and other unsavory networked bottom-feeders.

To address this, Article 13 mandates that all online platforms create databases of copyrighted works and block people from posting things that match items in the database. Any site that lets the public post text (whether that's whole articles or works of fiction or short tweets and Facebook updates), still images, videos, audio, software code, etc, will have to create and run these filters.

Will the filters work? Experience says they won't. Defeating these filters is just not hard, because it's not hard to trick computers. The most sophisticated image filters in the world have been deployed by Chinese internet giants in concert with the Chinese government's censorship effort. As a recent analysis from the University of Toronto's world-leading Citizen Lab has shown, it's relatively straightforward to beat these filters—and they were built by the most skilled engineers on the planet, who operated with an effectively unlimited budget, and who have the backing of a state that routinely practices indefinite detention and torture for people who defy its edicts.

The Chinese censorship system is attempting something far more modest than the EU's proposed copyright filters—checking for matches to a paltry few hundred thousands of images, using unimaginably giant pools of talent and money—and they can't make it work.

Neither can YouTube, as it turns out. For more than a decade, YouTube has operated an industry-leading copyright filter called "Content ID." Despite costing more than $60,000,000 to build, Content ID has not prevented copyright infringement—far from it! The failure of Content ID to catch infringing material has been the subject of frequent, sharp criticism from rightsholder groups, and indeed, has been cited as a major factor in the case for Article 13 and its copyright filters.

But that's not the only problem with Content ID: not only does it fail to catch copyright infringement, it also fails by falsely blocking works that don't infringe copyright. Content ID has falsely accused NASA of infringing when it uploaded its own Mars lander footage; it blocks classical pianists from posting their own performances (because Sony has claimed the whole catalogues of long-dead composers like Bach and Beethoven); it wipes out the whole audio track of scientific conference proceedings because the mics picked up the background music played in the hall during the lunch break; it even prevents you from posting silence because half a dozen companies have claimed to own nothing at all.

The thing is, Content ID is much less censorious than the Article 13 filters will be. Content ID only accepts mass submissions from a few "trusted rightsholders" (Article 13 filters will allow any or all of the internet's 3,000,000,000 users to claim copyright in works); it only checks videos (Article 13 filters will check text, audio, video, stills, code, etc); and it only affects a single service (Article 13 will affect all major services operating in Europe).

As creators, we often have cause to quote materials, both from the public domain and from copyrighted sources, under fair dealing. Our own works, then, are quite liable to trigger automated censorship from these filters, while actual infringers, who have plenty of time and motivation, dance around them as they have done with Content ID and as Chinese dissidents do with the country's social media filters.

If you happen to work for a giant media corporation, this may not be a problem. When I've had my books wrongfully taken down due to fraudulent copyright claims, I've been able to bring the might of Macmillan to bear to get them reinstated. But try calling Google or Facebook or Twitter in your individual capacity and ask them to task a human staffer to pay real attention to the thorny question of deciding whether your photo infringes copyright (say, because it captured an advertisement on the side of a bus with a copyrighted stock image that triggered a filter).

What's more, these filters aren't cheap. The $60 million pricetag on Content ID is just for starters, for a system that only filters part of one media type. The table-stakes for competing with the US tech giants in Europe is about to skyrocket to hundreds of millions of dollars for filters that don't stop infringers but do interfere with our legitimate creativity.

Article 13 will disadvantage any creator who isn't sheltered under the wing of a large corporation, and it will reduce competition in the tech sector, and thus the kind of deal we can get if we try to go it alone—it gets us coming and going.

In the face of real problems, we have to work out real solutions. We have to resist the temptation to adopt harmful responses that are pitched as solutions to the problem but only make it worse.

Seventeen years ago, some terrible people committed a terrorist atrocity on a scale never seen. In response to this genuine horror, the public and politicians demanded that Something Must Be Done. They should have been more specific.

In the seventeen years since the September 11th attacks, we've spent trillions on war and surveillance, eroded human rights and free speech, and still we fear terrorism. The security theater that followed 9-11 is a sterling example of the security syllogism: "Something must be done. There, I've done something."

The big entertainment and newspaper companies would be glad to have a few millions directed from the coffers of the big tech companies, regardless of the consequences to the creators whom they're claiming to represent. But Articles 11 and 13 are a catastrophe for both competition and free expression, the two most important values for creators who want to get speak freely and get paid for it.

Wanting it badly is not enough. If we allow ourselves to be stampeded into support for half-baked measures that line the pockets of big business and hope that the money will trickle down to us, we're digging ourselves even deeper into the hole. It's not too late to ask your MEPs to vote against this: visit Save Your Internet to contact them.