AI screenwriter
AI screenwriter
  • Content owners are wising up to their work being freely used by Big Tech to build new AI tools.
  • Bots like Common Crawl are scraping and storing billions of pages of content for AI training.
  • With less incentive to share online freely, the web could become a series of paywalled gardens. 

AI is undermining the web's grand bargain, and a decades-old handshake agreement is the only thing standing in the way.

A single bit of code, robots.txt, was proposed in the late 1990's as a way for websites to tell bot crawlers they don't want their data scraped and collected. It was widely accepted as one of the unofficial rules supporting the web.