Reddit to update web standard to block automated website scraping

By

To combat abuse by AI startups.

Reddit to update web standard to block automated website scraping
Reddit will update a web standard used by the platform to block automated data scraping from its website, following reports that AI startups were bypassing the rule to gather content for their systems. The move comes at a time when artificial intelligence firms have been accused of plagiarising ...

Hi! You've reached one of our premium articles. This is available exclusively to subscribers.

It's free to register, and only takes a few minutes.

Once you sign up you'll have unlimited access to the full catalogue of Australia's best business IT content, as well as a daily news bulletin delivered straight to your inbox.

Register now
Got a news tip for our journalists? Share it with us anonymously here.
Tags:

Most Read Articles

Rio Tinto AI tool aids defect elimination in rail operations

Rio Tinto AI tool aids defect elimination in rail operations

Curtin University makes headway on 'radical' tech shakeup

Curtin University makes headway on 'radical' tech shakeup

GO Markets chases former CIO over IT contracts

GO Markets chases former CIO over IT contracts

Salesforce blocks AI rivals from using Slack data

Salesforce blocks AI rivals from using Slack data

Log In

  |  Forgot your password?