"Lobster" banned? Anthropic plans to charge extra for third-party tool calls

robot
Abstract generation in progress

AI(Artificial Intelligence)new giant Anthropic has dealt a heavy blow to developers using OpenClaw (“lobster”) and its flagship model, Claude.

Local time on April 3, according to the technology website The Verge, Anthropic is about to block third-party integration tools like OpenClaw from accessing Claude’s subscription services. Users will need to enable a specific pay-as-you-go mode on demand to continue using OpenClaw together with Claude.

An email sent by the company to users shows that starting at 3:00 p.m. Eastern Time on April 4, users using Claude AI via OpenClaw will find that their original subscription credits no longer apply.

This means Anthropic is cutting off third-party call wrappers—including OpenClaw’s—access to standard subscription bundles, forcing users to switch to a separate unit-based billing system. Typically, under this model, the cost per token is higher than the subscription plan credits, meaning developers will face costs that exceed their set budgets and are difficult to predict.

For most users who use OpenClaw and Claude together, this change is essentially equivalent to an order of prohibition. Although Anthropic has not directly blocked OpenClaw access at the technical level, for teams that have built complete workflows around OpenClaw and rely on it to call Claude through common interfaces, this change will immediately bring both financial and operational pressure.

Public information shows that Anthropic was founded in 2021 by former OpenAI employees. Its products include the Claude series of large language models. Since its founding, the company has received investments from tech giants such as Amazon, Google, Nvidia, and Salesforce.

Regarding this paid policy adjustment, many developers have already voiced dissatisfaction on social media, complaining about platform instability and a trust crisis. Some well-known AI developers pointed out that they chose the Claude platform because Anthropic seems more willing than other competitors to build a third-party ecosystem, and this policy shift undermines that advantage.

Behind this is increasingly fierce competition among AI companies. OpenClaw originally emerged from Claude, relied entirely on Claude to provide intelligence capabilities, and even changed its name from “Clawdbot” to the now widely known “lobster” under Anthropic’s requirements. After that, OpenClaw’s founder, Peter Steinberger, joined OpenAI.

By raising the cost of using third-party tools, Anthropic is trying to steer users toward its own ecosystem. Earlier this year, Anthropic released a desktop agent app, Claude Cowork. Recently, Anthropic also launched its own “lobster,” announcing that users of its products Claude Code and Claude Cowork can let Claude control their computers—opening files, using a browser, and running development tools.

And not long ago, Anthropic unexpectedly “open-sourced” 510k lines of source code from its coding assistant Claude Code, including 4,756 source files, more than 40 tool modules, and multiple unreleased features that were leaked. For a startup like Anthropic that emphasizes “safety” and is actively seeking to go public via an IPO, a source-code leak is undoubtedly a major blow.

A flood of information and precise analysis—right in the Sina Finance APP

Responsible editor: Shi Xiuzhen SF183

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin