Unauthorized access hits Anthropic's Mythos AI model, sparking security concerns | News.az
Market Updates

Unauthorized access hits Anthropic's Mythos AI model, sparking security concerns | News.az

News.az1d ago

A group of unauthorized users gained access to Anthropic's artificial intelligence model Mythos, raising concerns about the security of the system, which is capable of supporting advanced cyber-related functions, News.Az reports, citing Bloomberg.

The report said a small group of users in a private online forum obtained access to Mythos on the same day Anthropic announced plans to make the model available to a limited number of companies for testing purposes.

Since gaining access, the group has reportedly been using the model regularly, though not for cybersecurity activities. Bloomberg cited sources and said it was provided with screenshots and a live demonstration of the model's capabilities.

Anthropic has previously stated that Mythos is capable of identifying and exploiting vulnerabilities "in every major operating system and every major web browser when directed by a user to do so." Because of these capabilities, the company has restricted access to a select group of software providers for testing and security evaluation.

According to the report, the users allegedly gained entry using a combination of methods, including access through a third-party contractor and tools commonly used by cybersecurity researchers.

In response, an Anthropic spokesperson said the company is investigating the claims of unauthorized access involving a third-party vendor environment.

The company added that it has no evidence suggesting that the access reported by Bloomberg extended beyond the third-party vendor's environment or that any of Anthropic's internal systems have been affected.

Bloomberg also reported, citing sources, that the group involved appears to be interested in experimenting with new AI models rather than causing harm, and has not used Mythos for cybersecurity-related prompts.

Originally published by News.az

Read original source →
Anthropic