Security
Headlines
HeadlinesLatestCVEs

Headline

CVE-2023-29374

In LangChain through 0.0.131, the LLMMathChain chain allows prompt injection attacks that can execute arbitrary code via the Python exec method.

CVE

Something went wrong, but don’t fret — let’s give it another shot.

Related news

GHSA-fprp-p869-w6q2: LangChain vulnerable to code injection

In LangChain through 0.0.131, the `LLMMathChain` chain allows prompt injection attacks that can execute arbitrary code via the Python `exec()` method.

CVE: Latest News

CVE-2023-50976: Transactions API Authorization by oleiman · Pull Request #14969 · redpanda-data/redpanda
CVE-2023-6905
CVE-2023-6903
CVE-2023-6904
CVE-2023-3907