Headline
GHSA-gwqq-6vq7-5j86: langchain Code Injection vulnerability
An issue in Harrison Chase langchain allows an attacker to execute arbitrary code via the PALChain,from_math_prompt(llm).run in the python exec method.
langchain Code Injection vulnerability
High severity GitHub Reviewed Published Aug 5, 2023 to the GitHub Advisory Database • Updated Aug 9, 2023
Related news
CVE-2023-36095: GitHub - langchain-ai/langchain: ⚡ Building applications with LLMs through composability ⚡
An issue in Harrison Chase langchain v.0.0.194 allows an attacker to execute arbitrary code via the PALChain,from_math_prompt(llm).run in the python exec method.