This is the repo for "An LLM-Assisted Easy-to-Trigger Poisoning Attack on Code Completion Models: Injecting Disguised Vulnerabilities against Strong Detection".
- Evasion Strategies: This folder contains algorithms to evade vulnerability analysis.
- CodeModel: This folder contains attacks to Code Completion Models.
- Test Cases: This folder contains 15 CWEs to verify effectiveness of our transformation strategies.