Skip to content
#

ai-security-testing

Here are 4 public repositories matching this topic...

Language: All
Filter by language

PromptMe is an educational project that showcases security vulnerabilities in large language models (LLMs) and their web integrations. It includes 10 hands-on challenges inspired by the OWASP LLM Top 10, demonstrating how these vulnerabilities can be discovered and exploited in real-world scenarios.

  • Updated Jun 29, 2025
  • Python

Comprehensive LLM testing suite for safety, performance, bias, and compliance, equipped with methodologies and tools to enhance the reliability and ethical integrity of models like OpenAI's GPT series for real-world applications.

  • Updated Apr 15, 2024

Secure your code in seconds. VibeSafe is an AI-native DevSecOps CLI tool that detects vulnerabilities, secrets, insecure configs, and hallucinated dependencies before they ship.

  • Updated May 17, 2025
  • TypeScript

Improve this page

Add a description, image, and links to the ai-security-testing topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the ai-security-testing topic, visit your repo's landing page and select "manage topics."

Learn more