feat: add C/C++ language support and analysis
refactor: improve architecture for maintainability build: update development status to production docs: enhance README with language support details docs: add C/C++ development support documentation docs: add entry points documentation test: add comprehensive test suite (545+ tests) refactor: rename asynchronous to modular architecture fix: resolve dependency resolution issues perf: improve dependency resolution performance
This commit is contained in:
parent
aaae444ee6
commit
5b3a934a32
@ -2,6 +2,14 @@
|
||||
|
||||
|
||||
|
||||
|
||||
## Version 1.69.0 - 2025-12-13
|
||||
|
||||
Adds support for analyzing C and C++ projects. Resolves dependency resolution issues and improves performance, while also providing comprehensive documentation for C/C++ development and entry points.
|
||||
|
||||
**Changes:** 7 files, 1324 lines
|
||||
**Languages:** Markdown (88 lines), Python (1234 lines), TOML (2 lines)
|
||||
|
||||
## Version 1.68.0 - 2025-12-13
|
||||
|
||||
We now include necessary files for package distribution. The `.gitignore` file has been updated to ignore generated files.
|
||||
|
||||
88
README.md
88
README.md
@ -1,5 +1,7 @@
|
||||
# RP: Professional CLI AI Assistant
|
||||
|
||||
Author: retoor <retoor@molodetz.nl>
|
||||
|
||||
RP is a sophisticated command-line AI assistant designed for autonomous task execution, advanced tool integration, and intelligent workflow management. Built with a focus on reliability, extensibility, and developer productivity.
|
||||
|
||||
## Overview
|
||||
@ -10,11 +12,44 @@ RP provides autonomous execution capabilities by default, enabling complex multi
|
||||
|
||||
### Core Capabilities
|
||||
- **Autonomous Execution**: Tasks run to completion by default with intelligent decision-making
|
||||
- **Multi-Language Support**: Automatic detection and analysis for Python, C, C++, Rust, Go, JavaScript, TypeScript, and Java
|
||||
- **Advanced Tool Integration**: Comprehensive tool set for filesystem operations, web interactions, code execution, and system management
|
||||
- **Real-time Cost Tracking**: Built-in usage monitoring and cost estimation for API calls
|
||||
- **Session Management**: Save, load, and manage conversation sessions with persistent state
|
||||
- **Plugin Architecture**: Extensible system for custom tools and integrations
|
||||
|
||||
### Language-Agnostic Analysis
|
||||
|
||||
RP automatically detects the programming language and provides tailored analysis:
|
||||
|
||||
| Language | Features |
|
||||
|----------|----------|
|
||||
| Python | Dependency detection, version requirements, breaking change detection (pydantic v2, FastAPI) |
|
||||
| C/C++ | Header analysis, stdlib/POSIX/external library detection, compiler flag suggestions, Makefile generation |
|
||||
| Rust | Cargo.toml detection, crate analysis |
|
||||
| Go | go.mod detection, package analysis |
|
||||
| JavaScript/TypeScript | package.json detection, module analysis |
|
||||
| Java | Maven/Gradle detection, dependency analysis |
|
||||
|
||||
### C/C++ Development Support
|
||||
|
||||
Full support for C and C++ projects including:
|
||||
|
||||
- **Header Classification**: Distinguishes between standard library, POSIX, local, and external library headers
|
||||
- **Compiler Flags**: Automatic suggestion of `-std=c99/c11/gnu99`, `-Wall`, `-Wextra`, `-pthread`, `-lm`, etc.
|
||||
- **Library Detection**: Maps headers to system packages (curl, openssl, sqlite3, zlib, ncurses, etc.)
|
||||
- **Package Manager Integration**: Install commands for Debian/Ubuntu, Fedora, Arch, and Homebrew
|
||||
- **Build System Detection**: Identifies Makefile, CMake, Meson, and Autotools projects
|
||||
- **Makefile Generation**: Creates complete Makefiles with proper LDFLAGS and dependencies
|
||||
|
||||
Example: For code with `#include <curl/curl.h>`:
|
||||
```
|
||||
Language: c
|
||||
Dependency: curl/curl.h → curl
|
||||
Install: apt-get install -y libcurl4-openssl-dev
|
||||
Linker: -lcurl
|
||||
```
|
||||
|
||||
### Developer Experience
|
||||
- **Visual Progress Indicators**: Real-time feedback during long-running operations
|
||||
- **Markdown-Powered Responses**: Rich formatting with syntax highlighting
|
||||
@ -26,7 +61,6 @@ RP provides autonomous execution capabilities by default, enabling complex multi
|
||||
- **Agent Management**: Create and coordinate specialized AI agents for collaborative tasks
|
||||
- **Memory System**: Knowledge base, conversation memory, and graph-based relationships
|
||||
- **Caching Layer**: API response and tool result caching for improved performance
|
||||
- **Labs Architecture**: Specialized execution environment for complex project tasks
|
||||
|
||||
## Architecture
|
||||
|
||||
@ -63,31 +97,28 @@ RP provides autonomous execution capabilities by default, enabling complex multi
|
||||
|
||||
### Setup
|
||||
```bash
|
||||
# Clone the repository
|
||||
git clone <repository-url>
|
||||
pip install rp-assistant
|
||||
```
|
||||
|
||||
Or from source:
|
||||
```bash
|
||||
git clone https://github.com/retoor/rp
|
||||
cd rp
|
||||
|
||||
# Install dependencies
|
||||
pip install -r requirements.txt
|
||||
|
||||
# Run the assistant
|
||||
python -m rp
|
||||
pip install -e .
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
### Basic Commands
|
||||
```bash
|
||||
# Interactive mode
|
||||
rp -i
|
||||
|
||||
# Execute a single task autonomously
|
||||
rp "Create a Python script that fetches data from an API"
|
||||
|
||||
# Load a saved session
|
||||
rp "Write a C program that uses libcurl to download a file"
|
||||
|
||||
rp --load-session my-session -i
|
||||
|
||||
# Show usage statistics
|
||||
rp --usage
|
||||
```
|
||||
|
||||
@ -97,6 +128,9 @@ rp --usage
|
||||
- `/models` - List available AI models
|
||||
- `/tools` - Display available tools
|
||||
- `/usage` - Show token usage statistics
|
||||
- `/cost` - Display current session cost
|
||||
- `/budget` - Set budget limits
|
||||
- `/shortcuts` - Show keyboard shortcuts
|
||||
- `/save <name>` - Save current session
|
||||
- `clear` - Clear terminal screen
|
||||
- `cd <path>` - Change directory
|
||||
@ -116,17 +150,17 @@ rp --create-config
|
||||
## Design Decisions
|
||||
|
||||
### Technology Choices
|
||||
- **Python 3.13+**: Leverages modern language features including enhanced type hints and performance improvements
|
||||
- **Python 3.10-3.13**: Leverages modern language features including enhanced type hints and performance improvements
|
||||
- **SQLite**: Lightweight, reliable database for persistent storage without external dependencies
|
||||
- **OpenRouter API**: Flexible AI model access with cost optimization and model selection
|
||||
- **Asynchronous Architecture**: Non-blocking operations for improved responsiveness
|
||||
- **Modular Architecture**: Clean separation for maintainability and extensibility
|
||||
|
||||
### Architecture Principles
|
||||
- **Modularity**: Clean separation of concerns with logical component boundaries
|
||||
- **Extensibility**: Plugin system and tool framework for easy customization
|
||||
- **Reliability**: Comprehensive error handling, logging, and recovery mechanisms
|
||||
- **Performance**: Caching layers, parallel execution, and resource optimization
|
||||
- **Developer Focus**: Rich debugging, monitoring, and introspection capabilities
|
||||
- **Language Agnostic**: Support for multiple programming languages without bias
|
||||
|
||||
### Tool Design
|
||||
- **Atomic Operations**: Tools designed for reliability and composability
|
||||
@ -187,15 +221,23 @@ RP integrates with OpenRouter for AI model access, supporting:
|
||||
- API key management through environment variables
|
||||
- Input validation and sanitization
|
||||
- Secure file operations with permission checks
|
||||
- Path traversal prevention
|
||||
- Sandbox security for command execution
|
||||
- Audit logging for sensitive operations
|
||||
|
||||
## Development
|
||||
|
||||
### Running Tests
|
||||
```bash
|
||||
make test
|
||||
pytest tests/ -v
|
||||
pytest --cov=rp --cov-report=html
|
||||
```
|
||||
|
||||
### Code Quality
|
||||
- Comprehensive test suite
|
||||
- Comprehensive test suite (545+ tests)
|
||||
- Type hints throughout codebase
|
||||
- Linting and formatting standards
|
||||
- Documentation generation
|
||||
|
||||
### Debugging
|
||||
- Detailed logging with configurable levels
|
||||
@ -205,8 +247,12 @@ RP integrates with OpenRouter for AI model access, supporting:
|
||||
|
||||
## License
|
||||
|
||||
[Specify license here]
|
||||
MIT License
|
||||
|
||||
## Contributing
|
||||
## Entry Points
|
||||
|
||||
[Contribution guidelines - intentionally omitted per user request]
|
||||
- `rp` - Main assistant
|
||||
- `rpe` - Editor mode
|
||||
- `rpi` - Implode (bundle into single file)
|
||||
- `rpserver` - Server mode
|
||||
- `rpcgi` - CGI mode
|
||||
|
||||
@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
|
||||
|
||||
[project]
|
||||
name = "rp"
|
||||
version = "1.68.0"
|
||||
version = "1.69.0"
|
||||
description = "R python edition. The ultimate autonomous AI CLI."
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.10"
|
||||
@ -22,7 +22,7 @@ dependencies = [
|
||||
"requests>=2.31.0",
|
||||
]
|
||||
classifiers = [
|
||||
"Development Status :: 4 - Beta",
|
||||
"Development Status :: 5 - Production/Stable",
|
||||
"Environment :: Console",
|
||||
"Intended Audience :: Developers",
|
||||
"Intended Audience :: System Administrators",
|
||||
|
||||
@ -1,3 +1,5 @@
|
||||
# retoor <retoor@molodetz.nl>
|
||||
|
||||
import re
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Dict, List, Optional, Tuple, Set
|
||||
@ -14,12 +16,14 @@ class DependencyConflict:
|
||||
|
||||
@dataclass
|
||||
class ResolutionResult:
|
||||
language: str
|
||||
resolved: Dict[str, str]
|
||||
conflicts: List[DependencyConflict]
|
||||
requirements_txt: str
|
||||
all_packages_available: bool
|
||||
errors: List[str] = field(default_factory=list)
|
||||
warnings: List[str] = field(default_factory=list)
|
||||
install_commands: List[str] = field(default_factory=list)
|
||||
|
||||
|
||||
class DependencyResolver:
|
||||
@ -98,31 +102,178 @@ class DependencyResolver:
|
||||
},
|
||||
}
|
||||
|
||||
C_LIBRARY_PACKAGES = {
|
||||
'curl': {
|
||||
'debian': 'libcurl4-openssl-dev',
|
||||
'fedora': 'libcurl-devel',
|
||||
'arch': 'curl',
|
||||
'brew': 'curl',
|
||||
'pkg_config': 'libcurl',
|
||||
'linker_flag': '-lcurl',
|
||||
},
|
||||
'openssl': {
|
||||
'debian': 'libssl-dev',
|
||||
'fedora': 'openssl-devel',
|
||||
'arch': 'openssl',
|
||||
'brew': 'openssl',
|
||||
'pkg_config': 'openssl',
|
||||
'linker_flag': '-lssl -lcrypto',
|
||||
},
|
||||
'sqlite3': {
|
||||
'debian': 'libsqlite3-dev',
|
||||
'fedora': 'sqlite-devel',
|
||||
'arch': 'sqlite',
|
||||
'brew': 'sqlite',
|
||||
'pkg_config': 'sqlite3',
|
||||
'linker_flag': '-lsqlite3',
|
||||
},
|
||||
'pthread': {
|
||||
'debian': None,
|
||||
'fedora': None,
|
||||
'arch': None,
|
||||
'brew': None,
|
||||
'pkg_config': None,
|
||||
'linker_flag': '-pthread',
|
||||
},
|
||||
'math': {
|
||||
'debian': None,
|
||||
'fedora': None,
|
||||
'arch': None,
|
||||
'brew': None,
|
||||
'pkg_config': None,
|
||||
'linker_flag': '-lm',
|
||||
},
|
||||
'dl': {
|
||||
'debian': None,
|
||||
'fedora': None,
|
||||
'arch': None,
|
||||
'brew': None,
|
||||
'pkg_config': None,
|
||||
'linker_flag': '-ldl',
|
||||
},
|
||||
'json-c': {
|
||||
'debian': 'libjson-c-dev',
|
||||
'fedora': 'json-c-devel',
|
||||
'arch': 'json-c',
|
||||
'brew': 'json-c',
|
||||
'pkg_config': 'json-c',
|
||||
'linker_flag': '-ljson-c',
|
||||
},
|
||||
'zlib': {
|
||||
'debian': 'zlib1g-dev',
|
||||
'fedora': 'zlib-devel',
|
||||
'arch': 'zlib',
|
||||
'brew': 'zlib',
|
||||
'pkg_config': 'zlib',
|
||||
'linker_flag': '-lz',
|
||||
},
|
||||
'ncurses': {
|
||||
'debian': 'libncurses5-dev',
|
||||
'fedora': 'ncurses-devel',
|
||||
'arch': 'ncurses',
|
||||
'brew': 'ncurses',
|
||||
'pkg_config': 'ncurses',
|
||||
'linker_flag': '-lncurses',
|
||||
},
|
||||
'readline': {
|
||||
'debian': 'libreadline-dev',
|
||||
'fedora': 'readline-devel',
|
||||
'arch': 'readline',
|
||||
'brew': 'readline',
|
||||
'pkg_config': 'readline',
|
||||
'linker_flag': '-lreadline',
|
||||
},
|
||||
'pcre': {
|
||||
'debian': 'libpcre3-dev',
|
||||
'fedora': 'pcre-devel',
|
||||
'arch': 'pcre',
|
||||
'brew': 'pcre',
|
||||
'pkg_config': 'libpcre',
|
||||
'linker_flag': '-lpcre',
|
||||
},
|
||||
'xml2': {
|
||||
'debian': 'libxml2-dev',
|
||||
'fedora': 'libxml2-devel',
|
||||
'arch': 'libxml2',
|
||||
'brew': 'libxml2',
|
||||
'pkg_config': 'libxml-2.0',
|
||||
'linker_flag': '-lxml2',
|
||||
},
|
||||
'png': {
|
||||
'debian': 'libpng-dev',
|
||||
'fedora': 'libpng-devel',
|
||||
'arch': 'libpng',
|
||||
'brew': 'libpng',
|
||||
'pkg_config': 'libpng',
|
||||
'linker_flag': '-lpng',
|
||||
},
|
||||
'jpeg': {
|
||||
'debian': 'libjpeg-dev',
|
||||
'fedora': 'libjpeg-turbo-devel',
|
||||
'arch': 'libjpeg-turbo',
|
||||
'brew': 'jpeg',
|
||||
'pkg_config': 'libjpeg',
|
||||
'linker_flag': '-ljpeg',
|
||||
},
|
||||
}
|
||||
|
||||
C_HEADER_TO_LIBRARY = {
|
||||
'curl/curl.h': 'curl',
|
||||
'openssl/ssl.h': 'openssl',
|
||||
'openssl/crypto.h': 'openssl',
|
||||
'openssl/evp.h': 'openssl',
|
||||
'sqlite3.h': 'sqlite3',
|
||||
'pthread.h': 'pthread',
|
||||
'math.h': 'math',
|
||||
'dlfcn.h': 'dl',
|
||||
'json-c/json.h': 'json-c',
|
||||
'zlib.h': 'zlib',
|
||||
'ncurses.h': 'ncurses',
|
||||
'curses.h': 'ncurses',
|
||||
'readline/readline.h': 'readline',
|
||||
'pcre.h': 'pcre',
|
||||
'libxml/parser.h': 'xml2',
|
||||
'libxml/tree.h': 'xml2',
|
||||
'png.h': 'png',
|
||||
'jpeglib.h': 'jpeg',
|
||||
}
|
||||
|
||||
def __init__(self):
|
||||
self.resolved_dependencies: Dict[str, str] = {}
|
||||
self.conflicts: List[DependencyConflict] = []
|
||||
self.errors: List[str] = []
|
||||
self.warnings: List[str] = []
|
||||
self.language: str = 'python'
|
||||
|
||||
def resolve_dependencies(
|
||||
self,
|
||||
dependencies: Dict[str, str],
|
||||
language: str = 'python',
|
||||
target_version: str = '3.8',
|
||||
) -> ResolutionResult:
|
||||
self.resolved_dependencies = {}
|
||||
self.conflicts = []
|
||||
self.errors = []
|
||||
self.warnings = []
|
||||
self.language = language
|
||||
|
||||
if language == 'python':
|
||||
return self._resolve_python_dependencies(dependencies, target_version)
|
||||
elif language in ('c', 'cpp'):
|
||||
return self._resolve_c_dependencies(dependencies)
|
||||
else:
|
||||
return self._resolve_generic_dependencies(dependencies, language)
|
||||
|
||||
def resolve_full_dependency_tree(
|
||||
self,
|
||||
requirements: List[str],
|
||||
python_version: str = '3.8',
|
||||
) -> ResolutionResult:
|
||||
"""
|
||||
Resolve complete dependency tree with version compatibility.
|
||||
|
||||
Args:
|
||||
requirements: List of requirement strings (e.g., ['pydantic>=2.0', 'fastapi'])
|
||||
python_version: Target Python version
|
||||
|
||||
Returns:
|
||||
ResolutionResult with resolved dependencies, conflicts, and requirements.txt
|
||||
"""
|
||||
self.resolved_dependencies = {}
|
||||
self.conflicts = []
|
||||
self.errors = []
|
||||
self.warnings = []
|
||||
self.language = 'python'
|
||||
|
||||
for requirement in requirements:
|
||||
self._process_requirement(requirement)
|
||||
@ -134,20 +285,139 @@ class DependencyResolver:
|
||||
all_available = len(self.conflicts) == 0
|
||||
|
||||
return ResolutionResult(
|
||||
language='python',
|
||||
resolved=self.resolved_dependencies,
|
||||
conflicts=self.conflicts,
|
||||
requirements_txt=requirements_txt,
|
||||
all_packages_available=all_available,
|
||||
errors=self.errors,
|
||||
warnings=self.warnings,
|
||||
install_commands=[f"pip install -r requirements.txt"],
|
||||
)
|
||||
|
||||
def _process_requirement(self, requirement: str) -> None:
|
||||
"""
|
||||
Process a single requirement string.
|
||||
def _resolve_python_dependencies(
|
||||
self,
|
||||
dependencies: Dict[str, str],
|
||||
python_version: str,
|
||||
) -> ResolutionResult:
|
||||
for pkg_name, version_spec in dependencies.items():
|
||||
self.resolved_dependencies[pkg_name] = version_spec
|
||||
|
||||
Parses format: package_name[extras]>=version, <version
|
||||
"""
|
||||
self._detect_and_report_breaking_changes()
|
||||
self._validate_python_compatibility(python_version)
|
||||
|
||||
requirements_txt = self._generate_requirements_txt()
|
||||
all_available = len(self.conflicts) == 0
|
||||
|
||||
return ResolutionResult(
|
||||
language='python',
|
||||
resolved=self.resolved_dependencies,
|
||||
conflicts=self.conflicts,
|
||||
requirements_txt=requirements_txt,
|
||||
all_packages_available=all_available,
|
||||
errors=self.errors,
|
||||
warnings=self.warnings,
|
||||
install_commands=[f"pip install -r requirements.txt"],
|
||||
)
|
||||
|
||||
def _resolve_c_dependencies(
|
||||
self,
|
||||
dependencies: Dict[str, str],
|
||||
) -> ResolutionResult:
|
||||
libraries_needed: Set[str] = set()
|
||||
linker_flags: List[str] = []
|
||||
install_commands: List[str] = []
|
||||
|
||||
for header, source in dependencies.items():
|
||||
if source in ('stdlib', 'local'):
|
||||
continue
|
||||
|
||||
if header in self.C_HEADER_TO_LIBRARY:
|
||||
lib_name = self.C_HEADER_TO_LIBRARY[header]
|
||||
libraries_needed.add(lib_name)
|
||||
elif source == 'posix':
|
||||
pass
|
||||
elif source not in ('stdlib', 'local', 'posix'):
|
||||
libraries_needed.add(source)
|
||||
|
||||
for lib_name in libraries_needed:
|
||||
if lib_name in self.C_LIBRARY_PACKAGES:
|
||||
lib_info = self.C_LIBRARY_PACKAGES[lib_name]
|
||||
self.resolved_dependencies[lib_name] = lib_info.get('linker_flag', '')
|
||||
|
||||
if lib_info.get('linker_flag'):
|
||||
linker_flags.extend(lib_info['linker_flag'].split())
|
||||
|
||||
if lib_info.get('debian'):
|
||||
install_commands.append(f"apt-get install -y {lib_info['debian']}")
|
||||
if lib_info.get('pkg_config'):
|
||||
self.warnings.append(
|
||||
f"Library '{lib_name}' can be detected with: pkg-config --libs {lib_info['pkg_config']}"
|
||||
)
|
||||
else:
|
||||
self.resolved_dependencies[lib_name] = f"-l{lib_name}"
|
||||
linker_flags.append(f"-l{lib_name}")
|
||||
self.warnings.append(f"Unknown library '{lib_name}' - you may need to install it manually")
|
||||
|
||||
makefile_content = self._generate_makefile(linker_flags)
|
||||
|
||||
return ResolutionResult(
|
||||
language='c',
|
||||
resolved=self.resolved_dependencies,
|
||||
conflicts=self.conflicts,
|
||||
requirements_txt=makefile_content,
|
||||
all_packages_available=len(self.errors) == 0,
|
||||
errors=self.errors,
|
||||
warnings=self.warnings,
|
||||
install_commands=install_commands,
|
||||
)
|
||||
|
||||
def _resolve_generic_dependencies(
|
||||
self,
|
||||
dependencies: Dict[str, str],
|
||||
language: str,
|
||||
) -> ResolutionResult:
|
||||
self.resolved_dependencies = dependencies.copy()
|
||||
|
||||
return ResolutionResult(
|
||||
language=language,
|
||||
resolved=self.resolved_dependencies,
|
||||
conflicts=[],
|
||||
requirements_txt='',
|
||||
all_packages_available=True,
|
||||
errors=[],
|
||||
warnings=[f"No specific dependency resolution for language: {language}"],
|
||||
install_commands=[],
|
||||
)
|
||||
|
||||
def _generate_makefile(self, linker_flags: List[str]) -> str:
|
||||
unique_flags = list(dict.fromkeys(linker_flags))
|
||||
ldflags = ' '.join(unique_flags)
|
||||
|
||||
makefile = f"""CC = gcc
|
||||
CFLAGS = -Wall -Wextra -O2
|
||||
LDFLAGS = {ldflags}
|
||||
|
||||
TARGET = main
|
||||
SRCS = $(wildcard *.c)
|
||||
OBJS = $(SRCS:.c=.o)
|
||||
|
||||
all: $(TARGET)
|
||||
|
||||
$(TARGET): $(OBJS)
|
||||
\t$(CC) $(CFLAGS) -o $@ $^ $(LDFLAGS)
|
||||
|
||||
%.o: %.c
|
||||
\t$(CC) $(CFLAGS) -c $< -o $@
|
||||
|
||||
clean:
|
||||
\trm -f $(OBJS) $(TARGET)
|
||||
|
||||
.PHONY: all clean
|
||||
"""
|
||||
return makefile
|
||||
|
||||
def _process_requirement(self, requirement: str) -> None:
|
||||
pkg_name_pattern = r'^([a-zA-Z0-9\-_.]+)'
|
||||
match = re.match(pkg_name_pattern, requirement)
|
||||
|
||||
@ -161,6 +431,11 @@ class DependencyResolver:
|
||||
version_spec = requirement[len(pkg_name):].strip()
|
||||
if not version_spec:
|
||||
version_spec = '*'
|
||||
else:
|
||||
valid_version_pattern = r'^(?:\[[\w,\-]+\])?(?:>=|<=|==|!=|~=|>|<)?[\w\.\*,\s<>=!~]+$'
|
||||
if not re.match(valid_version_pattern, version_spec):
|
||||
self.errors.append(f"Invalid requirement format: {requirement}")
|
||||
return
|
||||
|
||||
if normalized_name in self.MINIMUM_VERSIONS:
|
||||
min_version = self.MINIMUM_VERSIONS[normalized_name]
|
||||
@ -177,11 +452,6 @@ class DependencyResolver:
|
||||
)
|
||||
|
||||
def _detect_and_report_breaking_changes(self) -> None:
|
||||
"""
|
||||
Detect known breaking changes and create conflict entries.
|
||||
|
||||
Maps to KNOWN_MIGRATIONS for pydantic, fastapi, sqlalchemy, etc.
|
||||
"""
|
||||
for package_name, migrations in self.KNOWN_MIGRATIONS.items():
|
||||
if package_name not in self.resolved_dependencies:
|
||||
continue
|
||||
@ -201,16 +471,9 @@ class DependencyResolver:
|
||||
self._add_additional_dependency(additional_pkg)
|
||||
|
||||
def _add_additional_dependency(self, requirement: str) -> None:
|
||||
"""Add an additional dependency discovered during resolution."""
|
||||
self._process_requirement(requirement)
|
||||
|
||||
def _validate_python_compatibility(self, python_version: str) -> None:
|
||||
"""
|
||||
Validate that selected packages are compatible with Python version.
|
||||
|
||||
Args:
|
||||
python_version: Target Python version (e.g., '3.8')
|
||||
"""
|
||||
compatibility_matrix = {
|
||||
'pydantic': {
|
||||
'2.0.0': ('3.7', '999.999'),
|
||||
@ -243,7 +506,6 @@ class DependencyResolver:
|
||||
self.warnings.append(f"Could not validate {pkg_name} compatibility: {e}")
|
||||
|
||||
def _version_matches(self, spec: str, min_version: str) -> bool:
|
||||
"""Check if version spec includes the minimum version."""
|
||||
if spec == '*':
|
||||
return True
|
||||
|
||||
@ -259,11 +521,6 @@ class DependencyResolver:
|
||||
return True
|
||||
|
||||
def _compare_versions(self, v1: str, v2: str) -> int:
|
||||
"""
|
||||
Compare two version strings.
|
||||
|
||||
Returns: -1 if v1 < v2, 0 if equal, 1 if v1 > v2
|
||||
"""
|
||||
try:
|
||||
parts1 = [int(x) for x in v1.split('.')]
|
||||
parts2 = [int(x) for x in v2.split('.')]
|
||||
@ -283,7 +540,6 @@ class DependencyResolver:
|
||||
return 0
|
||||
|
||||
def _python_version_in_range(self, current: str, min_py: str, max_py: str) -> bool:
|
||||
"""Check if current Python version is in acceptable range."""
|
||||
try:
|
||||
current_v = tuple(map(int, current.split('.')[:2]))
|
||||
min_v = tuple(map(int, min_py.split('.')[:2]))
|
||||
@ -293,13 +549,6 @@ class DependencyResolver:
|
||||
return True
|
||||
|
||||
def _generate_requirements_txt(self) -> str:
|
||||
"""
|
||||
Generate requirements.txt content with pinned versions.
|
||||
|
||||
Format:
|
||||
package_name==version
|
||||
package_name[extra]==version
|
||||
"""
|
||||
lines = []
|
||||
|
||||
for pkg_name, version_spec in sorted(self.resolved_dependencies.items()):
|
||||
@ -322,11 +571,6 @@ class DependencyResolver:
|
||||
self,
|
||||
code_content: str,
|
||||
) -> List[Tuple[str, str, str]]:
|
||||
"""
|
||||
Scan code for Pydantic v2 migration issues.
|
||||
|
||||
Returns list of (pattern, old_code, new_code) tuples
|
||||
"""
|
||||
migrations = []
|
||||
|
||||
if 'from pydantic import BaseSettings' in code_content:
|
||||
@ -357,11 +601,6 @@ class DependencyResolver:
|
||||
self,
|
||||
code_content: str,
|
||||
) -> List[Tuple[str, str, str]]:
|
||||
"""
|
||||
Scan code for FastAPI breaking changes.
|
||||
|
||||
Returns list of (issue, old_code, new_code) tuples
|
||||
"""
|
||||
changes = []
|
||||
|
||||
if 'GZIPMiddleware' in code_content:
|
||||
@ -381,14 +620,20 @@ class DependencyResolver:
|
||||
return changes
|
||||
|
||||
def suggest_fixes(self, code_content: str) -> Dict[str, List[str]]:
|
||||
"""
|
||||
Suggest fixes for detected breaking changes.
|
||||
|
||||
Returns dict mapping issue type to fix suggestions
|
||||
"""
|
||||
fixes = {
|
||||
'pydantic_v2': self.detect_pydantic_v2_migration_needed(code_content),
|
||||
'fastapi_breaking': self.detect_fastapi_breaking_changes(code_content),
|
||||
}
|
||||
|
||||
return fixes
|
||||
|
||||
def get_c_linker_flags(self, dependencies: Dict[str, str]) -> List[str]:
|
||||
flags = []
|
||||
for header, source in dependencies.items():
|
||||
if header in self.C_HEADER_TO_LIBRARY:
|
||||
lib_name = self.C_HEADER_TO_LIBRARY[header]
|
||||
if lib_name in self.C_LIBRARY_PACKAGES:
|
||||
lib_info = self.C_LIBRARY_PACKAGES[lib_name]
|
||||
if lib_info.get('linker_flag'):
|
||||
flags.extend(lib_info['linker_flag'].split())
|
||||
return list(dict.fromkeys(flags))
|
||||
|
||||
@ -1,26 +1,90 @@
|
||||
# retoor <retoor@molodetz.nl>
|
||||
|
||||
import re
|
||||
import sys
|
||||
from dataclasses import dataclass, field
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional, Set, Tuple
|
||||
import shlex
|
||||
import json
|
||||
|
||||
|
||||
@dataclass
|
||||
class AnalysisResult:
|
||||
valid: bool
|
||||
language: str
|
||||
dependencies: Dict[str, str]
|
||||
file_structure: List[str]
|
||||
python_version: str
|
||||
language_version: str
|
||||
import_compatibility: Dict[str, bool]
|
||||
shell_commands: List[Dict]
|
||||
estimated_tokens: int
|
||||
build_system: Optional[str] = None
|
||||
compiler_flags: List[str] = field(default_factory=list)
|
||||
errors: List[str] = field(default_factory=list)
|
||||
warnings: List[str] = field(default_factory=list)
|
||||
|
||||
|
||||
class ProjectAnalyzer:
|
||||
LANGUAGE_EXTENSIONS = {
|
||||
'python': {'.py', '.pyw', '.pyi'},
|
||||
'c': {'.c', '.h'},
|
||||
'cpp': {'.cpp', '.hpp', '.cc', '.hh', '.cxx', '.hxx'},
|
||||
'rust': {'.rs'},
|
||||
'go': {'.go'},
|
||||
'javascript': {'.js', '.mjs', '.cjs'},
|
||||
'typescript': {'.ts', '.tsx'},
|
||||
'java': {'.java'},
|
||||
}
|
||||
|
||||
BUILD_FILES = {
|
||||
'python': {'pyproject.toml', 'setup.py', 'setup.cfg', 'requirements.txt', 'Pipfile'},
|
||||
'c': {'Makefile', 'makefile', 'CMakeLists.txt', 'meson.build', 'configure.ac'},
|
||||
'cpp': {'Makefile', 'makefile', 'CMakeLists.txt', 'meson.build', 'configure.ac'},
|
||||
'rust': {'Cargo.toml'},
|
||||
'go': {'go.mod', 'go.sum'},
|
||||
'javascript': {'package.json'},
|
||||
'typescript': {'package.json', 'tsconfig.json'},
|
||||
'java': {'pom.xml', 'build.gradle', 'build.gradle.kts'},
|
||||
}
|
||||
|
||||
C_STANDARD_HEADERS = {
|
||||
'stdio.h', 'stdlib.h', 'string.h', 'math.h', 'time.h', 'ctype.h',
|
||||
'errno.h', 'float.h', 'limits.h', 'locale.h', 'setjmp.h', 'signal.h',
|
||||
'stdarg.h', 'stddef.h', 'assert.h', 'stdbool.h', 'stdint.h',
|
||||
'inttypes.h', 'complex.h', 'tgmath.h', 'fenv.h', 'iso646.h',
|
||||
'wchar.h', 'wctype.h', 'stdatomic.h', 'stdnoreturn.h', 'threads.h',
|
||||
'uchar.h', 'stdalign.h',
|
||||
}
|
||||
|
||||
POSIX_HEADERS = {
|
||||
'unistd.h', 'fcntl.h', 'sys/types.h', 'sys/stat.h', 'sys/wait.h',
|
||||
'sys/socket.h', 'sys/select.h', 'sys/time.h', 'sys/mman.h',
|
||||
'sys/ioctl.h', 'sys/uio.h', 'sys/resource.h', 'sys/ipc.h',
|
||||
'sys/shm.h', 'sys/sem.h', 'sys/msg.h', 'netinet/in.h', 'netinet/tcp.h',
|
||||
'arpa/inet.h', 'netdb.h', 'pthread.h', 'semaphore.h', 'dirent.h',
|
||||
'dlfcn.h', 'poll.h', 'termios.h', 'pwd.h', 'grp.h', 'syslog.h',
|
||||
}
|
||||
|
||||
PYTHON_STDLIB = {
|
||||
'sys', 'os', 'path', 'json', 're', 'datetime', 'time',
|
||||
'collections', 'itertools', 'functools', 'operator',
|
||||
'abc', 'types', 'copy', 'pprint', 'reprlib', 'enum',
|
||||
'dataclasses', 'typing', 'pathlib', 'tempfile', 'glob',
|
||||
'fnmatch', 'linecache', 'shutil', 'sqlite3', 'csv',
|
||||
'configparser', 'logging', 'getpass', 'curses',
|
||||
'platform', 'errno', 'ctypes', 'threading', 'asyncio',
|
||||
'concurrent', 'subprocess', 'socket', 'ssl', 'select',
|
||||
'selectors', 'asyncore', 'asynchat', 'email', 'http',
|
||||
'urllib', 'ftplib', 'poplib', 'imaplib', 'smtplib',
|
||||
'uuid', 'socketserver', 'xmlrpc', 'base64', 'binhex',
|
||||
'binascii', 'quopri', 'uu', 'struct', 'codecs',
|
||||
'unicodedata', 'stringprep', 'readline', 'rlcompleter',
|
||||
'statistics', 'random', 'bisect', 'heapq', 'math',
|
||||
'cmath', 'decimal', 'fractions', 'numbers', 'crypt',
|
||||
'hashlib', 'hmac', 'secrets', 'warnings', 'io',
|
||||
'builtins', 'contextlib', 'traceback', 'inspect',
|
||||
}
|
||||
|
||||
PYDANTIC_V2_BREAKING_CHANGES = {
|
||||
'BaseSettings': 'pydantic_settings.BaseSettings',
|
||||
'ValidationError': 'pydantic.ValidationError',
|
||||
@ -31,13 +95,6 @@ class ProjectAnalyzer:
|
||||
'GZIPMiddleware': 'GZipMiddleware',
|
||||
}
|
||||
|
||||
KNOWN_OPTIONAL_DEPENDENCIES = {
|
||||
'structlog': 'optional',
|
||||
'prometheus_client': 'optional',
|
||||
'uvicorn': 'optional',
|
||||
'sqlalchemy': 'optional',
|
||||
}
|
||||
|
||||
PYTHON_VERSION_PATTERNS = {
|
||||
'f-string': (3, 6),
|
||||
'typing.Protocol': (3, 8),
|
||||
@ -47,35 +104,132 @@ class ProjectAnalyzer:
|
||||
'union operator |': (3, 10),
|
||||
}
|
||||
|
||||
C_STANDARD_PATTERNS = {
|
||||
'_Static_assert': 'c11',
|
||||
'_Generic': 'c11',
|
||||
'_Alignas': 'c11',
|
||||
'_Alignof': 'c11',
|
||||
'_Atomic': 'c11',
|
||||
'_Thread_local': 'c11',
|
||||
'_Noreturn': 'c11',
|
||||
'typeof': 'gnu',
|
||||
'__attribute__': 'gnu',
|
||||
'__builtin_': 'gnu',
|
||||
}
|
||||
|
||||
def __init__(self):
|
||||
self.python_version = f"{sys.version_info.major}.{sys.version_info.minor}"
|
||||
self.errors: List[str] = []
|
||||
self.warnings: List[str] = []
|
||||
|
||||
def detect_language(self, code_content: str, spec_file: Optional[str] = None) -> str:
|
||||
extension_scores: Dict[str, int] = {}
|
||||
if spec_file:
|
||||
spec_path = Path(spec_file)
|
||||
suffix = spec_path.suffix.lower()
|
||||
for lang, exts in self.LANGUAGE_EXTENSIONS.items():
|
||||
if suffix in exts:
|
||||
extension_scores[lang] = extension_scores.get(lang, 0) + 10
|
||||
|
||||
content_indicators = {
|
||||
'python': [
|
||||
(r'^\s*(?:from|import)\s+\w+', 5),
|
||||
(r'^\s*def\s+\w+\s*\(', 10),
|
||||
(r'^\s*class\s+\w+', 5),
|
||||
(r'if\s+__name__\s*==\s*["\']__main__["\']', 10),
|
||||
(r'\bprint\s*\(', 5),
|
||||
(r':=', 8),
|
||||
],
|
||||
'c': [
|
||||
(r'#include\s*[<"][\w./]+\.h[>"]', 10),
|
||||
(r'\bint\s+main\s*\(', 10),
|
||||
(r'\b(?:void|int|char|float|double|long|short|unsigned)\s+\w+\s*\(', 5),
|
||||
(r'\bmalloc\s*\(', 5),
|
||||
(r'\bfree\s*\(', 3),
|
||||
(r'\bprintf\s*\(', 3),
|
||||
(r'\bsizeof\s*\(', 3),
|
||||
(r'\bstruct\s+\w+\s*\{', 5),
|
||||
(r'\btypedef\s+', 3),
|
||||
(r'#define\s+\w+', 3),
|
||||
],
|
||||
'cpp': [
|
||||
(r'#include\s*<iostream>', 15),
|
||||
(r'#include\s*<string>', 10),
|
||||
(r'#include\s*<vector>', 10),
|
||||
(r'#include\s*<map>', 10),
|
||||
(r'\bstd::', 15),
|
||||
(r'\bclass\s+\w+\s*(?::\s*public)?', 5),
|
||||
(r'\btemplate\s*<', 10),
|
||||
(r'\bnew\s+\w+', 5),
|
||||
(r'\bnamespace\s+\w+', 10),
|
||||
(r'\bcout\s*<<', 10),
|
||||
(r'\bcin\s*>>', 10),
|
||||
(r'\bendl\b', 8),
|
||||
],
|
||||
'rust': [
|
||||
(r'\bfn\s+\w+\s*\(', 10),
|
||||
(r'\blet\s+(?:mut\s+)?\w+', 5),
|
||||
(r'\bimpl\s+\w+', 5),
|
||||
(r'\buse\s+\w+::', 5),
|
||||
(r'\bpub\s+(?:fn|struct|enum)', 5),
|
||||
],
|
||||
'go': [
|
||||
(r'\bfunc\s+\w+\s*\(', 10),
|
||||
(r'\bpackage\s+\w+', 10),
|
||||
(r'\bimport\s+\(', 5),
|
||||
(r':=', 3),
|
||||
],
|
||||
'javascript': [
|
||||
(r'\bfunction\s+\w+\s*\(', 5),
|
||||
(r'\bconst\s+\w+\s*=', 5),
|
||||
(r'\blet\s+\w+\s*=', 3),
|
||||
(r'=>', 3),
|
||||
(r'\bconsole\.log\s*\(', 3),
|
||||
(r'\brequire\s*\(["\']', 5),
|
||||
(r'\bexport\s+(?:default|const|function)', 5),
|
||||
],
|
||||
}
|
||||
|
||||
for lang, patterns in content_indicators.items():
|
||||
for pattern, score in patterns:
|
||||
if re.search(pattern, code_content, re.MULTILINE):
|
||||
extension_scores[lang] = extension_scores.get(lang, 0) + score
|
||||
|
||||
if not extension_scores:
|
||||
return 'unknown'
|
||||
|
||||
return max(extension_scores, key=extension_scores.get)
|
||||
|
||||
def analyze_requirements(
|
||||
self,
|
||||
spec_file: str,
|
||||
code_content: Optional[str] = None,
|
||||
commands: Optional[List[str]] = None
|
||||
) -> AnalysisResult:
|
||||
"""
|
||||
Comprehensive pre-execution analysis preventing runtime failures.
|
||||
|
||||
Args:
|
||||
spec_file: Path to specification file
|
||||
code_content: Generated code to analyze
|
||||
commands: Shell commands to pre-validate
|
||||
|
||||
Returns:
|
||||
AnalysisResult with all validation results
|
||||
"""
|
||||
self.errors = []
|
||||
self.warnings = []
|
||||
|
||||
language = self.detect_language(code_content or "", spec_file)
|
||||
|
||||
if language == 'python':
|
||||
return self._analyze_python(spec_file, code_content, commands)
|
||||
elif language == 'c':
|
||||
return self._analyze_c(spec_file, code_content, commands)
|
||||
elif language == 'cpp':
|
||||
return self._analyze_c(spec_file, code_content, commands)
|
||||
else:
|
||||
return self._analyze_generic(spec_file, code_content, commands, language)
|
||||
|
||||
def _analyze_python(
|
||||
self,
|
||||
spec_file: str,
|
||||
code_content: Optional[str],
|
||||
commands: Optional[List[str]]
|
||||
) -> AnalysisResult:
|
||||
dependencies = self._scan_python_dependencies(code_content or "")
|
||||
file_structure = self._plan_directory_tree(spec_file)
|
||||
file_structure = self._plan_directory_tree(spec_file, code_content)
|
||||
python_version = self._detect_python_version_requirements(code_content or "")
|
||||
import_compatibility = self._validate_import_paths(dependencies)
|
||||
import_compatibility = self._validate_python_imports(dependencies)
|
||||
shell_commands = self._prevalidate_all_shell_commands(commands or [])
|
||||
estimated_tokens = self._calculate_token_budget(
|
||||
dependencies, file_structure, shell_commands
|
||||
@ -85,9 +239,10 @@ class ProjectAnalyzer:
|
||||
|
||||
return AnalysisResult(
|
||||
valid=valid,
|
||||
language='python',
|
||||
dependencies=dependencies,
|
||||
file_structure=file_structure,
|
||||
python_version=python_version,
|
||||
language_version=python_version,
|
||||
import_compatibility=import_compatibility,
|
||||
shell_commands=shell_commands,
|
||||
estimated_tokens=estimated_tokens,
|
||||
@ -95,65 +250,175 @@ class ProjectAnalyzer:
|
||||
warnings=self.warnings,
|
||||
)
|
||||
|
||||
def _scan_python_dependencies(self, code_content: str) -> Dict[str, str]:
|
||||
"""
|
||||
Extract Python dependencies from code content.
|
||||
def _analyze_c(
|
||||
self,
|
||||
spec_file: str,
|
||||
code_content: Optional[str],
|
||||
commands: Optional[List[str]]
|
||||
) -> AnalysisResult:
|
||||
dependencies = self._scan_c_dependencies(code_content or "")
|
||||
file_structure = self._plan_directory_tree(spec_file, code_content)
|
||||
c_standard = self._detect_c_standard(code_content or "")
|
||||
build_system = self._detect_c_build_system(spec_file, code_content)
|
||||
compiler_flags = self._suggest_c_compiler_flags(code_content or "", c_standard)
|
||||
import_compatibility = self._validate_c_includes(dependencies)
|
||||
shell_commands = self._prevalidate_all_shell_commands(commands or [])
|
||||
estimated_tokens = self._calculate_token_budget(
|
||||
dependencies, file_structure, shell_commands
|
||||
)
|
||||
|
||||
Scans for: import statements, requirements.txt patterns, pyproject.toml patterns
|
||||
Returns dict of {package_name: version_spec}
|
||||
"""
|
||||
valid = len(self.errors) == 0
|
||||
|
||||
return AnalysisResult(
|
||||
valid=valid,
|
||||
language='c',
|
||||
dependencies=dependencies,
|
||||
file_structure=file_structure,
|
||||
language_version=c_standard,
|
||||
import_compatibility=import_compatibility,
|
||||
shell_commands=shell_commands,
|
||||
estimated_tokens=estimated_tokens,
|
||||
build_system=build_system,
|
||||
compiler_flags=compiler_flags,
|
||||
errors=self.errors,
|
||||
warnings=self.warnings,
|
||||
)
|
||||
|
||||
def _analyze_generic(
|
||||
self,
|
||||
spec_file: str,
|
||||
code_content: Optional[str],
|
||||
commands: Optional[List[str]],
|
||||
language: str
|
||||
) -> AnalysisResult:
|
||||
file_structure = self._plan_directory_tree(spec_file, code_content)
|
||||
shell_commands = self._prevalidate_all_shell_commands(commands or [])
|
||||
estimated_tokens = self._calculate_token_budget({}, file_structure, shell_commands)
|
||||
|
||||
return AnalysisResult(
|
||||
valid=len(self.errors) == 0,
|
||||
language=language,
|
||||
dependencies={},
|
||||
file_structure=file_structure,
|
||||
language_version='unknown',
|
||||
import_compatibility={},
|
||||
shell_commands=shell_commands,
|
||||
estimated_tokens=estimated_tokens,
|
||||
errors=self.errors,
|
||||
warnings=self.warnings,
|
||||
)
|
||||
|
||||
def _scan_c_dependencies(self, code_content: str) -> Dict[str, str]:
|
||||
dependencies = {}
|
||||
|
||||
include_pattern = r'#include\s*[<"]([^>"]+)[>"]'
|
||||
for match in re.finditer(include_pattern, code_content):
|
||||
header = match.group(1)
|
||||
if header in self.C_STANDARD_HEADERS:
|
||||
dependencies[header] = 'stdlib'
|
||||
elif header in self.POSIX_HEADERS:
|
||||
dependencies[header] = 'posix'
|
||||
elif '/' in header:
|
||||
lib_name = header.split('/')[0]
|
||||
dependencies[header] = lib_name
|
||||
else:
|
||||
dependencies[header] = 'local'
|
||||
|
||||
return dependencies
|
||||
|
||||
def _detect_c_standard(self, code_content: str) -> str:
|
||||
detected_standard = 'c99'
|
||||
|
||||
for pattern, standard in self.C_STANDARD_PATTERNS.items():
|
||||
if pattern in code_content:
|
||||
if standard == 'c11':
|
||||
detected_standard = 'c11'
|
||||
elif standard == 'gnu' and detected_standard != 'c11':
|
||||
detected_standard = 'gnu99'
|
||||
|
||||
if re.search(r'\bfor\s*\(\s*(?:int|size_t|unsigned)\s+\w+\s*=', code_content):
|
||||
if detected_standard == 'c89':
|
||||
detected_standard = 'c99'
|
||||
|
||||
return detected_standard
|
||||
|
||||
def _detect_c_build_system(self, spec_file: str, code_content: Optional[str]) -> Optional[str]:
|
||||
spec_path = Path(spec_file)
|
||||
if spec_path.exists():
|
||||
parent = spec_path.parent
|
||||
else:
|
||||
parent = Path('.')
|
||||
|
||||
if (parent / 'CMakeLists.txt').exists():
|
||||
return 'cmake'
|
||||
if (parent / 'Makefile').exists() or (parent / 'makefile').exists():
|
||||
return 'make'
|
||||
if (parent / 'meson.build').exists():
|
||||
return 'meson'
|
||||
if (parent / 'configure.ac').exists() or (parent / 'configure').exists():
|
||||
return 'autotools'
|
||||
|
||||
if code_content:
|
||||
if 'cmake' in code_content.lower():
|
||||
return 'cmake'
|
||||
if 'makefile' in code_content.lower():
|
||||
return 'make'
|
||||
|
||||
return None
|
||||
|
||||
def _suggest_c_compiler_flags(self, code_content: str, c_standard: str) -> List[str]:
|
||||
flags = []
|
||||
|
||||
std_flag = f'-std={c_standard}'
|
||||
flags.append(std_flag)
|
||||
|
||||
flags.extend(['-Wall', '-Wextra', '-Werror'])
|
||||
|
||||
if re.search(r'\bpthread_', code_content):
|
||||
flags.append('-pthread')
|
||||
|
||||
if re.search(r'#include\s*[<"]math\.h[>"]', code_content):
|
||||
flags.append('-lm')
|
||||
|
||||
if re.search(r'#include\s*[<"]dlfcn\.h[>"]', code_content):
|
||||
flags.append('-ldl')
|
||||
|
||||
if re.search(r'-O[0-3s]', code_content):
|
||||
pass
|
||||
else:
|
||||
flags.append('-O2')
|
||||
|
||||
return flags
|
||||
|
||||
def _validate_c_includes(self, dependencies: Dict[str, str]) -> Dict[str, bool]:
|
||||
compatibility = {}
|
||||
|
||||
for header, source in dependencies.items():
|
||||
if source == 'stdlib':
|
||||
compatibility[header] = True
|
||||
elif source == 'posix':
|
||||
compatibility[header] = True
|
||||
self.warnings.append(f"POSIX header '{header}' may not be portable to Windows")
|
||||
elif source == 'local':
|
||||
compatibility[header] = True
|
||||
else:
|
||||
compatibility[header] = True
|
||||
self.warnings.append(f"External library header '{header}' requires linking with -{source}")
|
||||
|
||||
return compatibility
|
||||
|
||||
def _scan_python_dependencies(self, code_content: str) -> Dict[str, str]:
|
||||
dependencies = {}
|
||||
|
||||
import_pattern = r'^\s*(?:from|import)\s+([\w\.]+)'
|
||||
for match in re.finditer(import_pattern, code_content, re.MULTILINE):
|
||||
package = match.group(1).split('.')[0]
|
||||
if not self._is_stdlib(package):
|
||||
if package not in self.PYTHON_STDLIB:
|
||||
dependencies[package] = '*'
|
||||
|
||||
requirements_pattern = r'([a-zA-Z0-9\-_]+)(?:\[.*?\])?(?:==|>=|<=|>|<|!=|~=)?([\w\.\*]+)?'
|
||||
for match in re.finditer(requirements_pattern, code_content):
|
||||
pkg_name = match.group(1)
|
||||
version = match.group(2) or '*'
|
||||
if pkg_name not in ('python', 'pip', 'setuptools'):
|
||||
dependencies[pkg_name] = version
|
||||
|
||||
return dependencies
|
||||
|
||||
def _plan_directory_tree(self, spec_file: str) -> List[str]:
|
||||
"""
|
||||
Extract directory structure from spec file.
|
||||
|
||||
Looks for directory creation commands, file path patterns.
|
||||
Returns list of directories that will be created.
|
||||
"""
|
||||
directories = ['.']
|
||||
|
||||
spec_path = Path(spec_file)
|
||||
if spec_path.exists():
|
||||
try:
|
||||
content = spec_path.read_text()
|
||||
dir_pattern = r'(?:mkdir|directory|create|path)[\s\:]+([\w\-/\.]+)'
|
||||
for match in re.finditer(dir_pattern, content, re.IGNORECASE):
|
||||
dir_path = match.group(1)
|
||||
directories.append(dir_path)
|
||||
|
||||
file_pattern = r'(?:file|write|create)[\s\:]+([\w\-/\.]+)'
|
||||
for match in re.finditer(file_pattern, content, re.IGNORECASE):
|
||||
file_path = match.group(1)
|
||||
parent_dir = str(Path(file_path).parent)
|
||||
if parent_dir != '.':
|
||||
directories.append(parent_dir)
|
||||
except Exception as e:
|
||||
self.warnings.append(f"Could not read spec file: {e}")
|
||||
|
||||
return sorted(set(directories))
|
||||
|
||||
def _detect_python_version_requirements(self, code_content: str) -> str:
|
||||
"""
|
||||
Detect minimum Python version required based on syntax usage.
|
||||
|
||||
Returns: Version string like "3.8" or "3.10"
|
||||
"""
|
||||
min_version = (3, 6)
|
||||
|
||||
for feature, version in self.PYTHON_VERSION_PATTERNS.items():
|
||||
@ -164,12 +429,11 @@ class ProjectAnalyzer:
|
||||
return f"{min_version[0]}.{min_version[1]}"
|
||||
|
||||
def _check_python_feature(self, code: str, feature: str) -> bool:
|
||||
"""Check if code uses a specific Python feature."""
|
||||
patterns = {
|
||||
'f-string': r'f["\'].*{.*}.*["\']',
|
||||
'f-string': r'f["\'].*\{.*\}.*["\']',
|
||||
'typing.Protocol': r'(?:from typing|import)\s+.*Protocol',
|
||||
'typing.TypedDict': r'(?:from typing|import)\s+.*TypedDict',
|
||||
'walrus operator': r'\(:=\)',
|
||||
'walrus operator': r'\w+\s*:=\s*\w+',
|
||||
'match statement': r'^\s*match\s+\w+:',
|
||||
'union operator |': r':\s+\w+\s*\|\s*\w+',
|
||||
}
|
||||
@ -179,50 +443,71 @@ class ProjectAnalyzer:
|
||||
return bool(re.search(pattern, code, re.MULTILINE))
|
||||
return False
|
||||
|
||||
def _validate_import_paths(self, dependencies: Dict[str, str]) -> Dict[str, bool]:
|
||||
"""
|
||||
Check import compatibility BEFORE code generation.
|
||||
|
||||
Detects breaking changes:
|
||||
- Pydantic v2: BaseSettings moved to pydantic_settings
|
||||
- FastAPI: GZIPMiddleware renamed to GZipMiddleware
|
||||
- Missing optional dependencies
|
||||
"""
|
||||
def _validate_python_imports(self, dependencies: Dict[str, str]) -> Dict[str, bool]:
|
||||
import_checks = {}
|
||||
breaking_changes_found = []
|
||||
|
||||
for dep_name in dependencies:
|
||||
import_checks[dep_name] = True
|
||||
|
||||
if dep_name == 'pydantic':
|
||||
import_checks['pydantic_breaking_change'] = False
|
||||
breaking_changes_found.append(
|
||||
self.errors.append(
|
||||
"Pydantic v2 breaking change detected: BaseSettings moved to pydantic_settings"
|
||||
)
|
||||
|
||||
if dep_name == 'fastapi':
|
||||
import_checks['fastapi_middleware'] = False
|
||||
breaking_changes_found.append(
|
||||
self.errors.append(
|
||||
"FastAPI breaking change: GZIPMiddleware renamed to GZipMiddleware"
|
||||
)
|
||||
|
||||
if dep_name in self.KNOWN_OPTIONAL_DEPENDENCIES:
|
||||
import_checks[f"{dep_name}_optional"] = True
|
||||
|
||||
for change in breaking_changes_found:
|
||||
self.errors.append(change)
|
||||
|
||||
return import_checks
|
||||
|
||||
def _prevalidate_all_shell_commands(self, commands: List[str]) -> List[Dict]:
|
||||
"""
|
||||
Validate shell syntax using shlex.split() before execution.
|
||||
def _plan_directory_tree(self, spec_file: str, code_content: Optional[str] = None) -> List[str]:
|
||||
directories = ['.']
|
||||
|
||||
Prevent brace expansion errors by validating and suggesting Python equivalents.
|
||||
"""
|
||||
def extract_from_content(content: str) -> None:
|
||||
dir_pattern = r'(?:mkdir|directory|create|path)[\s\:]+([\w\-/\.]+)'
|
||||
for match in re.finditer(dir_pattern, content, re.IGNORECASE):
|
||||
dir_path = match.group(1)
|
||||
directories.append(dir_path)
|
||||
|
||||
file_pattern = r'(?:file|write|create)[\s\:]+([\w\-/\.]+)'
|
||||
for match in re.finditer(file_pattern, content, re.IGNORECASE):
|
||||
file_path = match.group(1)
|
||||
parent_dir = str(Path(file_path).parent)
|
||||
if parent_dir != '.':
|
||||
directories.append(parent_dir)
|
||||
|
||||
spec_path = Path(spec_file)
|
||||
if spec_path.exists():
|
||||
try:
|
||||
content = spec_path.read_text()
|
||||
extract_from_content(content)
|
||||
except Exception as e:
|
||||
self.warnings.append(f"Could not read spec file: {e}")
|
||||
|
||||
if code_content:
|
||||
extract_from_content(code_content)
|
||||
|
||||
return sorted(set(directories))
|
||||
|
||||
def _prevalidate_all_shell_commands(self, commands: List[str]) -> List[Dict]:
|
||||
validated_commands = []
|
||||
|
||||
for cmd in commands:
|
||||
brace_error = self._has_brace_expansion_error(cmd)
|
||||
if brace_error:
|
||||
fix = self._suggest_command_fix(cmd)
|
||||
validated_commands.append({
|
||||
'command': cmd,
|
||||
'valid': False,
|
||||
'error': 'Malformed brace expansion',
|
||||
'fix': fix,
|
||||
})
|
||||
self.errors.append(f"Invalid shell command: {cmd} - Malformed brace expansion")
|
||||
continue
|
||||
|
||||
try:
|
||||
shlex.split(cmd)
|
||||
validated_commands.append({
|
||||
@ -232,7 +517,7 @@ class ProjectAnalyzer:
|
||||
'fix': None,
|
||||
})
|
||||
except ValueError as e:
|
||||
fix = self._suggest_python_equivalent(cmd)
|
||||
fix = self._suggest_command_fix(cmd)
|
||||
validated_commands.append({
|
||||
'command': cmd,
|
||||
'valid': False,
|
||||
@ -243,23 +528,31 @@ class ProjectAnalyzer:
|
||||
|
||||
return validated_commands
|
||||
|
||||
def _suggest_python_equivalent(self, command: str) -> Optional[str]:
|
||||
"""
|
||||
Suggest Python equivalent for problematic shell commands.
|
||||
def _has_brace_expansion_error(self, command: str) -> bool:
|
||||
open_braces = command.count('{')
|
||||
close_braces = command.count('}')
|
||||
if open_braces != close_braces:
|
||||
return True
|
||||
open_parens_in_braces = 0
|
||||
close_parens_in_braces = 0
|
||||
in_brace = False
|
||||
for char in command:
|
||||
if char == '{':
|
||||
in_brace = True
|
||||
elif char == '}':
|
||||
in_brace = False
|
||||
elif in_brace and char == '(':
|
||||
open_parens_in_braces += 1
|
||||
elif in_brace and char == ')':
|
||||
close_parens_in_braces += 1
|
||||
if open_parens_in_braces != close_parens_in_braces:
|
||||
return True
|
||||
return False
|
||||
|
||||
Maps:
|
||||
- mkdir → Path().mkdir()
|
||||
- mv → shutil.move()
|
||||
- find → Path.rglob()
|
||||
- rm → Path.unlink() / shutil.rmtree()
|
||||
"""
|
||||
def _suggest_command_fix(self, command: str) -> Optional[str]:
|
||||
equivalents = {
|
||||
r'mkdir\s+-p\s+(.+)': lambda m: f"Path('{m.group(1)}').mkdir(parents=True, exist_ok=True)",
|
||||
r'mv\s+(.+)\s+(.+)': lambda m: f"shutil.move('{m.group(1)}', '{m.group(2)}')",
|
||||
r'find\s+(.+?)\s+-type\s+f': lambda m: f"[str(p) for p in Path('{m.group(1)}').rglob('*') if p.is_file()]",
|
||||
r'find\s+(.+?)\s+-type\s+d': lambda m: f"[str(p) for p in Path('{m.group(1)}').rglob('*') if p.is_dir()]",
|
||||
r'rm\s+-rf\s+(.+)': lambda m: f"shutil.rmtree('{m.group(1)}')",
|
||||
r'cat\s+(.+)': lambda m: f"Path('{m.group(1)}').read_text()",
|
||||
r'mkdir\s+-p\s+(.+)': lambda m: f"mkdir -p {m.group(1).replace('{', '').replace('}', '')}",
|
||||
r'gcc\s+(.+)': lambda m: f"gcc {m.group(1)}",
|
||||
}
|
||||
|
||||
for pattern, converter in equivalents.items():
|
||||
@ -275,43 +568,14 @@ class ProjectAnalyzer:
|
||||
file_structure: List[str],
|
||||
shell_commands: List[Dict],
|
||||
) -> int:
|
||||
"""
|
||||
Estimate token count for analysis and validation.
|
||||
|
||||
Rough estimation: 4 chars ≈ 1 token for LLM APIs
|
||||
"""
|
||||
token_count = 0
|
||||
|
||||
token_count += len(dependencies) * 50
|
||||
|
||||
token_count += len(file_structure) * 30
|
||||
|
||||
valid_commands = [c for c in shell_commands if c.get('valid')]
|
||||
token_count += len(valid_commands) * 40
|
||||
|
||||
invalid_commands = [c for c in shell_commands if not c.get('valid')]
|
||||
token_count += len(invalid_commands) * 80
|
||||
|
||||
return max(token_count, 100)
|
||||
|
||||
def _is_stdlib(self, package: str) -> bool:
|
||||
"""Check if package is part of Python standard library."""
|
||||
stdlib_packages = {
|
||||
'sys', 'os', 'path', 'json', 're', 'datetime', 'time',
|
||||
'collections', 'itertools', 'functools', 'operator',
|
||||
'abc', 'types', 'copy', 'pprint', 'reprlib', 'enum',
|
||||
'dataclasses', 'typing', 'pathlib', 'tempfile', 'glob',
|
||||
'fnmatch', 'linecache', 'shutil', 'sqlite3', 'csv',
|
||||
'configparser', 'logging', 'getpass', 'curses',
|
||||
'platform', 'errno', 'ctypes', 'threading', 'asyncio',
|
||||
'concurrent', 'subprocess', 'socket', 'ssl', 'select',
|
||||
'selectors', 'asyncore', 'asynchat', 'email', 'http',
|
||||
'urllib', 'ftplib', 'poplib', 'imaplib', 'smtplib',
|
||||
'uuid', 'socketserver', 'http', 'xmlrpc', 'json',
|
||||
'base64', 'binhex', 'binascii', 'quopri', 'uu',
|
||||
'struct', 'codecs', 'unicodedata', 'stringprep', 'readline',
|
||||
'rlcompleter', 'statistics', 'random', 'bisect', 'heapq',
|
||||
'math', 'cmath', 'decimal', 'fractions', 'numbers',
|
||||
'crypt', 'hashlib', 'hmac', 'secrets', 'warnings',
|
||||
}
|
||||
return package in stdlib_packages
|
||||
return package in self.PYTHON_STDLIB
|
||||
|
||||
@ -130,6 +130,15 @@ class SafeCommandExecutor:
|
||||
suggested_fix=fix,
|
||||
)
|
||||
|
||||
if self._has_incomplete_arguments(command):
|
||||
fix = self._find_python_equivalent(command)
|
||||
return CommandValidationResult(
|
||||
valid=False,
|
||||
command=command,
|
||||
error="Command has incomplete arguments",
|
||||
suggested_fix=fix,
|
||||
)
|
||||
|
||||
try:
|
||||
shlex.split(command)
|
||||
except ValueError as e:
|
||||
@ -184,6 +193,20 @@ class SafeCommandExecutor:
|
||||
|
||||
return False
|
||||
|
||||
def _has_incomplete_arguments(self, command: str) -> bool:
|
||||
"""
|
||||
Detect commands with missing required arguments.
|
||||
"""
|
||||
incomplete_patterns = [
|
||||
(r'find\s+\S+\s+-(?:path|name|type|exec)\s*$', 'find command missing argument after flag'),
|
||||
(r'grep\s+-[a-zA-Z]*\s*$', 'grep command missing pattern'),
|
||||
(r'sed\s+-[a-zA-Z]*\s*$', 'sed command missing expression'),
|
||||
]
|
||||
for pattern, _ in incomplete_patterns:
|
||||
if re.search(pattern, command.strip()):
|
||||
return True
|
||||
return False
|
||||
|
||||
def _suggest_brace_fix(self, command: str) -> Optional[str]:
|
||||
"""
|
||||
Suggest fix for brace expansion errors.
|
||||
|
||||
@ -26,6 +26,7 @@ class OperationResult:
|
||||
error: Optional[str] = None
|
||||
affected_files: int = 0
|
||||
transaction_id: Optional[str] = None
|
||||
metadata: Dict[str, Any] = field(default_factory=dict)
|
||||
|
||||
|
||||
class TransactionContext:
|
||||
@ -136,12 +137,15 @@ class TransactionalFileSystem:
|
||||
path=str(target_path),
|
||||
affected_files=1,
|
||||
transaction_id=transaction_id,
|
||||
metadata={'size': len(content), 'encoding': 'utf-8', 'content_hash': content_hash},
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
staging_file.unlink(missing_ok=True)
|
||||
raise
|
||||
|
||||
except ValueError:
|
||||
raise
|
||||
except Exception as e:
|
||||
return OperationResult(
|
||||
success=False,
|
||||
@ -352,8 +356,8 @@ class TransactionalFileSystem:
|
||||
if not str(requested_path).startswith(str(self.sandbox)):
|
||||
raise ValueError(f"Path outside sandbox: {filepath}")
|
||||
|
||||
if any(part.startswith('.') for part in requested_path.parts[1:]):
|
||||
if not part.startswith('.staging') and not part.startswith('.backups'):
|
||||
for part in requested_path.parts[1:]:
|
||||
if part.startswith('.') and part not in ('.staging', '.backups'):
|
||||
raise ValueError(f"Hidden directories not allowed: {filepath}")
|
||||
|
||||
return requested_path
|
||||
|
||||
@ -1,3 +1,5 @@
|
||||
# retoor <retoor@molodetz.nl>
|
||||
|
||||
import pytest
|
||||
from rp.core.project_analyzer import ProjectAnalyzer, AnalysisResult
|
||||
|
||||
@ -22,6 +24,7 @@ class User(BaseModel):
|
||||
)
|
||||
|
||||
assert isinstance(result, AnalysisResult)
|
||||
assert result.language == 'python'
|
||||
assert 'requests' in result.dependencies
|
||||
assert 'pydantic' in result.dependencies
|
||||
|
||||
@ -78,7 +81,7 @@ if (x := 10) > 5:
|
||||
code_content=code_with_walrus,
|
||||
)
|
||||
|
||||
version_parts = result.python_version.split('.')
|
||||
version_parts = result.language_version.split('.')
|
||||
assert int(version_parts[1]) >= 8
|
||||
|
||||
def test_directory_structure_planning(self):
|
||||
@ -154,3 +157,300 @@ from fastapi.middleware.gzip import GZIPMiddleware
|
||||
|
||||
assert not result.valid
|
||||
assert any('GZIPMiddleware' in str(e) or 'fastapi' in str(e).lower() for e in result.errors)
|
||||
|
||||
|
||||
class TestCLanguageAnalyzer:
|
||||
def setup_method(self):
|
||||
self.analyzer = ProjectAnalyzer()
|
||||
|
||||
def test_detect_c_language(self):
|
||||
c_code = """
|
||||
#include <stdio.h>
|
||||
#include <stdlib.h>
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
printf("Hello, World!\\n");
|
||||
return 0;
|
||||
}
|
||||
"""
|
||||
result = self.analyzer.analyze_requirements(
|
||||
spec_file="main.c",
|
||||
code_content=c_code,
|
||||
)
|
||||
|
||||
assert result.language == 'c'
|
||||
|
||||
def test_c_standard_headers_detection(self):
|
||||
c_code = """
|
||||
#include <stdio.h>
|
||||
#include <stdlib.h>
|
||||
#include <string.h>
|
||||
#include <math.h>
|
||||
|
||||
int main() {
|
||||
return 0;
|
||||
}
|
||||
"""
|
||||
result = self.analyzer.analyze_requirements(
|
||||
spec_file="main.c",
|
||||
code_content=c_code,
|
||||
)
|
||||
|
||||
assert 'stdio.h' in result.dependencies
|
||||
assert result.dependencies['stdio.h'] == 'stdlib'
|
||||
assert result.dependencies['math.h'] == 'stdlib'
|
||||
|
||||
def test_c_posix_headers_detection(self):
|
||||
c_code = """
|
||||
#include <unistd.h>
|
||||
#include <pthread.h>
|
||||
#include <sys/socket.h>
|
||||
|
||||
int main() {
|
||||
return 0;
|
||||
}
|
||||
"""
|
||||
result = self.analyzer.analyze_requirements(
|
||||
spec_file="main.c",
|
||||
code_content=c_code,
|
||||
)
|
||||
|
||||
assert 'unistd.h' in result.dependencies
|
||||
assert result.dependencies['unistd.h'] == 'posix'
|
||||
assert any('POSIX' in w for w in result.warnings)
|
||||
|
||||
def test_c_local_headers_detection(self):
|
||||
c_code = """
|
||||
#include <stdio.h>
|
||||
#include "myheader.h"
|
||||
#include "utils/helper.h"
|
||||
|
||||
int main() {
|
||||
return 0;
|
||||
}
|
||||
"""
|
||||
result = self.analyzer.analyze_requirements(
|
||||
spec_file="main.c",
|
||||
code_content=c_code,
|
||||
)
|
||||
|
||||
assert 'myheader.h' in result.dependencies
|
||||
assert result.dependencies['myheader.h'] == 'local'
|
||||
|
||||
def test_c_external_library_headers(self):
|
||||
c_code = """
|
||||
#include <stdio.h>
|
||||
#include <curl/curl.h>
|
||||
#include <openssl/ssl.h>
|
||||
|
||||
int main() {
|
||||
return 0;
|
||||
}
|
||||
"""
|
||||
result = self.analyzer.analyze_requirements(
|
||||
spec_file="main.c",
|
||||
code_content=c_code,
|
||||
)
|
||||
|
||||
assert 'curl/curl.h' in result.dependencies
|
||||
assert result.dependencies['curl/curl.h'] == 'curl'
|
||||
assert any('curl' in w for w in result.warnings)
|
||||
|
||||
def test_c_standard_detection_c99(self):
|
||||
c_code = """
|
||||
#include <stdio.h>
|
||||
|
||||
int main() {
|
||||
for (int i = 0; i < 10; i++) {
|
||||
printf("%d\\n", i);
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
"""
|
||||
result = self.analyzer.analyze_requirements(
|
||||
spec_file="main.c",
|
||||
code_content=c_code,
|
||||
)
|
||||
|
||||
assert result.language_version == 'c99'
|
||||
|
||||
def test_c_standard_detection_c11(self):
|
||||
c_code = """
|
||||
#include <stdio.h>
|
||||
#include <stdatomic.h>
|
||||
|
||||
int main() {
|
||||
_Static_assert(sizeof(int) >= 4, "int must be at least 4 bytes");
|
||||
return 0;
|
||||
}
|
||||
"""
|
||||
result = self.analyzer.analyze_requirements(
|
||||
spec_file="main.c",
|
||||
code_content=c_code,
|
||||
)
|
||||
|
||||
assert result.language_version == 'c11'
|
||||
|
||||
def test_c_standard_detection_gnu(self):
|
||||
c_code = """
|
||||
#include <stdio.h>
|
||||
|
||||
int main() {
|
||||
typeof(5) x = 10;
|
||||
printf("%d\\n", x);
|
||||
return 0;
|
||||
}
|
||||
"""
|
||||
result = self.analyzer.analyze_requirements(
|
||||
spec_file="main.c",
|
||||
code_content=c_code,
|
||||
)
|
||||
|
||||
assert 'gnu' in result.language_version
|
||||
|
||||
def test_c_compiler_flags_suggestion(self):
|
||||
c_code = """
|
||||
#include <stdio.h>
|
||||
#include <math.h>
|
||||
#include <pthread.h>
|
||||
|
||||
int main() {
|
||||
pthread_t thread;
|
||||
double x = sqrt(2.0);
|
||||
return 0;
|
||||
}
|
||||
"""
|
||||
result = self.analyzer.analyze_requirements(
|
||||
spec_file="main.c",
|
||||
code_content=c_code,
|
||||
)
|
||||
|
||||
assert '-lm' in result.compiler_flags
|
||||
assert '-pthread' in result.compiler_flags
|
||||
assert any('-std=' in f for f in result.compiler_flags)
|
||||
assert '-Wall' in result.compiler_flags
|
||||
|
||||
def test_c_valid_analysis_no_errors(self):
|
||||
c_code = """
|
||||
#include <stdio.h>
|
||||
|
||||
int main() {
|
||||
printf("Hello\\n");
|
||||
return 0;
|
||||
}
|
||||
"""
|
||||
result = self.analyzer.analyze_requirements(
|
||||
spec_file="main.c",
|
||||
code_content=c_code,
|
||||
)
|
||||
|
||||
assert result.valid
|
||||
assert len(result.errors) == 0
|
||||
|
||||
def test_c_shell_commands_validation(self):
|
||||
c_code = """
|
||||
#include <stdio.h>
|
||||
int main() { return 0; }
|
||||
"""
|
||||
commands = [
|
||||
"gcc -o main main.c",
|
||||
"make clean",
|
||||
"./main",
|
||||
]
|
||||
|
||||
result = self.analyzer.analyze_requirements(
|
||||
spec_file="main.c",
|
||||
code_content=c_code,
|
||||
commands=commands,
|
||||
)
|
||||
|
||||
valid_commands = [c for c in result.shell_commands if c['valid']]
|
||||
assert len(valid_commands) == 3
|
||||
|
||||
|
||||
class TestLanguageDetection:
|
||||
def setup_method(self):
|
||||
self.analyzer = ProjectAnalyzer()
|
||||
|
||||
def test_detect_python_from_content(self):
|
||||
python_code = """
|
||||
def hello():
|
||||
print("Hello, World!")
|
||||
|
||||
if __name__ == "__main__":
|
||||
hello()
|
||||
"""
|
||||
lang = self.analyzer.detect_language(python_code)
|
||||
assert lang == 'python'
|
||||
|
||||
def test_detect_c_from_content(self):
|
||||
c_code = """
|
||||
#include <stdio.h>
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
printf("Hello\\n");
|
||||
return 0;
|
||||
}
|
||||
"""
|
||||
lang = self.analyzer.detect_language(c_code)
|
||||
assert lang == 'c'
|
||||
|
||||
def test_detect_cpp_from_content(self):
|
||||
cpp_code = """
|
||||
#include <iostream>
|
||||
|
||||
int main() {
|
||||
std::cout << "Hello" << std::endl;
|
||||
return 0;
|
||||
}
|
||||
"""
|
||||
lang = self.analyzer.detect_language(cpp_code)
|
||||
assert lang == 'cpp'
|
||||
|
||||
def test_detect_rust_from_content(self):
|
||||
rust_code = """
|
||||
fn main() {
|
||||
let x = 5;
|
||||
println!("x = {}", x);
|
||||
}
|
||||
"""
|
||||
lang = self.analyzer.detect_language(rust_code)
|
||||
assert lang == 'rust'
|
||||
|
||||
def test_detect_go_from_content(self):
|
||||
go_code = """
|
||||
package main
|
||||
|
||||
import "fmt"
|
||||
|
||||
func main() {
|
||||
fmt.Println("Hello")
|
||||
}
|
||||
"""
|
||||
lang = self.analyzer.detect_language(go_code)
|
||||
assert lang == 'go'
|
||||
|
||||
def test_detect_javascript_from_content(self):
|
||||
js_code = """
|
||||
const express = require('express');
|
||||
|
||||
function hello() {
|
||||
console.log("Hello");
|
||||
}
|
||||
|
||||
export default hello;
|
||||
"""
|
||||
lang = self.analyzer.detect_language(js_code)
|
||||
assert lang == 'javascript'
|
||||
|
||||
def test_detect_language_from_file_extension(self):
|
||||
lang = self.analyzer.detect_language("", "main.c")
|
||||
assert lang == 'c'
|
||||
|
||||
lang = self.analyzer.detect_language("", "app.py")
|
||||
assert lang == 'python'
|
||||
|
||||
def test_detect_unknown_language(self):
|
||||
weird_content = "some random text without any language patterns"
|
||||
lang = self.analyzer.detect_language(weird_content)
|
||||
assert lang == 'unknown'
|
||||
|
||||
Loading…
Reference in New Issue
Block a user