diff --git a/README.md b/README.md index d0dbcae..3a5767d 100644 --- a/README.md +++ b/README.md @@ -18,12 +18,16 @@ Last-In AI is your friendly neighborhood research paper analyzer that turns thos - ๐Ÿ” **Smart Analysis**: Reads papers so you don't have to (but you probably should anyway) - ๐Ÿ“Š **PDF Processing**: Turns those pesky PDFs into structured data faster than a grad student's coffee run - ๐Ÿคน **Orchestration**: Juggles multiple tasks like a circus professional +- ๐Ÿ”„ **Multi-Provider Support**: Switch between different LLM providers like a DJ switching tracks +- โš™๏ธ **Flexible Configuration**: Customize your LLM settings without breaking a sweat ## ๐Ÿ—๏ธ Architecture ``` src/ โ”œโ”€โ”€ analysis/ # Where the magic happens ๐ŸŽฉ +โ”‚ โ”œโ”€โ”€ analysis_engine.py # Core analysis logic +โ”‚ โ””โ”€โ”€ llm_provider.py # LLM provider abstraction โ”œโ”€โ”€ data_acquisition/ # Paper fetching wizardry ๐Ÿ“ฅ โ”œโ”€โ”€ orchestration/ # The puppet master ๐ŸŽญ โ””โ”€โ”€ processing/ # PDF wrestling championship ๐Ÿ“„ @@ -53,6 +57,7 @@ pip install -r requirements.txt ```bash cp .env.example .env # Edit .env with your favorite text editor +# Don't forget to add your chosen LLM provider's API key! ``` ### Method 2: Docker Installation ๐Ÿณ @@ -89,6 +94,24 @@ controller.process_papers("quantum_computing") # Now go grab a coffee, you've earned it! ``` +## โš™๏ธ LLM Configuration + +The system supports multiple LLM providers out of the box. Configure your preferred provider in `config/settings.yaml`: + +```yaml +llm: + provider: deepseek # Supported: openai, deepseek + temperature: 0.5 + max_tokens: 4096 + model: deepseek-r1 # Model name specific to the provider +``` + +Don't forget to set your API keys in `.env`: +```bash +OPENAI_API_KEY=your-openai-key-here +DEEPSEEK_API_KEY=your-deepseek-key-here +``` + ## ๐Ÿ› ๏ธ Development ### Running Tests @@ -104,6 +127,12 @@ docker compose -f docker/docker-compose.yml run --rm app python -m pytest - See `.env.example` for required configuration - Docker configurations are documented in `docker/README.md` +### Adding New LLM Providers +1. Extend the `LLMProvider` class in `src/analysis/llm_provider.py` +2. Implement the `get_llm()` method for your provider +3. Add your provider to the factory function +4. Update configuration as needed + ## ๐Ÿค Contributing Found a bug? Want to add a feature? Have a brilliant idea? We're all ears! @@ -123,6 +152,7 @@ MIT License - Because sharing is caring! See [LICENSE](LICENSE) for more details - This README was written by an AI (plot twist!) - The code is probably smarter than most of us - We counted the coffee cups consumed during development, but lost count at 42 +- Our LLM providers are like pizza toppings - everyone has their favorite! --- Made with ๐Ÿ’ป and questionable amounts of โ˜• \ No newline at end of file