This repository has been archived on 2025-02-10 . You can view files and clone it, but cannot push or open issues or pull requests.
[ Work in Progress ]
🤖 Last-In AI: Your Papers Please!
"Because reading research papers manually is so 2023!" 🎯
🎭 What's This All About?
Last-In AI is your friendly neighborhood research paper analyzer that turns those dense academic PDFs into digestible insights. Think of it as your very own academic paper whisperer, but with more silicon and less caffeine.
🌟 Features
- 📚 arXiv Integration: Fetches papers faster than you can say "peer review"
- 🔍 Smart Analysis: Reads papers so you don't have to (but you probably should anyway)
- 📊 PDF Processing: Turns those pesky PDFs into structured data faster than a grad student's coffee run
- 🤹 Orchestration: Juggles multiple tasks like a circus professional
- 🔄 Multi-Provider Support: Switch between different LLM providers like a DJ switching tracks
- ⚙️ Flexible Configuration: Customize your LLM settings without breaking a sweat
🏗️ Architecture
src/
├── analysis/ # Where the magic happens 🎩
│ ├── analysis_engine.py # Core analysis logic
│ └── llm_provider.py # LLM provider abstraction
├── data_acquisition/ # Paper fetching wizardry 📥
├── orchestration/ # The puppet master 🎭
└── processing/ # PDF wrestling championship 📄
docker/ # Container configuration 🐳
├── Dockerfile # Multi-stage build definition
├── docker-compose.yml # Service orchestration
└── README.md # Docker setup documentation
🚀 Getting Started
Method 1: Local Installation
- Clone this repository (because good things should be shared)
git clone https://git.stevanovic.co.uk/kpcto/lastin-ai.git
cd lastin-ai
- Install dependencies (they're like friends, but for your code)
pip install -r requirements.txt
- Set up your environment (like making your bed, but more technical)
cp .env.example .env
# Edit .env with your favorite text editor
# Don't forget to add your chosen LLM provider's API key!
Method 2: Docker Installation 🐳
- Clone and navigate to the repository
git clone https://git.stevanovic.co.uk/kpcto/lastin-ai.git
cd lastin-ai
- Set up environment variables
cp .env.example .env
# Edit .env with your configuration
- Build and run with Docker Compose
docker compose -f docker/docker-compose.yml up --build
For detailed Docker setup and configuration options, see Docker Documentation.
🎮 Usage
from src.orchestration.agent_controller import AgentController
# Initialize the brain
controller = AgentController()
# Let it do its thing
controller.process_papers("quantum_computing")
# Now go grab a coffee, you've earned it!
⚙️ LLM Configuration
The system supports multiple LLM providers out of the box. Configure your preferred provider in config/settings.yaml:
llm:
provider: deepseek # Supported: openai, deepseek
temperature: 0.5
max_tokens: 4096
model: deepseek-r1 # Model name specific to the provider
Don't forget to set your API keys in .env:
OPENAI_API_KEY=your-openai-key-here
DEEPSEEK_API_KEY=your-deepseek-key-here
🛠️ Development
Running Tests
# Local
python -m pytest
# Docker
docker compose -f docker/docker-compose.yml run --rm app python -m pytest
Environment Variables
- See
.env.examplefor required configuration - Docker configurations are documented in
docker/README.md
Adding New LLM Providers
- Extend the
LLMProviderclass insrc/analysis/llm_provider.py - Implement the
get_llm()method for your provider - Add your provider to the factory function
- Update configuration as needed
🤝 Contributing
Found a bug? Want to add a feature? Have a brilliant idea? We're all ears! Just remember:
- Fork it 🍴
- Branch it 🌿
- Code it 💻
- Test it 🧪
- PR it 🎯
📜 License
MIT License - Because sharing is caring! See LICENSE for more details.
🎭 Fun Facts
- This README was written by an AI (plot twist!)
- The code is probably smarter than most of us
- We counted the coffee cups consumed during development, but lost count at 42
- Our LLM providers are like pizza toppings - everyone has their favorite!
Made with 💻 and questionable amounts of ☕
Description
First attempt in working with AI code editors and I have tried roo code and cline. Lots of problems with Deepseek API and also Claude rate limits and api errors galore.
Languages
Python
89.2%
Dockerfile
10.8%