From 16eb77784ec575b435543cd11fce244913a5edf8 Mon Sep 17 00:00:00 2001 From: kpcto Date: Tue, 11 Feb 2025 01:30:05 +0000 Subject: [PATCH] Readme file update --- README.md | 74 ++++++++++++++++++++++++++++--------------------------- 1 file changed, 38 insertions(+), 36 deletions(-) diff --git a/README.md b/README.md index 1c483fa..bb5aa46 100644 --- a/README.md +++ b/README.md @@ -1,83 +1,85 @@ -# ๐Ÿง  LastIn AI: Your Paper's New Overlord +# ๐Ÿง  LastIn AI: The Fluff Whisperer -[![AI Overlord](https://img.shields.io/badge/Powered%20By-Sentient%20Sparks-blue?style=for-the-badge)](https://www.youtube.com/watch?v=dQw4w9WgXcQ) +[![Fluff Shield](https://img.shields.io/badge/Protected%20By-BS--O--Meterโ„ข-yellow?style=for-the-badge)](https://youtu.be/EkTq6m1TeKA?si=0x9sBZ7J3ZvZ1Z1Z) -**An AI system that reads, judges, and organizes papers so you can pretend you're keeping up with the literature** +**Academic paper analysis that separates the wheat from the chaff... and burns the chaff** ๐Ÿ”ฅ ## ๐Ÿš€ Features -- ๐Ÿค– **Self-Optimizing Pipeline**: Now with 20% more existential dread about its purpose -- ๐Ÿงฉ **Modular Architecture**: Swappable components like you're building Ikea furniture for robots -- ๐Ÿ” **Context-Aware Analysis**: Reads between LaTeX equations like a PhD student reads Twitter -- ๐Ÿ›  **Self-Healing Storage**: Fixes database issues while questioning why it bothers -- ๐Ÿคฏ **Flux Capacitor Mode**: Time-aware processing (results may violate causality) +- ๐Ÿค– **Fluff Annihilation Engine**: Detects filler content with surgical precision (and sass) +- ๐Ÿ“Š **BS-o-Meterโ„ข**: Rates papers on our patented _How-Dare-You-Waste-My-Timeโ„ข_ scale +- ๐Ÿ” **Context-Aware Roasting**: Reads between LaTeX equations to ask "where's the beef?" +- ๐Ÿ›  **Self-Healing Storage**: Fixes databases while judging your citation choices +- ๐Ÿ•ฐ **Time-Aware Snark**: Remembers when papers promised "revolutionary results" (spoiler: they didn't) ## โš™๏ธ Installation ```bash -# Clone repository (we promise there's no paperclip maximizer) +# Clone repository (contains 0% filler content) git clone https://github.com/your-lab/lastin-ai.git cd lastin-ai # Install dependencies (virtual env recommended) -pip install -r requirements.txt # Now with non-Euclidean dependency resolution! +pip install -r requirements.txt # Includes anti-fluff field generators -# Initialize the system (requires PostgreSQL) -python -m src.main init-db # Creates tables and a small existential crisis +# Initialize the system (PostgreSQL required) +python -m src.main init-db # Creates tables and your first existential crisis ``` ## ๐Ÿ”ง Configuration -Rename `.env.example` to `.env` and feed it your secrets: +Rename `.env.example` to `.env` and feed it: ```ini -OPENAI_API_KEY=sk-your-key-here # We pinky-promise not to become self-aware -DEEPSEEK_API_KEY=sk-moon-shot # For when regular AI isn't dramatic enough +OPENAI_API_KEY=sk-your-key-here # We solemnly swear to misuse this responsibly +DEEPSEEK_API_KEY=sk-moon-shot # For when regular snark isn't enough -DB_HOST=localhost # Where we store your academic sins -DB_NAME=paper_analysis # Schema designed during a SIGBOVIK break +DB_HOST=localhost # Where we store academic sins +DB_NAME=paper_analysis # Schema designed during a heated Twitter debate ``` ## ๐Ÿง  Usage ```bash -# Harvest papers like an academic combine +# Harvest papers like a combine harvesting buzzwords python -m src.main fetch --categories cs.AI --days 7 -# Query your digital hoard -python -m src.main query "papers that cite GPT but clearly didn't read it" +# Query your collection with sass +python -m src.main query "papers using 'transformative' unironically" ``` -**Pro Mode:** Add `--loglevel DEBUG` to watch neural networks question the meaning of "breakthrough". +**Pro Tip:** Add `--loglevel DEBUG` to watch the AI question authors' life choices in real-time. ## ๐Ÿ— Architecture ``` lastin-ai/ โ”œโ”€โ”€ src/ -โ”‚ โ”œโ”€โ”€ data_acquisition/ # Papers go in, embeddings come out -โ”‚ โ”œโ”€โ”€ storage/ # Where vectors go to rethink their life choices -โ”‚ โ”œโ”€โ”€ utils/ # Home of AgentController (the real protagonist) -โ”‚ โ””โ”€โ”€ main.py # The big red button (do not push) +โ”‚ โ”œโ”€โ”€ data_acquisition/ # Where PDFs go to get judged +โ”‚ โ”œโ”€โ”€ storage/ # Home of our disappointment vector space +โ”‚ โ”œโ”€โ”€ utils/ # AgentController lives here (the Simon Cowell of AIs) +โ”‚ โ””โ”€โ”€ main.py # The "I Regret Nothing" button ``` Our **Agent Controller** handles: -- ๐Ÿงฌ Data consistency through sheer willpower -- ๐Ÿ“‰ Quality metrics that judge us all -- ๐Ÿงฎ Vector math that would make Von Neumann blush +- ๐Ÿงจ Fluff detection with precision-guided snark +- ๐Ÿ“‰ Quality metrics that hurt authors' feelings +- ๐Ÿงฎ Vector math that judges papers by their covers ## ๐ŸŒŒ Roadmap -- [ ] Implement ethical constraints (optional) -- [ ] Add support for papers written by AIs about AIs -- [ ] Quantum-resistant pretension detection -- [ ] Automated rebuttal generator for peer review +- [ ] Automated "Why Was This Published?" report generator +- [ ] Sarcasm intensity slider (default: academic) +- [ ] Support for papers that cite "inspired by ChatGPT" as methodology +- [ ] Integration with Imposter Syndrome Detection API ## ๐ŸŒŸ Acknowledgments -- **ArXiv** for the digital paper avalanche -- **GPUs** for pretending we're not just matrix multiplying -- **The concept of attention** - you're the real MVP +- **ArXiv** for the raw material +- **GPT-4** for pretending our prompts make sense +- **The letter F** - for being the first character in "Fluff" --- -*Disclaimer: May occasionally generate paper summaries more coherent than the originals. Not liable for recursive self-improvement loops.* ๐Ÿค–โžฐ +*Disclaimer: Our fluff scores may correlate with your citation count. Not liable for sudden drops in academic self-esteem.* ๐Ÿ“‰๐Ÿ˜ญ +*P.S. Coffee-free since 2023 - runs on pure snark* ๐Ÿค–โ˜•๏ธโžก๏ธ๐Ÿ—‘๏ธ +*P.P.S. Yes, we know the irony of an AI judging human creativity* ๐Ÿค–๐Ÿ”„๐Ÿง‘๐ŸŽ“