Releases: ashes00/deepshell
DeepShell Version 1.3.5!
๐ Overview
DeepShell 1.3.5 introduces a comprehensive set of interactive mode enhancements that transform the user experience. This release focuses on productivity, usability, and workflow improvements, making DeepShell more powerful and user-friendly than ever before.
โจ New Features
๐พ Interactive Mode Save Command
Command: save [filename]
Transform your interactive sessions with the ability to save LLM responses directly to markdown files.
Usage Examples:
# Save with automatic filename prompt
> What are the benefits of using a CLI for LLM interaction?
[LLM provides detailed response with markdown formatting]
> save
Enter filename (.md extension will be added automatically): research-notes
Save successful!
File saved to: /home/user/Downloads/research-notes.md
# Save with direct filename
> Explain the concept of machine learning
[LLM provides detailed response]
> save ml-explanation
Save successful!
File saved to: /home/user/Downloads/ml-explanation.md
# Save with existing .md extension
> How do neural networks work?
[LLM provides detailed response]
> save neural-networks-guide.md
Save successful!
File saved to: /home/user/Downloads/neural-networks-guide.mdKey Features:
- Auto .md extension: Automatically adds
.mdextension if not provided - File validation: Prevents invalid characters and empty filenames
- Overwrite protection: Asks for confirmation before overwriting existing files
- Smart directory: Saves to Downloads folder (falls back to home directory)
- Raw markdown: Preserves exact formatting as received from the LLM
- Error handling: Prevents conversion if no previous response exists
๐ Interactive Mode Open Command
Command: open [filepath]
Process text files directly through the LLM with custom instructions, making file analysis seamless and powerful.
Usage Examples:
# Open file with custom instructions
> open /home/user/data.txt
Enter instructions for the LLM about this data: Please analyze this data and tell me if it's correct
[LLM processes the file content with your instructions]
# Open file with default analysis
> open ../documents/notes.md
Enter instructions for the LLM about this data: [Press Enter for default]
[LLM processes with "Please analyze this data"]Key Features:
- File validation: Checks if file exists, is readable, and is text-based
- Size limit: Maximum 25MB file size to prevent memory issues
- Path support: Both absolute and relative file paths
- User instructions: Prompt for custom instructions on how to process the file
- Default instructions: "Please analyze this data" if no instructions provided
- Binary detection: Rejects non-text files (images, executables, etc.)
- Error handling: Clear error messages for all failure cases
โ Interactive Mode Help Command
Command: help
Access comprehensive help documentation directly from interactive mode with a professionally formatted command reference.
Features:
- Comprehensive reference: Complete list of all interactive mode commands
- Detailed descriptions: Usage information and examples for each command
- Professional formatting: Clean, easy-to-read help menu
- Usage tips: Best practices and guidance for interactive mode
๐ง Technical Improvements
Enhanced Interactive Mode Interface
- Simplified welcome message: Clean, focused interface with just essential information
- Streamlined command structure: Intuitive command names and syntax
- Improved error handling: Clear, actionable error messages for all scenarios
- Better user experience: Reduced cognitive load with cleaner interface design
Robust File Processing
- Memory management: Efficient handling of large files up to 25MB
- Binary file detection: Smart detection of non-text files to prevent processing errors
- Path resolution: Support for both absolute and relative file paths
- Error recovery: Graceful handling of file access and permission issues
Command History Management
- Separate history entries: User instructions and file content stored as separate history items
- Accurate help documentation: Corrected information about conversation history persistence
- Memory efficiency: Optimized storage and retrieval of conversation data
๐ Bug Fixes
Ollama Integration Fix
- Fixed history bug: Ollama responses now properly added to conversation history
- Consistent behavior: All LLM services (Ollama, Gemini, OpenRouter) now work identically with save command
- Improved reliability: Eliminated "No previous response found" errors for Ollama users
Error Message Improvements
- Accurate help text: Fixed misleading information about conversation history persistence
- Clear error reporting: Better error messages for file processing failures
- User guidance: More helpful error messages that guide users to solutions
๐ Getting Started with New Features
Quick Start Guide
- Start interactive mode:
./deepshell -i- Get help anytime:
> help
- Save important responses:
> save my-notes
- Process files with custom instructions:
> open /path/to/file.txt
Enter instructions for the LLM about this data: [your instructions]
Use Cases
- Research documentation: Save important findings and explanations
- Knowledge base building: Create markdown files for future reference
- Content creation: Export formatted responses for articles or documentation
- Learning materials: Save educational content for later review
- File analysis: Process documents, code, or data files with custom instructions
- Code review: Analyze source code files for bugs, improvements, or explanations
- Document processing: Extract insights from text files, logs, or reports
- Interactive help: Get comprehensive command reference anytime during sessions
๐ Migration Guide
From Version 1.3.1 to 1.3.5
- No configuration changes required - existing configurations work seamlessly
- New interactive commands available - use
helpto see all available commands - Enhanced workflow - take advantage of save and open commands for better productivity
- Improved reliability - Ollama users will notice better consistency with other LLM services
Command Changes
- Convert โ Save: The
convertcommand has been renamed tosavefor better clarity - New commands:
openandhelpcommands are now available in interactive mode - Simplified interface: Cleaner welcome message focuses on essential information
๐ Technical Details
Files Modified
- interactive.c: Added save, open, and help command implementations
- utils.c: Added file processing and help display functions
- deepshell.h: Updated function declarations and version number
- README.md: Comprehensive documentation updates
New Functions
save_response_to_file(): Saves LLM responses to markdown filesread_file_content(): Reads and validates text filesis_text_file(): Detects binary vs text filesdisplay_interactive_help(): Shows comprehensive help menuis_valid_filename(): Validates filenames for save command
Dependencies
- No new dependencies required
- All existing functionality preserved
- Backward compatible with previous versions
๐ฏ Performance Improvements
- Memory efficiency: Optimized file processing and memory management
- Faster command processing: Streamlined interactive mode command handling
- Reduced memory footprint: Better memory allocation and cleanup
- Improved responsiveness: Faster file validation and error checking
๐ Security Considerations
- File validation: Strict validation of file types and sizes
- Path security: Safe handling of file paths and permissions
- Memory protection: Bounds checking and overflow protection
- Error isolation: Graceful handling of file system errors
๐ Documentation Updates
- README.md: Updated with comprehensive interactive mode documentation
- Help system: Built-in help command with detailed usage information
- Examples: Extensive examples for all new features
- Use cases: Practical applications and workflow guidance
๐ What's Next
This release establishes a solid foundation for interactive mode productivity. Future releases will build upon these capabilities with additional features and enhancements.
๐ Acknowledgments
Thank you to all users who provided feedback and suggestions that helped shape this release. Your input is invaluable in making DeepShell better for everyone.
**Happy Querying! ๐คโจ
DeepShell Version 1.3.1!
๐ฏ Overview
We're excited to announce DeepShell v1.3.1, a focused release that significantly improves the user experience through enhanced interactive mode capabilities, streamlined interface design, and comprehensive API key management features. This release transforms DeepShell into an even more professional and user-friendly LLM command-line interface.
โจ New Features & Enhancements
๐ฅ๏ธ Interactive Mode Line Editing Revolution
Complete terminal editing experience with GNU Readline integration
- Arrow Key Navigation: Use โ/โ arrow keys to move the cursor within your input line
- Command History: Use โ/โ arrow keys to recall and edit previous commands
- Standard Terminal Shortcuts: Full support for Home, End, Ctrl+A, Ctrl+E, and other standard editing shortcuts
- In-Place Editing: Edit your commands without having to retype entire lines
- Backspace/Delete Support: Proper character deletion that works as expected
Before: Arrow keys produced strange characters, had to delete entire lines to make corrections
After: Professional terminal editing experience with full cursor control and command history
๐ง Streamlined Settings Menu Architecture
Cleaner, more organized configuration interface
- Unified API Key Management: Consolidated separate Gemini and OpenRouter API key options into a single "Manage API Keys" menu
- Service Selection Flow: Choose your LLM service first, then manage its API keys
- Reduced Menu Clutter: Streamlined from 10 to 9 main options for better navigation
- Scalable Design: Easy to add new LLM services without menu bloat
New Menu Flow: Main Menu โ "Manage API Keys" โ Service Selection โ Existing Management Interface
๐ Enhanced API Key Visibility & Management
Comprehensive API key administration capabilities
Immediate Active Key Display
- Prominent Active Key Section: See your current API key details immediately upon entering API key management
- Complete Information: Displays both nickname and full API key value for the active key
- Smart Organization: Active Key Details โ All Keys Summary โ Management Options
Comprehensive Key Viewing
- New Option 4: "Show all API keys & nicknames" in both Gemini and OpenRouter menus
- Complete Key Information: View nickname, full API key value, and active status for all configured keys
- Professional Formatting: Clean presentation with separators between multiple keys
- Enhanced Troubleshooting: Easy verification of API key values and configuration
๐จ Interface & Documentation Improvements
Professional presentation and user guidance
- Help Menu Formatting: Fixed alignment issues with longer command flags for consistent, readable help output
- Code Documentation Cleanup: Removed all references to the original Python implementation for clarity
- Language-Neutral Examples: Updated documentation to focus on the C implementation
- Professional Codebase: Clean, maintainable documentation throughout
๐ ๏ธ Technical Improvements
Dependencies & Build System
- GNU Readline Integration: Added
libreadline-devdependency for advanced terminal editing - Enhanced Makefile: Updated build system with readline linking and installation instructions
- Cross-Platform Compatibility: Maintained support for both Linux and Windows (MSYS2/Mingw64)
Code Quality & Architecture
- Unified API Management: New
manage_api_keys_unified()function for scalable service management - Enhanced Error Handling: Improved user feedback and validation throughout
- Memory Management: Enhanced cleanup and resource handling
- Consistent Implementation: Standardized API key management across all LLM services
๐ What's Included
Complete Interactive Experience
- Full terminal editing capabilities with cursor movement
- Command history with up/down arrow navigation
- Standard terminal shortcuts (Home, End, Ctrl+A, Ctrl+E, etc.)
- Professional line editing without character corruption
Streamlined Configuration Management
- Unified API key management interface
- Immediate visibility of active API key details
- Comprehensive key viewing and verification
- Clean, organized menu structure
Enhanced User Interface
- Professional help menu formatting
- Consistent visual presentation
- Clear, focused documentation
- Improved navigation flow
๐ Getting Started with New Features
Experience Enhanced Interactive Mode
# Start interactive mode with full editing capabilities
./deepshell -i
# Now you can:
# - Use arrow keys to move cursor
# - Use up/down arrows for command history
# - Use standard terminal shortcuts
# - Edit commands in-placeStreamlined API Key Management
# Access unified API key management
./deepshell -s
# Select "3. Manage API Keys"
# Choose your service (Gemini/OpenRouter)
# See active key details immediately
# Use option 4 to view all keys comprehensivelyProfessional Help Interface
# View the improved help menu
./deepshell -h
# Notice the clean alignment and formatting๐ง Installation & Dependencies
Updated Prerequisites
# Install dependencies (now includes readline)
sudo apt-get update
sudo apt-get install -y build-essential libcurl4-openssl-dev libjson-c-dev libreadline-devBuild Instructions
# Clone and build
git clone https://github.com/ashes00/deepshell.git
cd deepshell
make clean && make๐ Migration Guide
From v1.3.0
- No Breaking Changes: All existing configurations and functionality preserved
- Automatic Enhancement: Interactive mode improvements work immediately
- New Dependencies: Install
libreadline-devfor full functionality - Menu Changes: API key management now under unified "Manage API Keys" option
Recommended Actions
- Update Dependencies: Install
libreadline-devfor enhanced interactive mode - Explore New Features: Try the improved interactive mode and streamlined menus
- Test API Key Management: Experience the enhanced visibility and organization
๐ฏ What's Next
DeepShell v1.3.1 establishes a solid foundation for professional terminal interaction and streamlined configuration management. Future releases will focus on:
- Additional LLM service integrations
- Advanced conversation management features
- Enhanced interactive mode capabilities
- Performance optimizations and additional terminal features
๐ Documentation
Full documentation with examples and tutorials is available in our README.md.
๐ Acknowledgments
Special thanks to the community for feature requests, bug reports, and testing that made this release possible. Your feedback continues to drive DeepShell's evolution as the premier LLM command-line interface.
Download DeepShell v1.3.1 from the releases page or build from source.
Need Help? Check our README.md or open an issue on GitHub.
Happy Querying! ๐คโจ
๐ Technical Details
Files Modified
main.c- Help menu formatting improvementssettings.c- Unified API key management and enhanced displaysutils.c- GNU Readline integration forread_line()functiondeepshell.h- Added readline headers and new function declarationsMakefile- Updated dependencies and build configurationREADME.md- Updated installation instructions
New Dependencies
libreadline-dev- Required for advanced terminal editing capabilities
Performance Impact
- Positive: Enhanced user experience with professional terminal editing
- Neutral: No performance impact on LLM queries or API interactions
- Improved: Faster configuration management with streamlined menus
DeepShell Version 1.3.0!
๐ฏ Overview
๐ We're excited to announce DeepShell v1.3.0, a major feature release that significantly expands DeepShell's capabilities with OpenRouter.ai integration, secure configuration backup/migration, and numerous enhancements that make DeepShell the most comprehensive LLM command-line interface available.
โจ New Features
๐ OpenRouter.ai Integration
Complete third LLM service with full feature parity
- 200+ Model Access: Connect to OpenAI, Anthropic, Meta, Google, Mistral, and many more providers through a single interface
- Advanced Model Browser:
- Paginated model selection (15 models per page)
- Smart sorting: Free models listed first, then alphabetical
- Navigate with
n(next),p(previous),q(quit)
- Multi-Key Management: Store multiple OpenRouter API keys with custom nicknames
- Full Feature Support: All DeepShell features work seamlessly with OpenRouter
- Free Model Support: Easy access to free models like
openai/gpt-4o-mini:free
๐ Configuration Backup & Migration
Enterprise-grade configuration management
- Encrypted Export (
-b filename.config):- Password-protected with confirmation
- Secure binary format (completely unreadable as text)
- Saves to Downloads folder automatically
- Includes ALL settings, API keys, and configurations
- Secure Import (
-c filename.config):- Password verification
- Confirmation prompt before overwriting
- Future-proof with version metadata
- Perfect for:
- Backing up your complete DeepShell setup
- Migrating between development machines
- Sharing team configurations securely
- Disaster recovery
๐ Enhanced API Key Management
Unified and powerful key management across all services
- Service-Agnostic Commands:
-set-key: Now manages both Gemini AND OpenRouter keys-show-key: Shows active key for current LLM service-a/--active-config: Quick summary of active LLM, model, and API key
- Consistent UX: OpenRouter key management now matches Gemini's interface exactly
- Smart Validation: Both services support nickname-based multi-key workflows
๐ ๏ธ Improvements & Fixes
User Experience Enhancements
- Alphabetized Help Menu: All command-line options now sorted alphabetically for easier reference
- Updated Interactive Logo: Now properly shows "Multi-LLM Support (Ollama, Gemini, and OpenRouter)"
- Improved Error Handling: Better error messages and validation throughout
- Enhanced Model Selection: Improved pagination and user-friendly navigation
Technical Improvements
- Auto-Setup Bypass: Export/import flags now correctly bypass automatic setup
- Future-Proof Design: Export format includes version metadata for cross-version compatibility
- Binary Security: Export files are true binary format, preventing accidental text viewing
- Memory Management: Enhanced memory handling and cleanup throughout
Bug Fixes
- Fixed import flag being overridden by default configuration setup
- Resolved model selection issues for OpenRouter
- Corrected API key management inconsistencies
- Fixed various edge cases in configuration handling
๐ What's Included
Complete Configuration Export
When you export your configuration, everything is included:
- All LLM service configurations (Ollama, Gemini, OpenRouter)
- All API keys with their nicknames
- Model selections for each service
- Interactive settings (history limit, streaming, animation)
- Markdown rendering preferences
- Server addresses and site attribution
OpenRouter Model Categories
Access to major model families including:
- OpenAI: GPT-4, GPT-3.5 (including free variants)
- Anthropic: Claude 3.5 Sonnet, Claude 3 Haiku
- Meta: Llama 3.1, Llama 3.2 (including free versions)
- Google: Gemma 2, PaLM models
- Mistral: Mixtral, Mistral 7B
- And 190+ more models from various providers
๐ Getting Started with New Features
Try OpenRouter
# Setup OpenRouter
./deepshell -s
# Select option 1 (Manage LLM Services)
# Choose 3 (OpenRouter)
# Quick model change with new pagination
./deepshell -m
# Browse models with n/p navigation, free models listed first
# Manage multiple OpenRouter keys
./deepshell -set-key
# Choose 2 (OpenRouter)Backup Your Configuration
# Export your complete setup
./deepshell -b my-deepshell-backup.config
# Enter password twice for protection
# Import on another machine
./deepshell -c my-deepshell-backup.config
# Enter password and confirm overwriteQuick Status Check
# See your current setup at a glance
./deepshell -a
# Shows: LLM Service, Model, API Key (with nickname)๐ง Migration Guide
From v1.2.x
No breaking changes! Your existing configuration will work seamlessly. New features are additive.
Recommended Actions
- Backup First:
./deepshell -b v1-2-backup.config - Try OpenRouter: Add it as a third LLM option
- Test New Commands: Explore
-afor quick status checks
๐ฏ What's Next
DeepShell v1.3.0 establishes a solid foundation for multi-LLM management with enterprise-grade backup capabilities. Future releases will focus on:
- Additional LLM service integrations
- Advanced conversation management features
- Enhanced interactive mode capabilities
- Performance optimizations
๐ Documentation
Full documentation with examples and tutorials is available in our [README.md](README.md).
๐ Acknowledgments
Special thanks to the community for feature requests, bug reports, and testing that made this release possible.
Download DeepShell v1.3.0 from the [releases page](https://github.com/ashes00/deepshell/releases) or build from source.
Need Help? Check our [README.md](README.md) or open an issue on GitHub.
Happy Querying! ๐คโจ
DeepShell Version 1.2.2!
โจ DeepShell v1.2.2: Interactive Mode Gets a Stunning Visual Makeover! โจ
We're excited to announce the immediate availability of DeepShell v1.2.2! This release brings a major visual enhancement that transforms your interactive mode experience from functional to fabulous. We've added a beautiful, professional logo that makes DeepShell feel like a premium, polished application while maintaining all the powerful functionality you've come to love.
๐ What's New & Exciting in DeepShell v1.2.2?
๐จ Stunning Interactive Mode Logo
Experience DeepShell like never before! When you launch interactive mode (deepshell -i), you'll now be greeted with a beautiful ASCII art logo that spells out "DeepShell" in elegant block characters. The logo features:
Professional ASCII Art: Eye-catching block-based design that spells "DeepShell" clearly
Perfect Subtitle Box: A beautifully rendered information box with AI-Powered Shell Interface details
Dynamic Version Display: Automatically shows the current version (1.2.2) without manual updates
Feature Highlights: Colorful display of DeepShell's key capabilities
Status Indicator: Clear indication that DeepShell is ready for your AI interactions
๐ง Smart Version Synchronization
Gone are the days of manually updating logo versions! DeepShell now automatically displays the correct version number in the logo by reading directly from the main program version. This ensures perfect consistency and eliminates the need for manual logo maintenance.
๐ฏ Enhanced User Experience
The new logo transforms interactive mode from a simple command prompt into a professional, engaging interface that:
Creates a memorable first impression
Provides clear visual branding
Enhances the overall user experience
Makes DeepShell feel more polished and complete

DeepShell Version 1.2.0! Latest
โจ DeepShell v1.2.0: Live, Blue, and Beautiful Progress While You Query! โจ
Weโre excited to release DeepShell v1.2.0! This version brings a lively, blue, dotโmatrix progress animation that runs while your LLM is processing and stops the instant your answer is ready. Itโs smooth, flickerโfree, and works across both Gemini and Ollama. Your terminal just got a whole lot more alive.
๐ Whatโs New & Exciting in DeepShell v1.2.0?
๐ Blue Dot Scanner Window (Front + Rear)
A sleek, braille โdotโmatrixโ animation frames your status text on both sides as the query runs.
Tuned for terminal readability and motion that feels fast but never frantic.
๐งต True Live Progress (Concurrent, NonโBlocking)
The animation now runs in a background thread and stops immediately when the response is ready.
Eliminates the old postโanimation pause so your output feels instant.
๐ Works Everywhere (Gemini + Ollama)
Implemented centrally; both LLM services pick it up automatically without perโservice tweaks.
๐งฝ Clean, FlickerโFree Rendering
Proper line clearing each frame keeps your terminal spotless and easy to read.
๐ Instant Control (Per Run or Persisted)
Settings: โToggle Progress Animationโ to enable/disable globally.
CLI: Use --no-animation for oneโoff runs without changing your config.
๐งฉ Build Update
Now links -lpthread under the hood to power the concurrent animation.
DeepShell Version 1.0.5!
โจ DeepShell v1.0.5: Unleash Deeper Control Over Your LLM Interactions! โจ
We're thrilled to announce the immediate availability of DeepShell v1.0.5! This release isn't just an update; it's a significant leap forward in empowering you with more granular control and a truly tailored experience when interacting with your Large Language Models. We've listened to your feedback and packed this version with features designed to make DeepShell even more intuitive, flexible, and powerful.
๐ What's New & Exciting in DeepShell v1.0.5?
โก๏ธ Toggle Response Streaming (Default: Disabled)
Experience the future of LLM interaction on your terms! By default, DeepShell now prioritizes flawless Markdown rendering by waiting for the complete response before displaying it. However, if you crave instant gratification, you can now enable streaming via the settings menu (deepshell -s). Get real-time, plain-text output as the LLM generates its response, making your terminal feel even more alive!
๐จ Toggle Markdown Rendering for Active Service (Default: Enabled)
DeepShell's beautiful Markdown output is a core feature, but we understand that sometimes you might prefer raw, unformatted text. With v1.0.5, you can now toggle Markdown rendering on or off for your active LLM service directly from the settings menu. This gives you unparalleled flexibility to adapt DeepShell's output to your workflow or terminal environment.
๐ง Set Interactive History Limit (Default: 25 Turns)
Dive deeper into your conversations! We've significantly increased the default conversational memory in interactive mode (deepshell -i) to 25 turns (that's 50 messages!). For those long, complex discussions, DeepShell will now remember more context. Need more? Less? No memory at all? You can easily customize this limit to any number (or 0 for no memory) via the settings menu.
๐ ๏ธ How to Get It
Simply update your DeepShell installation to experience these powerful new enhancements!
We're committed to making DeepShell the ultimate command-line companion for your LLM journey. Thank you for your continued support and happy querying!
DeepShell Version 1.0.4!
DeepShell v1.0.4: Interactive Chat Mode is Here!
We are excited to announce the release of DeepShell version 1.0.4, a significant update that fundamentally enhances how you interact with Large Language Models from the command line.
The Game Changer: Interactive Chat Mode ๐
The centerpiece of this release is the new interactive mode. Gone are the days of one-shot queries. You can now engage in fluid, stateful conversations with your configured LLM, all without leaving your terminal.
This mode introduces conversational memory, allowing the model to remember the context of the last 10 turns of your dialogue. This enables more natural follow-up questions and complex, evolving interactions.
Key Features in v1.0.4:
- Seamless Interactive Sessions: Launch a persistent chat session with the new
-ior--interactiveflag. - Conversational Memory: The model now remembers the context of your conversation, leading to more coherent and intelligent responses.
- Universal Support: Interactive mode works flawlessly for both Ollama and Google Gemini services.
- Effortless Interaction: Simply type your query at the prompt, get a response, and keep the conversation going. Type
exitorquitto end the session gracefully.
How to Use It
Getting started is as simple as running:
./deepshell -iThis update is focused on making your interactions with LLMs more powerful and intuitive. We've also made numerous under-the-hood improvements to argument parsing and configuration management for a more stable and robust experience.
We can't wait for you to try it. Update to version 1.0.4 now and transform your command-line AI assistant into a true conversational partner!
Happy Querying!
DeepShell Version 1.0.3!
Hello DeepShell Users!
We're thrilled to announce the release of DeepShell Version 1.0.3! This update brings some exciting new features and significant improvements to enhance your command-line LLM experience, focusing on better readability, faster workflows, and easier configuration management.
Hereโs whatโs new in DeepShell 1.0.3:
โจ Key Features & Enhancements:
Enhanced Readability with Rich Markdown Output! ๐ Say goodbye to plain text responses! DeepShell now automatically renders LLM outputs as rich Markdown directly in your terminal. This means beautifully formatted lists, code blocks, emphasis, and more, making it much easier to read and understand complex responses from both Ollama and Gemini services. This feature is enabled by default for the best experience.
Introducing LLM Jump (-j or --jump-llm)! ๐ Switching between your favorite LLM services just got a whole lot faster! The new "LLM Jump" feature allows you to instantly toggle back to the previously active LLM service. If you frequently switch between, say, a local Ollama model and a Gemini model, this one's for you!
Streamlined and Unified Settings Management! โ๏ธ We've completely revamped the configuration experience! The settings menu (-s or --setup) is now a more intuitive, centralized hub for all your DeepShell configurations. Whether you're:
Adding or reconfiguring LLM services (--llm)
Changing the default model for an active service (--model-change)
Managing Gemini API keys (--set-api-key)
Viewing your active setup (--show-full-conf)
Or even deleting your configuration (--delete-config) ...it's all managed through a clearer, more consistent interface. This also makes the initial setup smoother than ever.
Simplified Service Setup Defaults: When you set up or reconfigure your Ollama and Gemini services, we've enabled Markdown rendering by default, so you get the best output experience without extra steps.
Internal Refinements: We've made several internal code structure improvements for better maintainability, robustness, and to pave the way for future enhancements. This includes resolving some circular dependencies for a cleaner architecture.
We believe these updates will make DeepShell an even more powerful and pleasant tool for interacting with Large Language Models from your command line.
We encourage you to update to version 1.0.3 to take advantage of these new features and improvements. As always, your feedback is invaluable to us. Please let us know your thoughts, suggestions, or any issues you encounter.
๐ Announcing DeepShell v1.0
๐ Announcing DeepShell v1.0: Your Universal LLM Command-Line Interface! ๐
We are thrilled to announce the initial release of DeepShell v1.0! Say goodbye to juggling multiple tools and complex API integrations. DeepShell brings the power of Large Language Models (LLMs) directly into your terminal, offering a unified and efficient way to interact with cutting-edge AI.
What is DeepShell?
DeepShell is a versatile command-line program designed for developers, researchers, and power users. It seamlessly blends your local shell environment with the immense capabilities of various LLMs. Whether you're working with local Ollama instances or cloud-based services like Google's Gemini, DeepShell provides a streamlined, configurable, and intuitive interface.
Transform your command prompt into a conduit for deep AI intelligence. Generate code, summarize documents, brainstorm ideas, or craft creative textโall without leaving your terminal!
๐ Why DeepShell?
Unified Experience: Access different LLMs (Ollama, Gemini) without context switching.
Efficiency: Quickly query models and manage configurations directly from your command line.
Customizable: Tailor DeepShell to your preferred models and services.
Developer-Friendly: Built with Python, making it easy to understand and extend.
Local First, Cloud Ready: Perfect for local development with Ollama and scalable with cloud LLMs like Gemini.
โจ Key Features in v1.0:
Multi-LLM Support:
Seamlessly connect to Ollama servers (local or remote).
Integrate with Google Gemini API.
Easily switch between configured LLM services (-l).
Interactive Setup & Configuration:
User-friendly setup wizard (-s) for initial configuration.
Securely stores settings in ~/.deepshell/deepshell.conf.
Flexible Model Management:
List available models from your connected LLM service.
Set and change default models per service (-model).
Robust Gemini API Key Management:
Store and manage multiple Gemini API keys with user-defined nicknames.
Easily switch between active Gemini API keys (-set-key).
Display the currently active Gemini API key (-show-key).
Quick link to check your Gemini API usage (-gq).
Intuitive Querying:
Send queries directly from your command line (-q "Your query here").
Engaging progress animation while waiting for LLM responses.
User-Friendly Interface:
Clear, colored console output for enhanced readability.
Alphabetized and well-formatted help messages (-h).
View active configuration details (-show-config).
Option to delete your configuration (-d).
๐ค Supported LLMs:
Ollama: Connect to any Ollama instance (Llama, Mistral, etc.).
Google Gemini: Access Gemini models (e.g., gemini-pro) via the Google AI Studio API.
๐ Get Started with DeepShell v1.0!
Getting started is easy:
Ensure you have Python 3.7+ and pip installed.
Download or clone the DeepShell repository.
Install the necessary dependencies (like requests and chardet).
Run the initial setup wizard using the -s flag with the main.py script. This will guide you through configuring your first LLM service.
๐ Learn More:
Check out the README.md in the repository for detailed usage instructions and command-line options.
We're incredibly excited to share DeepShell v1.0 with the community! We believe it will significantly enhance your productivity and make interacting with LLMs a breeze. We welcome your feedback, suggestions, and contributions.
Happy Hacking!