Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1,774 changes: 1,774 additions & 0 deletions docs/md/api-%s.md

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
@@ -0,0 +1,137 @@
# page `md__tmp_github_reposRepoArchDocGenContext_Penify_dev_penify_cli_README` {#md__tmp_github_reposRepoArchDocGenContext_Penify_dev_penify_cli_README}

A CLI tool to generate smart commit messages, code documentation, and more.

Features

* Automatically generate documentation for your code

* Support for multiple programming languages

* Git hook integration for automatic documentation on commits

* Folder and file analysis

InstallationInstall from PyPI:

```cpp
pip install penify
```

UsagePenify CLI provides several subcommands for different functionalities, organized into basic commands (no login required) and advanced commands (login required).

Basic Commands (No login required)CommitGenerate smart commit messages using local LLM:

```cpp
penify commit [-m "Optional message"] [-e] [-d]
```

Options:

* `-m, --message`: Optional custom commit message

* `-e, --terminal`: Open editor to modify commit message before committing

* `-d, --description`: Generate commit message with both title and description (without this flag, only title is generated)

ConfigConfigure local LLM and JIRA settings:

```cpp
# Configure LLM settings
penify config llm --model MODEL_NAME [--api-base API_URL] [--api-key API_KEY]

# Configure LLM settings through web interface
penify config llm-web

# Configure JIRA settings
penify config jira --url JIRA_URL --username USERNAME --api-token TOKEN [--verify]

# Configure JIRA settings through web interface
penify config jira-web
```

Advanced Commands (Login required)LoginTo log in and obtain an API token:

```cpp
penify login
```

This command will open a browser window for authentication. After successful login, the API key will be saved locally for future use.

Documentation GenerationGenerate documentation for Git diff, files or folders:

```cpp
# Generate documentation for latest Git commit diff
penify docgen

# Generate documentation for specific file or folder
penify docgen -l /path/to/file/or/folder
```

Options:

* `-l, --location`: Path to specific file or folder for documentation generation (defaults to current directory)

Git Hook ManagementInstall or uninstall Git post-commit hooks:

```cpp
# Install Git hook
penify docgen install-hook [-l /path/to/repo]

# Uninstall Git hook
penify docgen uninstall-hook [-l /path/to/repo]
```

Options:

* `-l, --location`: Path to the Git repository (defaults to current directory)

AuthenticationPenify CLI uses an API token for authentication with advanced features.

If no token is available and you try to access an advanced feature, you'll be prompted to log in.

Local LLM ConfigurationFor commit message generation, Penify can use a local LLM. Configure it using:

```cpp
penify config llm --model MODEL_NAME --api-base API_URL --api-key API_KEY
```

Common configurations:

* OpenAI: `--model gpt-3.5-turbo --api-base [https://api.openai.com/v1](https://api.openai.com/v1) --api-key YOUR_KEY`

* Anthropic: `--model claude-2 --api-base [https://api.anthropic.com](https://api.anthropic.com) --api-key YOUR_KEY`

JIRA IntegrationConfigure JIRA integration to enhance commit messages with issue details:

```cpp
penify config jira --url https://your-domain.atlassian.net --username your-email@example.com --api-token YOUR_API_TOKEN
```

DevelopmentTo set up the development environment:

* Clone the repository:
```cpp
git clone https://github.com/SingularityX-ai/penify-cli.git
```

* Install the package in editable mode:
```cpp
pip install -e .
```

Running Tests
```cpp
pytest
```

LicenseThis project is licensed under the MIT License.

AuthorSuman Saurabh ([ss.sumansaurabh92@gmail.com](mailto:ss.sumansaurabh92@gmail.com))

ContributingContributions are welcome! Please feel free to submit a Pull Request.

IssuesIf you encounter any problems or have suggestions, please file an issue on the [GitHub repository](https://github.com/SingularityX-ai/penify/issues).

SupportFor automated API Documentation, Architecture Documentation, Code Documentation, Pull Request Documentation, or if you need a demo, please join our [Discord support channel](https://discord.gg/wqrc8JeV).

Original file line number Diff line number Diff line change
@@ -0,0 +1,150 @@
# page `md__tmp_github_reposRepoArchDocGenContext_Penify_dev_penify_cli_docs_commit_commands` {#md__tmp_github_reposRepoArchDocGenContext_Penify_dev_penify_cli_docs_commit_commands}

The `commit` command allows you to generate smart, AI-powered commit messages for your Git changes. This document explains all available options and combinations.

Basic Usage
```cpp
penify commit
```

By default, this command:

* Analyzes your staged Git changes

* Generates a concise commit title only

* Uses local LLM if configured, or falls back to Penify API

Command Options<tt>-m, --message</tt>Provide context for the commit message generation:

```cpp
penify commit -m "Fix login flow"
```

This hint helps the AI understand your intention and improves the quality of the generated message.

<tt>-e, --terminal</tt>Open an editor to review and edit the generated commit message before committing:

```cpp
penify commit -e
```

This opens your default Git editor with the generated message for review.

<tt>-d, --description</tt>Generate a detailed commit message with both title and description:

```cpp
penify commit -d
```

Without this flag, only the commit title is generated.

Option CombinationsYou can combine these options for different workflows:

Generate Title Only with Context
```cpp
penify commit -m "Update login UI"
```

Generate Title and Description with Context
```cpp
penify commit -m "Update login UI" -d
```

Generate and Edit Full Commit Message
```cpp
penify commit -d -e
```

Generate, Edit, and Provide Context
```cpp
penify commit -m "Refactor authentication" -d -e
```

LLM and JIRA IntegrationUsing Local LLMIf you've configured a local LLM using `penify config llm`, the commit command will automatically use it for message generation.

Benefits:

* Privacy: your code changes don't leave your machine

* Speed: no network latency

* Works offline

JIRA EnhancementIf you've configured JIRA integration using `penify config jira`, the commit command will:

* Detect JIRA issue references in your changes

* Fetch issue details from your JIRA instance

* Include issue information in the commit message

* Format the commit message according to JIRA's smart commit format

Example output:
```cpp
PROJ-123: Fix authentication bug in login flow

- Updated OAuth token validation
- Fixed session timeout handling
- Added unit tests for edge cases

[PROJ-123]
```

Configuration RequirementsFor the `commit` command to work:

* You must have configured either:

* Local LLM via `penify config llm`, OR

* Logged in via `penify login`

* For JIRA enhancement (optional):

* Configure JIRA via `penify config jira`

ExamplesBasic Commit with Default Settings
```cpp
# Stage your changes
git add .

# Generate commit message
penify commit

# Commit with the generated message
git commit -m "Generated message here"
```

Full Workflow with All Features
```cpp
# Stage your changes
git add .

# Generate detailed commit message with JIRA integration,
# provide context, and open editor for review
penify commit -m "Fix login issue" -d -e

# The commit is automatically completed after you save and exit the editor
```

TroubleshootingCommon Issues* **"No LLM model or API token provided"**

* Run `penify config llm` to configure a local LLM, or

* Run `penify login` to authenticate with Penify

* **"Failed to connect to JIRA"**

* Check your JIRA configuration with `cat ~/.penify`

* Verify your network connection

* Ensure your JIRA credentials are valid

* **"Error initializing LLM client"**

* Verify your LLM configuration settings

* Ensure the LLM API is accessible

Loading
Loading