-
-
Notifications
You must be signed in to change notification settings - Fork 2
feat: add Gemma 3 and better command and LLM structure #36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🦋 Changeset detectedLatest commit: 4e423dc The changes in this PR will be included in the next version bump. This PR includes changesets to release 1 package
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
Deploying bashbuddy-landing with
|
Latest commit: |
7f1a5aa
|
Status: | ✅ Deploy successful! |
Preview URL: | https://366244a0.bashbuddy-landing.pages.dev |
Branch Preview URL: | https://pol-w-52-add-gemma-3.bashbuddy-landing.pages.dev |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR introduces support for Gemma 3 along with a refactoring of the command and LLM inference structure to improve maintainability and ease of model integration.
- Updated role enums from "assistant" to "model" across multiple modules.
- Refactored LLM interfaces and inference functions to work with an array of messages instead of separate prompt strings.
- Added new model constants and enhanced the CLI command structure with conversation state handling.
Reviewed Changes
Copilot reviewed 9 out of 15 changed files in this pull request and generated no comments.
Show a summary per file
File | Description |
---|---|
packages/redis/src/index.ts | Changed role enum and updated UUID generation method. |
packages/api/src/router/chat.ts | Refactored GroqLLM interface and message handling for LLM inputs. |
packages/agent/src/index.ts | Updated LLM interface to use messages arrays and reordered prompt exports. |
apps/cli/src/utils/models.ts | Added new model constants (including Gemma 3) and removed duplicate entries. |
apps/cli/src/llms/parser.ts | Enhanced YAML response parsing with code block marker removal. |
apps/cli/src/llms/localllm.ts | Updated infer method to work with message arrays and added system prompt handling. |
apps/cli/src/commands/ask.ts | Refactored CLI command flow to use a conversation state and improved inference processing. |
.changeset/stale-buckets-roll.md | Updated the changeset note to reflect new Gemma 3 support and command structure improvements. |
Files not reviewed (6)
- apps/cli/package.json: Language not supported
- apps/landing/src/components/Header.svelte: Language not supported
- apps/landing/src/lib/posts/gemma-3-arrives-to-bashbuddy.svx: Language not supported
- apps/landing/src/routes/blog/[slug]/+page.svelte: Language not supported
- packages/redis/package.json: Language not supported
- pnpm-lock.yaml: Language not supported
Comments suppressed due to low confidence (3)
apps/cli/src/utils/models.ts:92
- [nitpick] Consider removing profanity from the comment to maintain a professional tone.
// It's fucking stupid
apps/cli/src/llms/localllm.ts:78
- [nitpick] Consider avoiding in-place mutation of the last user message; instead, create a new message object when modifying its content to prevent unintended side effects in the conversation state.
if (!supportsSystemPrompt(this.model) && messages.length == 2) {
packages/api/src/router/chat.ts:57
- [nitpick] Consider adding a comment explaining why the 'model' role is being converted to 'assistant' to clarify the intent and maintain consistency for future developers.
role: message.role === "model" ? "assistant" : message.role,
This PR adds Gemma 3, some improvements to the command code, and better LLM structure. Previously we were doing some weird things, now we're abstracting more to make it easy to add models.