Skip to content

Commit

Permalink
🚀 feat: Support for GPT-3.5 Turbo/0125 Model (#1704)
Browse files Browse the repository at this point in the history
* 🚀 feat: Support for GPT-3.5 Turbo/0125 Model

* ci: fix tx test
  • Loading branch information
danny-avila authored Feb 2, 2024
1 parent 30e143e commit 8479ac7
Show file tree
Hide file tree
Showing 7 changed files with 15 additions and 4 deletions.
4 changes: 2 additions & 2 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ GOOGLE_KEY=user_provided
#============#

OPENAI_API_KEY=user_provided
# OPENAI_MODELS=gpt-3.5-turbo-0301,gpt-3.5-turbo,gpt-4,gpt-4-0613,gpt-4-vision-preview,gpt-3.5-turbo-0613,gpt-3.5-turbo-16k-0613,gpt-4-0125-preview,gpt-4-turbo-preview,gpt-4-1106-preview,gpt-3.5-turbo-1106,gpt-3.5-turbo-instruct,gpt-3.5-turbo-instruct-0914,gpt-3.5-turbo-16k
# OPENAI_MODELS=gpt-3.5-turbo-0125,gpt-3.5-turbo-0301,gpt-3.5-turbo,gpt-4,gpt-4-0613,gpt-4-vision-preview,gpt-3.5-turbo-0613,gpt-3.5-turbo-16k-0613,gpt-4-0125-preview,gpt-4-turbo-preview,gpt-4-1106-preview,gpt-3.5-turbo-1106,gpt-3.5-turbo-instruct,gpt-3.5-turbo-instruct-0914,gpt-3.5-turbo-16k

DEBUG_OPENAI=false

Expand All @@ -127,7 +127,7 @@ DEBUG_OPENAI=false
# Plugins #
#============#

# PLUGIN_MODELS=gpt-4,gpt-4-turbo-preview,gpt-4-0125-preview,gpt-4-1106-preview,gpt-4-0613,gpt-3.5-turbo,gpt-3.5-turbo-1106,gpt-3.5-turbo-0613
# PLUGIN_MODELS=gpt-4,gpt-4-turbo-preview,gpt-4-0125-preview,gpt-4-1106-preview,gpt-4-0613,gpt-3.5-turbo,gpt-3.5-turbo-0125,gpt-3.5-turbo-1106,gpt-3.5-turbo-0613

DEBUG_PLUGINS=true

Expand Down
3 changes: 3 additions & 0 deletions api/models/tx.js
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ const tokenValues = {
'16k': { prompt: 3, completion: 4 },
'gpt-3.5-turbo-1106': { prompt: 1, completion: 2 },
'gpt-4-1106': { prompt: 10, completion: 30 },
'gpt-3.5-turbo-0125': { prompt: 0.5, completion: 1.5 },
};

/**
Expand All @@ -29,6 +30,8 @@ const getValueKey = (model, endpoint) => {

if (modelName.includes('gpt-3.5-turbo-16k')) {
return '16k';
} else if (modelName.includes('gpt-3.5-turbo-0125')) {
return 'gpt-3.5-turbo-0125';
} else if (modelName.includes('gpt-3.5-turbo-1106')) {
return 'gpt-3.5-turbo-1106';
} else if (modelName.includes('gpt-3.5')) {
Expand Down
3 changes: 3 additions & 0 deletions api/models/tx.spec.js
Original file line number Diff line number Diff line change
Expand Up @@ -90,6 +90,9 @@ describe('getMultiplier', () => {
expect(getMultiplier({ tokenType: 'completion', model: 'gpt-4-turbo-vision-preview' })).toBe(
tokenValues['gpt-4-1106'].completion,
);
expect(getMultiplier({ tokenType: 'completion', model: 'gpt-3.5-turbo-0125' })).toBe(
tokenValues['gpt-3.5-turbo-0125'].completion,
);
});

it('should return defaultRate if derived valueKey does not match any known patterns', () => {
Expand Down
1 change: 1 addition & 0 deletions api/utils/tokens.js
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,7 @@ const openAIModels = {
'gpt-3.5-turbo-16k': 16375, // -10 from max
'gpt-3.5-turbo-16k-0613': 16375, // -10 from max
'gpt-3.5-turbo-1106': 16375, // -10 from max
'gpt-3.5-turbo-0125': 16375, // -10 from max
'mistral-': 31990, // -10 from max
};

Expand Down
3 changes: 3 additions & 0 deletions api/utils/tokens.spec.js
Original file line number Diff line number Diff line change
Expand Up @@ -92,6 +92,9 @@ describe('getModelMaxTokens', () => {
expect(getModelMaxTokens('gpt-4-0125-preview')).toBe(
maxTokensMap[EModelEndpoint.openAI]['gpt-4-0125'],
);
expect(getModelMaxTokens('gpt-3.5-turbo-0125')).toBe(
maxTokensMap[EModelEndpoint.openAI]['gpt-3.5-turbo-0125'],
);
});

test('should return correct tokens for Anthropic models', () => {
Expand Down
4 changes: 2 additions & 2 deletions docs/install/configuration/dotenv.md
Original file line number Diff line number Diff line change
Expand Up @@ -317,7 +317,7 @@ DEBUG_OPENAI=false
- Leave it blank or commented out to use internal settings.

```bash
OPENAI_MODELS=gpt-3.5-turbo-0301,gpt-3.5-turbo,gpt-4,gpt-4-0613,gpt-4-vision-preview,gpt-3.5-turbo-0613,gpt-3.5-turbo-16k-0613,gpt-4-0125-preview,gpt-4-turbo-preview,gpt-4-1106-preview,gpt-3.5-turbo-1106,gpt-3.5-turbo-instruct,gpt-3.5-turbo-instruct-0914,gpt-3.5-turbo-16k
OPENAI_MODELS=gpt-3.5-turbo-0125,gpt-3.5-turbo-0301,gpt-3.5-turbo,gpt-4,gpt-4-0613,gpt-4-vision-preview,gpt-3.5-turbo-0613,gpt-3.5-turbo-16k-0613,gpt-4-0125-preview,gpt-4-turbo-preview,gpt-4-1106-preview,gpt-3.5-turbo-1106,gpt-3.5-turbo-instruct,gpt-3.5-turbo-instruct-0914,gpt-3.5-turbo-16k
```

- Titling is enabled by default when initiating a conversation.
Expand Down Expand Up @@ -383,7 +383,7 @@ Here are some useful documentation about plugins:
- Identify the available models, separated by commas **without spaces**. The first model in the list will be set as default. Leave it blank or commented out to use internal settings.

```bash
PLUGIN_MODELS=gpt-4,gpt-4-turbo-preview,gpt-4-0125-preview,gpt-4-1106-preview,gpt-4-0613,gpt-3.5-turbo,gpt-3.5-turbo-1106,gpt-3.5-turbo-0613
PLUGIN_MODELS=gpt-4,gpt-4-turbo-preview,gpt-4-0125-preview,gpt-4-1106-preview,gpt-4-0613,gpt-3.5-turbo,gpt-3.5-turbo-0125,gpt-3.5-turbo-1106,gpt-3.5-turbo-0613
```

- Set to false or comment out to disable debug mode for plugins
Expand Down
1 change: 1 addition & 0 deletions packages/data-provider/src/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -91,6 +91,7 @@ export const defaultModels = {
'claude-instant-1-100k',
],
[EModelEndpoint.openAI]: [
'gpt-3.5-turbo-0125',
'gpt-3.5-turbo-16k-0613',
'gpt-3.5-turbo-16k',
'gpt-4-turbo-preview',
Expand Down

0 comments on commit 8479ac7

Please sign in to comment.