Skip to content

[Fine-tuning][Inference] Integrate mistralai/Mixtral-8x7B-Instruct-v0.1 for CPU #790

[Fine-tuning][Inference] Integrate mistralai/Mixtral-8x7B-Instruct-v0.1 for CPU

[Fine-tuning][Inference] Integrate mistralai/Mixtral-8x7B-Instruct-v0.1 for CPU #790

name: PR
on:
pull_request:
branches:
- main
paths:
- '.github/**'
- 'docker/**'
- 'dev/docker/**'
- 'llm_on_ray/common/**'
- 'llm_on_ray/finetune/**'
- 'llm_on_ray/inference/**'
- 'llm_on_ray/rlhf/**'
- 'tools/**'
- 'pyproject.toml'
- 'tests/**'
jobs:
Lint:
uses: ./.github/workflows/workflow_lint.yml
Tests:
needs: Lint
uses: ./.github/workflows/workflow_tests.yml
Inference:
needs: Lint
uses: ./.github/workflows/workflow_inference.yml
Finetune:
needs: Lint
uses: ./.github/workflows/workflow_finetune.yml