From 5c62f9215ad46010615019662dcf484684e1bb5d Mon Sep 17 00:00:00 2001 From: subrataghosh123 Date: Fri, 20 Dec 2024 19:35:41 +0530 Subject: [PATCH] IBM Granite knowledge being added through MD Signed-off-by: Subrata Ghosh --- .../granite/attribution.txt | 5 + .../large_language_models/granite/qna.yaml | 94 +++++++++++++++++++ 2 files changed, 99 insertions(+) create mode 100644 knowledge/technology/large_language_models/granite/attribution.txt create mode 100644 knowledge/technology/large_language_models/granite/qna.yaml diff --git a/knowledge/technology/large_language_models/granite/attribution.txt b/knowledge/technology/large_language_models/granite/attribution.txt new file mode 100644 index 000000000..666e0fa9c --- /dev/null +++ b/knowledge/technology/large_language_models/granite/attribution.txt @@ -0,0 +1,5 @@ +Title of work: IBM Granite knowledge +Link to work: https://en.wikipedia.org/wiki/IBM_Granite +Revision: https://en.wikipedia.org/w/index.php?title=IBM_Granite&oldid=1246833397 +License of the work: CC-BY-SA-4.0 +Creator names: Wikipedia Authors diff --git a/knowledge/technology/large_language_models/granite/qna.yaml b/knowledge/technology/large_language_models/granite/qna.yaml new file mode 100644 index 000000000..5f90c4b11 --- /dev/null +++ b/knowledge/technology/large_language_models/granite/qna.yaml @@ -0,0 +1,94 @@ +created_by: subrataghosh123 +version: 3 +domain: large-language-model +document_outline: Knowledge contribution for IBM Granite as test +seed_examples: + - context: >- + IBM Granite is a series of decoder-only AI foundation models created by + IBM. It was announced on September 7, 2023, and an initial paper was + published 4 days later. + questions_and_answers: + - question: WHat is IBM Granite + answer: >- + IBM Granite is a series of decoder-only AI foundation models created + by IBM + - question: When was IBM Granite announced + answer: IBM Granite was announced on September 7, 2023 + - question: What is a series of decoder-only AI foundation models created by IBM? + answer: IBM Granite + - context: >- + A foundation model is an AI model trained on broad data at scale such that + it can be adapted to a wide range of downstream tasks. + + Granite's first foundation models were Granite.13b.instruct and + Granite.13b.chat. The "13b" in their name comes from 13 billion, the + amount of parameters they have as models, lesser than most of the larger + models of the time. Later models vary from 3 to 34 billion parameters. + questions_and_answers: + - question: What is a foundation model ? + answer: >- + A foundation model is an AI model trained on broad data at scale such + that it can be adapted to a wide range of downstream tasks. + - question: What were Granite's first foundation models? + answer: Granite.13b.instruct and Granite.13b.chat + - question: 'What does "13b" in the name of the foundation model refer to? ' + answer: It refers to 13 billion + - context: >- + On May 6, 2024, IBM released the source code of four variations of Granite + Code Models under Apache 2, an open source permissive license that allows + completely free use, modification and sharing of the software, and put + them on Hugging Face for public use. According to IBM's own report, + Granite 8b outperforms Llama 3 on several coding related tasks within + similar range of parameters. + questions_and_answers: + - question: when did IBM release the source code for Granite Code MOdels + answer: May 6, 2024 + - question: What is Apache 2? + answer: >- + Apache 2 is an open source permissive license that allows completely + free use + - question: >- + What was the method by which IBM released the source code for Granite + Code Models + answer: >- + IBM released the Granite Code Models under Apache 2, an open source + permissive license that allows completely free use, modification and + sharing of the software, and put them on Hugging Face for public use. + - context: >- + According to IBM's own report, Granite 8b outperforms Llama 3 on several + coding related tasks within similar range of parameters. An open source is + permissive license that allows completely free use, modification and + sharing of the software. + questions_and_answers: + - question: Which is more performing between Granite 8b and Llama3 + answer: 'According to IBM''s own report, Granite 8b outperforms Llama 3 ' + - question: What are the areas in which Granite 8b outperforms Llama3 + answer: >- + Granite 8b outperforms Llama 3 on several coding related tasks within + similar range of parameters. + - question: What is Open Source + answer: >- + open source is permissive license that allows completely free use, + modification and sharing of the software. + - context: >- + IBM Granite was initially intended for use in the IBM's cloud-based data + and generative AI platform Watsonx along with other models, IBM opened the + source code of some code models. Granite models are trained on datasets + curated from Internet, academic publishings, code datasets, legal and + finance documents. + questions_and_answers: + - question: What was IBM orginally intended for? + answer: >- + IBM Granite was initially intended for use in the IBM's cloud-based + data and generative AI platform Watsonx along with other models + - question: What is Granite model trained on? + answer: >- + Granite models are trained on datasets curated from Internet, academic + publishings, code datasets, legal and finance documents. + - question: Is IBM Granite publicly available? + answer: IBM opened the source code of some code models including Granite. +document: + repo: https://github.com/subrataghosh123/taxonomy-knowledge-docs + commit: 512c3583bc936fa422b9ca865503274d41f943b1 + patterns: + - IBM_Granite-20241220T140517274.md