Skip to content

Commit

Permalink
update readme
Browse files Browse the repository at this point in the history
  • Loading branch information
neph1 committed Nov 27, 2023
1 parent efab767 commit 3859892
Showing 1 changed file with 7 additions and 62 deletions.
69 changes: 7 additions & 62 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
</p>
<p align="center"> Welcome to LlamaTale! </p>

This fork uses an LLM backend to describe things in a more vivid way. (LLM not included)
This is a fork of the discontinued IF/MUD framework Tale. It uses an LLM backend to describe things in a more vivid way. (LLM not included)

The goal is to merge the structure of MU* frameworks with the creative and generative abilities of LLM's for the ultimate immersive roleplaying experience.

Expand All @@ -16,19 +16,16 @@ I try to test before each release, but I'm just one developer. If something does


Features (See https://github.com/neph1/LlamaTale/releases for a more up to date list):
* When the MUD describes something, through commands like 'look' or other verbs, it sends the original description as a prompt to the LLM. The LLM either has free hands to write as much as it wants, or for some things, it's limited to a certain amount of tokens, not to cause too many delays.
* When the MUD describes something, through commands like 'look' or other verbs, it sends the original description as a prompt to the LLM which (hopefully) generates a more vivid and verbose one.
* Dialogue replies generated by LLM, based on story, characters, sentiment.
* The LLM (tries to) analyze whether items have been passed during the dialogue, and trigger it in the MUD.
* Since v0.8.0, two characters are entirely generated by the LLM, and sent to walk around The Prancing Llama.
* Since https://github.com/neph1/LlamaTale/releases/tag/v0.13.0, it can generate a wholly custom story based on prompts.
* Combat is decided by the MUD, then acted out by the LLM.
* It has a 'rolling memory' of previous generations, this is inserted before the instructions in the prompt with each request. This should help it keep the context. (v0.8.0 Currently turned off in most cases, as it forced the prompt too much.)
* Characters have a short memory for conversations to keep them on track, as well as a flexible 'sentiment' towards other characters.
* Locations outside The Prancing Llama are entirely generated by the LLM.
* Characters have a memories of past events and conversations, as well as a flexible 'sentiment' towards other characters.
* Locations outside (south of) The Prancing Llama are entirely generated by the LLM.
* Support for KoboldCpp and OpenAI API based backends.
* Since https://github.com/neph1/LlamaTale/releases/tag/v0.13.0, it can generate a wholly custom story based on some input prompts.


Example:
Old Example:

Here's the kind of output you can expect. "Original" is the written description. "Generated" is what the LLM produces.

Expand Down Expand Up @@ -67,7 +64,7 @@ Scenario background story: After a grizzling journey through the snow storm you
eager to join in the merriment and warmth that filled the room.


Excerpt from talking to Elid (prompting with his ‘character card’):
Excerpt from talking to Elid:


You say: elid: ’what are you doing here?’.
Expand Down Expand Up @@ -114,55 +111,3 @@ You say: elid: ’maybe you’re right.. do you charge a fee for your
you think? Are you willing to take a risk for something truly
unique?””






ORIGINAL README for Tale (public archive):


----------------------



[![saythanks](https://img.shields.io/badge/say-thanks-ff69b4.svg)](https://saythanks.io/to/irmen)
[![Build Status](https://travis-ci.org/irmen/Tale.svg?branch=master)](https://travis-ci.org/irmen/Tale)
[![Latest Version](https://img.shields.io/pypi/v/tale.svg)](https://pypi.python.org/pypi/tale/)

![Tale logo](docs/source/_static/tale-large.png)

'Tale' - mud, mudlib & interactive fiction framework [frozen]
=============================================================

This software is copyright (c) by Irmen de Jong ([email protected]).

This software is released under the GNU LGPL v3 software license.
This license, including disclaimer, is available in the 'LICENSE.txt' file.



Tale requires Python 3.5 or newer.
(If you have an older version of Python, stick to Tale 2.8 or older, which still supports Python 2.7 as well)

Required third party libraries:
- ``appdirs`` (to load and save games and config data in the correct folder).
- ``colorama`` (for stylized console output)
- ``serpent`` (to be able to create save game data from the game world)
- ``smartypants`` (for nicely quoted string output)

Optional third party library:
- ``prompt_toolkit`` (provides a nicer console text interface experience)

Read the documentation for more details on how to get started, see http://tale.readthedocs.io/

EXAMPLE STORIES
---------------

There is a trivial example built into tale, you can start it when you have the library installed
by simply typing: ``python -m tale.demo.story``

On github and in the source distribution there are several much larger [example stories](stories/) and MUD examples.
* 'circle' - MUD that interprets CircleMud's data files and builds the world from those
* 'demo' - a random collection of stuff including a shop with some people
* 'zed_is_me' - a small single player (interactive fiction) survival adventure

0 comments on commit 3859892

Please sign in to comment.