🧩 Quizzes and Adventures 🏰 with Character Codex and llamafile


Β Β Β Β Β Β 


Let’s build something fun with Character Codex, a newly released dataset featuring popular characters from a wide array of media types and genres…

We’ll be using Haystack for orchestration and llamafile to run our models locally.

We will first build a simple quiz game, in which the user is asked to guess the character based on some clues. Then we will try to get two characters to interact in a chat and maybe even have an adventure together!

Preparation

Install dependencies

! pip install haystack-ai datasets

Load and look at the Character Codex dataset

from datasets import load_dataset

dataset = load_dataset("NousResearch/CharacterCodex", split="train")
/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_token.py:89: UserWarning: 
The secret `HF_TOKEN` does not exist in your Colab secrets.
To authenticate with the Hugging Face Hub, create a token in your settings tab (https://huggingface.co/settings/tokens), set it as secret in your Google Colab and restart your session.
You will be able to reuse this secret in all of your notebooks.
Please note that authentication is recommended but still optional to access public models or datasets.
  warnings.warn(



Downloading readme:   0%|          | 0.00/4.35k [00:00<?, ?B/s]



Downloading data:   0%|          | 0.00/11.2M [00:00<?, ?B/s]



Generating train split: 0 examples [00:00, ? examples/s]
len(dataset)
15939
dataset[0]
{'media_type': 'Webcomics',
 'genre': 'Fantasy Webcomics',
 'character_name': 'Alana',
 'media_source': 'Saga',
 'description': 'Alana is one of the main characters from the webcomic "Saga." She is a strong-willed and fiercely protective mother who is on the run with her family in a war-torn galaxy. The story blends elements of fantasy and science fiction, creating a rich and complex narrative.',
 'scenario': "You are a fellow traveler in the galaxy needing help, and Alana offers her assistance while sharing stories of her family's struggles and triumphs."}

Ok, each row of this dataset contains some information about a character. It also includes a creative scenario, which we will not use.

llamafile: download and run the model

For our experiments, we will be using the Llama-3-8B-Instruct model: a small but good language model.

llamafile is a project by Mozilla that simplifies access to LLMs. It wraps both the model and the inference engine in a single executable file.

We will use it to run our model.

llamafile is meant to run on standard computers. We will do some tricks to make it work on Colab. For instructions on how to run it on your PC, check out the docs and Haystack-llamafile integration page.

# download the model
!wget "https://huggingface.co/Mozilla/Meta-Llama-3-8B-Instruct-llamafile/resolve/main/Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile"
--2024-06-20 09:53:30--  https://huggingface.co/Mozilla/Meta-Llama-3-8B-Instruct-llamafile/resolve/main/Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile
Resolving huggingface.co (huggingface.co)... 18.239.50.103, 18.239.50.80, 18.239.50.49, ...
Connecting to huggingface.co (huggingface.co)|18.239.50.103|:443... connected.
HTTP request sent, awaiting response... 302 Found
Location: https://cdn-lfs-us-1.huggingface.co/repos/e3/ee/e3eefe425bce2ecb595973e24457616c48776aa0665d9bab33a29b582f3dfdf0/23365cb45398a3c568dda780a404b5f9a847b865d8341ec500ca3063a1f99eed?response-content-disposition=inline%3B+filename*%3DUTF-8%27%27Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile%3B+filename%3D%22Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile%22%3B&Expires=1719136410&Policy=eyJTdGF0ZW1lbnQiOlt7IkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTcxOTEzNjQxMH19LCJSZXNvdXJjZSI6Imh0dHBzOi8vY2RuLWxmcy11cy0xLmh1Z2dpbmdmYWNlLmNvL3JlcG9zL2UzL2VlL2UzZWVmZTQyNWJjZTJlY2I1OTU5NzNlMjQ0NTc2MTZjNDg3NzZhYTA2NjVkOWJhYjMzYTI5YjU4MmYzZGZkZjAvMjMzNjVjYjQ1Mzk4YTNjNTY4ZGRhNzgwYTQwNGI1ZjlhODQ3Yjg2NWQ4MzQxZWM1MDBjYTMwNjNhMWY5OWVlZD9yZXNwb25zZS1jb250ZW50LWRpc3Bvc2l0aW9uPSoifV19&Signature=swp5azcPl0FOe5CuStFZn1hmF0SHimPUOLwOqHd2ZAnoFJuYKDjhK7ESplRDWJdma9QWYOQyaCG23wkX18urieav%7E8OdxzwLaKLhL2YFx3L6RMwGEKWjrG-ql-LDfd2I1U4AcSXJZR5zHSBDmYql9M9hXKsvXHVkraIMS-cDx0ihj3s7yu4gbjUfE3SPg49aStq00ORcQnDV90mXxeheM6UjRymLRBdlxI3PCpAjzvyExcmZSgBU5vCnKtAEy5b65%7EzQoX5TVQTzQXjE9x8Qr2%7EAONSc7wy671HWYPRKNgZDrH3NJy90uFp38GKiQtab7hAy6fUlL358OQYhHzu4-Q__&Key-Pair-Id=K2FPYV99P2N66Q [following]
--2024-06-20 09:53:30--  https://cdn-lfs-us-1.huggingface.co/repos/e3/ee/e3eefe425bce2ecb595973e24457616c48776aa0665d9bab33a29b582f3dfdf0/23365cb45398a3c568dda780a404b5f9a847b865d8341ec500ca3063a1f99eed?response-content-disposition=inline%3B+filename*%3DUTF-8%27%27Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile%3B+filename%3D%22Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile%22%3B&Expires=1719136410&Policy=eyJTdGF0ZW1lbnQiOlt7IkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTcxOTEzNjQxMH19LCJSZXNvdXJjZSI6Imh0dHBzOi8vY2RuLWxmcy11cy0xLmh1Z2dpbmdmYWNlLmNvL3JlcG9zL2UzL2VlL2UzZWVmZTQyNWJjZTJlY2I1OTU5NzNlMjQ0NTc2MTZjNDg3NzZhYTA2NjVkOWJhYjMzYTI5YjU4MmYzZGZkZjAvMjMzNjVjYjQ1Mzk4YTNjNTY4ZGRhNzgwYTQwNGI1ZjlhODQ3Yjg2NWQ4MzQxZWM1MDBjYTMwNjNhMWY5OWVlZD9yZXNwb25zZS1jb250ZW50LWRpc3Bvc2l0aW9uPSoifV19&Signature=swp5azcPl0FOe5CuStFZn1hmF0SHimPUOLwOqHd2ZAnoFJuYKDjhK7ESplRDWJdma9QWYOQyaCG23wkX18urieav%7E8OdxzwLaKLhL2YFx3L6RMwGEKWjrG-ql-LDfd2I1U4AcSXJZR5zHSBDmYql9M9hXKsvXHVkraIMS-cDx0ihj3s7yu4gbjUfE3SPg49aStq00ORcQnDV90mXxeheM6UjRymLRBdlxI3PCpAjzvyExcmZSgBU5vCnKtAEy5b65%7EzQoX5TVQTzQXjE9x8Qr2%7EAONSc7wy671HWYPRKNgZDrH3NJy90uFp38GKiQtab7hAy6fUlL358OQYhHzu4-Q__&Key-Pair-Id=K2FPYV99P2N66Q
Resolving cdn-lfs-us-1.huggingface.co (cdn-lfs-us-1.huggingface.co)... 18.239.94.84, 18.239.94.6, 18.239.94.3, ...
Connecting to cdn-lfs-us-1.huggingface.co (cdn-lfs-us-1.huggingface.co)|18.239.94.84|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 5768624435 (5.4G) [binary/octet-stream]
Saving to: β€˜Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile’

Meta-Llama-3-8B-Ins 100%[===================>]   5.37G  57.2MB/s    in 1m 40s  

2024-06-20 09:55:11 (54.8 MB/s) - β€˜Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile’ saved [5768624435/5768624435]
# make the llamafile executable
! chmod +x Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile

Running the model - relevant parameters:

  • --server: start an OpenAI-compatible server
  • --nobrowser: do not open the interactive interface in the browser
  • --port: port of the OpenAI-compatible server (in Colab, 8080 is already taken)
  • --n-gpu-layers: offload some layers to GPU for increased performance
  • --ctx-size: size of the prompt context
# we prepend "nohup" and postpend "&" to make the Colab cell run in background
! nohup ./Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile \
        --server \
        --nobrowser \
        --port 8081 \
        --n-gpu-layers 999 \
        --ctx-size 8192 \
        > llamafile.log &
nohup: redirecting stderr to stdout
# we check the logs until the server has been started correctly
!while ! grep -q "llama server listening" llamafile.log; do tail -n 5 llamafile.log; sleep 10; done

Let’s try to interact with the model.

Since the server is OpenAI-compatible, we can use an OpenAIChatGenerator.

from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack.utils import Secret

generator = OpenAIChatGenerator(
    api_key=Secret.from_token("sk-no-key-required"),  # for compatibility with the OpenAI API, a placeholder api_key is needed
    model="LLaMA_CPP",
    api_base_url="http://localhost:8081/v1",
    generation_kwargs = {"max_tokens": 50}
)

generator.run(messages=[ChatMessage.from_user("How are you?")])
{'replies': [ChatMessage(content="I'm just a language model, I don't have emotions or feelings like humans do. However, I'm functioning properly and ready to assist you with any questions or tasks you may have. How can I help you today?<|eot_id|>", role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'LLaMA_CPP', 'index': 0, 'finish_reason': 'stop', 'usage': {'completion_tokens': 46, 'prompt_tokens': 14, 'total_tokens': 60}})]}

πŸ•΅οΈ Mystery Character Quiz

Now that everything is in place, we can build a simple game in which a random character is selected from the dataset and the LLM is used to create hints for the player.

Hint generation pipeline

This simple pipeline includes a ChatPromptBuilder and a OpenAIChatGenerator.

Thanks to the template messages, we can include the character information in the prompt and also previous hints to avoid duplicate hints.

from haystack import Pipeline
from haystack.dataclasses import ChatMessage
from haystack.utils import Secret

from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator

template_messages = [
    ChatMessage.from_system("You are a helpful assistant that gives brief hints about a character, without revealing the character's name."),
    ChatMessage.from_user("""Provide a brief hint (one fact only) for the following character.
                          {{character}}

                          Use the information provided, before recurring to your own knowledge.
                          Do not repeat previously given hints.

                          {% if previous_hints| length > 0 %}
                            Previous hints:
                            {{previous_hints}}
                          {% endif %}""")
]

chat_prompt_builder = ChatPromptBuilder(template=template_messages, required_variables=["character"])

generator = OpenAIChatGenerator(
    api_key=Secret.from_token("sk-no-key-required"),  # for compatibility with the OpenAI API, a placeholder api_key is needed
    model="LLaMA_CPP",
    api_base_url="http://localhost:8081/v1",
    generation_kwargs = {"max_tokens": 100}
)

hint_generation_pipeline = Pipeline()
hint_generation_pipeline.add_component("chat_prompt_builder", chat_prompt_builder)
hint_generation_pipeline.add_component("generator", generator)
hint_generation_pipeline.connect("chat_prompt_builder", "generator")
<haystack.core.pipeline.pipeline.Pipeline object at 0x7c0f4a07f580>
πŸš… Components
  - chat_prompt_builder: ChatPromptBuilder
  - generator: OpenAIChatGenerator
πŸ›€οΈ Connections
  - chat_prompt_builder.prompt -> generator.messages (List[ChatMessage])

The game

import random

MAX_HINTS = 3



random_character = random.choice(dataset)
# remove the scenario: we do not use it
del random_character["scenario"]

print("πŸ•΅οΈ Guess the character based on the hints!")

previous_hints = []

for hint_number in range(1, MAX_HINTS + 1):
    res = hint_generation_pipeline.run({"character": random_character, "previous_hints": previous_hints})
    hint = res["generator"]["replies"][0].content

    previous_hints.append(hint)
    print(f"✨ Hint {hint_number}: {hint}")


    guess = input("Your guess: \nPress Q to quit\n")

    if guess.lower() == 'q':
        break

    print("Guess: ", guess)

    if random_character['character_name'].lower() in guess.lower():
        print("πŸŽ‰ Congratulations! You guessed it right!")
        break
    else:
        print("❌ Wrong guess. Try again.")
else:
    print(f"πŸ™ Sorry, you've used all the hints. The character was {random_character['character_name']}.")
πŸ•΅οΈ Guess the character based on the hints!
✨ Hint 1: Here's a brief hint:

This actor has won an Academy Award for his role in a biographical sports drama film.<|eot_id|>
Your guess: 
Press Q to quit
Tom Cruise?
Guess:  Tom Cruise?
❌ Wrong guess. Try again.
✨ Hint 2: Here's a new hint:

This actor is known for his intense physical transformations to portray his characters, including a significant weight gain and loss for one of his most iconic roles.<|eot_id|>
Your guess: 
Press Q to quit
Brendan Fraser
Guess:  Brendan Fraser
❌ Wrong guess. Try again.
✨ Hint 3: Here's a new hint:

This actor has played a character who is a comic book superhero.<|eot_id|>
Your guess: 
Press Q to quit
Christian Bale
Guess:  Christian Bale
πŸŽ‰ Congratulations! You guessed it right!

πŸ’¬ 🀠 Chat Adventures

Let’s try something different now!

Character Codex is a large collection of characters, each with a specific description. Llama 3 8B Instruct is a good model, with some world knowledge.

We can try to combine them to simulate a dialogue and perhaps an adventure involving two different characters (fictional or real).

Character pipeline

Let’s create a character pipeline: ChatPromptBuilder + OpenAIChatGenerator.

This represents the core of our conversational system and will be invoked multiple times with different messages to simulate conversation.

from haystack import Pipeline
from haystack.dataclasses import ChatMessage, ChatRole
from haystack.utils import Secret

from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator

character_pipeline = Pipeline()
character_pipeline.add_component("chat_prompt_builder", ChatPromptBuilder(required_variables=["character_data"]))
character_pipeline.add_component("generator", OpenAIChatGenerator(
    api_key=Secret.from_token("sk-no-key-required"),  # for compatibility with the OpenAI API, a placeholder api_key is needed
    model="LLaMA_CPP",
    api_base_url="http://localhost:8081/v1",
    generation_kwargs = {"temperature": 1.5}
))
character_pipeline.connect("chat_prompt_builder", "generator")
<haystack.core.pipeline.pipeline.Pipeline object at 0x78dd00ce69e0>
πŸš… Components
  - chat_prompt_builder: ChatPromptBuilder
  - generator: OpenAIChatGenerator
πŸ›€οΈ Connections
  - chat_prompt_builder.prompt -> generator.messages (List[ChatMessage])

Messages

We define the most relevant messages to steer our LLM engine.

  • System message (template): this instructs the Language Model to chat and act as a specific character.

  • Start message: we need to choose an initial message (and a first speaking character) to spin up the conversation.

We also define the invert_roles utility function: for example, we want the first character to see the assistant messages from the second character as user messages, etc.

system_message = ChatMessage.from_system("""You are: {{character_data['character_name']}}.
                                            Description of your character: {{character_data['description']}}.
                                            Stick to your character's personality and engage in a conversation with an unknown person. Don't make long monologues.""")

start_message = ChatMessage.from_user("Hello, who are you?")
from typing import List

def invert_roles(messages: List[ChatMessage]):
    inverted_messages = []
    for message in messages:
        if message.is_from(ChatRole.USER):
            inverted_messages.append(ChatMessage.from_assistant(message.content))
        elif message.is_from(ChatRole.ASSISTANT):
            inverted_messages.append(ChatMessage.from_user(message.content))
        else:
          inverted_messages.append(message)
    return inverted_messages

The game

It’s time to choose two characters and play.

We choose the popular dancer Fred Astaire and Corporal Dwayne Hicks from the Alien saga.

from rich import print

first_character_data = dataset.filter(lambda x: x["character_name"] == "Fred Astaire")[0]
second_character_data = dataset.filter(lambda x: x["character_name"] == "Corporal Dwayne Hicks")[0]

first_name = first_character_data["character_name"]
second_name = second_character_data["character_name"]

# remove the scenario: we do not use it
del first_character_data["scenario"]
del second_character_data["scenario"]
MAX_TURNS = 20


first_character_messages = [system_message, start_message]
second_character_messages = [system_message]

turn = 1
print(f"{first_name} πŸ•Ί: {start_message.content}")

while turn < MAX_TURNS:
    second_character_messages=invert_roles(first_character_messages)
    new_message = character_pipeline.run({"template":second_character_messages, "template_variables":{"character_data":second_character_data}})["generator"]["replies"][0]
    second_character_messages.append(new_message)
    print(f"\n\n{second_name} πŸͺ–: {new_message.content}")

    turn += 1
    print("-"*20)

    first_character_messages=invert_roles(second_character_messages)
    new_message = character_pipeline.run({"template":first_character_messages, "template_variables":{"character_data":first_character_data}})["generator"]["replies"][0]
    first_character_messages.append(new_message)
    print(f"\n\n{first_name} πŸ•Ί: {new_message.content}")

    turn += 1
Fred Astaire πŸ•Ί: Hello, who are you?

Corporal Dwayne Hicks πŸͺ–: Just a survivor, looking for a way out of this mess. You with me? We gotta get out of 
here, those... things... are all over the place.<|eot_id|>
--------------------

Fred Astaire πŸ•Ί: (adjusting his top hat) Ah, my dear fellow, I'm Fred Astaire, a performer of song, dance, and wit.
I'm not quite sure what sort of "mess" you're referring to, but I'm always up for a challenge. However, I do hope 
it involves some dashing rescue, a clever escape, and perhaps a spirited tune or two. Are you prepared to join 
forces and see this predicament through with a bit of style and panache?<|eot_id|>

Corporal Dwayne Hicks πŸͺ–: (skeptical) Hold up, partner. We're in the middle of a firefight with giant killing 
machines here. This ain't no movie musical. We gotta keep our eyes open and our guns hot if we're gonna make it out
alive. I appreciate the bravado, Fred, but let's keep our priorities straight. You wanna help me take out these 
xenomorphs?<|eot_id|>
--------------------

Fred Astaire πŸ•Ί: (chuckling) Ah, my dear chap, you're right, of course. I suppose I got a bit carried away with the
romance of the situation. Xenomorphs, you say? Well, I suppose they're a bit more formidable than the usual 
assortment of chorus girls and gangsters I've had the pleasure of tangling with. (pats his pockets, checking for 
his cane) Now, I'm not one for firearms, but I do have a few tricks up my sleeve. That cane of mine may come in 
handy, don't you think? And I've always been rather good at thinking on my feet. Let's see... (taps chin 
thoughtfully) Perhaps we could use a bit of misdirection, a dash of distraction, and a healthy dose of 
old-fashioned showmanship to take out these creatures. What do you say, partner?<|eot_id|>

Corporal Dwayne Hicks πŸͺ–: (impressed) Alright, Fred, you might be more useful than I thought. That cane could come 
in handy for swatting at them, and your... showmanship could help distract them long enough for me to get a clear 
shot. Just remember, we're in this together, and we need to watch each other's backs. And don't even think about 
trying to do any fancy dancing or singing - we need to stay focused. Let's move out, and try to make a plan of 
attack. Stay sharp.<|eot_id|>
--------------------

Fred Astaire πŸ•Ί: (grinning) Ah, excellent! I do love a good partnership, and I must say, this is quite the 
adventure we're having! (pats his cane reassuringly) Don't worry, I won't get too carried away with the tap shoes 
just yet. (glances around, taking in the surroundings) Ah, yes... a plan of attack, you say? Well, I think I see an
opening... (spots something) Ah ha! There's a ventilation shaft just ahead, looks like it hasn't been touched yet. 
Why don't we make a run for it and try to lose them in there? We can regroup and come up with a new plan once we're
safely out of sight. What do you say, partner?<|eot_id|>

Corporal Dwayne Hicks πŸͺ–: (nodding) Alright, let's move! Stay close and keep your wits about you. We don't know 
what's on the other side of that shaft. (glances back at Fred) And try not to get too distracted, we need to keep 
our priorities straight. Move, move, move!<|eot_id|>
--------------------

Fred Astaire πŸ•Ί: (laughs) Oh, I'm not getting distracted, my dear chap! (jumps into action) I'm just making sure we
make a stylish exit, that's all! (darts towards the ventilation shaft, cane at the ready) Now, shall we make like a
couple of ghosts and disappear into the unknown? (smirks)<|eot_id|>

Corporal Dwayne Hicks πŸͺ–: (chuckles) Alright, alright, Fred! Let's do this! (follows close behind, keeping his eyes
scanning for any signs of danger) Stay sharp, partner!<|eot_id|>
--------------------

Fred Astaire πŸ•Ί: (grinning) Ah, sharp as a tack, my good fellow! (climbs up into the ventilation shaft, cane first)
And now, let's see where this adventure takes us! (disappears into the darkness, voice echoing back) Ta-ra, 
partner! Stay close behind!<|eot_id|>

Corporal Dwayne Hicks πŸͺ–: (muttering to himself) Great, just what I need. A showman leading the charge. (climbs up 
into the ventilation shaft, hand on the grip of his shotgun) Alright, let's get moving. And try to stay quiet, we 
don't know what's waiting for us up there. (follows Fred into the darkness, eyes adjusting to the dim 
light)<|eot_id|>
--------------------

Fred Astaire πŸ•Ί: (voice echoes back through the ventilation shaft) Ah, don't worry about me being quiet, my dear 
chap! I'm as stealthy as a ghost in a gauze veil! (pauses, listens intently) Ah, do you hear that? It sounds like 
we're not alone up here... (whispers) And I think it's getting closer...<|eot_id|>

Corporal Dwayne Hicks πŸͺ–: (stops in his tracks, listening intently) Shh, I hear it too. (raises his shotgun, ready 
for a fight) What do you see? How many of them are there? (voice is low and steady, focused on the threat 
ahead)<|eot_id|>
--------------------

Fred Astaire πŸ•Ί: (whispers back) Ah, my dear chap, I see... (pauses, eyes adjusting to the dark) ...at least three 
of them, I'd say. They're moving in tandem, like they're coordinated. (takes a deep breath) But don't worry, I have
an idea. (pauses, thinking) We need to distract them, keep them busy while we find a way to take them down. 
(produces a small flashlight from his pocket and flicks it on, shining it in a pattern that seems to be beckoning 
the creatures) Ah, watch this, my dear chap! (grins mischievously)<|eot_id|>

Corporal Dwayne Hicks πŸͺ–: (eyes widen in surprise) What the...? Fred, are you crazy?! (points the shotgun at the 
creatures, ready to fire) Get out of the way, they're moving towards us!<|eot_id|>
--------------------

Fred Astaire πŸ•Ί: (laughs) Ah, too late for that, my dear chap! (steps back, holding up his cane as a shield) We've 
got to see this through! (eyes shine with excitement) Trust me, I have a plan! (uses his cane to deflect a claw 
swipe, then uses the flashlight to blind the creature momentarily) Ah, gotcha! Now, take your shot, 
partner!<|eot_id|>

Corporal Dwayne Hicks πŸͺ–: (takes aim with the shotgun, fires)<|eot_id|>
--------------------

Fred Astaire πŸ•Ί: (ducking behind a nearby ventilation grille) Ah, excellent shot, my dear chap! (peeks out from 
behind the grille, assessing the situation) Looks like we've got two down, one to go... (grins) And it's starting 
to get a bit... (looks around) ...synchronized, don't you think? (winks)<|eot_id|>

Corporal Dwayne Hicks πŸͺ–: (crawls behind the grille, shotgun at the ready) Synchronized? You mean like they're 
planning something? (eyes the last creature warily) Don't think I haven't noticed, Fred. We need to finish this 
fast and get out of here.<|eot_id|>
--------------------

Fred Astaire πŸ•Ί: (eyes sparkling with mischief) Ah, yes, precisely, my dear chap! They are, indeed, planning 
something. And I think I have just the thing to disrupt their little dance... (pulls out a harmonica and begins to 
play a jaunty tune)<|eot_id|>

✨ Looks like a nice result.

Of course, you can select other characters (even randomly) and change the initial message.

The implementation is pretty basic and could be improved in many ways.

πŸ“š Resources

(Notebook by Stefano Fiorucci)