首页 > 编程问答 >如何立即取消使用 Ollama Python 库生成答案的 Asyncio 任务?

如何立即取消使用 Ollama Python 库生成答案的 Asyncio 任务?

时间:2024-07-22 04:56:02浏览次数:14  
标签:python multithreading asynchronous async-await ollama

我正在使用 Ollama 通过 Ollama Python API 从大型语言模型 (LLM) 生成答案。我想通过单击停止按钮取消响应生成。问题在于,只有当响应生成已经开始打印时,任务取消才会起作用。如果任务仍在处理并准备打印,则取消不起作用,并且无论如何都会打印响应。更具体地说,即使单击按钮后,此函数 prompt_mistral("Testing") 仍然执行并打印响应。

我的代码:

import ollama
import asyncio
import threading
from typing import Optional
import tkinter as tk

# Create the main window
root = tk.Tk()
root.title("Tkinter Button Example")

worker_loop: Optional[asyncio.AbstractEventLoop] = None
task_future: Optional[asyncio.Future] = None

async def get_answer_from_phi3():

    print("Trying")

    messages = [
            {"role": "system", "content": "Hello"}
        ]

    client = ollama.AsyncClient()
    
    stream = await client.chat(
    model='phi3',
    messages=messages,
    stream=True,
    options= {
    "top_k": 1})

    try:
        async for chunk in stream:
            # Store generated answer
           print(chunk['message']['content'], end='', flush=True)
            

    except asyncio.exceptions.CancelledError as e:
        print("Cancelled")
        pass

    except Exception as e:
        print(e)
        return "Sorry,vv an error occurred while processing your request."


async def prompt_mistral(query):
    messages = []
    messages.append({"role": "assistant", "content": "Write a song that celebrates the beauty, diversity, and importance of our planet, Earth. The song should evoke vivid imagery of the natural world, from lush forests and majestic mountains to serene oceans and vast deserts. It should capture the essence of Earth as a living, breathing entity that sustains all forms of life. Incorporate themes of harmony, unity, and interconnectedness, emphasizing how all elements of nature are intertwined and how humanity is an integral part of this complex web. The lyrics should reflect a sense of wonder and appreciation for the planet's resources and ecosystems, highlighting the delicate balance that sustains life. Include references to various landscapes, climates, and wildlife, painting a picture of Earth's diverse environments. The song should also touch on the responsibility we have to protect and preserve the planet for future generations, addressing issues like climate change, deforestation, pollution, and conservation efforts. Use poetic language and metaphors to convey the grandeur and fragility of Earth, and infuse the song with a hopeful and inspiring tone that encourages listeners to take action in safeguarding our shared home. The melody should be uplifting and emotionally resonant, complementing the powerful message of the lyrics"})
    generated_answer = ''
    try:
        client = ollama.AsyncClient()
        stream = await client.chat(
            model='mistral',
            messages=messages,
            stream=True,
            options= {
                "top_k": 1}
        )

        async for chunk in stream:
            # Store generated answer
            generated_answer += chunk['message']['content']
            print(chunk['message']['content'])
            
    
    except asyncio.exceptions.CancelledError as e:
        print("Cancelled reponse")
        return

    except Exception as e:
        print(e)

        return "Sorry,vv an error occurred while processing your request."

def prompt_llama(message):

    async def prompt():

        messages = []
        messages.append({"role": "assistant", "content": message})
        try:
            client = ollama.AsyncClient()
            stream = await client.chat(
                model='llama2',
                messages=messages,
                stream=True,
                options= {
                    "top_k": 1}
            )

            generated_answer = ''
            
            async for chunk in stream:
                # Store generated answer
                generated_answer += chunk['message']['content']
                print(chunk['message']['content'])    
        
            if "help" in generated_answer:
                await prompt_mistral("Testing")
            else:
                print(generated_answer)

        except asyncio.exceptions.CancelledError as e:
            print("Cancelled")
            return

        except Exception as e:
            print(e)

            return "Sorry,vv an error occurred while processing your request."

    def mistral_worker_function():
        global worker_loop, task_future
        worker_loop = asyncio.new_event_loop()
        task_future = worker_loop.create_task(prompt())
        worker_loop.run_until_complete(task_future) 

    print("Starting thread")
    thread = threading.Thread(target=mistral_worker_function)
    thread.start()
    client = ollama.AsyncClient()

# Define the function to be called when the button is pressed
def on_button_click():
    global worker_loop, task_future
    # the loop and the future are not threadsafe
    worker_loop.call_soon_threadsafe(
        lambda: task_future.cancel()
    )
    

    def phi3_worker_function():
        global worker_loop, task_future
        worker_loop = asyncio.new_event_loop()
        task_future = worker_loop.create_task(get_answer_from_phi3())
        worker_loop.run_until_complete(task_future)

    print("Starting thread")
    thread = threading.Thread(target=phi3_worker_function())
    thread.start()

# Create the button
button = tk.Button(root, text="Stop", command=on_button_click)

# Place the button on the window
button.pack(pady=20)

prompt_llama("Hi")

# Start the Tkinter event loop
root.mainloop()

代码中的问题在于,正在从不同的线程调用 asyncio 任务。这意味着取消按钮和 Ollama API 调用在不同的事件循环中运行。为了解决这个问题,需要确保两者都在同一个事件循环中运行,并且从同一个线程调用取消。

以下是如何修改代码的方法:

import ollama
import asyncio
import tkinter as tk
from typing import Optional

# Create the main window
root = tk.Tk()
root.title("Tkinter Button Example")

worker_loop: Optional[asyncio.AbstractEventLoop] = None
task_future: Optional[asyncio.Future] = None

async def get_answer_from_phi3():
    print("Trying")

    messages = [
        {"role": "system", "content": "Hello"}
    ]

    client = ollama.AsyncClient()
    stream = await client.chat(
        model='phi3',
        messages=messages,
        stream=True,
        options={"top_k": 1}
    )

    try:
        async for chunk in stream:
            # Store generated answer
            print(chunk['message']['content'], end='', flush=True)
    except asyncio.CancelledError:
        print("Cancelled")
    except Exception as e:
        print(e)
        return "Sorry, an error occurred while processing your request."

async def prompt_mistral(query):
    messages = [{"role": "assistant", "content": "Write a song that celebrates the beauty, diversity, and importance of our planet, Earth..."}]
    generated_answer = ''
    try:
        client = ollama.AsyncClient()
        stream = await client.chat(
            model='mistral',
            messages=messages,
            stream=True,
            options={"top_k": 1}
        )

        async for chunk in stream:
            # Store generated answer
            generated_answer += chunk['message']['content']
            print(chunk['message']['content'])
    except asyncio.CancelledError:
        print("Cancelled response")
        return
    except Exception as e:
        print(e)
        return "Sorry, an error occurred while processing your request."

async def prompt_llama(message):
    messages = [{"role": "assistant", "content": message}]
    try:
        client = ollama.AsyncClient()
        stream = await client.chat(
            model='llama2',
            messages=messages,
            stream=True,
            options={"top_k": 1}
        )

        generated_answer = ''
        async for chunk in stream:
            # Store generated answer
            generated_answer += chunk['message']['content']
            print(chunk['message']['content'])

        if "help" in generated_answer:
            await prompt_mistral("Testing")
        else:
            print(generated_answer)
    except asyncio.CancelledError:
        print("Cancelled")
        return
    except Exception as e:
        print(e)
        return "Sorry, an error occurred while processing your request."

def on_button_click():
    global task_future
    if task_future is not None and not task_future.done():
        task_future.cancel()

# Create the button
button = tk.Button(root, text="Stop", command=on_button_click)
button.pack(pady=20)

async def run_app():
    global task_future
    task_future = asyncio.create_task(prompt_llama("Hi"))
    await task_future

# Schedule the main function to run on the main event loop
root.after_idle(lambda: asyncio.ensure_future(run_app()))
# Start the Tkinter event loop
root.mainloop()

在这个修改后的代码中,我们创建了一个名为 run_app 的异步函数,它启动了我们的主要异步任务 prompt_llama 。然后,我们使用 root.after_idle 在 Tkinter 事件循环内安排 run_app 的执行。这意味着所有内容,包括按钮单击处理程序和 Ollama API 调用,都在同一个事件循环中运行。现在,当按下停止按钮时,它应该可以立即取消 Ollama API 调用,无论其执行状态如何。

此外,我们删除了不必要的线程,并简化了代码以提高可读性。

标签:python,multithreading,asynchronous,async-await,ollama
From: 78767823

相关文章

  • 使用 np.array 索引过滤 Python 列表
    谁能向我解释一下列表self.data如何使用numpy索引数组selec_idx进行索引/过滤?完整的课程可在https://github.com/kaidic/LDAM-DRW/blob/master/imbalance_cifar.pydefgen_imbalanced_data(self,img_num_per_cls):new_data=[]n......
  • 尝试在cmd(python)中安装turtle
    当我在cmd中编写pipinstallturtle后,这条消息出现在我面前。有人知道该怎么办吗?C:\>pipinstallturtleCollectingturtleUsingcachedturtle-0.0.2.tar.gz(11kB)ERROR:Commanderroredoutwithexitstatus1:command:'C:\Users\Bruger\App......
  • Python + Polars:如何有效地“fill_null”仅 DataFrame 的某些列?
    例如,假设我只想fill_null(strategy="zero")DataFrame的数字列。我当前的策略是这样做:final_df=df.select(cs.exclude(cs.numeric()))final_df.with_columns(df.select(cs.numeric()).fill_null(strategy="zero"))是否有替代的、更惯用或更有效的方法来实现我......
  • 如何找到 Windows 上 Python 的安装位置?
    我想找出我的Python在Windows上的安装路径。例如:C:\Python25如何找到Python的安装位置?在Windows上查找Python安装位置的方法有很多种:1.使用where命令(推荐方法):打开命令提示符(cmd)。输入以下命令并按回车键:wherepython......
  • Python Pandas:如何解析没有行结尾的 CSV
    如何解析没有行结尾的CSV文件?示例数据sep=;Id;Date;1;1980-02-29;2;1980-03-10;3;;可以使用lineterminator参数指定行终止符来解析没有行尾的CSV文件。在这种情况下,你可以将lineterminator设置为一个空字符串('')。以下是使用......
  • Python 中的 azure.ai.translation.document - 无法使用当前权限访问源文档位置
    我在使用Python中的azure.ai.translation.document库时遇到问题。我已经在Azure中设置了翻译服务以及带有两个容器的Blob存储。我已使用“用户委托密钥”创建了SAS连接。一种用于源(读取/列表),一种用于目标(写入/列表)。例如我尝试运行Python程序,但源b......
  • 博文标题:探索Python中的元编程:装饰器的魔法
    引言在Python的世界里,装饰器(Decorators)是一种非常强大的特性,它允许程序员在不修改原始函数代码的情况下,为函数添加新的功能。这种机制不仅增强了代码的可读性和可维护性,还提供了高度的灵活性和扩展性。本文将深入探讨装饰器的基本概念、工作原理以及如何利用它们来简化和......
  • 使用Python 和 Selenium 抓取 酷狗 音乐专辑 附源码
    在这篇博客中,我将分享如何使用Python和Selenium抓取酷狗音乐网站上的歌曲信息。我们将使用BeautifulSoup解析HTML内容,并提取歌曲和专辑信息。准备工作首先,我们需要安装一些必要的库:pipinstallrequestsbeautifulsoup4selenium代码实现以下是完整的代码:importosi......
  • 基于Django+Python的网易新闻与评论舆情热点分析平台
    一、引言在信息爆炸的时代,人们每天面对海量的信息流,如何从中筛选出有价值的信息并进行深度分析变得尤为重要。基于Django+Python的网易新闻与评论舆情热点分析平台,旨在为用户提供一个高效的数据分析工具,帮助用户快速理解新闻趋势、情感倾向以及公众对特定事件的看法。通过自......
  • Python解释器详解及其应用场景
    Python解释器及其应用场景一、Python解释器概述Python解释器是Python程序运行的核心,它负责读取Python代码(即.py文件)并将其转换为机器语言,从而使计算机能够执行。简单来说,Python解释器就像是Python代码与计算机之间的翻译官,把Python代码翻译成计算机能懂的语言。Python解释器......