OpenAI适配器
很多人最开始使用OpenAI,后面也希望探索其他模型。LangChain与许多模型提供商的集成使这一切变得容易。虽然LangChain有自己的消息和模型API,但我们还是尽可能地简化了探索其他模型的过程,通过提供一个适配器来将LangChain模型适配到OpenAI API中。
目前,这只处理输出,不返回其他信息(令牌计数、停止原因等)。
import openai
from langchain.adapters import openai as lc_openai
API参考:
- openai 来自
langchain.adapters
ChatCompletion.create
messages = [{"role": "user", "content": "hi"}]
原始的OpenAI调用
result = openai.ChatCompletion.create(
messages=messages,
model="gpt-3.5-turbo",
temperature=0
)
result["choices"][0]['message'].to_dict_recursive()
{'role': 'assistant', 'content': 'Hello! How can I assist you today?'}
LangChain OpenAI包装器调用
lc_result = lc_openai.ChatCompletion.create(
messages=messages,
model="gpt-3.5-turbo",
temperature=0
)
lc_result["choices"][0]['message']
{'role': 'assistant', 'content': 'Hello! How can I assist you today?'}
更换模型提供商
lc_result = lc_openai.ChatCompletion.create(
messages=messages,
model="claude-2",
temperature=0,
provider="ChatAnthropic"
)
lc_result["choices"][0]['message']
{'role': 'assistant', 'content': ' Hello!'}
ChatCompletion.stream
原始的OpenAI调用
for c in openai.ChatCompletion.create(
messages = messages,
model="gpt-3.5-turbo",
temperature=0,
stream=True
):
print(c["choices"][0]['delta'].to_dict_recursive())
{'role': 'assistant', 'content': ''}
{'content': 'Hello'}
{'content': '!'}
{'content': ' How'}
{'content': ' can'}
{'content': ' I'}
{'content': ' assist'}
{'content': ' you'}
{'content': ' today'}
{'content': '?'}
{}
LangChain OpenAI包装器调用
for c in lc_openai.ChatCompletion.create(
messages = messages,
model="gpt-3.5-turbo",
temperature=0,
stream=True
):
print(c["choices"][0]['delta'])
{'role': 'assistant', 'content': ''}
{'content': 'Hello'}
{'content': '!'}
{'content': ' How'}
{'content': ' can'}
{'content': ' I'}
{'content': ' assist'}
{'content': ' you'}
{'content': ' today'}
{'content': '?'}
{}
更换模型提供商
for c in lc_openai.ChatCompletion.create(
messages = messages,
model="claude-2",
temperature=0,
stream=True,
provider="ChatAnthropic",
):
print(c["choices"][0]['delta'])
{'role': 'assistant', 'content': ' Hello'}
{'content': '!'}
{}