AzureML Online Endpoint
AzureML 是一个用于构建、训练和部署机器学习模型的平台。用户可以在模型目录中探索要部署的模型类型,该目录提供了 Azure 基础模型和 OpenAI 模型。Azure 基础模型包括各种开源模型和流行的 Hugging Face 模型。用户还可以将自己喜欢的模型导入到 AzureML 中。
本笔记本介绍了如何使用托管在 AzureML 在线端点
上的 LLM。
from langchain.llms.azureml_endpoint import AzureMLOnlineEndpoint
设置
要使用包装器,您必须在 AzureML 上部署模型并获取以下参数:
endpoint_api_key
:必需 - 端点提供的 API 密钥endpoint_url
:必需 - 端点提供的 REST 端点 URLdeployment_name
:不是必需 - 使用端点的模型的部署名称
内容格式化器
content_formatter
参数是一个处理程序类,用于将 AzureML 端点的请求和响应转换为所需的模式。由于模型目录中有各种模型,每个模型的数据处理方式可能不同,因此提供了一个 ContentFormatterBase
类,允许用户根据自己的喜好转换数据。提供了以下内容格式化器:
GPT2ContentFormatter
:格式化 GPT2 的请求和响应数据DollyContentFormatter
:格式化 Dolly-v2 的请求和响应数据HFContentFormatter
:格式化文本生成 Hugging Face 模型的请求和响应数据LLamaContentFormatter
:格式化 LLaMa2 的请求和响应数据
注意:OSSContentFormatter
正在被弃用,并被 GPT2ContentFormatter
替代。逻辑是相同的,但 GPT2ContentFormatter
是一个更合适的名称。您仍然可以继续使用 OSSContentFormatter
,因为更改是向后兼容的。
以下是使用 Hugging Face 的摘要模型的示例。
自定义内容格式化器
from typing import Dict
from langchain.llms.azureml_endpoint import AzureMLOnlineEndpoint, ContentFormatterBase
import os
import json
class CustomFormatter(ContentFormatterBase):
content_type = "application/json"
accepts = "application/json"
def format_request_payload(self, prompt: str, model_kwargs: Dict) -> bytes:
input_str = json.dumps(
{
"inputs": [prompt],
"parameters": model_kwargs,
"options": {"use_cache": False, "wait_for_model": True},
}
)
return str.encode(input_str)
def format_response_payload(self, output: bytes) -> str:
response_json = json.loads(output)
return response_json[0]["summary_text"]
content_formatter = CustomFormatter()
llm = AzureMLOnlineEndpoint(
endpoint_api_key=os.getenv("BART_ENDPOINT_API_KEY"),
endpoint_url=os.getenv("BART_ENDPOINT_URL"),
model_kwargs={"temperature": 0.8, "max_new_tokens": 400},
content_formatter=content_formatter,
)
large_text = """On January 7, 2020, Blockberry Creative announced that HaSeul would not participate in the promotion for Loona's
next album because of mental health concerns. She was said to be diagnosed with "intermittent anxiety symptoms" and would be
taking time to focus on her health.[39] On February 5, 2020, Loona released their second EP titled [#] (read as hash), along
with the title track "So What".[40] Although HaSeul did not appear in the title track, her vocals are featured on three other
songs on the album, including "365". Once peaked at number 1 on the daily Gaon Retail Album Chart,[41] the EP then debuted at
number 2 on the weekly Gaon Album Chart. On March 12, 2020, Loona won their first music show trophy with "So What" on Mnet's
M Countdown.[42]
On October 19, 2020, Loona released their third EP titled [12:00] (read as midnight),[43] accompanied by its first single
"Why Not?". HaSeul was again not involved in the album, out of her own decision to focus on the recovery of her health.[44]
The EP then became their first album to enter the Billboard 200, debuting at number 112.[45] On November 18, Loona released
the music video for "Star", another song on [12:00].[46] Peaking at number 40, "Star" is Loona's first entry on the Billboard
Mainstream Top 40, making them the second K-pop girl group to enter the chart.[47]
On June 1, 2021, Loona announced that they would be having a comeback on June 28, with their fourth EP, [&] (read as and).
[48] The following day, on June 2, a teaser was posted to Loona's official social media accounts showing twelve sets of eyes,
confirming the return of member HaSeul who had been on hiatus since early 2020.[49] On June 12, group members YeoJin, Kim Lip,
Choerry, and Go Won released the song "Yum-Yum" as a collaboration with Cocomong.[50] On September 8, they released another
collaboration song named "Yummy-Yummy".[51] On June 27, 2021, Loona announced at the end of their special clip that they are
making their Japanese debut on September 15 under Universal Music Japan sublabel EMI Records.[52] On August 27, it was announced
that Loona will release the double A-side single, "Hula Hoop / Star Seed" on September 15, with a physical CD release on October
20.[53] In December, Chuu filed an injunction to suspend her exclusive contract with Blockberry Creative.[54][55]
"""
summarized_text = llm(large_text)
print(summarized_text)
HaSeul在Mnet的M Countdown上赢得了她的第一个音乐节目奖杯,她没有参加Loona的下一张专辑的宣传活动,因为她的精神健康问题。2020年2月5日,Loona发布了他们的第二张EP,标题为[#](读作hash),并附带了主打曲目"So What"。尽管HaSeul没有出现在主打曲目中,但她的声音出现在专辑中的其他三首歌曲中,包括"365"。EP曾在每日Gaon零售专辑榜上排名第1,然后在每周Gaon专辑榜上排名第2。2020年3月12日,Loona在Mnet的M Countdown上赢得了他们的第一个音乐节目奖杯,歌曲是"So What"。
2020年10月19日,Loona发布了他们的第三张EP,标题为[12:00](读作午夜),并附带了它的第一首单曲"Why Not?"。HaSeul再次没有参与该专辑的制作,因为她自己决定专注于康复。该EP成为他们的第一张进入Billboard 200的专辑,首次登场排名第112位。11月18日,Loona发布了"Star"的音乐视频,这是[12:00]上的另一首歌曲。"Star"在Billboard主流前40名中排名第40位,使他们成为第二个进入该榜单的韩国女子组合。
2021年6月1日,Loona宣布他们将在6月28日回归,推出他们的第四张EP,标题为[&](读作and)。次日,即6月2日,Loona的官方社交媒体账号发布了一段预告片,显示了十二对眼睛,确认了自2020年初以来一直处于休假状态的成员HaSeul的回归。6月12日,组合成员YeoJin、Kim Lip、Choerry和Go Won与Cocomong合作发布了歌曲"Yum-Yum"。9月8日,他们发布了另一首合作歌曲,名为"Yummy-Yummy"。2021年6月27日,Loona在他们的特别片段的结尾宣布,他们将在9月15日在日本发行首张专辑,由Universal Music Japan子厂牌EMI Records发行。8月27日,宣布Loona将于9月15日发行双A面单曲"Hula Hoop / Star Seed",并于10月20日发行实体CD。12月,Chuu提起诉讼,要求暂停她与Blockberry Creative的独家合同。
使用 LLMChain 的 Dolly
from langchain import PromptTemplate
from langchain.llms.azureml_endpoint import DollyContentFormatter
from langchain.chains import LLMChain
formatter_template = "写一篇关于{topic}的{word_count}字的文章。"
prompt = PromptTemplate(
input_variables=["word_count", "topic"], template=formatter_template
)
content_formatter = DollyContentFormatter()
llm = AzureMLOnlineEndpoint(
endpoint_api_key=os.getenv("DOLLY_ENDPOINT_API_KEY"),
endpoint_url=os.getenv("DOLLY_ENDPOINT_URL"),
model_kwargs={"temperature": 0.8, "max_tokens": 300},
content_formatter=content_formatter,
)
chain = LLMChain(llm=llm, prompt=prompt)
print(chain.run({"word_count": 100, "topic": "如何交朋友"}))
许多人愿意谈论自己;问题是其他人似乎很拘谨。试着理解别人的出发点。志同道合的人可以一起建立一个部落。
序列化 LLM
您还可以保存和加载 LLM 配置
from langchain.llms.loading import load_llm
from langchain.llms.azureml_endpoint import AzureMLEndpointClient
save_llm = AzureMLOnlineEndpoint(
deployment_name="databricks-dolly-v2-12b-4",
model_kwargs={
"temperature": 0.2,
"max_tokens": 150,
"top_p": 0.8,
"frequency_penalty": 0.32,
"presence_penalty": 72e-3,
},
)
save_llm.save("azureml.json")
loaded_llm = load_llm("azureml.json")
print(loaded_llm)
AzureMLOnlineEndpoint
Params: {'deployment_name': 'databricks-dolly-v2-12b-4', 'model_kwargs': {'temperature': 0.2, 'max_tokens': 150, 'top_p': 0.8, 'frequency_penalty': 0.32, 'presence_penalty': 0.072}}