Pipelines
When you run import ruprompts
, we add two custom pipelines to transformers
:
text-generation-with-prompt
text2text-generation-with-prompts
These pipelines are then accessible with standard syntax:
transformers.pipeline('text-generation-with-prompt', ...)
Read more about pipelines in HF docs.
class
ruprompts.pipelines.TextGenerationWithPromptPipeline
ruprompts.pipelines.TextGenerationWithPromptPipeline
Adds the trained prompt as prefix before passing text to TextGenerationPipeline.
Alias: transformers.pipeline('text-generation-with-prompt', ...)
.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
prompt |
|
Prompt used to format input entries.
If string is given, loads the prompt with |
required |
**kwargs |
arguments for |
required |
Examples:
>>> from ruprompts import TextGenerationWithPromptPipeline, Prompt
>>> prompt = Prompt.from_pretrained(...)
>>> model = AutoLMHeadModel.from_pretrained(...)
>>> ppln = TextGenerationWithPromptPipeline(prompt=prompt, model=model)
>>> from transformers import pipeline
>>> ppln = pipeline('text-generation-with-prompt', prompt=prompt, model=model)
>>> ppln = pipeline('text-generation-with-prompt', prompt=prompt)
>>> ppln = pipeline('text-generation-with-prompt', prompt='konodyuk/prompt_rugpt3large_joke')
>>> a = ppln(text="Заходят в бар")
>>> b = ppln("Заходят в бар")
>>> assert a == b
class
ruprompts.pipelines.Text2TextGenerationWithPromptPipeline
ruprompts.pipelines.Text2TextGenerationWithPromptPipeline
Formats text with the given prompt before passing it to TextGenerationPipeline.
Alias: transformers.pipeline('text2text-generation-with-prompt', ...)
.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
prompt |
|
Prompt used to format input entries.
If string is given, loads the prompt with |
required |
**kwargs |
arguments for |
required |
Examples:
>>> from ruprompts import Text2TextGenerationWithPromptPipeline, Prompt
>>> prompt = Prompt.from_pretrained(...)
>>> model = AutoLMHeadModel.from_pretrained(...)
>>> ppln = Text2TextGenerationWithPromptPipeline(prompt=prompt, model=model)
>>> from transformers import pipeline
>>> ppln = pipeline('text2text-generation-with-prompt', prompt=prompt, model=model)
>>> ppln = pipeline('text2text-generation-with-prompt', prompt=prompt)
>>> ppln = pipeline('text2text-generation-with-prompt', prompt='konodyuk/prompt_rugpt3large_qa_sberquad')
>>> ppln = pipeline('text2text-generation-with-prompt', prompt='konodyuk/prompt_rugpt3large_qa_sberquad')
>>> ppln(context="Трава зеленая.", question="Какая трава?")
>>> ppln = pipeline('text2text-generation-with-prompt', prompt='konodyuk/prompt_rugpt3large_detox_russe')
>>> a = ppln(text="Отвали, дурак")
>>> b = ppln("Отвали, дурак")
>>> assert a == b