Skip to content

Pipelines

When you run import ruprompts, we add two custom pipelines to transformers:

  • text-generation-with-prompt
  • text2text-generation-with-prompts

These pipelines are then accessible with standard syntax:

transformers.pipeline('text-generation-with-prompt', ...)

Read more about pipelines in HF docs.

class
ruprompts.pipelines.TextGenerationWithPromptPipeline

Adds the trained prompt as prefix before passing text to TextGenerationPipeline.

Alias: transformers.pipeline('text-generation-with-prompt', ...).

Parameters:

Name Type Description Default
prompt

Prompt or str

Prompt used to format input entries. If string is given, loads the prompt with Prompt.from_pretrained.

required
**kwargs

arguments for transformers.Pipeline

required

Examples:

>>> from ruprompts import TextGenerationWithPromptPipeline, Prompt
>>> prompt = Prompt.from_pretrained(...)
>>> model = AutoLMHeadModel.from_pretrained(...)
>>> ppln = TextGenerationWithPromptPipeline(prompt=prompt, model=model)
>>> from transformers import pipeline
>>> ppln = pipeline('text-generation-with-prompt', prompt=prompt, model=model)
>>> ppln = pipeline('text-generation-with-prompt', prompt=prompt)
>>> ppln = pipeline('text-generation-with-prompt', prompt='konodyuk/prompt_rugpt3large_joke')
>>> a = ppln(text="Заходят в бар")
>>> b = ppln("Заходят в бар")
>>> assert a == b

class
ruprompts.pipelines.Text2TextGenerationWithPromptPipeline

Formats text with the given prompt before passing it to TextGenerationPipeline.

Alias: transformers.pipeline('text2text-generation-with-prompt', ...).

Parameters:

Name Type Description Default
prompt

Prompt or str

Prompt used to format input entries. If string is given, loads the prompt with Prompt.from_pretrained.

required
**kwargs

arguments for transformers.Pipeline

required

Examples:

>>> from ruprompts import Text2TextGenerationWithPromptPipeline, Prompt
>>> prompt = Prompt.from_pretrained(...)
>>> model = AutoLMHeadModel.from_pretrained(...)
>>> ppln = Text2TextGenerationWithPromptPipeline(prompt=prompt, model=model)
>>> from transformers import pipeline
>>> ppln = pipeline('text2text-generation-with-prompt', prompt=prompt, model=model)
>>> ppln = pipeline('text2text-generation-with-prompt', prompt=prompt)
>>> ppln = pipeline('text2text-generation-with-prompt', prompt='konodyuk/prompt_rugpt3large_qa_sberquad')
>>> ppln = pipeline('text2text-generation-with-prompt', prompt='konodyuk/prompt_rugpt3large_qa_sberquad')
>>> ppln(context="Трава зеленая.", question="Какая трава?")
>>> ppln = pipeline('text2text-generation-with-prompt', prompt='konodyuk/prompt_rugpt3large_detox_russe')
>>> a = ppln(text="Отвали, дурак")
>>> b = ppln("Отвали, дурак")
>>> assert a == b