maxframe.learn.contrib.llm.text.generate#
- maxframe.learn.contrib.llm.text.generate(data, model: TextLLM, prompt_template: List[Dict[str, Any]], params: Dict[str, Any] = None)[source]#
Generate text using a text language model based on given data and prompt template.
- Parameters:
data (DataFrame or Series) – Input data used for generation. Can be maxframe DataFrame, Series that contain text to be processed.
model (TextLLM) – Language model instance used for text generation.
prompt_template (List[Dict[str, str]]) –
Dictionary containing the conversation messages template. Use
{col_name}
as a placeholder to reference column data from input data.Usually in format of [{“role”: “user”, “content”: “{query}”}], same with openai api schema.
params (Dict[str, Any], optional) – Additional parameters for generation configuration, by default None. Can include settings like temperature, max_tokens, etc.
- Returns:
Generated text raw response and success status. If the success is False, the generated text will return the error message.
- Return type:
Examples
>>> from maxframe.learn.contrib.llm.models.managed import ManagedTextLLM >>> import maxframe.dataframe as md >>> >>> # Initialize the model >>> llm = ManagedTextLLM(name="Qwen2.5-0.5B-instruct") >>> >>> # Prepare prompt template >>> messages = [ ... { ... "role": "user", ... "content": "Help answer following question: {query}", ... }, ... ]
>>> # Create sample data >>> df = md.DataFrame({"query": ["What is machine learning?"]}) >>> >>> # Generate response >>> result = generate(df, llm, prompt_template=messages) >>> result.execute()