View Source AI.Accumulator (fnord v0.4.34)

When file or other input to too large for the model's context window, this module may be used to process the file in chunks. It automatically modifies the supplied agent prompt to include instructions for accumulating a response across multiple chunks based on the max_tokens parameter supplied to the get_response function.

Summary

Functions

get_response(ai, opts \\ [])