I’m interested in automatically generating lengthy, coherent stories of 10,000+ words from a single prompt using an open source local large language model (LLM) on low-spec hardware like a laptop without GPU and with i5-8250U, 16GB DDR4-2400MHz. I came across the “Awesome-Story-Generation” repository which lists relevant papers describing promising methods like “Re3: Generating Longer Stories With Recursive Reprompting and Revision”, announced in this Twitter thread from October 2022 and “DOC: Improving Long Story Coherence With Detailed Outline Control”, announced in this Twitter thread from December 2022. However, these papers used GPT-3, and I was hoping to find similar techniques implemented with open source tools that I could run locally. If anyone has experience or knows of resources that could help me achieve long, coherent story generation with an open source LLM on low-spec hardware, I would greatly appreciate any advice or guidance.
Have a peruse of this article. Various different options for running LLMs locally
deleted by creator
You can get a really cool, coherent story of any length you want by writing one or hiring a writer.
GPT4All
You aren’t going to get a response that long. That is just the limitations of LLM’s. If you do manage to get something that long it won’t make sense as it can’t hold enough context as it generates.
Try out some of the options listed in this comment https://ttrpg.network/comment/6729305