I’ve recently been writing fiction and using an AI as a critic/editor to help me tighten things up (as I’m not a particularly skilled prose writer myself). Currently the two ways I’ve been trying are just writing text in a basic editor and then either saving files to add to a hosted LLM or copy pasting into a local one. Or using pycharm and AI integration plugins for it.

Neither is particularly satisfactory and I’m wondering if anyone knows of a good setup for this (preferably open source but not neccesary), integration with at least one of ollama or open-router would be needed.

Edit: Thanks for the recommendations everyone, lots of things for me to check out when I get the time!

  • nibby@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    10 months ago

    If you’re up for learning Emacs, it has several packages for integrating with Ollama, such as ellama. It has worked satisfactory for me.

    • Womble@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      I actually already use emacs, I just find configuring it a complete nightmare. Good to know its an option though

      • tal@lemmy.todayBanned from community
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        I installed the emacs ellama package, and I don’t think that it required any configuration to use, though I’m not at my computer to check.

  • hendrik@palaver.p3x.de
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 months ago

    I’m not sure if this is what you’re looking for, but for AI generated novels, we have Plot Bunni. That’s specifically made to draft, generate an outline and chapters and then the story. Organize ideas… It has a lot of rough edges though. I had some very limited success with it, and it’s not an editor. But it’s there and caters to storywriting.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 months ago

    Mikupad is incredible:

    https://github.com/lmg-anon/mikupad

    I think my favorite feature is the ‘logprobs’ mouseover, aka showing the propability of each token that’s generated. It’s like a built-in thesaurus, a great way to dial in sampling, and you can regenerate from that point.

    Once you learn how instruct formatting works (and how it auto inserts tags), it’s easy to maintain some basic formatting yourself and question it about the story.

    It’s also fast. It can handle 128K context without being too laggy.

    I’d recommend the llama.cpp server or TabbyAPI as backends (depending on the model and your setup), though you can use whatever you wish.

    I’d recommend exui as well, but seeing how exllamav2 is being depreciated, probably not the best idea to use anymore… But another strong recommendation is kobold.cpp (which can use external APIs if you want).