prompt looked normal and not exceptionally long and nothing nsfw. I dont yet see any pattern causing it and i didnt save the prompts or try to remake with same seed. i will on the next one. im addicted to generating beautiful mlp ponies, so ill be at it for the next 700 hours and Im sure Ill gen another unless u solve and fix it first.
also because the image and topic involves ai, i predict 4 downvotes from the wider lemmy ecosystem. 2 or 3 seems more reasonable, but im feeling 4.
Same with me but just today. Fingers crossed, maybe next t2i update is imminent.
😂 what’s the trick to make such precise prediction
intuition ✨️🦄✨️
Normally, this happens very rarely, but I’ve had it happen to me before. I think there was some confusion when transferring the prompt. Or something was changed in the code at that moment. I use Perchance myself 95% of the time to generate MLP images, which actually works really well! I use my own private one where I can write the code myself - which is at least possible in perchance. It’s really very simplified.
Starlight is my fav by far and i consider her my spirit animal.
Celestia, Luna, Twilight are the others i like enough that i include them too as characters in things i do that can have characters.
who do u like?

fhere is actually a glitch in our backend… you still use T2Ai backend or whatever it was,… All I knoe is google pointed out to me that the common random glitch out for no apparent reason like this one you experienced has somthing to do with calls dropping out and possibly our sloppy asynchronous functions code.





