Honestly I'm not sure what we'd be bringing to that that you can't simply get from a standalone AI app; wouldn't really even be any faster since you can bring up your preferred LLM more-or-less instantly with a button shortcut on both iOS and Android. So given that this would entail us paying substantial API fees to whichever company created the AI we were using, it seems like it's hard to justify as a feature addition.
We do plan to offer some prompts you can use to have your preferred AI generate content in a way that will be nicely formatted in Pleco - dictionary entries, flashcards, interactive reader documents, etc - so people who want to incorporate AI content into their use of Pleco could do it that way, but I'm not currently seeing any clear advantages to having an LLM chatbot built into Pleco that would justify the substantial usage costs.