<aside> 💡
This involves the ability to both create and run code live during an LLM chat session.
Most of the major LLM vendors now provide this capability built-in, but it must be enabled. For example in chatGPT, you must turn on “code interpreter”.
</aside>
For example, my “Morphicator” that generates custom combinations of the genAI input & output cards, was created that way. Play with it here: https://claude.site/artifacts/2e804a1c-e62d-465c-a24e-70f0fa262d87
When building custom GPTs with OpenAI, one can tie them together with Zapier for a lightweight automation, or use native “Actions” in OpenAI to directly connect applications together.
Read about Zapier actions here: https://actions.zapier.com/docs/platform/gpt
***last updated, November 21st, 2024***