Is anyone using GPT APIs (OpenAI etc) in an AwareIM app to build AI Assistants in Aware Apps?
Curious to know what you're doing, how's it going, what your Aware build structure is like, struggles, etc.
For us: we have used the OpenAI API in a few limited places in our Aware Apps for specific use cases, eg to re-write email templates, to generate content for vision statements etc. We have also built and integrated external chatbot into an Aware App.
We're looking to build some more complex stuff now, things like:
- chat style interface
- taking inputs that are typed in by the user and saving them into attributes
- accessing internal knowledge bases and answering questions in a chat interface
- processing uploads (CSV, PDF, Video etc), extracting information, saving them to aware attributes
- access other systems (LinkedIn, GMail, Outlook, calendars etc)
Depending on how this thread goes, perhaps we might want to share ideas in a subgroup?
Aware + AI (GPTs/LLMs) + Workflows
Aware + AI (GPTs/LLMs) + Workflows
Rod. Aware V9 (latest build), Developer Edition, on OS Linux (Ubuntu) using GUI hosted on AWS EC2, MYSQL on AWS RDS
Re: Aware + AI (GPTs/LLMs) + Workflows
I am also interested in hearing what people are doing in this area (and also to hear more about what you are doing Rod). AI is in everything today it seems so we have to get these options etc. in Aware as well if we want to stay up to date with todays and tomorrows technology.
Henrik (V8 Developer Ed. - Windows)
Re: Aware + AI (GPTs/LLMs) + Workflows
We are doing something similar in one of our project and hopefully will share soon.
From,
Himanshu Jain
AwareIM Consultant (since version 4.0)
OS: Windows 10.0, Mac
DB: MYSQL, MSSQL
Himanshu Jain
AwareIM Consultant (since version 4.0)
OS: Windows 10.0, Mac
DB: MYSQL, MSSQL
Re: Aware + AI (GPTs/LLMs) + Workflows
Implementing AI into apps is key, I been investigating deep integration into to some of our apps, however doing it inside AwareIM, there is a limiting factor:Curious to know what you're doing, how's it going, what your Aware build structure is like, struggles, etc.
In order to successfully build ChatGPT into an AwareIM application, support for Websockets protocol must be in place. Hopefully support will see the light with this regard and make it a built in feature like they did with REST
Websockets > ChatGPT response:
Simulating real-time user interaction with an AwareIM based app could benefit from using WebSockets, especially if you want two-way communication between ChatGPT and AwareIM system without the need for constant polling or refreshing.
WebSockets provide a persistent connection between the server and the client, allowing real-time data exchange. This would be useful in several scenarios:
Live updates: If you want ChatGPT to provide real-time status updates or responses from e.g. ERP (e.g., showing the user when a transaction is completed or an inventory item is updated), WebSockets would be a good choice.
Continuous data streams: For scenarios where ChatGPT needs to interact with ERP data as it changes (e.g., tracking transactions, live inventory changes, or order processing updates), WebSockets provide a way to push updates immediately without requiring the client to make repeated API requests.
Simulating user actions in real-time: If your goal is to mimic user interactions such as placing orders, processing invoices, or updating records within the ERP in a responsive, live manner, WebSockets would be ideal for sending actions or updates back and forth between the ERP system and ChatGPT.
In summary, setting up WebSockets would allow you to create a dynamic, real-time interaction layer between ChatGPT, your ERP system, and the users.