Curious to know what you're doing, how's it going, what your Aware build structure is like, struggles, etc.
Implementing AI into apps is key, I been investigating deep integration into to some of our apps, however doing it inside AwareIM, there is a limiting factor:
In order to successfully build ChatGPT into an AwareIM application, support for Websockets protocol must be in place. Hopefully support will see the light with this regard and make it a built in feature like they did with REST
Websockets > ChatGPT response:
Simulating real-time user interaction with an AwareIM based app could benefit from using WebSockets, especially if you want two-way communication between ChatGPT and AwareIM system without the need for constant polling or refreshing.
WebSockets provide a persistent connection between the server and the client, allowing real-time data exchange. This would be useful in several scenarios:
Live updates: If you want ChatGPT to provide real-time status updates or responses from e.g. ERP (e.g., showing the user when a transaction is completed or an inventory item is updated), WebSockets would be a good choice.
Continuous data streams: For scenarios where ChatGPT needs to interact with ERP data as it changes (e.g., tracking transactions, live inventory changes, or order processing updates), WebSockets provide a way to push updates immediately without requiring the client to make repeated API requests.
Simulating user actions in real-time: If your goal is to mimic user interactions such as placing orders, processing invoices, or updating records within the ERP in a responsive, live manner, WebSockets would be ideal for sending actions or updates back and forth between the ERP system and ChatGPT.
In summary, setting up WebSockets would allow you to create a dynamic, real-time interaction layer between ChatGPT, your ERP system, and the users.