Serviceability is the last in the mind of most Product managers and Developers. 90% of the product users have less time to play around with various configurations to make the product work, as this is one of many products they manage. Nobody has time to read the docs.
The product managers say that their product is like Apple, but they need to remember to provide guardrails and alerts in a way that makes the product self-serviceable or self-healable.
In this blog, we will review how GenAI can improve products' usability and how product managers make the product suitable for LLMs to learn fast.
- Make the products and documents GenAI ready
- Products utilize GenAI principles to self-heal and be better usable/serviceable. LLMs for Proactive monitoring and self-healing:
GenAI Ready Product:
1. Logs:
Logs generated by the product should have a clear structure, making it easy for LLMs to train on these logs. Easily identifiable PII data.
Error: TimeStamp: Message in Clear English
Info: Timestamp: Message in Clear English
All the processes in your product should follow a similar pattern.
2. GuardRails LLM
Guide the customer to an optimal solution rather than allowing them to shoot themselves in the foot. These guardrails can be trained by LLMs (product-specific LLMs running within the product - smaller LLM footprint that acts as a well-trained product user),
Do not allow the customers to install the new software if the storage or memory crunch is already in the system.
3. Customer pattern learner LLM
This LLM can sit in the product or run in the SaaS to understand the customer usage/use case and provide solutions to the customer. This LLM can alert the customer if any anomaly is spotted.
Customers using an older code version with a bug with a specific use case can be alerted to upgrade. (Version recommender)
4. Utilizing LLM for insights/analytics and file walker algorithms ( backup vendors and others browse files to identify patterns and can use GenAI tech for LLMs/Vector DBs).
Convert all the ML-based analytics to LLM-based analytics.
5. Prompt Engineering: Simplify the UI experience for the customer. Current UX/UI can be used for advanced users.
Example: The prompt can be - "Identify current bottlenecks and suggest a solution"
Because there were too many zombie processes, a CPU Bottleneck was identified. Also, identify these processes and kill them.
Prompt Example for a backup software: Show me the current job that protects VMWare-SQL-Server-5 or Protect SQL-Server6 (provides the step or configures it automatically)
6. If you are shipping Hardware with your product, LPU/GPU-ready hardware may be the future, or you can ship the call home data to GPU clusters in Amazon to run insights and analytics.
7. Better Product APIs to interact with LLMs: LLMs should be able to connect to the data source, log in to the product, and automatically change the product's configuration as per the prompt. This will help with AI-powered automation. There is a new development in this area called AI-APIs. AI APIs take things a step further by using machine learning and natural language processing to understand requests, generate relevant responses, and complete tasks.
Documents suitable for GenAI:
If the product vendor generates the product documents, they should be structured and parsed through AI. Reinforced/supervised training of AI is good for verifying that it can produce clear, concise, and correct answers for a specific vendor software version without hallucinations and contradictions for the questions asked and can translate correctly in multiple languages.
LLMs are good at reading documents and summarizing them. A quick test run of various prompts with LLMs trained on the new docs for every version/white paper will improve confidence.
LLMs for Proactive monitoring and self-healing:
- Pros: Secure, Quick identification/Trained for local data—useful for Cameras to monitor a break-in;
- Cons: Local Admin required and upgrades required, Limited processing power
- Identify the anomaly and the corresponding fingerprint, provide solutions or apply the solution if it exists.
- Walk through the logs/alerts to identify new issues and alert the respective teams (Engineering/Field Notice/CSMs). Create a draft doc on the field notice.
- LLMs can identify the blast radius of this fingerprint.
- LLMs can be live monitors of your product, and LLMs can be enabled to fix the issue or create docs or scripts to resolve the issue.
- LLMs can scan these logs much faster than the current log parsers/file walkers.
No comments:
Post a Comment