top of page

Exploration of LLM's role in Problem Resolution Process (IT / Product Support)

With over two decades of experience serving in multinational Fortune 500 companies, I have witnessed and been an integral part of the evolution of IT and product support. Throughout my journey, from handling Level 1 to Level 3 support roles, leading diverse teams, and even managing intricate projects, I've observed the criticality of problem resolution in maintaining client trust and operational excellence. In recent years, the emergence of Large Language Model AI (LLM) has promised a paradigm shift in the support arena. This article delves into the transformative potential of LLMs, reflecting on their capability to diagnose, understand, and resolve complex, vendor-specific challenges, all while collaboratively working with traditional support structures.

The diagram above illustrates the structured communication and support framework between a client's location and a vendor's location in a technology-driven environment. Central to this interaction mechanism are the "Interaction Channels," which serve as the medium for communication. These channels encompass various modalities including Telephone, Web Chat, and Remote Access, each tailored to address specific support needs while ensuring seamless communication.

On one hand, the "Problem Process" delineates the client's responsibility in recognizing and reporting issues, emphasizing the client's active role in the initial stages of troubleshooting. On the other, the "Support Level" compartmentalizes the vendor's technical support into three distinct tiers: Level 1, Level 2, and Level 3. These gradations allow for an organized and efficient escalation matrix, ensuring that technical challenges are addressed with increasing expertise and specialization as they progress through the levels.

Current State Problem Resolution Process as of Oct. 2023

The "LLM (Proprietary)" in the depicted framework signifies a specialized customer-facing support system, tailored with specific domain knowledge. Unlike generic AI models, the proprietary nature of LLM indicates that it's been fine-tuned or adapted for a particular industry or application, ensuring that its responses and solutions are more relevant and accurate for the user's context. This advanced level of customization allows the LLM to effectively bridge the gap between the user client and the AI, facilitating a more seamless and efficient knowledge-sharing process.

Within the modern IT support landscape, the integration of such domain-specific AI models like the LLM (Proprietary) has become imperative. They not only enhance the user experience by providing more context-aware support but also enable users to extract maximum value from the technology. This represents the current evolutionary state of IT support, where AI is no longer just a passive tool but an active participant in problem recognition and resolution, symbiotically working alongside human users.

LLM as OS Phase

The presented diagram offers a groundbreaking integration of Large Language Models (LLMs) into the realm of IT support and operations. Central to this concept is the "LLM (OS)" component, which serves as an innovative blend of an operating system and a communication conduit. In essence, the LLM (OS) functions as a task director within the machine. It orchestrates various operations, and when faced with challenges, it liaises directly with the Vendor LLM. Once directives are received, the LLM (OS) executes the necessary tasks, promising streamlined operations and swift issue resolution.

A core innovation that makes this feasible is the MemGPT system, as referenced in the research paper (Link). MemGPT empowers LLMs to surpass their traditional limitations concerning context windows. This is achieved through an adept memory management system, which retains and retrieves comprehensive data from extended interactions. Such capabilities are invaluable in IT support, where maintaining a cohesive understanding of machine states, user interactions, and ongoing issues is paramount. The MemGPT system ensures the LLM can manage long troubleshooting sequences and comprehend a wide range of machine operations.

Lastly, the insights from AI expert Andrej Karpathy (Link) envision a transformative role for LLMs, not just as chatbots, but as the nucleus of next-gen operating systems. LLMs, in this capacity, are anticipated to handle diverse modalities like text, audio, and vision, making them adaptable to a range of IT platforms. Even more impressively, their potential to self-program and self-optimize showcases a future where machines can autonomously evolve. By fusing concepts like MemGPT with Karpathy's vision, IT professionals are introduced to a revolutionary approach to machine operations and IT support, setting the stage for the next era of IT service management.

Edge LLM Phase

During GTC Spring 2023 conference Nvidia CEO, Jensen Huang gave an insightful discussion on the transformative role of Large Language Models (LLMs) within edge computing. A standout feature of his discourse is the introduction of the "Edge LLM," a groundbreaking solution that strategically positions LLMs on edge devices. This positioning facilitates swift data processing and decision-making. But beyond the immediate benefits of rapid data analytics, the integration of Edge Agent LLM is set to play a pivotal role when connected to the overarching LLM Operating System (OS). Huang's keynote, available here[Link]).

The diagram above represents a sophisticated workflow highlighting the integration and collaboration of various Large Language Models (LLMs) in the realm of problem resolution within IT support structures.

At the machine or client location, there are two pivotal AI entities: the Edge Agent LLM and the LLM (OS). The Edge Agent LLM is responsible for problem recognition, which can either be proactive or triggered based on specific events or anomalies. Upon recognizing an issue, the Edge Agent LLM communicates with the LLM (OS) to report the problem, ensuring a seamless flow of information between machine-initiated responses and operating system-centric diagnostics.

This local problem identification and communication mechanism then intersects with the vendor location, primarily through a series of interaction channels including APIs, inter-LLM communication, and manual remote access capabilities. The Vendor LLM, represented symbolically with a brain, holds proprietary knowledge and insights, positioning it as the Level 1 support. It utilizes its specialized understanding to diagnose and potentially resolve issues. If the Vendor LLM cannot resolve the issue, it escalates the problem to human-centered Level 2 and Level 3 tech support. This tiered, collaborative approach ensures swift, precise, and effective problem resolution, leveraging both AI-driven insights and human expertise.

Problem Resolution Process (Auto Execute) Phase

Transitioning to the vendor location, the diagram depicts an innovative blend of AI and human support. The 'AE' notation stands for 'Auto Execute', a feature indicating that this particular LLM possesses both proprietary knowledge and automation capabilities. This means that for Level 1 and Level 2 support, the LLM can not only diagnose issues but can also execute automated solutions, streamlining the resolution process. However, it's essential to note that any auto-execution requires prior approval from tech support, ensuring a human oversight layer to the AI's decisions. If a problem surpasses the AI's capacity or requires a more nuanced approach, it gets escalated to Level 3, where human tech support takes over. This synergy of AI-driven automation and human expertise ensures efficient and accurate problem resolution while retaining the much-needed human touch.

Complete Auto Execute Process Resolution Process Phase

The diagram encapsulates the revolutionary shift in IT support paradigms, where Levels 1 to 3 have been seamlessly automated through cutting-edge artificial intelligence. Traditionally reliant on manual intervention, these levels encompassed critical stages of problem diagnosis, isolation, and resolution. However, the image illustrates an environment where these functions are now overseen by a sophisticated AI system. This AI doesn't just perform tasks—it comprehends, adapts, and resolves with an efficiency that surpasses human capability, ensuring immediate responses and resolutions to a vast array of common technical glitches and queries.

In this new age of IT support, the traditional barriers and delays inherent to manual troubleshooting are eradicated. The AI-driven support system is proactive, preemptively diagnosing and mitigating potential issues before they escalate, optimizing system performance, and ensuring a smoother user experience. The implication here is not just about speed, but precision and consistency, with the AI minimizing the margin of error in problem resolutions.

Yet, the visual also subtly acknowledges the irreplaceable value of human expertise. While the majority of support levels have transitioned to AI, Level 4 (implied) remains steadfastly manual, handling specialized hardware concerns and unique, one-off issues. This distinction emphasizes that, despite the prowess of AI, there are nuanced challenges and specialized tasks where human intuition, experience, and hands-on skills are paramount. 

Vendor LLM Client Location Phase

The image underscores the strategic integration of the Vendor's Large Language Model AI (LLM) within the client's machine framework, where it's designed to supplement, and in many scenarios, supplant the native LLM(OS). With a specialized reservoir of vendor-specific knowledge, this Vendor LLM is adept at diagnosing and tackling nuanced issues that might be beyond the capability of standard operating system-based LLMs. The coexistence of these two models epitomizes the next level of IT support, with the Vendor LLM being tailor-made to understand intricacies tied to specific machine types, software variations, and proprietary challenges.

A significant catalyst in this enhanced support system is the Edge Agent LLM. Acting as a primary filter, the Edge Agent LLM captures, evaluates, and if possible, resolves issues at the first point of contact. However, when faced with vendor-specific challenges or those requiring a deeper dive, it seamlessly hands off the tasks to the Vendor LLM. This synergy between Edge Agent LLM and Vendor LLM, powered by insights into user and machine behavior, offers IT and ITSM professionals an unparalleled support mechanism, ensuring rapid issue identification and resolution with minimal user intervention.

Hybrid Client Vendor LLM

The diagram illustrates a transformative approach to IT support, epitomizing the synergy between customer and vendor through a collaborative framework. Situated at the Machine/Client Location is a sophisticated AI model that blends the nuances of machine and user behavior data from the customer side with the comprehensive problem management suite from the vendor, encompassing problem reporting, diagnosis, isolation, identification, and resolution.

Central to this amalgamated setup is the Hybrid Client Vendor Large Language Model (LLM), a cutting-edge tool that underscores the diminishing boundaries of ownership between the vendor and the client. Unlike traditional setups where tools were distinctly owned and operated, this model signifies a paradigm where the lines of demarcation are blurred, fostering a more integrated, cohesive, and responsive IT environment.

Furthermore, there's a pronounced shift in the skills landscape. The traditional reactive skills, once predominant in IT support, are now becoming ancillary. The ascent of MLops and LLMOps champions a proactive stance, driven by the power of machine learning and LLM capabilities. By leveraging the baselining and modeling of machine data coupled with user behavior, the system anticipates and preempts issues rather than just responding to them. As a result, the reactive skillset is relegated to the periphery, emphasizing the industry's progression towards predictive and preemptive IT operations.

5 views0 comments


bottom of page