How It Works
When your MCP client connects to the InfoTrack Gateway, a structured sequence of interactions takes place. This flow ensures secure, efficient communication between your AI agent and InfoTrack's services.
1. Connection
Your client sends an authenticated HTTP request to the Gateway URL. The InfoTrack MCP Gateway uses Streamable HTTP transport, so ensure your MCP client is configured for HTTP connections rather than stdio.
2. Tool Discovery
The Gateway responds with a list of available tools, their descriptions, required inputs, and expected outputs. This happens dynamically on each connection, so your agent does not need a static tool manifest. When InfoTrack adds new capabilities, your agent picks them up automatically without any code changes.
3. Tool Invocation
When a user asks a question (for example, "Who owns 135 King Street, Sydney?"), the agent selects the appropriate tool, prepares the inputs, and calls the Gateway. The Gateway authenticates the request, routes it to the correct downstream service, and injects any required context such as authentication tokens and matter details.
4. Elicitation
For actions that have a cost or require a choice (such as selecting an extract type or confirming a price), the Gateway prompts the user for confirmation before proceeding. This uses the MCP elicitation protocol to request additional information from the user within the conversation flow.
5. Results
The Gateway returns structured data and files to the agent. The agent then presents the results in a clear, readable format. Where applicable, search results can also be automatically delivered to the relevant matters in your Document Management System in real time.
Recommended Models
Higher-capability LLMs deliver better results because they are stronger at reasoning about which tools to use, how to chain them together, and how to present results. We recommend using the most capable model available in your environment for the best tool orchestration and output quality.