Microsoft Agent Framework

Inspired by Erik van Appeldoorn's talk on NNUG's April Meetup, I decided to dive into Microsoft Agent Framework which went GA a few days ago.

What is Microsoft Agent Framework?

The Microsoft Agent Framework is an open source framework designed to build, orchestrate and deploy multi agent AI applications using either Python or .NET. It enables developers to create agents that leverage Azure OpenAI models with built-in tool calling, conversation management and security through managed identities.

In this article, we're going to use the C# library and integrate with a LLM hosted in Azure AI Foundry. We'll build an internal application that understands our developer documentation and helps new developers get up to speed quickly by answering questions about our development process.

All the code is open and available on my GitHub. Feel free to explore, clone or contribute.

Before we dive into the code, it's worth understanding the core building blocks of the Microsoft Agent Framework.

Agent The central abstraction. An agent receives input, decides what to do and produces a response optionally by invoking tools or delegating to other agents. You shape its behavior through a system prompt and the tools you attach to it.

AgentThread Represents a conversation session. The AgentThread maintains message history across multiple sessions, giving the agent context from earlier in the conversation. Think of it as the memory of a single chat session.

Tools This is where agents go from conversational to genuinely useful. Tools are functions you expose to the agent, from reading files, querying a database or calling an API. The agent decides when and how to call them based on the user's input. In our example, we use a tool to make all documentation available to the agent at query time.

Chat History A structured list of messages (SystemMessage, UserMessage, AssistantMessage) passed into the model on every call. This is how the agent maintains conversational context and how you inject the system prompt that defines its personality and constraints.

Multi-Agent Collaboration The framework is built with orchestration in mind. Multiple agents can work together, one agent can hand off tasks to another making it easy to build systems where a router agent delegates to specialized agents based on the type of question.

Demonstration

The first thing you'll need to get started is an Azure subscription and to create a Microsoft Foundry resource. This will create two resources in your resource group, a Foundry resource and a Foundry project.

Head over to the Foundry project and from the menu expand Resource Management and select Endpoints. Note down the value for later as we are going to use this as a parameter to our application.

From the Foundry project, click Go to Foundry portal, then select Agents from the menu and deploy gpt-4o-mini and set name to DocsAssistant. The name here is important as this refers to the agent in the code.

We are using DefaultAzureCredential to authenticate, so make sure you are logged into Azure:
az login.

When logged in, you also need to make sure that you have the correct roles assigned to your user. The role you need is Azure AI User. To verify that you have this role, run this command:

az role assignment list --assignee your@email.com --all --output table

With the endpoint, correct permissions and agent name in hand, you're ready to run the application.

Next, lets clone the repository.

git clone git@github.com:Thorstensen/microsoft-agent-framework-demo.git

I assume you have installed .NET, if not do so.

Navigate to the folder where you cloned the repository and run the application with your endpoint and deployment name:
dotnet run --endpoint YOUR_AZURE_AI_FOUNDRY_ENDPOINT

You will be prompted to ask any questions you have about the documentation.

Show me the code!

var endpoint = GetArguments(args, "--endpoint") ?? throw new ArgumentException("endpoint needs to point to azure foundry service");
var modelName = GetArguments(args, "--model") ?? "gpt-4o-mini";

const string instructions =
    """
    You are a friendly helpful assistant answering questions about documentation.
    Your users are typically developers, software architects or other technically savvy people.
    You should answer questions about the documentation as concisely as possible,
    and if you don't know the answer, say you don't know.
    """;

AIAgent agent = new AIProjectClient(
        new Uri(endpoint),
        new DefaultAzureCredential()
    )
    .AsAIAgent(
        model: modelName,
        instructions: instructions,
        name: "DocsAssistant",
        tools: [AIFunctionFactory.Create(Tools.GetDocumentation)]
    );
    
while (true)
{
    var question = Console.ReadLine();

    Console.WriteLine("Ask me anything about the documentation!");
    Console.WriteLine(Environment.NewLine);
    
    if (string.IsNullOrWhiteSpace(question)) continue;

    Console.WriteLine(await agent.RunAsync(question));
}

This code will:

  1. Parse the arguments given to the console app.
  2. Throw an exception if the endpoint is not found
  3. Parse the model to use in Foundry, if not specified, default to gpt-4o-mini
  4. Specify an instruction to the agent
  5. Add a custom tool to the agent, making all documentation available.
  6. Run a while loop forever, prompting the user to enter a question about the documentation.

This example is a fairly easy one, but it demonstrates how easy it is to create powerfuls applications leveraging LLMs in Azure AI Foundry.

Thanks for reading!

Header photo by Volodymyr Hryshchenko on Unsplash