#1: Keynote
In a recent keynote address, Microsoft executives Balan and Slava shared their insights on the transformative power of integration services and generative AI. The session highlighted the critical role of integration in digital transformation and how AI is reshaping the operational landscape for organizations. Below is a comprehensive summary of the main points covered by both speakers.
Balan’s Key Points
Introduction
- Balan opened the keynote by expressing excitement about the advancements in integration services since his last appearance in 2022.
- He noted that while the technological landscape has evolved significantly, particularly with the rise of generative AI.
The Reality of Generative AI
- Generative AI is no longer a hype; it is a reality that is actively reshaping industries.
- AI is making previously impossible tasks feasible, from natural language processing to visual understanding.
- Organizations are increasingly viewing AI as a strategic differentiator rather than a cost center, leading to a surge in innovative use cases across various sectors.
New Use Cases and Applications
Balan highlighted several exciting applications of AI, including:
- Customer Experience Personalization: Retailers are leveraging AI to enhance customer interactions.
- Drug Discovery: AI is being utilized to streamline processes in the pharmaceutical industry.
- Insurance Industry Innovations: AI agents are becoming more prevalent in managing insurance claims and customer inquiries.
The Importance of Integration
- Integration is becoming the backbone of modern applications.
- Just as organizations once needed a website or a mobile app, they now require AI capabilities to remain competitive.
- The integration of AI into existing applications is essential for maximizing efficiency and effectiveness.
Characteristics of Intelligent Applications
Intelligence is the new baseline for Modern apps and Balan outlined two main characteristics of intelligent applications:
- Ease of Information Retrieval: Intelligent applications should simplify the process of finding relevant information.
- Data Surfacing: They should leverage AI to present the right data at the right time, enhancing decision-making processes.
The Role of AI in Business Processes
- AI is not just for customer-facing applications; it is also being used internally to improve employee productivity.
- Balan shared examples of how Microsoft employees are utilizing generative AI for market research, content creation, and product usage analysis.
What’s Coming Next: The Rise of Agents in Integration
As we look ahead, the introduction of agents is becoming increasingly significant for organizations, many organizations are exploring how agents can be introduced in both customer-facing processes and internal operations. This trend emphasizes the importance for integration developers to consider how agents will impact integrations and how integrations will support the creation of new agents.
Challenges and Opportunities
- Despite advancements, many generative AI applications are still not moving into production.
- Agentic AI development challenges such as model selection, complex workflows, content safety, observability & governance, and tool sprawl remain significant hurdles.
- However, the potential for AI to drive efficiency and innovation is immense.
Understanding the Difference: Agents vs. Tools
A key distinction between an agent and a tool lies in their use of large language models (LLMs). While tools can utilize LLMs for tasks like summarization, they do not engage in planning, which is a fundamental characteristic of agents.
This distinction may seem daunting, but the thoughtful integration of agents into business processes is essential. LLMs can now transform lengthy workflows into natural language instructions for planning and execution. Agents execute these plans using various tools, and the Model Context Protocol (MCP) plays a role in this process.
Additionally, agents can receive inputs from end users in conversational contexts and can communicate with each other through a two-way protocol supported by Azure. This means you don’t have to worry about managing these protocols, as Azure handles the complexities, whether you’re developing conversational chatbots or autonomous systems.
Azure AI Foundry
Azure AI Foundry is a platform that provides a flexible range of AI models — from OpenAI to lightweight models — which can be hosted serverlessly or on dedicated compute.
It includes an agent service that enables durable communication between AI agents, supports testing, and helps deploy them into production.
Foundry also offers specialized services like speech translation (used successfully in real-world scenarios), and strong observability, giving full visibility into both agentic workflows and broader integration workflows (like those built with Logic Apps).
The platform is deeply integrated with Azure Integration Services, making it ideal for building intelligent, end-to-end automated solutions
Highlights from Microsoft Build
- Logic Apps + Azure AI Search: Native integration now allows real-time, automated updates to vector store indices as source data changes.
- Tool Calling from AI Foundry: Agents in Azure AI Foundry can now invoke Logic Apps and Azure Functions — enabling richer AI-driven workflows.
- Serverless GPU Hosting: Experiment with generative AI without waiting for GPU capacity, using ACA to host models serverlessly for POCs.
- Advanced API Management Use Cases:
- Load balancing across models (across regions or clouds)
- Usage tracking across apps and agents
- Real-world example: A Korean telecom built a ChatGPT-like B2B platform using APIM to monitor and control usage.
- Semantic Caching in APIM: Reduces unnecessary model calls using fuzzy logic, optimizing model cost and response time.
- Content Safety via APIM: Enforce AI content moderation across apps with centralized filtering policies.
- Continuous Feedback Loop for AI Tuning: Log prompts and responses via APIM to feed back into Foundry and fine-tune models.
- MCP (Model Context Protocol): Enables agents and tools to exchange metadata/context. Supported natively in APIM and Azure Functions.
- Serverless APIM for Generative AI: Lightweight, cost-efficient setup for managing APIs and MCP servers.
- Credential Management Enhancements: On-behalf-of auth and fine-grained operation-level access control for MCP.
- Organizational Registries in API Center: Centralized MCP management for large-scale enterprise adoption.
- MCP + Dev Tools: MCP servers are now easily usable in GitHub Copilot, VS Code, and Copilot Studio
The Future of Integration Services
Balan emphasized the need for organizations to invest in three key areas:
- Understanding Use Cases: Clearly defining the business metrics that AI will impact.
- Modernizing Application Infrastructure: Addressing data silos and disconnected workflows to facilitate AI integration.
- Establishing Robust Operations: Creating a feedback loop to continuously improve AI applications based on user interactions.
Conclusion
- Balan concluded with a call to action for organizations to embrace the integration of AI into their workflows.
- He expressed confidence that the future of integration services, powered by AI, will enable businesses to operate more efficiently and effectively.
Slava’s Key Points
Introduction and Context
- Slava began by acknowledging the excitement surrounding the advancements in AI and integration services.
- He emphasized that enterprise automation is not just about responding to messages but coordinating multiple steps and systems to achieve specific outcomes.
Agentic Workflows
- Slava introduced the concept of agentic workflows, which may involve one or multiple agents operating sequentially or in parallel.
- He explained that some workflows may involve human interaction, while others run fully autonomously.
Logic Apps and Agent Loop
- Slava discussed how Logic Apps provide the control, integration, and governance needed to build real-life workflows that incorporate agentic capabilities.
- He introduced the Agent Loop, a new feature in Logic Apps that allows users to build autonomous or conversational agents directly within the Logic Apps standard.
- Users can bring their models, system prompts, and tools, enabling agents to run natively as part of the workflow.
Conversational Agents
- Slava highlighted that agents can not only think but also communicate, thanks to the flexible concept of channels that allow for real-time, interactive conversations across platforms like Teams and Slack.
- The chat testing capabilities embedded in Logic Apps Designer enable users to simulate and debug conversations without leaving the workflow context.
Multi-Agent Orchestration
- The Agent Loop supports multi-agent orchestration, allowing for various execution patterns, including sequential and concurrent execution, loops, branching, and handoffs between agents.
- This capability enables organizations to mix agentic steps with deterministic steps in the same workflow, providing a comprehensive orchestration solution.
Integration with AI Foundry
- Slava emphasized the deep integration between Agent Loop and AI Foundry, allowing users to build agents that run inside Logic Apps while leveraging the capabilities of AI Foundry.
- Agents registered with AI Foundry can access models and benchmarking for ongoing quality checks, integrate with responsible AI services, and utilize shared knowledge and grounding tools.
Adoption and Use Cases
- Slava shared that since the public preview of Agent Loop, hundreds of customers across various industries have already built and run over 1,500 agents, generating significant usage of AI services.
- He noted the variety of use cases, including project estimation, data mapping, ticket handling, and even game-playing agents.
Code Full Workflows
Slava announced the introduction of Code Full Workflows in Logic Apps, which combines the power of traditional coding with the simplicity of Logic Apps. This new authoring model allows developers to write workflows using C# while still benefiting from the operational reliability of the Logic Apps platform.
Future of Integration
Slava emphasized that as the demand for integration grows, Azure Integration Services must evolve to meet the needs of organizations. With the projected increase in the use of agents, a strong integration backbone will be essential for success.
Conclusion
- Slava concluded by encouraging attendees to explore the capabilities of Agent Loop and Logic Apps, emphasizing that these tools provide a fast and easy way to build agents that are grounded in real-time data and business applications.
- He invited participants to join dedicated sessions to learn more about these new capabilities.
Final Thoughts
The keynote by Balan and Slava provided a comprehensive overview of the current state and future of integration services and generative AI at Microsoft. Their insights highlighted the importance of embracing AI and integration to drive efficiency, innovation, and competitive advantage in today’s rapidly evolving digital landscape. Organizations that prioritize these areas will be well-positioned to thrive in the future.
#2: What’s new in Azure Logic Apps
The session “What is the Future of Logic Apps?” at the INTEGRATE event featured key insights from Microsoft integration experts, including Kent Weare, Parth Shah, and Divya Swankar. Here’s a summary of the main points discussed:
- Investment Focus: The future of Azure Logic Apps will focus on three main areas:
- Reshaping Business Processes: Leveraging AI to enhance business processes.
- Optimizing Operations: Improving operational efficiency through integration environments and business process tracking.
- Enhancing Developer Experiences: Providing better tools and experiences for developers.
- New Capabilities:
- Agent Loop: A new feature allowing users to build agents in Azure Logic Apps, integrating AI models to enhance business process orchestration.
- Agent Parameters: This feature simplifies workflow creation by allowing dynamic population of parameters using large language models.
- SRE Agent: A proactive agent that identifies issues and assists in triaging and resolving operational tasks.
- Code for Workflows: A new capability that allows developers to write code within Logic Apps, providing more control over complex logic.
- Data Mapper Enhancements: The data mapper has been improved to facilitate easier data transformations, with a focus on performance and reliability. It now supports custom XPath and XSLT functions.
- Hybrid Logic Apps: The hybrid model allows for local processing and storage, catering to customers who require local control over their data and processes.
- Organizational Templates: A new feature enabling organizations to create and manage custom templates for workflows, enhancing development productivity and standardization.
- AI Integration: The session emphasized the integration of AI capabilities, including prompt templates for Azure OpenAI, allowing for more dynamic and context-aware interactions.
- Feedback and Future Plans: The team encouraged feedback from users regarding the consumption model of Logic Apps, as well as the future direction of the platform, particularly concerning the balance between consumption and standard models.
- Q&A Session: The session concluded with a Q&A segment where attendees raised questions about specific features, including the data mapper’s schema requirements and the singleton design pattern without service bus dependencies.
Overall, the session highlighted Microsoft’s commitment to enhancing Azure Logic Apps through AI integration, improved developer experiences, and robust operational capabilities, while also seeking user feedback to guide future developments.
#3: How are we going to support all of this AI and Integration stuff?
In his session at INTEGRATE 2025, Michael Stephenson(aka Mike) – The product owner for Turbo360 and an Azure MVP emphasized the pivotal role of Turbo360 in enhancing AI and integration applications. He addressed the increasing demand for effective support mechanisms as organizations rapidly develop these solutions. The tool simplifies the support experience, enabling developers to focus on innovation rather than maintenance.
Mike showcased key features of Turbo360, including Business Activity Monitoring (BAM) and Cost analyzer, which empower organizations to manage their integration solutions and optimize Azure costs more effectively. BAM allows non-technical stakeholders to track and analyze integration performance, fostering collaboration between technical and business teams.
He also highlighted the importance of simplifying the cloud operating model by providing a unified view of various cloud services, which reduces complexity for developers and architects. This single-pane approach streamlines monitoring and management of integrations, making it easier for teams to navigate the complexities of modern technology.
Ultimately, Mike positioned Turbo360 as an essential tool for organizations facing integration challenges, emphasizing its ability to drive efficiency and deliver business value. His insights underscored the importance of Turbo360 in helping organizations adapt to the fast-changing technological landscape while maximizing their integration capabilities.
#4: Transforming BizTalk: A journey to modern integration
At INTEGRATE 2025, Harold Campos, Principal Program Manager for Azure Integration Services, presented a compelling session on modernizing BizTalk Server solutions using Azure Logic Apps.
The Evolution of Integration
Harold began by discussing the significant changes in integration over the past few decades. He outlined the progression from file-based interfaces to Service-Oriented Architecture (SOA), and then to Enterprise Service Bus (ESB) and Integration Platform as a Service (iPaaS) solutions. This evolution reflects the growing complexity and demands of modern business environments, necessitating more agile and scalable integration solutions.
As a long-standing solution, BizTalk Server has effectively addressed many traditional integration needs. However, with the introduction of Azure Logic Apps approximately eight years ago, Microsoft has positioned Logic Apps as the successor to BizTalk. Harold emphasized that Logic Apps is designed to be the quickest and most cost-effective option for customers migrating from BizTalk Server.
Migration Support and Timeline
Harold highlighted that BizTalk Server 2020 will continue to receive mainstream support through 2028, giving customers a longer runway for migration. He described a phased migration strategy—rehost, replatform, and refactor—tailored to organizational readiness.
Key Features of Azure Logic Apps
He emphasized the power of Azure Logic Apps as a modern platform, with support for rich XML integration, custom .NET code, and a hybrid hosting model that runs across on-prem and cloud environments. He also showcased its deep integration with Azure Monitor and Application Insights, enabling better observability and diagnostics.
Harold closed by encouraging the community to share feedback and help shape the future of Logic Apps, reinforcing Microsoft’s commitment to supporting modern integration needs.
#5: Real-time event streaming ingestion and processing with Fabric
Introduction
Kevin begins by expressing his enthusiasm for the conference and the community, sharing his long-standing connection with the event. He introduces his role overseeing Azure Messaging, which encompasses various services like Service Bus, Event Hubs, Event Grid, and Stream Analytics.
Challenges in Real-Time Data Processing
Kevin outlines the challenges organizations face in adapting to the rapid influx of real-time data and AI. He highlights the limitations of traditional batch processing systems, which are often slow and cumbersome. The need for organizations to react quickly to incoming data from various sources—such as manufacturing devices, applications, and vehicles—is emphasized. He notes the difficulty in finding relevant insights amidst vast amounts of data and the fragmentation of existing tech stacks, which require expertise in multiple technologies and lead to data silos.
Microsoft Fabric Overview
To address these challenges, Kevin introduces Microsoft Fabric, a unified platform designed to streamline data acquisition and analytics. He emphasizes the goal of moving from a fragmented tech stack to a single, cohesive model that democratizes access to data analytics across organizations. Key features of Fabric include:
- Real-Time Streaming: Transitioning from faster batch processing to real-time data ingestion and reaction.
- Unified Data State: Eliminating data silos and creating a single source of truth for data.
- AI-Powered Insights: Utilizing AI to identify anomalies and insights within the data.
Components of Real-Time Intelligence
Kevin discusses the components of real-time intelligence within Microsoft Fabric, which include:
- Event Ingestion: Connecting to various data sources for real-time data streaming.
- Real-Time Analytics: Leveraging Azure Event Hubs, Event Grid, and Stream Analytics for analytics at scale.
- Digital Twin Builder: Creating digital representations of physical assets for better monitoring and analysis.
- Real-Time Dashboards: Interactive dashboards for visualizing and analyzing streaming data.
- Rules and Actions: Automating responses based on predefined rules triggered by incoming data.
Demonstration of Real-Time Intelligence
Alicia Li takes the stage to showcase new features in Microsoft Fabric, focusing on the Event Stream capabilities. She introduces several key features:
- SQL Operator: Allows users to express business logic using familiar SQL syntax, enhancing productivity and enabling more advanced scenarios.
- Schema Inferencing: Automatically infers the schema of incoming data, allowing for multiple event types within a single event stream.
- Event Schema Set: Enables users to define and manage schemas explicitly for better data quality and governance.
Alicia demonstrates how to build a smart building environment monitoring system using real-time data from temperature and power usage sensors. The demo highlights:
- Setting Up Event Streams: Connecting to an MQTT broker to ingest sensor data.
- Data Transformation: Using filters and managing fields to prepare data for analysis.
- Activating Alerts: Setting up rules to trigger alerts when certain conditions are met (e.g., temperature thresholds).
- Building Dashboards: Creating real-time dashboards to visualize data and monitor conditions.
Future Roadmap
Kevin wraps up the session by discussing the roadmap for Microsoft Fabric, which includes:
- Enhanced Integration: Improved connections with various data sources, including Azure SQL and Microsoft 365 Graph data.
- Custom Business Events: Allowing users to define their own events based on processing outcomes.
- Anomaly Detection: Utilizing AI to identify anomalies in real-time data streams.
- Fabric Functions: Integrating Azure Functions into Fabric for enhanced event-driven architecture.
#6: Building agentic workflows using Azure Logic Apps
The session features Kent Weare and Divya Swarnkar from Microsoft, who discuss the evolution of automation through the introduction of AI agents, particularly focusing on the “agent loop” concept within Logic Apps.
Introduction
Kent opens the session by emphasizing the transformative potential of AI in automation and integration architectures. He outlines the historical evolution of integration methods, from file-based interfaces to modern iPaaS systems, and highlights the recent emergence of AI and agents as a new paradigm in this space.
Logic Apps Strategy
Kent describes the Logic Apps strategy, which focuses on three core areas:
- Reshaping Business Processes: Utilizing agents to enhance and automate workflows.
- Optimizing Operations: Streamlining processes through automation.
- Enhancing Developer Experiences: Making it easier for developers to create and manage integrations.
The Role of AI Agents
Kent explains that AI agents represent an evolution in integration architecture, allowing for more dynamic and adaptable workflows. He emphasizes that agents are not meant to replace existing systems but to complement and enhance them. The architecture of an AI agent includes:
- Tools: Connectors, workflows, APIs, and custom code.
- Knowledge: Contextual information that helps agents make informed decisions.
- Autonomy: Agents can operate based on events or schedules, rather than following a static path.
Customer Success Stories
Kent shares examples of how companies like AT&T and DocuSign are leveraging AI to improve productivity and streamline operations. For instance, DocuSign uses Azure Logic Apps to orchestrate their intelligent agreement management system, resulting in significant time savings for their clients.
Demonstration of Agent Loop
Divya takes the stage to demonstrate the agent loop in action, focusing on a recruitment agent designed to assist recruiters with tasks such as candidate screening, interview scheduling, and team notifications. The demo showcases:
- Creating an Agent Workflow: Using Logic Apps to set up an agent workflow with triggers and actions.
- Configuring the Agent: Defining the agent’s personality, goals, and boundaries through system instructions.
- Dynamic Connections: Allowing agents to use user-specific identities for actions like scheduling meetings.
Observability and Transparency
Divya highlights the importance of observability in agent operations, showcasing how users can track agent decisions, interactions, and tool usage through run history and agent chat panels. This transparency helps users understand the agent’s actions and decisions.
Mindset Shift
Kent discusses the shift in mindset required for adopting AI agents, moving from deterministic workflows to more flexible, objective-driven approaches. He emphasizes the importance of focusing on outcomes rather than predefined paths, allowing for greater adaptability in business processes.
Multi-Agent Scenarios
The session also covers multi-agent orchestration patterns, demonstrating how different agents can work together to accomplish complex tasks. Divya presents examples of a travel agent scenario and a sales report generation scenario, illustrating how agents can hand off tasks and collaborate effectively.
Conclusion
Kent concludes by encouraging attendees to explore the potential of AI agents within their organizations, emphasizing that Logic Apps already provide a robust platform for building these solutions. He invites participants to engage with the broader Microsoft AI ecosystem and to attend future sessions for more insights.
Overall, the session highlights the transformative potential of AI agents in automation, showcasing practical applications and encouraging a shift in how organizations approach integration and process automation.
#7: Messaging for the enterprise
The final session of the day featured Christina Compy, Principal PM Manager, and Eldert Grootenboer, Senior Product Manager from Microsoft, who discussed advancements in Azure messaging services. They highlighted four main components of Azure’s messaging ecosystem:
- Event Grid: A discrete message broker supporting various messaging patterns and event handling across Azure.
- Service Bus: An enterprise queue service that predates Azure, offering robust messaging capabilities.
- Event Hubs: A high-throughput streaming platform used for massive data loads, including internal Azure services.
- Stream Analytics (ASA): A foundational service for processing data, particularly billing events.
The speakers emphasized Azure’s reliability, boasting a five-nines SLA and handling 377 trillion requests monthly. They also introduced Fabric, a new SaaS layer built on existing Azure services, aimed at simplifying event stream management.
Key updates included infrastructure improvements focused on security and resilience, such as zone redundancy for new deployments and modernization efforts to enhance performance and security.
Eldert outlined the roadmap for upcoming features, including:
- Geo-application general availability (GA) for multi-region data handling.
- Batch delete capabilities for efficient message management.
- Durable terminals for connection recovery.
- Enhanced migration options between service tiers and regions.
They also discussed geo-replication, allowing for data and metadata replication across regions with minimal client changes required. The session concluded with a focus on the importance of regular testing and automation in failover processes, as well as the introduction of cross-namespace forwarding to enhance message routing flexibility.
The speakers encouraged audience engagement and provided resources for further learning, while also inviting attendees to reach out with questions.