
Senior Software Engineer
- Bangalore, Karnataka
- Permanent
- Full-time
A.P. Moller – Maersk is the global leader in container shipping services. The business operates in 130 countries and employs 80,000 staff. An integrated container logistics company, Maersk aims to connect and simplify its customers’ supply chains.Today, we have more than 180 nationalities represented in our workforce across 131 Countries and this mean, we have elevated level of responsibility to continue to build inclusive workforce that is truly representative of our customers and their customers and our vendor partners too.The Brief
We are seeking a Senior Software Engineer with deep backend expertise to lead the development of scalable infrastructure for LLM inferencing, Model Context Protocol (MCP) integration, Agent-to-Agent (A2A) communication, prompt engineering, and robust API platforms. This role sits at the core of our AI systems stack — enabling structured, contextual, and intelligent communication between models, agents, and services. You'll design modular backend services that interface seamlessly with inferencing engines, orchestrate model contexts, and expose capabilities via APIs for downstream products and agents.What I'll be doing – your accountabilities?
- Architect and implement backend services that support dynamic model context management via MCP for LLM-based systems.
- Build scalable and token-efficient inference pipelines with support for streaming, context merging, memory, and retrieval.
- Enable Agent-to-Agent (A2A) messaging and task coordination through contextual protocols, message contracts, and execution chains.
- Design and maintain developer-friendly, secure, and versioned APIs for agents, tools, memory, context providers, and prompt libraries.
- Lead efforts in prompt engineering workflows including templating, contextual overrides, and programmatic prompt generation.
- Collaborate across engineering, ML, and product teams to define and implement context-aware agent systems and inter-agent communication standards to enable closed-loop enterprise AI Services ready for consumption by the enterprise.
- Own end-to-end delivery of infrastructure, inferencing, back-end, API and communication management in multi-agent system.
- Ensure models are modular, extensible, and easily integrated with external services/platforms (e.g., dashboards, analytics, AI agents).
- Bachelor’s, Master’s or Phd in Computer Science, Engineering, or related technical field.
- 8+ years of experience in backend systems design and development — ideally in AI/ML or data infrastructure domains.
- Strong proficiency in Python (FastAPI preferred); additional experience with Node.js, Go, or Rust is a plus.
- Experience with LLM inferencing pipelines, context windowing, and chaining prompts with memory/state persistence.
- Familiarity with or active experience implementing Model Context Protocol (MCP) or similar abstraction layers for context-driven model orchestration.
- Strong understanding of REST/GraphQL API design, OAuth2/JWT-based auth, and event-driven backend architectures.
- Practical knowledge of Redis, PostgreSQL, and one or more vector databases (e.g., Weaviate, Qdrant).
- Comfortable working with containerized applications, CI/CD pipelines, and cloud-native deployments (AWS/GCP/Azure).
- Experience building or contributing to agent frameworks (e.g., LangGraph, CrewAI, AutoGen, Agno etc.).
- Background in multi-agent systems, dialogue orchestration, or synthetic workflows.
- Familiarity with OpenAI, Anthropic, HuggingFace, or open-weight model APIs and tool-calling protocols.
- Strong grasp of software security, observability (OpenTelemetry, Prometheus), and system performance optimization.
- Experience designing abstraction layers for LLM orchestration across different provider APIs (OpenAI, Claude, local inference).
- Opportunity to lead backend architecture for cutting-edge, LLM-native systems.
- High-impact role in shaping the future of context-aware AI agent communication.
- Autonomy to drive backend standards, protocols, and platform capabilities across the org.
- Collaborative, remote-friendly culture with deep technical peers.