Inquire
Book A Call
Field notes in depth

NVIDIA and ServiceNow Unveil Project Arc: Autonomous AI Agents Built for the Enterprise

Modern Data Center
Share this article

Enterprise AI has learned to generate. It has learned to reason. Now companies are asking the next question: how should AI act?

💡

Quick answer: NVIDIA and ServiceNow have unveiled Project Arc, a long-running autonomous desktop agent for knowledge workers, built on NVIDIA OpenShell and ServiceNow's AI Control Tower. The collaboration brings governance, security, and Blackwell-class efficiency to autonomous AI — turning it from a demo into something enterprises can actually deploy.

At ServiceNow Knowledge 2026, the two companies extended a multi-year partnership with a single, sharper goal: make autonomous AI agents safe, governed, and economically viable inside the enterprise. The result is a stack that pairs NVIDIA's compute and runtime layer with ServiceNow's workflow context — so an agent isn't just clever, it's accountable.

For the past two years, agents have been mostly experimental. They could draft a reply, summarize a ticket, or chain a few tools together. But putting them in front of real enterprise data, with permission to act, required a level of control most teams simply didn't have. That's the gap this announcement closes.

From Generating to Reasoning to Acting

The shift in enterprise AI has happened in three quick waves. First came generation — large models writing copy, code, and summaries. Then came reasoning — models that could plan, evaluate, and decide. The third wave, now underway, is action: AI that opens applications, navigates file systems, executes multistep workflows, and reports back.

Action introduces hard problems that generation never had to solve. Who authorized the agent? What can it touch? How is the work audited? Project Arc is built around answers to those exact questions.

What Project Arc Actually Is

Project Arc is a long-running, self-evolving desktop agent designed for the knowledge worker — developers, IT teams, administrators, and operations staff who live inside dozens of overlapping tools all day.

Unlike a standalone chatbot or a one-shot script, Arc connects natively to the ServiceNow AI Platform through ServiceNow Action Fabric. That connection is what brings governance, auditability, and workflow intelligence to every action it performs — local file edits, terminal commands, and cross-application sequences that traditional automation cannot stitch together.

Project Arc represents the next step in our ongoing collaboration with NVIDIA, bringing autonomous execution to the desktop. By combining OpenShell's runtime layer with ServiceNow AI Control Tower, and powered by ServiceNow Action Fabric, we're delivering the governance and security that enterprise AI requires. — Jon Sigler, EVP & GM of AI Platform, ServiceNow

The Three Foundations of Enterprise-Grade Agents

According to NVIDIA's Kari Briski, every long-running autonomous agent depends on three pillars working together. Skip one, and the system either becomes a security liability or never reaches break-even economics.

1. Open, Domain-Adaptable Models

Agents must be customizable to the enterprise's data and skills — not trapped inside a closed black box.

2. Governed Security

Sandboxing, policy controls, and observability must be built in from the runtime up.

3. Efficient Tokenomics

AI factories must deliver enough cost-per-token improvement to make multistep agents economically viable at scale.

OpenShell: The Secure Runtime Underneath

Project Arc runs on NVIDIA OpenShell, an open-source secure runtime for developing and deploying autonomous agents inside sandboxed, policy-governed environments. ServiceNow is now contributing back to OpenShell — a meaningful signal that the foundation is meant to be a shared, open standard rather than a vendor lock-in.

With OpenShell, enterprises define exactly:

  • What the agent is allowed to see — files, systems, data sources
  • Which tools it may invoke — APIs, terminals, applications
  • How its actions are contained and rolled back when something looks off

Pair that with the ServiceNow AI Control Tower for governance and observability across the lifecycle, and you have something an internal security team will actually approve.

The Open Model Stack

Capability without flexibility is a dead end in the enterprise. The collaboration leans heavily on an open ecosystem so customers can specialize models for their own domain.

Building blocks

  • NVIDIA AI-Q Blueprint — a framework for building deep research agents that can gather context and make complex multi-document decisions.
  • NVIDIA Agent Toolkit with Nemotron open models — flexible building blocks teams can fine-tune.
  • NVIDIA NeMo Gym — the library powering benchmarking and training of agentic systems.

Benchmarks built for real enterprise work

General reasoning benchmarks don't tell you how an agent behaves over a 40-step IT workflow. So the partners built two purpose-built suites:

  • NOWAI-Bench — an open benchmarking suite for enterprise AI agents, integrated with NeMo Gym.
  • EnterpriseOps-Gym — described as the industry's most challenging enterprise agent benchmark. Nemotron 3 Super currently ranks #1 among open-source models on it.

The Tokenomics Argument

Autonomous agents burn tokens. A single complex multistep job can consume what a chatbot uses in a week. That's why infrastructure efficiency isn't a footnote — it's the entire business case.

50× More tokens per watt vs. Hopper
~35× Lower cost per million tokens
#1 Nemotron 3 Super on EnterpriseOps-Gym

The NVIDIA Blackwell platform's efficiency gains over the prior Hopper generation are what move long-running agents from "interesting demo" to "deployable across thousands of seats." When the AI Control Tower plugs into the validated NVIDIA Enterprise AI Factory design, customers also gain real-time observability across the entire AI lifecycle.


Why This Matters for Enterprises Right Now

Most enterprises are sitting on a backlog of automations they couldn't reliably build with traditional RPA — too many edge cases, too much unstructured data, too many tools to bridge. That backlog is exactly where autonomous agents earn their keep.

But the buyer's calculus has shifted. Capability is now table stakes. The questions on every CIO's whiteboard are: Can I prove what the agent did? Can I scope what it can touch? Will the unit economics hold at scale? Project Arc and the underlying stack are an answer to all three at once.

AI is becoming a new way that work gets done. — NVIDIA & ServiceNow, ServiceNow Knowledge 2026

Key takeaways

  • Project Arc is ServiceNow's autonomous desktop agent for knowledge workers — it lives on the desktop and acts across files, terminals, and apps.
  • It runs on NVIDIA OpenShell, an open-source secure runtime, with ServiceNow contributing back upstream.
  • Governance and observability come through ServiceNow AI Control Tower and the Action Fabric.
  • An open model stack — Nemotron, AI-Q Blueprint, NeMo Gym — keeps customers in control of customization.
  • New benchmarks (NOWAI-Bench, EnterpriseOps-Gym) measure what enterprises actually care about: long, real workflows.
  • Blackwell economics make multistep agents finally affordable at scale.

Frequently Asked Questions

What is Project Arc?

Project Arc is a long-running, self-evolving autonomous desktop agent from ServiceNow, built in collaboration with NVIDIA. It's designed for knowledge workers — developers, IT teams, and administrators — and can access local file systems, terminals, and applications to complete complex multistep tasks under enterprise governance.

What is NVIDIA OpenShell?

NVIDIA OpenShell is an open-source secure runtime for developing and deploying autonomous AI agents inside sandboxed, policy-governed environments. It lets enterprises define what agents can see, which tools they can invoke, and how their actions are contained.

How does this compare to traditional RPA?

Traditional RPA follows fixed scripts and breaks on edge cases or UI changes. Autonomous agents like Project Arc reason about goals, adapt to unstructured situations, and chain across applications — including those without clean APIs. They handle the long-tail workflows RPA could never reliably automate.

What benchmark does Nemotron 3 Super lead?

Nemotron 3 Super currently ranks #1 among open-source models on EnterpriseOps-Gym, described as the industry's most challenging enterprise agent benchmark. It evaluates models on the kind of long, multistep workflows enterprises run in production rather than on general reasoning tasks.

Why are tokenomics central to enterprise AI agents?

Long-running agents consume far more tokens than chatbots — a single multistep workflow can chew through what a Q&A bot uses in a week. NVIDIA's Blackwell platform delivers around 50× more tokens per watt than Hopper and roughly 35× lower cost per million tokens, which is what shifts agents from a pilot curiosity to viable enterprise infrastructure.

How is governance enforced?

Governance comes from two layers working together. OpenShell sandboxes execution and enforces policy at the runtime level. The ServiceNow AI Control Tower, integrated with the Action Fabric, provides observability, audit trails, and lifecycle controls across every agent and action.

Where was this announced?

At ServiceNow Knowledge 2026, the company's annual customer and partner conference, on May 5, 2026.