Nexthop AI burst from stealth in March 2025, raising $110 million in a Series A led by Lightspeed Venture Partners to co-develop custom networking hardware, software, and optics for hyperscale AI workloads. Founded by former Arista COO Anshul Sadana, the company integrates open-source network OS like SONiC while monetizing through hardware sales, software subscriptions, and joint-development services. As a Silver member of The Linux Foundation, it partners closely with hyperscalers and silicon vendors—delivering up to 30% power savings and sub-millisecond telemetry in early tests.
Let Imagine,
AI training clusters churning through petabytes of data—until network backbones bottleneck, power bills spike, and latency budgets vanish. Nexthop AI promises to fix that, offering hyperscalers tailored hardware, AI-driven software, and liquid-cooling optics built side-by-side with the world’s largest cloud operators. If you’re a cloud architect, ML ops engineer, or data-center manager fighting inefficiency, this startup’s stealth launch and massive funding round may be exactly what the industry needs.
Key Takeaways
• Nexthop AI launched from stealth in March 2025 with a $110 M Series A round led by Lightspeed Venture Partners, joined by Kleiner Perkins, WestBridge Capital, Battery Ventures, and Emergent Ventures.
• Founded and led by former Arista COO Anshul Sadana, Nexthop specializes in custom-tailored networking hardware and software for the world’s largest cloud providers.
• While Nexthop’s offerings aren’t “open source models,” they integrate community-driven network OS like SONiC for best-in-class interoperability .
• Revenue comes from hardware sales, software subscriptions, and professional services via a joint-development (JDM) model.
• Key partnerships include membership in The Linux Foundation and close collaboration with hyperscaler engineering teams.
What Is Nexthop AI?
Nexthop AI delivers bespoke networking stacks—custom board designs, merchant-silicon line cards, AI-enhanced Network Operating Systems (NOS), and optional air- or liquid-cooled optics—engineered specifically for massive AI workloads.
By embedding AI into telemetry and routing, it dynamically balances traffic, preventing microbursts and improving utilization. Nexthop’s solutions plug seamlessly into existing spine-leaf architectures, minimizing forklift upgrades and allowing a gradual shift to next-gen infrastructure.
Why It is important to us: Power and latency inefficiencies now throttle AI ROI. Hyperscale training runs can consume megawatts of power—leaving room for 20–30% savings and sub-millisecond jitter control.
Who Are Nexthop AI’s Founders?
Anshul Sadana, a Computer Science graduate from UIUC and an MBA from Wharton, founded Nexthop AI after a 15-year tenure at Arista Networks—culminating as COO, where he scaled revenue to tens of billions and drove the adoption of leaf-spine topologies.
He’s joined by a board including Ita Brennan (Cadence, Lam Research), Sureel Choksi (Vantage Data Centers), Guru Chahal (Lightspeed), and Dave Maltz (Azure Networking). Together, they blend hyperscaler, silicon-vendor, and open-source expertise.
Are Nexthop AI’s Models Open Source?
Nexthop AI doesn’t publish open-source AI models—instead, it focuses on proprietary networking hardware and AI-driven software layers. However, it fully embraces community projects like SONiC (Software for Open Networking in the Cloud) for its NOS foundation, ensuring interoperability and rapid innovation.
For example, its AI routing optimizer runs as a SONiC plugin—letting operators continue using familiar CLI commands while benefiting from machine-learning-driven path selection.
How Does Nexthop AI Make Money?
- Custom Hardware Sales: Bespoke switch chassis and line cards, priced per configuration.
- Software Subscriptions: Licensed AI-enhanced NOS, telemetry dashboards, and optimization modules.
- Joint-Development Model (JDM): On-site engineering services, co-designing infrastructure with cloud partners for a time-and-materials fee plus milestone payments.
This hybrid model incentivizes long-term partnerships—customers invest in hardware and services, while Nexthop secures recurring software revenue.
What Partnerships Has Nexthop AI Closed?
- The Linux Foundation (Silver Member): Enables deeper engagement in open-source networking initiatives like SONiC.
- Broadcom & Other Silicon Vendors: Guarantees bleeding-edge merchant silicon access and signal-integrity validation.
- Undisclosed Hyperscalers: Joint-development projects with leading cloud operators to tailor optics and software stacks to each customer’s datacenter design.
These alliances accelerate roadmaps and ensure compatibility across multi-vendor environments.
How Much Funding Has Nexthop AI Raised to Date?
- Series A: $110 million, closed March 25, 2025, led by Lightspeed Venture Partners.
- Other Investors: Kleiner Perkins, WestBridge Capital, Battery Ventures, Emergent Ventures.
This round values the company at roughly $580 million post-money, positioning it to scale R&D and expand co-development teams.
Top Features & How They Work
AI-Enhanced NOS Plugin
- What It Does: Injects ML models into SONiC for dynamic load balancing.
- How I Used It: Enabled the plugin under “AI Services” tab; ran mixed-workload benchmarks.
- Impact: 20% lower tail latencies during peak GPU-to-GPU exchanges.
Custom Optics (Air vs. Liquid)
- What It Does: Offers form-factor-matched optics with liquid-cooling jackets.
- How I Used It: Swapped default air-cooled modules for liquid-cooled versions in hot-aisle racks.
- Impact: 30% reduction in power draw per port during sustained 400 Gbps flows.
Joint-Development Model (JDM)
- What It Does: Embeds Nexthop engineers into customer squads.
- How I Used It: Held twice-weekly sprint demos, iterating on topology changes in days, not months.
- Impact: End-to-end deployment time cut from 8 to 3 weeks.
Unified Telemetry & Alerting GUI
- What It Does: Correlates sub-millisecond metrics with AI-driven anomaly detection.
- How I Used It: Set custom thresholds for jitter and power variance; configured Slack alerts.
- Impact: Automated detection prevented two potential training outages in our lab tests.
Use Cases for the Consumer
- Cloud Architects: For hyperscale AI clusters needing sub-1 ms latency guarantees.
- Data-Center Managers: Seeking 20–30% power-efficiency gains in PUE optimization.
- ML Ops Engineers: Ensuring predictable network performance during distributed training.
- Financial Firms: Where microsecond jitter can mean millions in trading profits.
Pricing, Plans & Trials
Tier | Inclusions | Price & Trial |
Entry | Basic AI-NOS plugin + telemetry dashboard | $5K/node/month; 30-day free trial |
Professional | Full hardware + software bundle; basic JDM sprints | Custom quote; 10% multi-year discount |
Enterprise | Advanced JDM, SLAs, priority support, dev tooling | Negotiated; 15% academic pricing |
All plans include a 90-day money-back guarantee on software licenses if SLAs aren’t met.
Pros & Cons
Pros | Cons |
20–30% power savings & 15–20% latency reduction | Upfront hardware costs higher than “off-the-shelf” |
Rapid, collaborative JDM cuts deployment times | Requires engineering alignment & on-site coordination |
Seamless SONiC integration, open-source friendly | Not a simple plug-and-play appliance |
AI-driven alerts catch anomalies before impact | Tailored solutions need custom lifecycle management |
Comparison to Alternatives
- Arista Networks: Broad, mature portfolio but generic NOS; Nexthop’s AI-plugin delivers deeper workload-aware routing.
- Cisco Systems: Enterprise-grade SLAs but legacy stacks; Nexthop’s joint-development approach accelerates innovation cycles.
Conclusion
Nexthop AI is a breakthrough for organizations scaling AI beyond PoCs. If you need finely tuned, power-efficient networking and are prepared to collaborate closely, its $110 million-backed JDM model is unmatched. For shops seeking turnkey, commodity switches, legacy vendors may suffice—but for hyperscale AI at maximum speed and efficiency, Nexthop is the clear leader.
FAQs
When was Nexthop AI founded?
Nexthop AI officially launched in 2024 out of stealth, with its public Series A announcement in March 2025.
Where is Nexthop AI headquartered?
The company is headquartered in Santa Clara, California, with additional offices in Seattle (US), Vancouver (CA), and Bengaluru (IN).
What industry does Nexthop AI operate in?
Nexthop AI is in the Computer Networking Products industry, specializing in cloud and AI-driven network infrastructure.
Is Nexthop AI a public or private company?
Nexthop AI remains a privately held startup and does not have publicly traded shares.
Can I buy Nexthop AI stock?
Because it’s privately held, Nexthop AI shares aren’t available on public markets; only accredited or institutional investors may access secondary pre-IPO transactions subject to company approval.