DataVec Icon

The first FaaS that scales like serverless — |

DataVec delivers horizontal elasticity with vertical performance, enabling near-zero-latency workloads at massive scale.

104 KB
Actors
300 K
Invocations/sec/core
Zero
Cold Starts

Why Datavec?

Serverless unlocked horizontal scale—but at the cost of performance. DataVec restores both.

The next evolution of serverless: horizontal scale, vertical power.

Traditional FaaS platforms traded away vertical performance for elasticity. DataVec’s zero-overhead runtime delivers both — allowing teams to develop fast, scale instantly, and run with the efficiency of handcrafted servers.

Our Performance?

Performance & Locality — Solving the Core Problems of Serverless
CURRENT PLATFORMS
1

Heavy cold starts (10-100 ms)

“Throw money at the problem”

Heavy cold starts
DATAVEC
2

Instant dispatch (µs — single page fault)

Save money through predictable performance”

Instant dispatch
3

Stateless, hash-partitioned scaling fragments locality

Instant dispatch
4

Local state persistence keeps computation and data together

Instant dispatch
DataVec executes functions instantly and locally, eliminating the hidden costs of stateless cloud scaling.

Our Approach

DataVec runs on mnvkd, a unified C runtime that merges functions, services, actors, and threads into a single efficient system. By aggregating I/O and scheduling instead of isolating workloads, mnvkd removes runtime waste and achieves microsecond dispatch. Its locality model keeps data close to where it’s computed, so workloads stay fast and consistent even at massive scale. The result: native-speed performance with the flexibility of serverless.
Aggregation + Locality = C-speed serverless.

AI -Powered Development

AI can design apps — but it can’t optimize platforms. DataVec bridges that gap.

We give AI a runtime that’s fast, local, and resource-aware—so it can deploy apps that run at C-speed with zero overhead. You get all the simplicity of high-level frameworks, backed by the performance of hand-tuned systems.

AI builds the logic. DataVec makes it production-grade..

Key Features

A runtime engineered for instant dispatch, local state, and limitless concurrency.
  • Single-page I/O loads: One huge-page fault brings an entire micro-process into memory for instant dispatch.
  • Persistent, protected memory: Huge-page–backed mappings keep state local and secure between invocations.
  • Lightweight by design: Each actor consumes just 104 KB (8 KB core + 96 KB tunable buffers), enabling millions of concurrent functions.
  • Efficient concurrency: Cooperative, isolated scheduling achieves real-time responsiveness without thread contention.
  • Built for any language: Native bindings for C, JavaScript, and others—bringing FaaS performance to every stack.

Use Cases

From cloud functions to edgeAI, DataVec brings C-speed efficiency to any workload.
DataVec Icon

Cloud Functions at C-Speed

Deploy complete cloud and edge applications through the WinterTC interface. DataVec isolates workloads without overhead—delivering instant startup, consistent latency, and up to 10× cost efficiency

Read More..

DataVec Icon

Ultra-Efficient Services

Design data services and custom protocols on our Super-Server layer to handle millions of connections with minimal resource consumption.

Read More..

DataVec Icon

Edge AI & Real-Time Execution

Run soft real-time actors for AI inference, streaming, or event processing with <2 ms latency and predictable per-core performance.

Read More..

Roadmap

We’re building carefully, with transparency and a focus on real performance.
Q4 2025
Managed Cloud Service
Q1 2026
Public Beta Launch
Q3 2026
Managed Cloud Service

Our Team

Ben Woolley —Platform Development

Full-stack engineer with deep experience across all layers—from web frameworks to operating system internals. Two decades in marketing-technology and high-performance runtime design.

Shane Kutzer —Business Development

Veteran operator and founder with decades of experience building customer-driven businesses. Focused on strategic partnerships and enterprise adoption.