v0.1.0 — Now in public preview

European. Cloud‑native.
AI‑first.

kern is a systems language built in Europe for the next era of computing. Designed from the ground up for cloud infrastructure and AI workloads — with the safety, speed, and sovereignty your team demands.

main.kern
use std::io
use std::ai

type Prompt {
    system: String
    input:  String
}

fn main() -> async Result<(), Error> {
    let model = await ai::load("llama-3")?

    let prompts = [
        Prompt { system: "Summarize",  input: doc },
        Prompt { system: "Translate", input: doc },
    ]

    prompts
        |> map(|p| model.infer(p))
        |> async::all()
        |> await?
        |> each(|r| io::println("{r}"))
}

Three pillars. One language.

kern isn't another general-purpose language. It's purpose-built around the forces shaping the next decade of software.

European Native

Built in Europe.
Governed in the open.

kern is developed under European open-source governance. No single corporate owner. Privacy-by-design principles baked into the standard library. Built for teams that need to meet GDPR, the AI Act, and digital sovereignty requirements without friction.

  • Open governance under EU foundation
  • Privacy-by-design standard library
  • GDPR & AI Act compliance primitives
  • Digital sovereignty by default
Cloud Native

Born in containers.
Scales to zero.

kern compiles to tiny, dependency-free binaries that start in under a millisecond. First-class support for gRPC, health checks, structured logging, and distributed tracing. Designed to be the best language for Kubernetes and serverless.

  • Sub-millisecond cold starts
  • Static binaries, no runtime deps
  • Native gRPC & OpenTelemetry
  • Kubernetes-ready health & probes
AI Native

Tensors in the type system.
GPU at your fingertips.

kern treats AI as a first-class workload. Built-in tensor types with shape checking at compile time. Automatic differentiation. Transparent GPU offloading. Write inference pipelines and training loops in the same language as your application.

  • Compile-time tensor shape checking
  • Built-in automatic differentiation
  • Transparent GPU compute offloading
  • Native model loading & inference

Solid foundations.

Memory Safety

Ownership tracking at compile time. No GC, no dangling pointers, no data races.

Algebraic Types

Sum types, pattern matching, and exhaustive checking. Make illegal states unrepresentable.

Fearless Concurrency

Structured concurrency with compile-time race detection. Correct by construction.

Zero-Cost Abstractions

Compiles to native code via LLVM. You never pay for what you don't use.

First-Class Tooling

Built-in formatter, linter, test runner, and package manager. One tool, zero config.

Pipe-First Syntax

Data flows left to right. Chain, compose, and transform with clarity.

Expressive by nature.

kern's syntax reads as easily as it's written. See for yourself.

shapes.kern
type Shape =
    | Circle(f64)
    | Rect(f64, f64)
    | Triangle(f64, f64, f64)

fn area(shape: Shape) -> f64 {
    match shape {
        Circle(r) =>
            std::math::PI * r * r,
        Rect(w, h) =>
            w * h,
        Triangle(a, b, c) => {
            let s = (a + b + c) / 2.0
            (s * (s - a) * (s - b) * (s - c)).sqrt()
        }
    }
}

// The compiler ensures every variant is handled.
// Add a new shape? Every match must update.

Exhaustive pattern matching

Sum types let you define every possible variant. The compiler guarantees you handle them all — add a variant, and it tells you exactly where to update.

  • Compiler-enforced exhaustiveness
  • Destructuring in match arms
  • No null, no undefined, no surprises
service.kern
use std::http
use std::cloud::{health, telemetry}

// A cloud-native HTTP service in 20 lines

fn main() -> async Result<(), Error> {
    let app = http::router()
        |> get("/users/:id", handle_user)
        |> post("/users",    create_user)
        |> with(telemetry::middleware())
        |> with(health::check("/healthz"))

    http::serve(app, ":8080")
        |> await
}

fn handle_user(req: Request) -> async Response {
    let id = req.params.get("id")?
    let user = await db::find_user(id)?
    Response::json(user)
}

Cloud-native by default

Health checks, structured logging, and distributed tracing are part of the standard library — not afterthoughts bolted on via third-party packages.

  • Built-in OpenTelemetry tracing
  • Kubernetes-ready health probes
  • Pipe-friendly middleware composition
inference.kern
use std::ai
use std::tensor::{Tensor, f32}

// Shape-checked tensors catch dimension
// errors at compile time, not at 3 AM.

fn classify(
    image: Tensor<f32, [224, 224, 3]>
) -> async Result<Label, Error> {
    let model = await ai::load("resnet-50")?

    image
        |> normalize(0.485, 0.229)
        |> model.forward()
        |> argmax()
        |> to_label()
}

fn main() -> async Result<(), Error> {
    let img = await Tensor::from_image("photo.jpg")?
    let label = await classify(img)?
    println("Detected: {label}")
}

AI as a first-class citizen

Tensor dimensions are part of the type system. Shape mismatches are compile errors, not runtime crashes. Load models, run inference, and train — all in one language.

  • Compile-time tensor shape verification
  • Automatic GPU offloading
  • Native model loading & inference

Fast isn't a feature.
It's the foundation.

0 x faster Compilation speed vs. Go
0 GC pauses Deterministic memory management
<1 ms cold start Native binary, serverless-ready
0 % less memory Peak RSS vs. Java equivalent

Up and running in seconds.

1

Install kern

terminal
$ curl -fsSL https://kern-lang.org/install.sh | sh
2

Create a project

terminal
$ kern new my-service
  Created project `my-service` in ./my-service
$ cd my-service
3

Run it

terminal
$ kern run
   Compiling my-service v0.1.0
    Finished in 0.08s
    Running `target/release/my-service`
Server listening on :8080

Build what
Europe needs next.

Join a growing community of developers building sovereign, cloud-native, AI-powered software with kern.