Abstract

This document is Part I of four parts in an immersive tutorial which builds an enterprise fully-open-source search indexing pipeline and search engine that scales with both indexing and searching your document corpus.

The project builds upon itself. We will start from a clean slate and build the product up.

The sections are planned as follows:

Part I (this document)

  1. Deep-dive into the micronaut framework

  2. How to setup a multi-module enterprise project using

    1. Gradle

    2. JDK21

  3. Creating google protocol buffers

Part II: creating a search pipeline

  1. Creating a kafka pipeline with pure open source projects

  2. Expanding a kafka pipeline for a Search Engine

  3. Search microservices

    1. Chunker

    2. Embeddings

    3. NLP - NER, Keyword Extraction

    4. Mapping protocol buffers

Part III: Designing a search API

  1. Creating a search API

  2. Simple or complex?

  3. Handling vector search in OpenSearch or Solr

  4. Handling BM25 search in OpenSearch or Solr

  5. Tuning relevancy

Part IV: Analytics

  1. How to measure analytics

  2. Choosing a search analytics package

  3. Creating a front end

Part V: Deploying Microservices

To create this HTML file, you n

this tutorial. It’s open and used to demonstrate how one can build a multi-module build for containers in a single repository using the Micronaut framework. At the same time, this tutorial will build a kafka/grpc-based text processing pipeline.

We build the entire project using the following components:

  1. Base components

    1. Micronaut framework

    2. JDK21

    3. Gradle build system (Kotlin syntax)

  2. External servers

    1. Kafka (latest vanilla version)

    2. Apicurio (Schema registry)

    3. Consul (Service discovery)

    4. Open Search (Search Engine)

    5. MongoDB (Document Store)

Managing dependencies and ensuring build consistency across multiple projects can be a significant challenge, especially as systems grow in complexity. This tutorial provides a step-by-step guide to establishing a robust multi-project build structure using Gradle with the Kotlin DSL (even if your projects use Java), targeting JDK 21.

What We’re Building

Imagine a system designed for processing data pipelines. This system consists of several parts:

  • Shared Libraries: A core library containing the main pipeline logic (pipeline-service-core), data models defined using Protocol Buffers (protobuf-models), and common helper functions (util).

  • Testing Utilities: A dedicated library (pipeline-service-test-utils) to assist in testing the pipeline components.

  • Microservices: Specific implementations of pipelines as runnable Micronaut applications (e.g., pipeline-instance-A).

The Goal

Our goal is to manage this system effectively within a single repository (monorepo) using Gradle. We’ll focus on:

  1. Centralized Dependency Management: Creating a custom Bill of Materials (BOM) and using Gradle’s version catalog (libs.versions.toml) to ensure all modules use consistent library versions.

  2. Consistent Build Environment: Using Gradle Kotlin DSL and configuring for JDK 21.

  3. Modular Structure: Defining clear dependencies between the different project modules.

  4. Efficient CI/CD: Discussing strategies to build and deploy only the parts of the system that have changed.

Project Dependency Overview

The following diagrams illustrate the relationships between the different project modules we’ll be configuring.

diagram‑dependencies
Figure 1. High-Level Module Dependencies:
diagram bom
Figure 2. Dependency on the Custom BOM:

This tutorial will guide you through setting up this structure step-by-step, providing CLI commands and code examples along the way.

Prerequisites

  • Linux or macOS environment with a Bash-compatible shell.

  • JDK 21 installed and configured (e.g., JAVA_HOME environment variable set).

  • Git installed.

  • Docker installed (optional, for containerization steps).

Let’s begin!

Step 1: Verify Project Setup and Initialize Gradle Wrapper (CLI)

This step assumes your project’s directory structure is already in place. We will verify the structure and ensure the Gradle wrapper is initialized for consistent builds.

  1. Verify Project Structure: Ensure your project root (e.g., micronaut-multiproject-example) contains the necessary subproject directories and configuration files:

    micronaut-multiproject-example/
    ├── bom/
    ├── gradle/
    │   └── libs.versions.toml  # Created in Step 4
    ├── pipeline-instance-A/
    ├── pipeline-service-core/
    ├── pipeline-service-test-utils/
    ├── protobuf-models/
    ├── util/
    ├── build.gradle.kts        # Created in Step 3
    ├── settings.gradle.kts     # Created in Step 2
    └── ... (other project files)

    Navigate into your project’s root directory:

    cd path/to/your/micronaut-multiproject-example
  2. Initialize or Verify Gradle Wrapper: Run this command in the root directory to ensure the correct Gradle version is configured via the wrapper. If the wrapper files (gradlew, gradlew.bat, gradle/wrapper) already exist, this command can update them if needed. We’ll use Gradle 8.13 (or your preferred compatible version). Check the Gradle Compatibility Matrix for the latest recommendations.

    # Ensure you have a system Gradle installed OR download manually if wrapper doesn't exist yet
    gradle wrapper --gradle-version 8.13

    This ensures gradlew, gradlew.bat, and the gradle/wrapper directory are present and configured. From now on, use ./gradlew to run Gradle tasks for consistency.

  3. Initialize Git Repository (if not already done): If your project isn’t already a Git repository, initialize it and create a .gitignore file.

    # Only run if not already a git repository
    git init
  4. Create the .gitignore file:

    nano .gitignore
    Click to view the full .gitignore listing
    
    # Compiled class file
    *.class
    
    # Log file
    *.log
    
    # BlueJ files
    *.ctxt
    
    # Mobile Tools for Java (J2ME)
    .mtj.tmp/
    
    # Package Files #
    *.jar
    *.war
    *.nar
    *.ear
    *.zip
    *.tar.gz
    *.rar
    
    # virtual machine crash logs, see http://www.java.com/en/download/help/error_hotspot.xml
    hs_err_pid*
    replay_pid*
    
    .gradle/
    build/
    .idea/
    *.iml
    out/
    
    /out.txt
    
    docs/.asciidoctor
        

Step 2: Configure Project Settings (settings.gradle.kts)

This file defines which subprojects are included in the build and configures plugin/dependency resolution. (Content remains the same as previous version - ensure your file matches).

  1. Edit settings.gradle.kts: Open the settings.gradle.kts file and ensure it has the following content:

    nano settings.gradle.kts
    Click to view the full settings.gradle.kts listing
    
    // settings.gradle.kts
    pluginManagement {
        repositories {
            gradlePluginPortal()
            mavenCentral()
        }
    }
    
    dependencyResolutionManagement {
        repositories {
            mavenCentral()
        }
        versionCatalogs {
            create("mn") {
                from("io.micronaut.platform:micronaut-platform:4.8.2")
            }
        }
    }
    
    rootProject.name = "my-pipeline-system"
    
    // Include all the subprojects
    include(
        "bom",
        "protobuf-models",
        "pipeline-service-core",
        "pipeline-service-test-utils",
        "pipeline-service-test-utils:micronaut-kafka-registry-core",
        "pipeline-service-test-utils:micronaut-kafka-registry-moto",
        "pipeline-service-test-utils:micronaut-kafka-registry-apicurio",
        "pipeline-instance-A",
        "util"
    )
        

Step 3: Configure Root Build File (build.gradle.kts)

Configure global settings and apply common test configurations. (Content remains the same as previous version - ensure your file matches).

  1. Create/Edit build.gradle.kts: In the root directory, ensure build.gradle.kts contains:

    build.gradle.kts (Root project)
    nano build.gradle.kts
    Click to view the full build.gradle.kts listing
    
    import org.asciidoctor.gradle.jvm.AsciidoctorTask
    
    plugins {
        id("org.asciidoctor.jvm.convert") version "4.0.4"
    }
    
    repositories {
        mavenCentral()
    }
    
    asciidoctorj {
        modules {
            diagram.use()
            // optional: pin version
            diagram.version("2.3.2")
        }
        attributes (
            mapOf("source-highlighter" to "coderay",
                       "docinfo1"           to "shared"))
    }
    tasks.named("asciidoctor") {
        setSourceDir(file("src/docs"))
        setOutputDir(file("docs"))
        logDocuments = true
    
        attributes(
            mapOf(
                // tell HTML where images “live”
                "imagesdir"       to "images",
                // tell Diagram to also dump images here
                "imagesoutdir"    to "docs/images",
                "plantuml-format" to "svg",
                "docinfodir"      to "src/docs"
            )
        )
    }
    
    
    group = "com.krickert.search"
    version = "1.0.0-SNAPSHOT"
    
    subprojects {
        repositories {
            mavenCentral()
        }
    
        // Apply Java toolchain configuration to all subprojects with Java plugin
        plugins.withId("java-base") {
            configure {
                toolchain {
                    languageVersion.set(JavaLanguageVersion.of(21))
                }
            }
    
            // Configure JUnit 5 for all subprojects with Java capabilities
            afterEvaluate {
                tasks.withType().configureEach {
                    useJUnitPlatform()
                    testLogging {
                        events("passed", "skipped", "failed")
                    }
                }
            }
        }
    
        // Apply publishing configuration defaults
        plugins.withId("maven-publish") {
            configure {
                repositories {
                    mavenLocal()
                }
            }
        }
    }
    
        
The subprojects {} block configures useJUnitPlatform() globally, so you won’t need it in individual subproject files.

Step 4: Configure Version Catalog (gradle/libs.versions.toml)

Centralize dependency versions and aliases. (Content remains the same as previous version - ensure your file exists and matches).

  1. Create/Verify gradle/libs.versions.toml:

    nano gradle/libs.versions.toml
    Click to view the full gradle/libs.versions.toml listing
    
    # gradle/libs.versions.toml
    
    [versions]
    micronautPlatform = "4.8.2"  # For the platform BOM
    micronautPlugins = "4.5.1"   # For the Micronaut Gradle plugins
    kotlin = "1.9.25"
    gradleProtobufPlugin = "0.9.5"
    gradleReleasePlugin = "3.1.0"
    protobuf = "3.25.3"
    grpc = "1.72.0"
    junit = "5.10.0"
    slf4j = "2.0.13"
    jackson = "2.18.3"
    guava = "33.4.8-jre"
    commonsLang3 = "3.14.0"
    
    [libraries]
    # Protobuf / gRPC
    protobuf-java = { module = "com.google.protobuf:protobuf-java", version.ref = "protobuf" }
    grpc-stub = { module = "io.grpc:grpc-stub", version.ref = "grpc" }
    grpc-protobuf = { module = "io.grpc:grpc-protobuf", version.ref = "grpc" }
    grpc-protocGen = { module = "io.grpc:protoc-gen-grpc-java", version.ref = "grpc" }
    
    # Testing with explicit versions
    junit-jupiter-api = { module = "org.junit.jupiter:junit-jupiter-api", version.ref = "junit" }
    junit-jupiter-engine = { module = "org.junit.jupiter:junit-jupiter-engine", version.ref = "junit" }
    
    # Other libraries
    slf4j-api = { module = "org.slf4j:slf4j-api", version.ref = "slf4j" }
    slf4j-simple = { module = "org.slf4j:slf4j-simple", version.ref = "slf4j" }
    jackson-databind = { module = "com.fasterxml.jackson.core:jackson-databind", version.ref = "jackson" }
    guava = { module = "com.google.guava:guava", version.ref = "guava" }
    commons-lang3 = { module = "org.apache.commons:commons-lang3", version.ref = "commonsLang3" }
    micronaut-platform = { module = "io.micronaut.platform:micronaut-platform", version.ref = "micronautPlatform" }
    
    [bundles]
    testing-jvm = ["junit-jupiter-api", "junit-jupiter-engine"]
    
    [plugins]
    # Update these to use micronautPlugins version
    micronaut-application = { id = "io.micronaut.application", version.ref = "micronautPlugins" }
    micronaut-library = { id = "io.micronaut.library", version.ref = "micronautPlugins" }
    protobuf = { id = "com.google.protobuf", version.ref = "gradleProtobufPlugin" }
    release = { id = "net.researchgate.release", version.ref = "gradleReleasePlugin" }
        

Step 5: Create the Custom BOM Project (bom)

Define and publish your custom Bill of Materials. (Content remains the same - ensure your bom/build.gradle.kts matches).

  1. Create/Verify build of materials bom/build.gradle.kts:

    bom/build.gradle.kts (bom project)
    nano bom/build.gradle.kts
    Click to view the full bom/build.gradle.kts listing
    
    // File: bom/build.gradle.kts
    plugins {
        `java-platform`
        `maven-publish`
    }
    
    group = rootProject.group
    version = rootProject.version
    
    javaPlatform {
        // allowDependencies()
    }
    
    dependencies {
        constraints {
            // Import Micronaut Platform BOM (provides JUnit constraints etc.)
            // For BOM imports in constraints, use the specific notation:
            api(libs.micronaut.platform)  // This is the correct way to reference the BOM
    
            // Explicitly add JUnit with version
            api("org.junit.jupiter:junit-jupiter-api:5.10.0")
            api("org.junit.jupiter:junit-jupiter-engine:5.10.0")
    
            // Other constraints remain the same
            api(libs.protobuf.java)
            api(libs.grpc.stub)
            api(libs.grpc.protobuf)
            api(libs.guava)
            api(libs.jackson.databind)
            api(libs.commons.lang3)
            api(libs.slf4j.api)
    
            // Constrain own modules
            api("${rootProject.group}:protobuf-models:${rootProject.version}")
            api("${rootProject.group}:util:${rootProject.version}")
            api("${rootProject.group}:pipeline-service-core:${rootProject.version}")
            api("${rootProject.group}:pipeline-service-test-utils:${rootProject.version}")
            // Kafka registry modules
            api("${rootProject.group}:pipeline-service-test-utils.micronaut-kafka-registry-core:${rootProject.version}")
            api("${rootProject.group}:pipeline-service-test-utils.micronaut-kafka-registry-moto:${rootProject.version}")
            api("${rootProject.group}:pipeline-service-test-utils.micronaut-kafka-registry-apicurio:${rootProject.version}")
        }
    }
    
    publishing {
        publications {
            create("mavenJavaPlatform") {
                from(components["javaPlatform"])
                groupId = project.group.toString()
                artifactId = project.name
                version = project.version.toString()
                pom {
                    name.set("My Pipeline System BOM")
                    description.set("Bill of Materials for My Pipeline System components")
                }
            }
        }
    }
        
  2. Build & Publish BOM Locally (CLI):

    ./gradlew :bom:publishToMavenLocal

Step 6: Configure Subprojects

Configure subprojects to use the BOM. (Ensure your subproject build files and source code exist and match the following configurations).

6a. Protobuf Models (protobuf-models)

  1. Verify .proto files exist: (Ensure files like pipeline_models.proto are in protobuf-models/src/main/proto/)

    mkdir -p protobuf-models/src/main/proto/
    nano protobuf-models/src/main/proto/pipeline_models.proto
    Click to view the full protobuf-models/src/main/proto/pipeline_models.proto listing
    
    syntax = "proto3";
    package com.krickert.search.model;
    option java_multiple_files = true;
    import "google/protobuf/timestamp.proto";
    import "google/protobuf/struct.proto";
    import "google/protobuf/empty.proto";
    
    message PipeDoc {
      string id = 1;
      string title = 2;
      string body = 3;
      repeated string keywords = 4;
      string document_type = 5;
      string revision_id = 6;
      google.protobuf.Timestamp creation_date = 7;
      google.protobuf.Timestamp last_modified = 8;
      google.protobuf.Struct custom_data = 9;
      SemanticDoc chunk_embeddings = 10;
      map embeddings = 11;
    }
    
    message Embedding {
      repeated float embedding = 1;
    }
    
    message SemanticDoc {
      string parent_id = 1;
      string parent_field = 2;
      string chunk_config_id = 3;
      string semantic_config_id = 4;
      repeated SemanticChunk chunks = 5;
    }
    
    message SemanticChunk {
      string chunk_id = 1;
      int64 chunk_number = 2;
      ChunkEmbedding embedding = 3;
    }
    
    message ChunkEmbedding {
      string embedding_text = 1;
      repeated float embedding = 2;
    }
    
    message Route {
      RouteType routeType = 1;
      // For Kafka: the topic name; for gRPC: the endpoint tag or destination identifier.
      string destination = 2;
    }
    
    enum RouteType {
      UNKNOWN = 0;
      NULL_TERMINATION = 1;
      KAFKA = 2;
      GRPC = 3;
    }
    
    message PipeRequest {
      PipeDoc doc = 1;
      map config = 2;
      repeated Route destinations = 3;
    }
    
    message OutputResponse {
      bool success = 1;
      oneof reply {
        PipeDoc outputDoc = 2;
        ErrorData errorData = 3;
      }
    }
    
    message ErrorData {
      string errorMessage = 2;
      repeated Route failedRoutes = 3;
      optional PipeRequest errorRequest = 4;
    }
    
    message PipeResponse {
      bool success = 1;
      optional ErrorData errorDate = 2;
    }
    
    message PipeStream {
      PipeRequest request = 1;
      repeated PipeResponse pipeReplies = 2;
      repeated string streamLogs = 3;
    }
    
    service PipelineService {
      rpc forward(PipeStream) returns (google.protobuf.Empty);
      rpc getOutput(PipeRequest) returns (OutputResponse);
    }
        
  2. Verify/Create protobuf-models/build.gradle.kts:

    nano protobuf-models/build.gradle.kts
    Click to view the full protobuf-models/build.gradle.kts listing
    
    
    var grpcVersion = libs.versions.grpc.get()
    var protobufVersion = libs.versions.protobuf.get()
    
    plugins {
        `java-library`
        alias(libs.plugins.protobuf)
    }
    
    group = rootProject.group
    version = rootProject.version
    
    repositories {
        mavenCentral()
    }
    
    java {
        toolchain {
            languageVersion.set(JavaLanguageVersion.of(21))
        }
    }
    
    dependencies {
        implementation(platform(project(":bom")))
        testImplementation(platform(project(":bom")))
        testImplementation(mn.mockito.core)
    
        implementation(libs.protobuf.java)
        implementation(libs.grpc.protobuf)
        implementation(libs.grpc.stub)
        implementation(libs.slf4j.api)
        implementation(mn.javax.annotation.api)
        implementation(libs.guava)
        implementation(libs.commons.lang3)
    
        testImplementation(libs.junit.jupiter.api)
        testRuntimeOnly(libs.junit.jupiter.engine)
    }
    
    // Simplified protobuf configuration
    // Inform IDEs like IntelliJ IDEA, Eclipse or NetBeans about the generated code.
    sourceSets {
        main {
            java {
                srcDirs("build/generated/source/proto/main/grpc")
                srcDirs("build/generated/source/proto/main/java")
            }
        }
    }
    
    protobuf {
        protoc {
            artifact = "com.google.protobuf:protoc:$protobufVersion"
        }
        plugins {
            create("grpc") {
                artifact = "io.grpc:protoc-gen-grpc-java:$grpcVersion"
            }
        }
        generateProtoTasks {
            all().forEach { task ->
                task.plugins {
                    create("grpc")
                }
            }
        }
    }
        
  3. Verify sample code & test exist:

    Create ProtobufUtils.java in src/main/java/com/krickert/search/model/

    mkdir -p protobuf-models/src/main/java/com/krickert/search/model
    nano protobuf-models/src/main/java/com/krickert/search/model/ProtobufUtils.java
    Click to view the full protobuf-models/src/main/java/com/krickert/search/model/ProtobufUtils.java listing
    
    package com.krickert.search.model;
    
    import com.google.protobuf.ListValue;
    import com.google.protobuf.Message;
    import com.google.protobuf.Timestamp;
    import com.google.protobuf.Value;
    
    import java.io.FileOutputStream;
    import java.io.IOException;
    import java.time.Instant;
    import java.util.Collection;
    import java.util.UUID;
    import java.util.concurrent.atomic.AtomicInteger;
    
    import org.apache.commons.lang3.StringUtils;
    
    
    /**
     * Utility class for working with protobuf messages.
     */
    public class ProtobufUtils {
    
        /**
         * Returns the current timestamp as a Timestamp object.
         *
         * @return the current timestamp
         */
        public static Timestamp now() {
            Instant time = Instant.now();
            return Timestamp.newBuilder().setSeconds(time.getEpochSecond())
                    .setNanos(time.getNano()).build();
        }
    
        /**
         * Creates a Timestamp object from the given epoch seconds.
         *
         * @param epochSeconds the number of seconds since January 1, 1970
         * @return a Timestamp object representing the given epoch seconds
         */
        public static Timestamp stamp(long epochSeconds) {
            return Timestamp.newBuilder().setSeconds(epochSeconds)
                    .setNanos(0).build();
        }
    
        /**
         * Saves a Protobuf message to disk.
         *
         * @param dst  The destination file path.
         * @param item The Protobuf message to be saved.
         * @throws IOException If an I/O error occurs while writing to the file.
         */
        public static  void saveProtobufToDisk(String dst, T item) throws IOException {
            item.writeTo(new FileOutputStream(dst));
        }
    
    
        /**
         * Saves a collection of Protocol Buffer messages to disk.
         *
         * @param dstPrefix The prefix of the destination file path.
         * @param items     The collection of Protocol Buffer messages to be saved.
         * @param        The type of Protocol Buffer message.
         * @throws RuntimeException If an I/O error occurs while saving the messages.
         */
        public static  void saveProtocoBufsToDisk(String dstPrefix, Collection items) {
            int leftPad = (String.valueOf(items.size())).length();
            saveProtocoBufsToDisk(dstPrefix, items, leftPad);
        }
    
        /**
         * Saves a collection of Protocol Buffer messages to disk.
         *
         * @param dstPrefix The prefix of the destination file path.
         * @param items     The collection of Protocol Buffer messages to be saved.
         * @param leftPad   The number of digits used for left padding the index of each saved message in the file name.
         * @param        The type of Protocol Buffer message.
         * @throws RuntimeException If an I/O error occurs while saving the messages.
         */
        public static  void saveProtocoBufsToDisk(String dstPrefix, Collection items, int leftPad) {
            AtomicInteger i = new AtomicInteger();
            items.forEach((item) -> {
                try {
                    saveProtobufToDisk(dstPrefix + StringUtils.leftPad(String.valueOf(i.getAndIncrement()), leftPad, "0") + ".bin", item);
                } catch (IOException e) {
                    throw new RuntimeException(e);
                }
            });
        }
    
        /**
         * Creates a UUID key from a given string identifier.
         *
         * @param id The string identifier.
         * @return The UUID key.
         */
        public static UUID createKey(String id) {
            return UUID.nameUUIDFromBytes(id.getBytes());
        }
    
        /**
         * Creates a UUID key from a given PipeDocument object.
         *
         * @param pipeDocument The PipeDocument object to generate the key from.
         * @return The generated UUID key.
         */
        public static UUID createKey(PipeDoc pipeDocument) {
            return createKey(pipeDocument.getId());
        }
    
        /**
         * Creates a ListValue object from a collection of strings.
         *
         * @param collectionToConvert The collection of strings to be converted.
         * @return A ListValue object containing the converted strings.
         */
        public static ListValue createListValueFromCollection(Collection collectionToConvert) {
            ListValue.Builder builder = ListValue.newBuilder();
            collectionToConvert.forEach((obj) -> builder.addValues(Value.newBuilder().setStringValue(obj).build()));
            return builder.build();
        }
    }
        

    Don’t worry about what these tests do yet, they are some sample protocol buffer utility classes that we’ll reuse for other projects. Now let’s see what we are doing by writing some tests…​

    Create ProtobufUtilsTest.java in protobuf-models/src/test/java/com/krickert/search/model/ProtobufUtilsTest.java

    mkdir -p protobuf-models/src/test/java/com/krickert/search/model
    nano protobuf-models/src/test/java/com/krickert/search/model/ProtobufUtilityTest.java
    Click to view the full protobuf-models/src/test/java/com/krickert/search/model/ProtobufUtilsTest.java listing
    
    package com.krickert.search.model;
    
    import com.google.protobuf.ListValue;
    import com.google.protobuf.Timestamp;
    import com.krickert.search.model.PipeDoc; // Example message
    import org.junit.jupiter.api.Assertions;
    import org.junit.jupiter.api.Test;
    import org.junit.jupiter.api.io.TempDir;
    
    
    import java.io.FileInputStream;
    import java.io.IOException;
    import java.nio.file.Files;
    import java.nio.file.Path;
    import java.time.Instant;
    import java.util.Arrays;
    import java.util.Collection;
    import java.util.Collections;
    import java.util.List;
    import java.util.UUID;
    
    import static org.junit.jupiter.api.Assertions.*;
    
    class ProtobufUtilsTest {
    
        // --- Timestamp Tests ---
    
        /**
         * Tests the {@link ProtobufUtils#now} method.
         * The method should return a {@link Timestamp} object representing the current time
         * as an Instant object, converted to seconds and nanoseconds.
         */
        @Test
        void testNowReturnsCurrentTimestamp() {
            // Act
            Timestamp timestamp = ProtobufUtils.now();
    
            // Assert
            Instant currentInstant = Instant.now();
            assertNotNull(timestamp);
            assertTrue(timestamp.getSeconds() <= currentInstant.getEpochSecond());
            assertTrue(timestamp.getNanos() >= 0 && timestamp.getNanos() < 1_000_000_000);
            // Allow for a small buffer due to execution time between now() calls
            assertTrue(timestamp.getSeconds() >= (currentInstant.getEpochSecond() - 2), "Timestamp seconds should be very close to current time");
        }
    
        @Test
        void nowIsNowNotThen() throws InterruptedException {
            Timestamp now1 = ProtobufUtils.now();
            Assertions.assertInstanceOf(Timestamp.class, now1);
            Thread.sleep(10); // Sleep briefly
            Timestamp now2 = ProtobufUtils.now();
            Thread.sleep(1001); // Sleep for over a second
            Timestamp now3 = ProtobufUtils.now();
    
            assertTrue(now2.getSeconds() >= now1.getSeconds()); // Could be same second
            // If same second, nanos should generally increase (though not guaranteed if clock resolution is low)
            if (now2.getSeconds() == now1.getSeconds()) {
                 assertTrue(now2.getNanos() >= now1.getNanos());
            }
    
            assertTrue(now3.getSeconds() > now1.getSeconds(), "Timestamp after 1s sleep should have larger seconds value");
            assertTrue(now3.getSeconds() > now2.getSeconds());
    
        }
    
        @Test
        void stamp() {
            long time = System.currentTimeMillis() / 1000; // Current epoch seconds
            Timestamp stamp = ProtobufUtils.stamp(time);
            assertEquals(time, stamp.getSeconds());
            assertEquals(0, stamp.getNanos());
    
            Timestamp stampZero = ProtobufUtils.stamp(0);
            assertEquals(0, stampZero.getSeconds());
            assertEquals(0, stampZero.getNanos());
    
            Timestamp stampNegative = ProtobufUtils.stamp(-1234567890L);
            assertEquals(-1234567890L, stampNegative.getSeconds());
            assertEquals(0, stampNegative.getNanos());
        }
    
        // --- UUID Tests ---
    
        @Test
        void createKeyFromString() {
            String id1 = "test-id-1";
            String id2 = "test-id-2";
            String id1Again = "test-id-1";
    
            UUID key1 = ProtobufUtils.createKey(id1);
            UUID key2 = ProtobufUtils.createKey(id2);
            UUID key1Again = ProtobufUtils.createKey(id1Again);
    
            assertNotNull(key1);
            assertNotNull(key2);
            assertNotNull(key1Again);
    
            assertEquals(key1, key1Again); // Same input string -> same UUID
            assertNotEquals(key1, key2);   // Different input string -> different UUID
    
            // Test empty string
            UUID keyEmpty = ProtobufUtils.createKey("");
            assertNotNull(keyEmpty);
    
            // Test null string - should throw NullPointerException
            assertThrows(NullPointerException.class, () -> {
                //noinspection DataFlowIssue
                ProtobufUtils.createKey((String) null);
            });
        }
    
        @Test
        void createKeyFromPipeDoc() {
            PipeDoc doc1 = PipeDoc.newBuilder().setId("doc-id-1").build();
            PipeDoc doc2 = PipeDoc.newBuilder().setId("doc-id-2").build();
            PipeDoc doc1Again = PipeDoc.newBuilder().setId("doc-id-1").setTitle("Different Title").build(); // ID is the same
            PipeDoc docEmptyId = PipeDoc.newBuilder().setId("").build();
    
            UUID key1 = ProtobufUtils.createKey(doc1);
            UUID key2 = ProtobufUtils.createKey(doc2);
            UUID key1Again = ProtobufUtils.createKey(doc1Again);
            UUID keyEmpty = ProtobufUtils.createKey(docEmptyId);
    
    
            assertNotNull(key1);
            assertNotNull(key2);
            assertNotNull(key1Again);
            assertNotNull(keyEmpty);
    
            assertEquals(key1, key1Again); // Same ID -> same UUID
            assertNotEquals(key1, key2);   // Different ID -> different UUID
    
            // Test null document - should throw NullPointerException
            assertThrows(NullPointerException.class, () -> {
                //noinspection DataFlowIssue
                ProtobufUtils.createKey((PipeDoc) null);
            });
    
             // Test document with null ID - should throw NullPointerException when accessing id
             PipeDoc docNullId = PipeDoc.newBuilder().build(); // ID defaults to "", not null technically
             UUID keyFromDefaultEmptyId = ProtobufUtils.createKey(docNullId);
             assertEquals(keyEmpty, keyFromDefaultEmptyId);
    
    
        }
    
        // --- ListValue Test ---
        @Test
        void createListValueFromCollection() {
            Collection strings = Arrays.asList("hello", "world", "", "another");
            ListValue listValue = ProtobufUtils.createListValueFromCollection(strings);
    
            assertNotNull(listValue);
            assertEquals(4, listValue.getValuesCount());
            assertEquals("hello", listValue.getValues(0).getStringValue());
            assertEquals("world", listValue.getValues(1).getStringValue());
            assertEquals("", listValue.getValues(2).getStringValue());
            assertEquals("another", listValue.getValues(3).getStringValue());
    
            // Test with empty collection
            ListValue emptyListValue = ProtobufUtils.createListValueFromCollection(Collections.emptyList());
            assertNotNull(emptyListValue);
            assertEquals(0, emptyListValue.getValuesCount());
    
             // Test with collection containing null - Protobuf Value doesn't allow null strings directly, check behavior
             // The current implementation would likely throw NPE on addValues(Value.newBuilder().setStringValue(null)...)
             Collection listWithNull = Arrays.asList("a", null, "c");
             assertThrows(NullPointerException.class, ()-> ProtobufUtils.createListValueFromCollection(listWithNull), "setStringValue(null) should throw NPE");
    
    
            // Test with null collection - should throw NullPointerException
            assertThrows(NullPointerException.class, () -> {
                //noinspection DataFlowIssue
                ProtobufUtils.createListValueFromCollection(null);
            });
        }
    
    
        // --- Disk Saving Tests (Requires Temp Directory) ---
    
        @TempDir
        Path tempDir; // JUnit 5 Temp Directory injection
    
        @Test
        void saveProtobufToDisk_Single() throws IOException {
            PipeDoc doc = PipeDoc.newBuilder()
                    .setId("save-test-1")
                    .setTitle("Save Me")
                    .build();
            Path filePath = tempDir.resolve("single_doc.bin");
            String dst = filePath.toString();
    
            ProtobufUtils.saveProtobufToDisk(dst, doc);
    
            // Verify file exists
            assertTrue(Files.exists(filePath));
            assertTrue(Files.size(filePath) > 0);
    
            // Verify content can be parsed back
            try (FileInputStream fis = new FileInputStream(dst)) {
                PipeDoc readDoc = PipeDoc.parseFrom(fis);
                assertEquals(doc, readDoc);
            }
        }
    
        @Test
        void saveProtobufToDisk_Error() {
            PipeDoc doc = PipeDoc.newBuilder().setId("error-test").build();
            String invalidPath = tempDir.resolve("non_existent_dir/file.bin").toString(); // Invalid directory
    
            // Expect IOException or RuntimeException wrapping it
            assertThrows(IOException.class, () -> ProtobufUtils.saveProtobufToDisk(invalidPath, doc));
        }
    
    
        @Test
        void saveProtocoBufsToDisk_Multiple_DefaultPadding() throws IOException {
            PipeDoc doc1 = PipeDoc.newBuilder().setId("multi-1").build();
            PipeDoc doc2 = PipeDoc.newBuilder().setId("multi-2").build();
            List docs = Arrays.asList(doc1, doc2);
            String prefix = tempDir.resolve("multi_default_").toString();
    
            ProtobufUtils.saveProtocoBufsToDisk(prefix, docs);
    
            // Check files (padding based on size=2 -> 1 digit)
            Path path1 = tempDir.resolve("multi_default_0.bin");
            Path path2 = tempDir.resolve("multi_default_1.bin");
    
            assertTrue(Files.exists(path1));
            assertTrue(Files.exists(path2));
            assertEquals(doc1, PipeDoc.parseFrom(Files.readAllBytes(path1)));
            assertEquals(doc2, PipeDoc.parseFrom(Files.readAllBytes(path2)));
        }
    
        @Test
        void saveProtocoBufsToDisk_Multiple_CustomPadding() throws IOException {
            PipeDoc doc1 = PipeDoc.newBuilder().setId("multi-pad-1").build();
            PipeDoc doc2 = PipeDoc.newBuilder().setId("multi-pad-2").build();
            PipeDoc doc11 = PipeDoc.newBuilder().setId("multi-pad-11").build();
            List docs = Arrays.asList(doc1, doc2, doc11); // Size 3
            String prefix = tempDir.resolve("multi_pad_").toString();
            int leftPad = 3; // Custom padding
    
            ProtobufUtils.saveProtocoBufsToDisk(prefix, docs, leftPad);
    
            // Check files with custom padding
            Path path1 = tempDir.resolve("multi_pad_000.bin");
            Path path2 = tempDir.resolve("multi_pad_001.bin");
            Path path3 = tempDir.resolve("multi_pad_002.bin"); // Index 2 for 3rd item
    
            assertTrue(Files.exists(path1));
            assertTrue(Files.exists(path2));
            assertTrue(Files.exists(path3));
            assertEquals(doc1, PipeDoc.parseFrom(Files.readAllBytes(path1)));
            assertEquals(doc2, PipeDoc.parseFrom(Files.readAllBytes(path2)));
            assertEquals(doc11, PipeDoc.parseFrom(Files.readAllBytes(path3))); // Check 3rd item
        }
    
         @Test
        void saveProtocoBufsToDisk_EmptyList() throws IOException {
            List docs = Collections.emptyList();
            String prefix = tempDir.resolve("multi_empty_").toString();
    
            // Should not throw error and not create any files
            ProtobufUtils.saveProtocoBufsToDisk(prefix, docs);
    
            // Verify no files with the prefix were created
            @SuppressWarnings("resource")
            List files = Files.list(tempDir)
                                    .filter(p -> p.getFileName().toString().startsWith("multi_empty_"))
                                    .toList();
            assertTrue(files.isEmpty());
        }
    
    }
        

    Test that the build runs:

    ./gradlew :protobuf-models:test
  4. Test & Build (CLI):

    ./gradlew :protobuf-models:test
    ./gradlew :protobuf-models:build

6b. Utility Library (util)

  1. Verify Java code exists: (Ensure files exist in util/src/main/java/)

  2. Verify/Create util/build.gradle.kts:

    nano util/build.gradle.kts
    Click to view the full util/build.gradle.kts listing
    
    // File: util/build.gradle.kts
    plugins {
        `java-library`
    }
    
    group = rootProject.group
    version = rootProject.version
    
    dependencies {
        implementation(platform(project(":bom")))
        testImplementation(platform(project(":bom")))
        implementation(mn.protobuf.java.util)
        api(libs.guava) // Expose Guava via API
    
        // Testing dependencies
        testImplementation(libs.junit.jupiter.api)
        testRuntimeOnly(libs.junit.jupiter.engine)
    }
        
  3. Build (CLI):

    ./gradlew :util:build

6c. Core Library (pipeline-service-core)

  1. Verify Java code exists: (Ensure files exist in pipeline-service-core/src/main/java/)

  2. Verify/Create pipeline-service-core/build.gradle.kts:

    nano pipeline-service-core/build.gradle.kts
    Click to view the full pipeline-service-core/build.gradle.kts listing
    
    // File: pipeline-service-core/build.gradle.kts
    plugins {
        `java-library`
        `maven-publish`
        alias(libs.plugins.micronaut.library)
    }
    
    group = rootProject.group
    version = rootProject.version
    
    java {
        withJavadocJar()
        withSourcesJar()
        sourceCompatibility = JavaVersion.VERSION_21
        targetCompatibility = JavaVersion.VERSION_21
    }
    
    micronaut {
        version("4.8.2")
        processing {
            incremental(true)
            annotations("com.krickert.search.pipeline.*")
        }
    }
    
    dependencies {
        implementation(platform(project(":bom")))
        annotationProcessor(platform(project(":bom")))
        testImplementation(platform(project(":bom")))
        testAnnotationProcessor(platform(project(":bom")))
    
        // Micronaut dependencies using mn catalog
        annotationProcessor(mn.micronaut.inject.java)
        annotationProcessor(mn.lombok)
        annotationProcessor(mn.micronaut.validation)
    
        // API dependencies - these are exposed to consumers of the library
        api(mn.micronaut.inject)
        api(mn.micronaut.serde.api)
        api(mn.micronaut.serde.jackson)
        api(mn.micronaut.runtime)
        api(mn.micronaut.validation)
        api(mn.micronaut.grpc.server.runtime)
        api(mn.micronaut.grpc.annotation)
        api(mn.micronaut.kafka)
    
    
        // Project dependencies
        api(project(":protobuf-models"))
        api(project(":util"))
    
        // Implementation dependencies - these are not exposed to consumers
        implementation(libs.slf4j.api)
        compileOnly(mn.lombok)
    
        // Testing dependencies
        testImplementation(mn.micronaut.test.junit5)
        testAnnotationProcessor(mn.micronaut.inject.java)
    }
    
    // Publishing configuration
    publishing {
        publications {
            create("mavenJava") {
                from(components["java"])
    
                pom {
                    name.set("Pipeline Service Core")
                    description.set("Core library for pipeline service implementation")
    
                    licenses {
                        license {
                            name.set("The Apache License, Version 2.0")
                            url.set("http://www.apache.org/licenses/LICENSE-2.0.txt")
                        }
                    }
                }
            }
        }
    }
        
  3. Build (CLI):

    ./gradlew :pipeline-service-core:build

6d. Test Utilities (pipeline-service-test-utils)

  1. Verify Java code exists: (Ensure files exist in pipeline-service-test-utils/src/main/java/)

  2. Verify/Create pipeline-service-test-utils/build.gradle.kts:

    nano pipeline-service-test-utils/build.gradle.kts
    Click to view the full pipeline-service-test-utils/build.gradle.kts listing
    
    // File: pipeline-service-test-utils/build.gradle.kts
    plugins {
        `java-library`
    }
    
    group = rootProject.group
    version = rootProject.version
    
    dependencies {
        implementation(platform(project(":bom")))
    
        // Depend on core library
        api(project(":pipeline-service-core"))
    
        // Include testing libraries using mn catalog
        api(mn.micronaut.test.junit5)
    
        // May depend on other utils
        api(project(":util"))
    }
        
  3. Build (CLI):

    ./gradlew :pipeline-service-test-utils:build

6e. Micronaut Application (pipeline-instance-A)

  1. Verify Java code exists: (Ensure Application.java and other files exist in pipeline-instance-A/src/main/java/…​)

  2. Verify/Create pipeline-instance-A/build.gradle.kts:

    nano pipeline-instance-A/build.gradle.kts
    Click to view the full pipeline-instance-A/build.gradle.kts listing
    
    // pipeline-instance-A/build.gradle.kts
    plugins {
        id("java")
        alias(libs.plugins.micronaut.application)
        // id("com.google.cloud.tools.jib") version "..."
    }
    
    group = rootProject.group
    version = rootProject.version
    
    // Repositories inherited from root project
    // Java toolchain inherited from root project
    
    micronaut {
        runtime("netty")
        testRuntime("junit5")
        processing {
            incremental(true)
            annotations("com.krickert.search.pipeline.instanceA.*")
        }
        mainClass("com.krickert.search.pipeline.instanceA.Application")
    }
    
    dependencies {
        implementation(platform(project(":bom")))
        annotationProcessor(platform(project(":bom")))
        testImplementation(platform(project(":bom")))
        testAnnotationProcessor(platform(project(":bom")))
    
        // Micronaut dependencies
        annotationProcessor(mn.micronaut.inject.java)
        implementation(mn.micronaut.inject)
        implementation(mn.micronaut.runtime)
        implementation(mn.micronaut.http.server.netty)
        implementation(mn.micronaut.http.client)
        implementation(mn.micronaut.jackson.databind)
    
        // Project dependencies
        implementation(project(":pipeline-service-core"))
    
        // Logging implementation
        runtimeOnly(libs.slf4j.simple)
    
        // Testing
        testImplementation(mn.micronaut.test.junit5) // Micronaut test pulls in JUnit
        // testImplementation(libs.bundles.testing.jvm) // Removed bundle
        testImplementation(project(":pipeline-service-test-utils"))
        testAnnotationProcessor(mn.micronaut.inject.java)
    }
    
    application {
        mainClass.set(micronaut.mainClass.get())
    }
    
    // Test task configuration inherited from root project
        
  3. Build (CLI):

    ./gradlew :pipeline-instance-A:build
  4. Run (CLI):

    ./gradlew :pipeline-instance-A:run
  5. Build Docker Image (Optional - CLI):

    ./gradlew :pipeline-instance-A:dockerBuild

Step 7: Common Build Commands (CLI)

Here are commands run from the root directory (your-monorepo) that affect the whole project:

  • Clean All Build Outputs:

    ./gradlew clean
  • Build Everything (Compile, Test, Assemble):

    ./gradlew build
  • Run All Tests:

    ./gradlew test
  • Build without Running Tests:

    ./gradlew assemble
  • Publish All Publishable Artifacts to Maven Local:

    ./gradlew publishToMavenLocal
  • List Project Dependencies: (Useful for debugging)

    ./gradlew :pipeline-instance-A:dependencies

Step 8: Conditional CI/CD (Conceptual)

As discussed previously, the goal for CI/CD is to build/test/deploy only what changed. This typically involves:

  1. Detecting Changes: Using git diff, Nx, or a Gradle plugin.

  2. Identifying Affected Projects: Including downstream dependents.

  3. Running Tasks Selectively:

    • Using specific project paths: ./gradlew :pipeline-service-core:build :pipeline-instance-A:build

    • Using built-in tasks: ./gradlew :pipeline-service-core:buildDependents (builds core and instance-A)

    • Using tools: nx affected -t build

Implementing this requires additional scripting or tooling setup in your CI environment (e.g., GitHub Actions, GitLab CI).

Step 9: Versioning and Releasing (Conceptual)

Choose a versioning strategy (Unified or Independent). Use a release plugin for automation.

  • If using gradle-release (Unified Versioning Example - CLI):

    # Ensure gradle.properties has the version, e.g., version=1.0.0-SNAPSHOT
    # Run the release task (interactive)
    ./gradlew release

    This will guide you through setting the release version (e.g., 1.0.0) and the next snapshot version (e.g., 1.0.1-SNAPSHOT), commit changes, tag the release, run build tasks (like publish), and commit the next snapshot version.

  • Independent Versioning: Requires more sophisticated tooling or scripting integrated with your change detection mechanism to version and release only affected modules.

Conclusion: Building for the Future

This step-by-step guide provides a practical path to setting up a well-structured, maintainable multi-project build using Gradle Kotlin DSL, a custom BOM, and modern dependency management techniques. Remember to adapt the specific configurations and commands to your exact project needs.