Back to Projects

B.Nye Job Engine

AI Agent Orchestration Platform

PythonClaude SDKMCPReactPlaywrightGenerative UI + Agentic AI

An enterprise-scale workflow required coordinating 12 data sources, 4 AI models, and 6 output systems with zero manual handoffs. I built an event-driven platform where 5 microservices and 4 specialized agents dispatch in parallel — research, content, outreach, and identification — completing full cycles in under 60 seconds.

20K+

Lines of Code

4

AI Agents

12

Data Sources

25

Test Files

Problem

Running an enterprise-scale workflow across 12 data sources, 4 AI models, and 6 output systems required coordination that no existing tool provided. Each step — research, content generation, outreach, and pipeline tracking — was manual, sequential, and error-prone. The system needed to run autonomously, learn from outcomes, and improve without manual tuning.

Solution

I built an event-driven microservices platform. 5 independent services communicate through an append-only event log with file-level locking. 4 specialized AI agents dispatch in parallel with 5-second target SLA and typed input/output contracts. A 12-source discovery engine aggregates, deduplicates, and scores results — then feeds outcomes back into scoring weights. The system gets smarter every cycle.

Architecture

┌─────────────────────────────────────────────────┐
│              12-Source Discovery                  │
│  Indeed │ Dice │ LinkedIn │ Firecrawl │ ATS │ ...│
└────────────────────┬────────────────────────────┘
                     │
              ┌──────▼──────┐
              │ Scout Scanner │──▶ Event Log (JSONL)
              │ (score+dedup) │         │
              └──────────────┘         │
                                ┌──────▼──────┐
    ┌───────────────────────────│  Pipeline    │
    │         Parallel          │   State      │
    │        Agent Dispatch     └──────────────┘
    │                │
┌───▼──┐ ┌────▼───┐ ┌───▼──┐ ┌────▼────┐
│ Sid  │ │  Judy  │ │Reggie│ │   HM    │
│Rsrch │ │Content │ │Outrch│ │Identify │
└──────┘ └────────┘ └──────┘ └─────────┘
    │         │          │         │
    └─────────┴──────────┴─────────┘
              │
       ┌──────▼──────┐    ┌──────────────┐
       │ Gmail       │    │ Decay        │
       │ Sentinel    │    │ Alerter      │
       └─────────────┘    └──────────────┘
              │                   │
              └────────┬──────────┘
                       ▼
               Morning Brief
            (daily priority queue)

System Components

Service

Scout Scanner

12-source parallel search with ecosystem gating and 9-criteria scoring

Service

Gmail Sentinel

Real-time email monitoring with NLP classification and pipeline matching

Service

Morning Brief

Daily aggregator — priority queue generation from overnight events

Service

Decay Alerter

Pipeline stagnation detection — ghosted, dormant, stale classifications

Service

Remote Control

HTTP/WebSocket server for webhook integration and service management

AI Agent

Sid (Research Agent)

Sonnet — company deep-dive, JD analysis, comp benchmarking

AI Agent

Judy (Content Agent)

Opus — cover letter generation, voice-validated output

AI Agent

Reggie (Outreach Agent)

Sonnet — contact research, outreach drafting, follow-up cadence

AI Agent

HM Researcher

Sonnet — hiring manager identification, org chart mapping

Data Store

Event Log

Append-only JSONL with fcntl file locking — 8 event types

Data Store

Pipeline State

JSON with lifecycle stages, trajectory signals, outcome tracking

Key Engineering

Event-Driven Microservices

5 independent services communicate through an append-only event log. File-level locking ensures safe concurrent writes. Each service reads events it cares about and writes its own.

# Append-only event log with file locking
class EventLog:
    def append(self, event_type: str, payload: dict):
        event = {
            "id": uuid4().hex,
            "type": event_type,  # NEW_OPPORTUNITY | EMAIL_CLASSIFIED | STATUS_CHANGED
            "timestamp": datetime.utcnow().isoformat(),
            "payload": payload,
        }
        with FileLock(self.path):
            with open(self.path, "a") as f:
                f.write(json.dumps(event) + "\n")

4 Parallel AI Agents

Research, content, outreach, and identification agents dispatch simultaneously with 5-second target SLA. Each agent has typed inputs/outputs and domain-specific tools via MCP.

12-Source Discovery Engine

Aggregates results from Indeed, Dice, LinkedIn, Firecrawl, ATS boards, career pages, industry boards, and more. Deduplicates across sources and applies learned scoring modifiers from outcome feedback.

Learned Optimization

Outcome signals (which approaches converted, which were ghosted) feed back into scoring weights. The system gets smarter with every cycle — no manual tuning required.