Skip to content

Core Concepts

All NCI concepts defined in plain English with analogies. Organized by session.


Session 1

Resonance Node

The fundamental unit of knowledge in NCI. Each node has a signature (fingerprint) and a threshold (sensitivity). When an input arrives, the node checks how similar it is to its fingerprint. If similar enough, it fires. If not, it stays completely silent.

Analogy

A tuning fork. It only hums when the right frequency is played near it. Everything else leaves it silent.

Signature

A 64-number vector that acts as a fingerprint for a concept. Similar concepts produce similar signatures. This is how the system finds related knowledge quickly.

Analogy

A person's face. You don't need their entire life history to recognize them — just enough distinguishing features.

Resonance

A score between -1.0 and 1.0 measuring how similar an input is to a node's signature. Computed using cosine similarity. Score ≥ threshold = node fires.

Threshold

Each node's personal sensitivity setting. How similar an input must be before the node fires. Target produces 1–4% activation — brain-like sparsity.

Analogy

Deciding whether to answer the door. High threshold = only answer for a loud knock. Right threshold = answer when someone's actually there.

Sparsity

Only 1–4% of nodes activate for any given input. The rest stay silent and cost nothing. This is where NCI's hardware efficiency comes from.

Analogy

A city at night. Most windows are dark. Only the relevant buildings are lit up.

Compression Index

The master lookup system. Groups similar nodes into buckets using LSH. A search only checks the relevant bucket — less than 1% of all nodes.

Analogy

A library card catalog. You don't read every book — you check the catalog, go to the right section, open the right page.


Session 2

Concept Cluster

A group of related ResonanceNodes that activate together. Built automatically using k-means. When one fires, it pulls its neighbors along.

Analogy

When someone says "home networking" — router, firewall, DNS, subnet all light up together. Same section of the library.

Context Boost

Active clusters temporarily lower member thresholds. Related knowledge becomes easier to activate.

Analogy

You've been studying Spanish all week. Someone says "banco" and you instantly think "bank" — not a park bench. Your Spanish cluster is context-boosted.

Centroid

The average signature of all nodes in a cluster. Used as a fast relevance check — compare query against centroids instead of all nodes.

Finds the most relevant cluster first, then checks ALL member nodes. Breaks through the bucket ceiling to achieve brain-like sparsity.


Session 3

Knowledge Landscape

The top-level brain object. Owns all nodes, clusters, connections, and metadata. The skull that holds everything together.

Consolidation

A score from 0.0 (fresh) to 1.0 (crystallized). Reinforced nodes consolidate — threshold drops, they resist decay. Unused nodes fade.

Analogy

Learning to drive. Day 1: conscious of every action. Year 10: automatic. The knowledge crystallized.

.nci File Format

Custom binary format for saving/loading a complete Knowledge Landscape. Magic bytes "NCI!", packed numpy arrays for speed, JSON for cluster metadata.

Analogy

Freezing a brain and thawing it later with all memories intact.

Inter-Cluster Connections

Clusters connect to their most similar neighbors automatically. Creates pathways: networking → security → linux.

Design Principle

Nodes are never removed through decay. Decay is priority sorting only. Nodes only removed by explicit deprecation with a logged reason.


Session 4

Visual Brain

A standalone browser visualizer for the Knowledge Landscape. D3.js force-directed graph — nodes sized by fired count, coloured by status, glow on active clusters.


Session 5

Semantic Signatures

Real meaning-based signatures replacing random numbers. Uses sentence-transformers to convert words to 384-dim embeddings, projected to 64 dimensions.

Analogy

Phone numbers with area codes. Similar concepts share the same "area code."

Word Embeddings

Vectors where geometry encodes meaning. king − man + woman ≈ queen. That's not magic — it's geometry.

Concept Learning

Teaching NCI named concepts using real language. brain.learn("firewall") generates a semantic signature and stores the concept label.