The Next Algorithms

Knowledge Compression & Adaptive Complexity

"AGI is just a few more algorithms away..."

Two fundamental ideas that could transform how AI understands and interacts with the world

Idea 1: Knowledge Compression

Complex patterns can be represented with minimal information when we understand their underlying structure.

Example: 1000 equilateral triangles on a screen.

The same visual information can be stored as either 6000 coordinates or ~10 parameters that describe the pattern.

Traditional Storage

Store every vertex position

6000 numbers

(1000 triangles × 3 vertices × 2 coordinates)

• Large storage requirement
• No pattern understanding

Compressed Representation

Store pattern parameters

~10 numbers
  • Triangle size
  • Count
  • Distribution randomness
  • Average spacing
  • Color parameters

• Same visual result
• 600:1 compression ratio

Orthogonal Representation

Every aspect of a system that is orthogonal (independent) can be represented separately.

Similar to eigenvectors - finding the most efficient basis for representing information.

RGB(120, 80, 200)
15.3 units
0.73
Gaussian(μ=0.5, σ=0.2)

Idea 2: Adaptive Level of Detail

Part of intelligence means knowing what level of detail you need to solve a problem.

Example: Modeling a biological cell

From billions of DNA base pairs to just the parameters you need.

Cellular Level

Viewing whole cell structure and major organelles.

1,000
Cell Overview

Task-Based Detail Selection

Level 1-3: Cell Type
Shape, size, basic structure
100-1,000 parameters

Level 4-7: Organelle Analysis
Mitochondria, nucleus, ER detail
1,000-10,000 parameters

Level 8-10: Molecular Dynamics
Proteins, DNA, chemical reactions
10,000-100,000 parameters

Compression Parameters:

The Path to AGI

Combine optimal compression with adaptive complexity

Result: AI that can efficiently model any system at the right level of detail

Current AI (LLMs)

  • Brute force compression
  • Billions of parameters
  • Fixed level of detail
  • Context-limited understanding

Next-Gen AI

  • Optimal orthogonal representation
  • Dynamic parameter allocation
  • Adaptive detail levels
  • On-demand model creation

Key Innovations Needed

  • • Training models to find optimal compressions
  • • Agents that create models on-demand
  • • Dynamic complexity adjustment based on feedback
  • • Integration of multiple representation levels

The Vision

With these algorithms in place, the path to AGI becomes clearer.

The combination of optimal knowledge compression and adaptive complexity could enable AI systems that truly understand and model the world at any required level of detail.

"When you see how you can go from billions of DNA pairs to just 10k numbers that describe the same complexity... it's pretty crazy."