LLM Memory Consolidation and Augmentation

Authors: Terry Chen, Kaiwen Che, Matthew Song Abstract Despite advances in large language model (LLM) capability, their fundamental limitation of not being able to store context over long-lived interactions persists. In this paper, a novel human-inspired three-tiered memory architecture is presented that addresses these limitations through biomimetic design principles rooted in cognitive science. Our approach aligns the human working memory with the LLM context window, episodic memory with vector stores of experience-based knowledge, and semantic memory with structured knowledge triplets. ...

March 10, 2025 โ€ข ๐Ÿ“– 5 min read โ€ข Terry Chen

Realtime Conversational Learning Aid

AI-powered study group assistant that analyzes real-time conversations, detects misconceptions, and facilitates deeper learning through Socratic questioning and contextual knowledge retrieval.

November 10, 2024 โ€ข ๐Ÿ“– 2 min read โ€ข Terry Chen

Human Quirks

Observing and understanding the strange quirks of individuals and crowds

October 1, 2024 โ€ข ๐Ÿ“– 1 min read โ€ข Terry Chen

Tech History

Exploring the evolution of technology and its impact on society.

March 19, 2024 โ€ข ๐Ÿ“– 1 min read โ€ข Terry Chen

Agentic Workforce

What does an organization look like when a meaningful portion of the workforce is non-human?

August 20, 2023 โ€ข ๐Ÿ“– 1 min read โ€ข Terry Chen

Service Localization

Technology built for one market rarely works in another without rethinking the product, not just the language.

August 18, 2023 โ€ข ๐Ÿ“– 1 min read โ€ข Terry Chen