Designing effective systems for thinking requires moving beyond a "brain-bound" model of cognition toward an "extended" or "embodied" view. Information architects and interaction designers often mistakenly treat digital interfaces as isolated rectangles, ignoring how human cognition relies on physical environments, bodily movement, and external tools to process complex information. Epistemic actions—such as gesturing while speaking or physically manipulating objects like chess pieces—serve as essential cognitive shortcuts that reduce mental load, rather than mere errors or superfluous behaviors. Current artificial intelligence and robotics research frequently fails by relying on a rigid "perceive-think-act" sequence, which ignores the fluid, intertwined nature of human interaction. To build true cognitive partners, designers must shift their focus from optimizing static screens to facilitating the dynamic, embodied interactions that actually drive human understanding and problem-solving.
Sign in to continue reading, translating and more.
Continue