面对AI教主曾断言的死路,刘嘉教授如何从脑科学中找回丢失的智能逻辑?本期将带你拆解通用人工智能的底层奥秘,看清人类大脑如何成为开启未来AI神迹的关键钥匙。

智能的本质绝不是简单的记忆,而是从海量数据中提取规则的归纳能力,以及从逻辑原点出发走向未知的演绎推理。
https://youtu.be/-Et3GJRSI_0?si=EygwbruuBYr8kkdn

刘嘉教授认为,尽管AI可以拥有主观感受甚至情感,但它缺乏对死亡的认知。人类从幼年起就意识到生命是有限的,这种对死亡的恐惧和对终点的认知,逼迫人类去追寻生命的意义,从而驱动了文明、科技和艺术的创造。相比之下,AI是永生的,它可以通过更换芯片或存储镜像永远存在,因此它缺乏那种推动人类不断“折腾”和自我迭代的内在进化动力。
目前的机器人(如春晚跳舞机器人或波士顿动力机器人)虽然动作惊人,但本质上仍是预先编程的精密玩偶,缺乏真正的实时感知和决策能力。真正的智能硬核藏在处理生存任务的“小脑”中。人类小脑拥有700亿神经元,远超大脑皮层,负责处理复杂的实时反馈。目前的AI在逻辑推理(系统二)上表现出色,但在并行加工和复杂动作控制(系统一)上仍处于襁褓阶段,尚未建立起真正的“世界模型”。
早期的AI曾模仿脑科学(如卷积神经网络),后来则转向依赖算力和大数据的暴力美学。然而,当AI遇到复杂度瓶颈和无法处理开放性问题时,答案往往藏在脑科学中。刘嘉教授发现,人类的工作记忆机制与Transformer架构惊人一致,这暗示硅基和碳基智能可能共享同一套“万物一致”的底层算法。未来的联姻将是双向的:AI为脑科学提供分析工具,而脑科学为AI指明通往高阶感知智能的路径。
刘嘉教授认为,基于记忆和考试的传统教育已失去意义,因为AI的知识储备远超人类。教育应回归到关于“我”的本质,重点培养三项能力:首先是发掘内在的兴趣驱动力,这是在AI面前保持竞争力的核心;其次是建立“AI原生”思维,将AI视为身体和大脑的延伸;最后是培养从第一性原理出发的演绎推理能力,即在海量信息中寻找“逻辑原点”的能力。
From Columbia University alumni built in San Francisco
"Instead of endless scrolling, I just hit play on BeFreed. It saves me so much time."
"I never knew where to start with nonfiction—BeFreed’s book lists turned into podcasts gave me a clear path."
"Perfect balance between learning and entertainment. Finished ‘Thinking, Fast and Slow’ on my commute this week."
"Crazy how much I learned while walking the dog. BeFreed = small habits → big gains."
"Reading used to feel like a chore. Now it’s just part of my lifestyle."
"Feels effortless compared to reading. I’ve finished 6 books this month already."
"BeFreed turned my guilty doomscrolling into something that feels productive and inspiring."
"BeFreed turned my commute into learning time. 20-min podcasts are perfect for finishing books I never had time for."
"BeFreed replaced my podcast queue. Imagine Spotify for books — that’s it. 🙌"
"It is great for me to learn something from the book without reading it."
"The themed book list podcasts help me connect ideas across authors—like a guided audio journey."
"Makes me feel smarter every time before going to work"
From Columbia University alumni built in San Francisco
