1980年以来计算机领域的重大新发明

这个问题来自对过去50年左右计算机不同进步的评论 。

其他一些与会者也要求我把它作为整个论坛的一个问题提出来。

这里的基本思路不是要抨击事物的现状,而是要了解一些关于提出新的基本思想和原则的进展。

我声称在大多数计算领域我们需要真正的新思想,我想知道最近已经做了哪些重要而有力的计划。 如果我们找不到他们,那么我们应该问“为什么?” 和“我们该怎么做?”

互联网本身早在1980年,但是由蒂姆·伯纳斯·李(Tim Berners-Lee)提出和实施的万维网World Wide Web,简称“通过简单的机制分发的超文本”)始于1989/90年。

虽然超文本的概念早已存在( 尼尔森的Xanadu试图实施分布式scheme),但WWW是实现分布式超文本系统的一种新方法。 Berners-Lee以一种function强大且易于实现的方式,将简单的客户机 – 服务器协议,标记语言和寻址scheme结合起来。

我认为大多数创新都是以原创的方式重新组合现有的作品。 WWW的每一块都曾经以某种forms存在过,但这种结合只是事后才明显的。

我确信你现在正在使用它。

自由软件基金会 (1985年成立)

即使你不是他们哲学的全心支持者,自由软件开放源代码的想法已经对软件行业和一般内容(例如维基百科)产生了巨大的影响。

我认为可以公平地说,在1980年,如果你使用的是电脑,你要么得到报酬,要么就是一个极客,所以变了?

  • 打印机和消费级桌面出版 。 意思是你不需要印刷机来制作高容量,高质量的印刷品。 当然,现在我们完全理所当然,大多数情况下,我们甚至不打扰打印部分,因为无论如何每个人都在线。

  • 颜色 。 认真。 彩屏对于非极客对游戏和应用的感知是非常重要的。 突然之间,游戏似乎不像努力工作,更像是看电视,这为世嘉,任天堂,Atari等人打开了大门,让消费者的游戏进入家庭。

  • 媒体压缩(MP3和video文件)。 而像TiVO和iPod这样的一大堆东西,我们并不认为它们是电脑,因为它们如此无处不在,用户友好。 但他们是。

我认为这里的共同点是曾经不可能的事情(制作印刷文件,准确地复制彩色图像,实时发送信息到世界各地,分发audio和video材料),然后由于设备和物stream而昂贵涉及,现在是消费级别。 那么,如果我们能够制定出小而便宜的scheme,那么现在大公司做的事情以前是不可能的,但可能会很酷。

任何仍然涉及物质运输的东西都是值得关注的。 video会议还没有取代真正的会议 – 但是使用正确的技术,它仍然可能。 一些休闲旅游可以通过全感觉沉浸式环境消除 – 家庭影院是一个微不足道的例子, 另一个是苏豪区的办公大楼里的“虚拟高尔夫球场”,在这里您可以在模拟球场上打18洞真正的高尔夫球。

不过,对我来说,下一个真正重大的事情就是制造。 制造东西。 勺子和吉他和椅子和服装和汽车和瓷砖和东西。 仍然依赖制造和分销基础设施的东西。 我不必再去商店购买电影或专辑了 – 直到我不需要去商店买衣服和厨具?

当然,OLED显示器,GPS和移动宽带以及IoC容器,脚本和“云”也正在发生着有趣的变化 – 但它们仍然只是将图片放在屏幕上的新方式。 我可以打印自己的照片并写下自己的网页,但是我希望能够制作一个完全符合我办公桌旁angular落的麻布篮,还有一个用于将我的吉他效果器固定在桌面上的安装支架。把我的手机剪到我的自行车把手上。

不是编程相关? 不,但是在1980年,声音的生产也不是。 或video分配。 或者给你在赞比亚的亲属发消息。 想大,人… 🙂

包pipe理和分布式修订控制。

这些软件开发和分发的模式是相当新的,并且刚刚开始产生影响。

Ian Murdock曾经把包pipe理叫做“单一最大的Linux给业界带来”。 那么他会的,但他有一个问题。 自1980年以来,软件的安装方式发生了重大变化,但大多数电脑用户还没有经历过这种变化。

Joel和Jeff一直在讨论使用Eric Sink在播客#36中进行版本控制(或版本控制,或源代码控制)。 似乎大多数开发者还没有赶上集中式的系统,DVCS被广泛认为是神秘的和不必要的。

从播客36抄本 :

○时06分37秒

阿特伍德:如果你认为 – 这是一个很大的假设 – 大多数开发人员有点不习惯掌握基本的源代码控制 – 坦率地说,我发现这不是真的。

Spolsky:不,他们中的大多数人,即使他们有,他们理解的检查,检查,但分支和合并 – 混淆他们。

BitTorrent 。 它彻底改变了以前似乎是一个明显不可改变的规则 – 一个人通过互联网下载文件所花费的时间与下载文件的人数成正比。 它还解决了以前的对等解决scheme的缺陷,特别是对于“解决scheme”本身而言是有机的方式。

BitTorrent优雅地变成了一个通常是缺点的东西 – 许多用户试图同时下载单个文件 – 成为一个优势,在地理上分配文件作为下载过程的一个自然部分。 其优化两个对等方之间的带宽使用的策略阻碍了作为副作用的榨取 – 所有参与者执行节stream的最佳利益。

这是一旦其他人发明的想法之一,似乎很简单,如果不明显的话。

Damas-Milnertypes推断(通常称为Hindley-Milnertypes推理)于1983年出版,并且自此成为每个复杂静态types系统的基础。 在编程语言中这是一个真正的新概念(基于20世纪70年代出版的想法而被接纳,但直到1980年以后才成为现实)。 就重要性而言,我把它放在自我和用来实现自我的技术上; 在影响力方面没有同行。 (OO世界的其余部分仍然在做Smalltalk或Simula的变体。)

types推断的变化仍在发挥; 我最能想到的变化是Wadler和Blott的解决重载的types机制,后来发现它为types级别的编程提供了非常强大的机制。 这个故事的结尾还在写。

这里有一个谷歌地图缩减的插件,不仅仅是为了自己,而是作为谷歌在不可靠商品机器的农场之上运行快速,可靠服务的代理。 这绝对是一个重要的发明,完全不同于1980年统治雄心勃勃的重量级计算的大型主机方法。

标记 ,信息分类的方式。 是的,每个问题下的小文本。

发明标签花了大约30年的时间,这真是太神奇了。 我们使用列表和目录; 我们使用了印刷书籍优化的东西。

然而,30年的时间比人们需要认识到印刷书籍的尺寸更小的时间要短得多。 人们可以把书保存在手中。

我认为标签概念在核心CS人员中被低估了。 所有的研究都集中在自然语言处理(自上而下的方法)上。 但标签是计算机和人们都能理解的第一语言。 这是一种自下而上的方式,使电脑使用自然语言。

我认为我们正在看这个错误的方式,得出错误的结论。 如果我得到这个权利,这个周期是:

理念 – >第一次实施 – >less数人收养 – >临界质量 – >商品

从最初的想法到商品,你经常有几个世纪,假设这个想法已经达到了这个阶段。 达芬奇可能在1493年制造了某种直升机,但花了大约400年的时间才得到一台能够脱离地面的实际机器。

威廉·伯恩(William Bourne)从1580年首次描述潜艇到1800年的第一次实施,有220年的历史,现在的潜艇还处于初期阶段:我们几乎不了解水下旅行(海底2/3的行星,想想潜在的房地产;)。

而且,我们没有听说过早期的那些我们从未听说过的早期的想法。 根据一些传说,看起来亚历山大大帝在公元前332年曾用过某种潜水钟(这是潜艇的基本思想:将人员和空气供应到海底的装置)。 算上这一点,我们正在从构想(即使是基本的原型)到产品的2000年。

我所说的是,今天看实施,更别提产品了,甚至连1980年以前的想法都没有……我敢打赌,中国古代一些无名文件pipe理员使用“快速sorting”algorithm。 所以呢?

四十年前有networking电脑,当然,但是和今天的互联网没有什么区别。 基本的想法/技术在那里,但不pipe你玩不了魔兽在线游戏。

我声称,在大多数计算领域我们需要真正的新思想,我想知道最近已经做了哪些重要而有力的计划。 如果我们找不到他们,那么我们应该问“为什么?” 和“我们该怎么做?”

从历史上看,我们从来没有能够“快速地find那些与想法接近的东西”。 我认为这个循环越来越快,但计算仍然是年轻的。

目前,我正试图弄清楚如何制作全息图(星球大战类,没有任何物理上的支持)。 我想我知道如何使它工作。 我甚至没有收集到工具,材料和资金,但即使我能够在任何程度上取得成功,实际的想法已经有几十年的历史了,至less有相关的实施/技术已经被使用了很长时间。

一旦你开始列出实际的产品,你可以很确定的概念和第一个实现前不久。 没关系。

你可以用某种理由来辩论,没有什么是新的,或者一切都是新的,总是。 这是哲学,两种观点都可以得到捍卫。

从实际的angular度来看,真理存在于两者之间。 真相不是一个二元的概念,布尔逻辑是该死的。

中国人可能早就拿出印刷机了,但是大概只有10年左右,大多数人才能以合理的价格在家里打印体面的彩色照片。

发明是无处可去的,取决于你的标准和参照系。

Google的Page Rankalgorithm。 虽然它可以被看作是对networking爬虫search引擎的一种改进,但我会指出,它们也是在1980年后发展起来的。

DNS,1983年,以及相关的进步,如电子邮件主机parsing通过MXlogging,而不是爆炸path。 *不寒而栗*

Zeroconf在DNS上工作,2000年。我将我的打印机插入networking,我的笔记本电脑看到它。 我在networking上启动一个networking服务器,我的浏览器看到它。 (假设他们广播他们的可用性。)

NTP(1985)基于Marzulloalgorithm(1984)。 在抖动networking上准确的时间。

鼠标滚轮,1995年。没有它的使用鼠标感觉如此原始。 不,这不是恩格尔巴特的团队想到的,忘了提的。 至less当我问那个时候在队里的人的时候, (这是在1998年的某个Engelbart赛事,我得到了第一只老鼠。)

Unicode,1987,及其对不同types的编码,标准化,双向文本等的依赖性进展。

是的,人们每天都在使用这五种方法是很常见的。

这些是“真正的新点子吗?” 毕竟,有老鼠,有字符编码,有networking计时。 告诉我如何区分“新”和“真正的新”,我会为你回答这个问题。 我的直觉说这些是新的。

在较小的领域,有更容易更新的进展。 例如,在生物信息学方面,史密斯 – 沃特曼(Smith-Waterman,1981),尤其是BLAST(1990)有效地使这一领域成为可能。 但是,这听起来像是在寻求在整个计算领域非常广泛的想法,并且首先要挑选低悬的果实。 因此,它总是与一个新的领域。

数码相机呢?

根据维基百科, 第一台真正的数码相机出现在1988年,大众市场的数码相机在上世纪90年代末变得可以承受。

现代着色语言和现代GPU的stream行。

GPU也是一款低成本的并行超级计算机,采用CUDA和OpenCL等工具,可以实现快速的高级并行代码。 感谢所有这些玩家,降低了这些令人印象深刻的硬件奇迹的价格。 在接下来的五年中,我希望所有售出的新电脑(以及iPhone)都能够运行大规模并行代码,这是一个基本假设,就像24位彩色或32位保护模式一样。

JIT汇编是在20世纪80年代末发明的。

为了解决“为什么新的创意死亡”这个问题,以及“该怎么办”?

我怀疑很多缺乏进展的是由于资本的大量涌入以及行业的财富根深蒂固。 听起来是违反直觉的,但我认为任何新的想法都是一成不变的, 如果不是第一次尝试,就不能回来。 它被一个有着根深蒂固利益的人所购买,或者只是失败了,能源消失了。 几个例子是平板电脑,和集成的办公软件。 牛顿和其他几个人有真正的潜力,但最终(通过竞争性的消耗和糟糕的判断)浪费了他们的出生地,杀死了整个类别。 (我特别喜欢Ashton Tate的框架,但是我仍然坚持使用Word和Excel)。

该怎么办? 首先想到的是Wm。 莎士比亚的build议是:“杀死所有的律师吧。” 但是现在他们武装得太好了,恐怕。 我其实认为最好的select是find某种开放源代码的倡议。 他们似乎比替代scheme更好地保持可访问性和渐进式改进。 但是这个行业已经足够大,所以需要某种有机的合作机制来获得牵引力。

我还认为,有一种dynamic的说,根深蒂固的利益(特别是平台)需要大量的变动 – stream失 – 为持续的收入来源辩护; 这吸收了很多可以用更好的方式花费的创造力。 看看我们花了多less时间用微软,Sun或者Linux或者Firefox的最新迭代来花费水,对已经大部分工作正常的系统进行修改。 这不是因为他们是邪恶的,而是因为这个行业才build立起来的。 没有稳定平衡这样的东西; 所有的反馈机制都是积极的,有利于稳定的变化。 (你有没有看到一个function撤回,或者一个变化缩回?)

在SO上讨论的另一个线索是Skunkworks综合症(参考:Geoffrey Moore):大型组织中的真正创新几乎总是(90%以上)出现在自发出现的未经授权的项目中,仅由个人或小组主动性而且更多的时候不是被正式的pipe理层级所反对)。 所以:问题权威,降压系统。

有一件令我惊讶的事情是简单的电子表格。 非程序员的民间用一个简单的公式网格为现实世界的问题build立了一个非常好的解决scheme。 在桌面应用程序中复制他们的工作往往花费比编写电子表格所花费的时间多10到100倍的时间,结果得到的应用程序通常更难以使用,并且充满了错误!

我相信电子表格成功的关键是自动依赖分析。 如果电子表格的用户被迫使用观察者模式,他们将没有机会得到正确的。

所以,大的进步是自动依赖分析。 现在为什么没有任何现代化的平台(Java,.Net,Web Services)将其构build到系统的核心? 特别是在通过并行化进行扩展的日子和时代 – 依赖关系的graphics会导致平行计算。

编辑:党 – 刚刚检查。 VisiCalc于1979年发布 – 让我们假装它是1980年以后的发明。

编辑2:似乎电子表格已经被艾伦注意到了 – 如果他买了这个论坛的问题是正确的!

软件:

  • 虚拟化和仿真

  • P2P数据传输

  • 社区驱动的项目,如维基百科,SETI @ home …

  • networking爬虫和networkingsearch引擎,即将索引信息传播到世界各地

硬件:

  • 模块化PC

  • 电子纸

函数式编程研究人员重新发现monad。 monad有助于让纯粹,懒惰的语言(Haskell)成为一种实用的工具; it has also influenced the design of combinator libraries (monadic parser combinators have even found their way into Python).

Moggi's "A category-theoretic account of program modules" (1989) is generally credited with bringing monads into view for effectful computation; Wadler's work (for example, "Imperative functional programming" (1993)) presented monads as practical tool.

Shrinkwrap software

Before 1980, software was mostly specially written. If you ran a business, and wanted to computerize, you'd typically get a computer and compiler and database, and get your own stuff written. Business software was typically written to adapt to business practices. This is not to say there was no canned software (I worked with SPSS before 1980), but it wasn't the norm, and what I saw tended to be infrastructure and research software.

Nowadays, you can go to a computer store and find, on the shelf, everything you need to run a small business. It isn't designed to fit seamlessly into whatever practices you used to have, but it will work well once you learn to work more or less according to its workflow. Large businesses are a lot closer to shrinkwrap than they used to be, with things like SAP and PeopleSoft.

It isn't a clean break, but after 1980 there was a very definite shift from expensive custom software to low-cost off-the-shelf software, and flexibility shifted from software to business procedures.

It also affected the economics of software. Custom software solutions can be profitable, but it doesn't scale. You can only charge one client so much, and you can't sell the same thing to multiple clients. With shrinkwrap software, you can sell lots and lots of the same thing, amortizing development costs over a very large sales base. (You do have to provide support, but that scales. Just consider it a marginal cost of selling the software.)

Theoretically, where there are big winners from a change, there are going to be losers. So far, the business of software has kept expanding, so that as areas become commoditized other areas open up. This is likely to come to an end sometime, and moderately talented developers will find themselves in a real crunch, unable to work for the big boys and crowded out of the market. (This presumably happens for other fields; I suspect the demand for accountants is much smaller than it would be without QuickBooks and the like.)

Outside of hardware innovations, I tend to find that there is little or nothing new under the sun. Most of the really big ideas date back to people like von Neumann and Alan Turing.

A lot of things that are labelled 'technology' these days are really just a program or library somebody wrote, or a retread of an old idea with a new metaphor, acronym, or brand name.

Computer Worms were researched in the early eighties of the last century in the Xerox Palo Alto Research Center.

From John Shoch's and Jon Hupp's The "Worm" Programs – Early Experience with a Distributed Computation " (Communications of the ACM, March 1982 Volume 25 Number 3, pp.172-180, march 1982):

In The Shockwave Rider , J. Brunner developed the notion of an omnipotent "tapeworm" program running loose through a network of computers – an idea which may seem rather disturbing, but which is also quite beyond our current capabilities. The basic model, however, remains a very provocative one: a program or a computation that can move from machine to machine, harnessing resources as needed, and replicating itself when necessary.

In a similar vein, we once described a computational model based upon the classic science-fiction film, The Blob : a program that started out running in one machine, but as its appetite for computing cycles grew, it could reach out, find unused machines, and grow to encompass those resources. In the middle of the night, such a program could mobilize hundreds of machines in one building; in the morning, as users reclaimed their machines, the "blob" would have to retreat in an orderly manner, gathering up the intermediate results of its computation. Holed up in one or two machines during the day, the program could emerge again later as resources became available, again expanding the computation. (This affinity for nighttime exploration led one researcher to describe these as "vampire programs.")

Quoting Alan Kay: "The best way to predict the future is to invent it."

Better user interfaces.

Today's user interfaces still suck. And I don't mean in small ways but in large, fundamental ways. I can't help but to notice that even the best programs still have interfaces that are either extremely complex or that require a lot of abstract thinking in other ways, and that just don't approach the ease of conventional, non-software tools.

Granted, this is due to the fact that software allows to do so much more than conventional tools. That's no reason to accept the status quo though. Additionally, most software is simply not well done.

In general, applications still lack a certain “just works” feeling are too much oriented by what can be done, rather than what should be done. One point that has been raised time and again, and that is still not solved, is the point of saving. Applications crash, destroying hours of work. I have the habit of pressing Ctrl+S every few seconds (of course, this no longer works in web applications). Why do I have to do this? It's mind-numbingly stupid. This is clearly a task for automation. Of course, the application also has to save a diff for every modification I make (basically an infinite undo list) in case I make an error.

Solving this probem isn't even actually hard. It would just be hard to implement it in every application since there is no good API to do this. Programming tools and libraries have to improve significantly before allowing an effortless implementation of such effords across all platforms and programs, for all file formats with arbitrary backup storage and no required user interaction. But it is a necessary step before we finally start writing “good” applications instead of merely adequate ones.

I believe that Apple currently approximates the “just works” feeling best in some regards. Take for example their newest version of iPhoto which features a face recognition that automatically groups photos by people appearing in them. That is a classical task that the user does not want to do manually and doesn't understand why the computer doesn't do it automatically. And even iPhoto is still a very long way from a good UI, since said feature still requires ultimate confirmation by the user (for each photo!), since the face recognition engine isn't perfect.

HTM systems ( Hiearchical Temporal Memory ).

A new approach to Artifical Intelligence, initiated by Jeff Hawkins through the book " On Intelligence ".

Now active as a company called Numenta where these ideas are put to the test through development of "true" AI, with an invitation to the community to participate by using the system through SDKs.

It's more about building machine intelligence from the ground up, rather than trying to emulate human reasoning.

The use of Physics in Human Computer interaction to provide an alternative, understandable metaphor. This combined with gestures and haptics will likely result in a replacment for the current common GUI metaphor invented in the 70's and in common use since the mid to late 80's.

The computing power wasn't present in 1980 to make that possible. I believe Games likely led the way here. An example can easily be seen in the interaction of list scrolling in the iPod Touch/iPhone. The interaction mechanism relies on the intuition of how momentum and friction work in the real world to provide a simple way to scroll a list of items, and the usability relies on the physical gesture that cause the scroll.

I believe Unit Testing, TDD and Continuous Integration are significant inventions after 1980.

Mobile phones.

While the first "wireless phone" patent was in 1908, and they were cooking for a long time (0G in 1945, 1G launched in Japan in 1979), modern 2G digital cell phones didn't appear until 1991. SMS didn't exist until 1993, and Internet access appeared in 1999.

I started programming Jan 2nd 1980. I've tried to think about significant new inventions over my career. I struggle to think of any. Most of what I consider significant were actually invented prior to 1980 but then weren't widely adopted or improved until after.

  1. Graphical User Interface.
  2. Fast processing.
  3. Large memory (I paid $200.00 for 16k in 1980).
  4. Small sizes – cell phones, pocket pc's, iPhones, Netbooks.
  5. Large storage capacities. (I've gone from carrying a large 90k floppy to an 8 gig usb thumb drive.
  6. Multiple processors. (Almost all my computers have more than one now, software struggles to keep them busy).
  7. Standard interfaces (like USB) to easily attach hardware peripherals.
  8. Multiple Touch displays.
  9. Network connectivity – leading to the mid 90's internet explosion.
  10. IDE's with Intellisense and incremental compiling.

While the hardware has improved tremendously the software industry has struggled to keep up. We are light years ahead of 1980, but most improvements have been refinements rather than inventions. Since 1980 we have been too busy applying what the advancements let us do rather than inventing. By themselves most of these incremental inventions are not important or powerful, but when you look back over the last 29 years they are quite powerful.

We probably need to embrace the incremental improvements and steer them. I believe that truly original ideas will probably come from people with little exposure to computers and they are becoming harder to find.

没有。

I think it's because people have changed their attitudes. People used to believe that if they would just find that "big idea", then they would strike it rich. Today, people believe that it is the execution and not the discovery that pays out the most. You have mantras such as "ideas are a dime a dozen" and "the second mouse gets the cheese". So people are focused on exploiting existing ideas rather than coming up with new ones.

Open Source community development.

The iPad (released April 2010): surely such a concept is absolutely revolutionary!

alt text photos/2010/1/apple-ipad/apple-ipad-05.JPG

No way Alan Kay saw that coming from the 1970's!
Imagine such a "personal, portable information manipulator"…


Wait? What!? The Dynabook you say?

替代文字

Thought out by Alan Kay as early as 1968, and described in great details in this 1972 paper ??

NOOOooo ooooo….

Oh well… never mind.

Interesting Posts