Being reminded about the Japanese Fifth Generation Computing project, I guess I still don't quite grasp all the subtleties of the name.

From the Wikipedia summary it seems like "fifth generation of computing hardware" was (Japanese, and especially MITI ) code for "parallelism". The five generations being: vacuum tube, solid-state, integrated circuit, "microprocessor", and then massive parallelism.

By this standard, we're still in the fifth generation. GPUs and cloud.

Iirc, fourth generation was query languages like SQL. The general idea is that as you go up, the ratio of compiler effort to programmer effort goes up.


That was the fourth generation of software/languages, yes. Domain-specific, mostly database searching languages. Third generation being "general purpose high level languages" like I think everything from FORTRAN/COBOL on? First gen being machine language, second gen assembler?

But I think the MITI idea of "fifth generation" seems to have been thinking in terms of hardware, not software.

But^2... Prolog as a "fifth gen language" maybe paired with parallelism as "fifth gen hardware".


So that I think is partly why the MITI Fifth Generation concept was so murky: it was trying to jump ahead of both the American hardware AND software paradigms at once, by combining BOTH massively parallel hardware - already risky - with the European fascination with logic programming (to be both general purpose AND domain-specific) rather than the MIT AI Lab's interest in Lisp. Possibly deliberately blurring the lines between the two concepts of "generation".


I guess Forth also flirted with the concept of "fourth generation language" in that you could write very terse domain-specific abstractions in it... even though it was still very low-level, it probably wasn't any worse at doing that than dBASE II...

but I'm not sure if that was intentional or just a naming pun that the Forth community embraced during the time when "fourth gen language" was in a hype cycle?


I feel like part of the idea behind "fifth generation" -ness, and 1980s AI / parallelism hype at the time, was the idea of machine learning. Fourth generation software being domain-specific languages, fifth generation then maybe being the machine starting to learn and write its own code?

There seemed to be this cluster of tech that now seems unrelated but was all linked:

* parallelism
* connectionism / neural-nets
* machine learning (but including symbolic AI)
* objects
* GUIs


Like for quite a bit there it felt like "AI" was linked to two separate worlds:

* 'connectionism', the idea that AI would emerge naturally out of parallelism. Coming from the Internet but also object-oriented programming and parallel hardware. Connections to the West Coast New Age scene too and its emphasis on Social Networks.

* 'simulationism' maybe? GUIs and virtual reality and objects as high-bandwidth human/machine interfaces. The Matrix.

Both about 'intelligence augmentation'.


@natecull @enkiv2 this is a fascinating glimpse on connections not made and paths not taken

For some time i have felt that the modern GUI has stagnated because the required link with machine learning / data processing has not been there (ML being today mostly used to exploit rather than augment users)

Quite an eye opener (but on second thought logical). Software is so much just a reflection of social structure, its scary.

Sign in to participate in the conversation

Welcome to This server is for people in Europe, but you can connect with friends on any Mastodon server in the world.