Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It is interesting that so many of the largest languages were developed in a couple year time frame in the early-mid 90s. Python, Javascript, Java, Lua, R. All of these were developed 91-95 and make a bulk of development today.


It is an interesting observation!

I'm sure there are a few unrelated factors going on to influence that plus some inadvertent cherry-picking. But I do think there is a thing to the observation too.

If I had to make a guess, I'd point to a combination of:

1. This was right around the time that computers were fast enough to afford the runtime cost of garbage collection while still delivering usable apps. The GC languages before the 90s (Lisp, Scheme, Smalltalk, Self) had reputations for abysmal performance and have largely (but not entirely) died out.

2. This was also the boom of the Internet and web which shifted a lot of computation to server-side after the PC boom had shifted it client-side. That also enabled somewhat slower languages with GC to thrive because first-mover advantage was more valuable than using your servers efficiently. You could just throw more hardware at it.

3. That boom also created a huge influx of new developers who picked up whatever language happened to be hot at the time. And once they had, they tended to stick with it. I can't find a good graph of the number of software engineers over time but I'd bet that there's a noticeable uptick around the dotcom boom.


After mulling it over a bit, plus seeing a few other response (Php, Ruby, etc) I think the internet is the reason.

C, C++, Ada, Basic, Fortran. Those were written assuming they were going to be writing systems code and numerical code. There is then some inertia keeping the most successful ones of from those periods going in their spaces.

Then these new languages with new ideas (GCs, Dynamic Typing, etc) they came out and became successful in the newer Web and Application spaces. Now why they won out and not LISP or SmallTalk or what have you, I am not sure. But my hypothesis is that the web is a big part of it.


Languages developed in that time matured just as good binary package managers started popping up, is my pet theory. Before that, getting a development environment for a new language was serious work, and people did things like stick to Perl since it happened to be installed already.


Not true.

I was programming in the 90s when these languages emerged. Developments environments were emacs, vi, Brief, Borland IDE, etc. There were a few other IDEs available, but about $200 per seat.

All the scripting languages mentioned didn't come as default in Unix or Windows. You had to download from their own websites.

It was mostly Visual Basic, C, COBOL that were popular.


I think that's what I mean. After the time you talk about ('90s), these languages matured, and they happened to mature around the same time binary package managers became a thing, i.e. in the early-to-mid '00s.


There was also ELK Scheme, the Extension Language Kit, a scheme interpreter designed to be used as an extension language for other appllications.

https://www.usenix.org/legacy/publications/compsystems/1994/...

>Elk, the Extension Language Kit, is a Scheme implementation that is intended to be used as a general, reusable extension language subsystem for integration into existing and future applications. Applications can define their own Scheme data types and primitives, providing for a tightly-knit integration of the CIC++ parts of the application with Scheme code. Library interfaces, for example to the UNIX operating system and to various X window system libraries, show the effectiveness of this approach. Several features of Elk such as dynamic loading of object files and freezing of fully customized applications into executables (implemented for those UNIX environments where it was feasible) increase its usability as the backbone of a complex application. Elk has been used in this way for seven years within a locally-developed ODA-based multimedia document editor; it has been used in numerous other projects after it could be made freely available five years ago.

Also Gnu Guile:

https://en.wikipedia.org/wiki/GNU_Guile

>GNU Ubiquitous Intelligent Language for Extensions[3] (GNU Guile) is the preferred extension language system for the GNU Project[4] and features an implementation of the programming language Scheme. Its first version was released in 1993.[1] In addition to large parts of Scheme standards, Guile Scheme includes modularized extensions for many different programming tasks.[5][6]

Also Winterp, which used XLisp:

https://dl.acm.org/doi/10.1145/121994.121998

>Winterp is an interactive, language-based user-interface and application-construction environment enabling rapid prototyping of applications with graphical user interfaces based on the OSF/Motif UI Toolkit. Winterp also serves as a customization environment for delivered applications by providing a real programming language as an extension language. Many existing user-interface languages only have the expressive power to describe static layout of user interface forms; by using a high-level language for extensions and prototyping, Winterp also handles the dynamic aspects of UI presentation, e.g. the use of direct manipulation, browsers, and dialog. Winterp makes rapid prototyping possible because its language is based on an interpreter, thereby enabling interactive construction of application functionality and giving immediate feedback on incremental changes.Winterp's language is based on David Betz's public domain Xlisp interpreter which features a subset of Common Lisp's functionality. The language is extensible, permitting new Lisp primitives to be added in the C language and allowing hybrid implementations constructed from interpreted Lisp and compiled C. Hybrid implementation gives Winterp-based applications the successful extension and rapid-prototyping capabilities of Lisp-based environments, while delivering the multiprocessing perfor- mance of C applications running on personal Unix workstations.

And TCL/Tk of course!

https://www.tcl-lang.org/

And on the commercial side, there was Visix Galaxy, which was extensible in PostScript, inspired by NeWS:

https://www.ambiencia.com/products.php

https://0-hr.com/Wolfe/Programming/Visix.htm

https://groups.google.com/g/comp.lang.java.programmer/c/LPkz...

https://donhopkins.com/home/interval/pluggers/galaxy.html

https://wiki.c2.com/?SpringsAndStruts

>The Visix Galaxy project was a ridiculously overpriced and overfeatured portable GraphicalUserInterface. You could do things like swivel an entire panel full of their custom widgets 32 degrees clockwise, and it would render all its text at this new angle without jaggies. The company went out of business after gaining only a handful of customers. For USD$ 10,000 a seat they sure didn't see the OpenSource movement coming. Their last attempt before going under was (guess what?) a Java IDE.

Galaxy competed with Neuron Data Systems in the "cross platform gui framework" space (which got steamrolled by the web permanently and for a window of time Java):

https://donhopkins.com/home/interval/pluggers/neuron.html

Here is a great overview of User Interface Software and Tools by Brad Myers:

https://www.cs.cmu.edu/~bam/uicourse/2001spring/lecture05too...

https://www.cs.cmu.edu/~bam/toolnames/

https://docs.google.com/document/d/1hQbMwK_iyjX-wpu_Xw_H-3zL...


Don't forget Ruby in 1995!


PHP too.


Java 1996, C++ only got standardized in 1998, C in 1990 (tehcnically the standard is from 1989, but there was a short retification in 1990), Delphi is from 1995 (not that big player nowadays, but plenty of its influences live on C#, Typescript and Kotlin).

It goes to show how much investment is required for a programming language to actually take off at scale.

However in a couple of years, we will be asking the computers to perform tasks for us and the actually compiler frontend will be irrelevant to the AI runtime.


I think that's an illusion.

The language of R is S, which originated at Bell Labs in 01976. Python began development in 01989, although Guido didn't release it until 01991. And the top 20 on https://www.tiobe.com/tiobe-index/ are Python, C (01972?), C++ (01982?), Java, C# (01999? though arguably it's just a dialect of Java), JS, Visual Basic (first released 01991, within your window), Golang (02007), Delphi (under this name in 01995 but a dialect of Object Pascal from 01986, in turn a dialect of Pascal, from 01970), SQL (01973), Fortran (01957), Perl (01987), R, PHP (01995, within your window!), assembly (01947), Rust (02006), MATLAB/Octave (01984), Scratch (! 02003), Ada (01978?), and Kotlin (02011).

By decade, that's one language from the 40s, one language from the 50s, no languages from the 60s, 5 languages from the 70s, 5 languages from the 80s, 4 languages from the 90s, 3 languages from 0200x, one language from the 02010s, and no languages from the 02020s.

Lua is #33 on TIOBE's list, but given its prevalence in Roblox (as Luau), WoW, and other games, I suspect it should be much higher.

For some reason, CUDA (a dialect of C++) and shader languages like GLSL don't show up in the list at all.

— ⁂ —

I think most of what's going on here is that it takes a new language a long time to get good, and it takes a new good language a long time to get popular. Perl, Python, Java, PHP, and JS became popular because of the Web; https://philip.greenspun.com/panda/server-programming explains why Perl, Python, and PHP did, and of course Java and JS became popular because they were the only languages you could make interactive web pages in:

> You would think that picking a Web site development language would be trivial. Obviously the best languages are safe and incorporate powerful object systems. So let's do everything in Common Lisp or Java. Common Lisp can run interpeted as well as compiled, which makes it a more efficient language for developers. So Common Lisp should be the obvious winner of the Web server language wars. Yet nobody uses Common Lisp for server-side scripting. Is that because Java-the-hype-king has crushed it? No. In fact, to a first approximation, nobody uses Java for server-side scripting. Almost everyone is using simple interpreted languages such as Visual Basic, PHP, Perl, or Tcl.

> How could a lame string-oriented scripting language possibly compete in power with systems programming languages? Well, guess what? The only data type that you can write to a Web browser is a string. And all the information from the relational database management system on which you are relying comes back to to the Web server program as strings. So maybe it doesn't matter whether your scripting language has an enfeebled type system.


Why do you write your years with a leading zero?


So you can instantly recognize at a glance that it's Kragen's post! ;)

It's a Long Now Foundation thing: slower, deeper, longer. Y10K compliance.

https://longnow.org/ideas/long-now-years-five-digit-dates-an...

"The present moment used to be the unimaginable future." -Steward Brand

"How can we invest in a future we know is structurally incapable of keeping faith with its past? The digital industries must shift from being the main source of society’s ever-shortening attention span to becoming a reliable guarantor of long-term perspective. We’ll know that shift has happened when programmers begin to anticipate the Year 10,000 Problem, and assign five digits instead of four to year dates. 01998 they’ll write, at first frivolously, then seriously." -Steward Brand

10,000 Year Clock:

https://longnow.org/clock/


I wonder if somebody offers a therapy for that.



Appreciated.


Some people think that writing years as 2025 is wrong because this will lead to problems in year 9999 (y10k bug? I'm not sure if they call it that way) so they decided to introduce leading zero as it would solve something and not just postpone the problem to 99999.


So they are assuming that:

- this comment will still be around in 8000 years

- we will still be using the same calendar system in 8000 years

- people 8000 years in the future will leave off the leading 1 of years for some reason, and will use a leading 0 to disambiguate dates from the previous 10000 year period.


I would say it is a symbolic reminder to care about the long term consequences of our actions. In the same way we have holidays to remind us about the environment or mortality.


Somehow both incredibly optimistic and also unbelievably resigned at the same time


That's silly. The y2k bug was because the year was written as 65, instead of the full year being 1965, so information was lost. Writing 2025 has no missing information.


I do agree with your point but I also think that there is a lot of inertia in the sector (rightfully so!) and it is very difficult for languages to become established if they don't come with a "unique selling point" of sorts, which to me explains how new popular languages have become rarer.

That selling point, for Lua, is the super easy integration via C-API to me (=> making existing compiled applications scriptable), thanks to uncomplicated build (dependency free, simple), the straightforward C-API and the ease of exposing C "modules" to Lua.

On a sidenote:

Don't you think that Y10k-safe dates are somewhat inconsistent with referencing previous decades directly? Those dates are also obnoxious to parse for humans (myself, at least).


>C# (01999? though arguably it's just a dialect of Java)

That's like saying Java is a dialect of C++. Java was specifically designed as a "fuck you" to C++, and C# was specifically designed as a "fuck you" to Java.


While at a political level that's reasonable, at both the semantic and the syntactic level, the first version of C# was very close to Java, much closer than the first version of Java was to C++. https://learn.microsoft.com/en-us/dotnet/csharp/whats-new/cs... is a very vague overview.


More like a better Objective-C, and with a syntax that was appealing to C++ developers.

https://cs.gmu.edu/~sean/stuff/java-objc.html

.NET was being designed with J++, Microsoft's Java extensions, Cool research language only became C# and took over J++'s role in .NET due to Sun's lawsuit.

The lawsuit is more than well known, and the background to .NET planned used of .NET is on the papers published by Don Syme of F# fame, regarding the history of .NET and F# HOPL.


Regarding C# and Java part of your comment, I think you might want to take a look at the following Wikipedia entries:

- Microsoft Java Virtual Machine: https://en.wikipedia.org/wiki/Microsoft_Java_Virtual_Machine

- Visual J++: https://en.wikipedia.org/wiki/Visual_J%2B%2B


I've known and worked with James Gosling for years before Java (Live Oak), on his earlier projects, Emacs at UniPress and NeWS at Sun, and fought along side him against Sun management trying to make NeWS free in 1990 (and I left Sun because they broke the promises they made us and spilled a lot of blood), so I didn't need to learn about Java's history from Wikipedia.

James's email that convinced me to go work with him at Sun on NeWS in 1990:

https://news.ycombinator.com/item?id=22457490

James' original 1985 paper on SunDew (later called NeWS):

https://www.chilton-computing.org.uk/inf/literature/books/wm...

David Rosenthal on NeWS -vs- X11 in 2024:

https://www.theregister.com/2024/07/10/dshr_on_news_vs_x/

James Gosling on how he'd do it over again in 2002:

https://web.archive.org/web/20240126041327/https://hack.org/...

Me on the X-Windows Disaster, comparing X11 and NeWS in the 1994 Unix Haters Handbook:

https://donhopkins.medium.com/the-x-windows-disaster-128d398...

Here's a Stanford talk James Gosling gave about Java that I attended in 1995, where he talks about C++, his original tape copy program that turned into a satellite ground control system, how he holds the world record for writing the largest number of cheesy little extension languages to go, and his implementation of Emacs sold by UniPress (which RMS calls "Evil Software Hoarder Emacs"), and his design and implementation of NeWS (formerly SunDew), a PostScript based network extensible window system.

James Gosling - Sun Microsystems - Bringing Behavior to the Internet - 1995-12-1:

https://www.youtube.com/watch?v=dgrNeyuwA8k

>Video of James Gosling's historic talk about Java, "Bringing Behavior to the Internet", presented to Terry Winograd's user interface class at Stanford University, December 1, 1995.

In that talk I asked him a couple questions about security and the "optical illusion attack" that he hedged on (44:53, 1:00:35). (The optical illusion attack is when the attacker simply draws a picture of a "secure" pop up dialog from your bank asking for your password.)

He mentioned off hand how a lot of the command and control systems for Operation Desert Storm was written in PostScript. That was his NeWS dialect of PostScript, and was written primarily by Josh Siegel at LANL called "LGATE", who later came to work at Sun in 1990 and rewrote the NeWS PostScript interpreter himself, then went on to write an X11 window manager in PostScript, again proving James's point that people always did a lot more with his cheesy little extension languages than he ever expected (which also held true with Java).

Josh's work on simulating Desert Storm and WWIII with NeWS at LANL:

https://news.ycombinator.com/item?id=44540509

Some of Terry Winnograd's other guest speakers:

https://news.ycombinator.com/item?id=39252103

I also saw Bill Joy's much earlier talk at the 1986 Sun Users Group in Washington DC, where he announced a hypothetical language he wanted to build called "C++++-=", and that he talked about in subsequent presentations.

I think that was the same talk when Bill said "You can't prove anything about a program written in C or FORTRAN. It's really just Peek and Poke with some syntactic sugar". More Bill Joy quotes:

https://www.donhopkins.com/home/catalog/unix-haters/slowlari...

James eventually realized that concept as Java, showing that the kernel inspiration of writing a "fuck you to C++" language existed long before James invented "Live Oak", even soon after C++ was invented. But "Java" was a much better name than "Live Oak" or "C++++-=" fortunately -- thanks to Kim Polese -- though not as succinct and musically inspired as "C#"!

https://en.wikipedia.org/wiki/Bill_Joy#Joy's_law

https://news.ycombinator.com/item?id=30113944

Bill Joy’s Law: 2^(Year-1984) Million Instructions per Second

https://donhopkins.medium.com/bill-joys-law-2-year-1984-mill...

>The peak computer speed doubles each year and thus is given by a simple function of time. Specifically, S = 2^(Year-1984), in which S is the peak computer speed attained during each year, expressed in MIPS. -Wikipedia, Joy’s law (computing)

>“C++++-= is the new language that is a little more than C++ and a lot less.” -Bill Joy

>In this talk from 1991, Bill Joy predicts a new hypothetical language that he calls “C++++-=”, which adds some things to C++, and takes away some other things.

>“Java is C++ without the guns, knives, and clubs.” -James Gosling


I think alasr meant to suggest that you might learn more about the history of C# by reading Wikipedia, not about the history of Java.


True




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: