Know Your Language: The Slow Flickering Star-Death of Java (Part One)
Image: Casey Reed/NASA

FYI.

This story is over 5 years old.

Tech

Know Your Language: The Slow Flickering Star-Death of Java (Part One)

How and why Java rules object-oriented programming, and why it's the beginning of the once-crucial language's long, long winding down.

Java is at a turning point, it seems. According to a source (via InfoWorld) formerly at Oracle, the language's corporate parent, the future of Java is "planned obsolescence"—an indeterminate winding down.

At the same time, the company has reportedly canned some of its "Java evangelists." These are (or were) the Oracle staffers tasked with promoting the language to the software community, e.g. technically skilled marketers.

Advertisement

And so many are speculating that it's the beginning of the end for what is one of the two or three most foundational and widely used languages out there—a doom that won't actually surprise very many people. Of course, the "end" for Java is sort of like the death of a star. It will heave and dim over the course of billions of years (which is tens of years at programming time-scales), only to end up a slowly burning white dwarf, a warm, compact sphere of legacy enterprise applications.

Java's supernova days are over.

Despite its stature as the most popular programming language—a hair above C and C++, according to a recent IEEE report—I rarely use Java directly. While the language is an immensely powerful and important language, and pretty much inevitable in the programming real-world, it's also uniquely un-fun.

The Java virtual machine

Here is the most important thing about Java: the Java Virtual Machine.

A common Java pitch is that its programs are portable. This has to do with its "virtual machine." If I write a program in C or C++, my next step is going to be feeding it into a compiler, which will turn my human-readable code into machine code. Java adds another layer in between, which is the JVM. So, instead of compiling programs into machine code corresponding to the actual physical machine, Java compiles code into bytecode that runs on the virtual machine, which is itself software.

Advertisement

The JVM connects to the actual computing hardware via an interpreter. Different operating systems have different interpreters written for them. The bytecode produced by the JVM is eventually converted into a given machine's "native" machine language, where it becomes just like any non-JVM language: instructions that tell a physical machine what to do and when.

This has advantages, and portability is the big one. A program written in Java is cross-platform. It will produce the same bytecode in whatever operating system, with the differentiation arriving during the interpretation stage. The same bytecode does not produce the same assembly code, in other words.

Another advantage of this scheme is known as "garbage collection." A programmer in Java generally doesn't deal directly with physical computer memory. This is automatically handled further down the line. The upshot is that the programmer doesn't have to worry about allocating and deallocating that memory, which in C and C++ can be a headache to end all headaches. Not dealing directly with memory is also considered to be a safer way of doing things—no buffer overruns, etc. A programmer is less likely to make something that's gonna crash.

That all sounds pretty great, right? So why isn't every programming language handled in the same way? The answer is above. Running a Java program is slower because it has this whole extra step. It's a lot harder to optimize in Java because the language kind of just plays dumb when it comes to the physical machine. Abstraction in computing—simply defined as the degrees of separation between code and hardware—is never free.

Advertisement

A touch of class

I learned Java in a class on object-oriented programming, a highly dominant coding worldview in which the notion of modularity is worshiped above all. There is no other language more attuned to the object-oriented paradigm than Java—the language exists as a vast sea of small, discrete code units, all of which are available from the Java API and all of which are immaculately documented within that API. It's sort of like buying (or taking) code from Ikea: The pieces and materials are all already there ready and packaged together. There's no need for saws and drills and measuring tape. Instead, there's just a blank white box with a reassuring rattle inside.

We say that Java has a rich standard library, which means that instead of combing a bunch of random semi-functioning libraries stocked with different implementations of common (or not-so-common) functions, a programmer has a highly standardized centralized source. This can be a streamlined and reassuring way to go about assembling software from pre-built objects.

First, what do we really mean by "object"? This is kind of tricky because it's a literal, technical way to build things, but it's also a way to interpret programming itself—again, a worldview. In this worldview, nothing is really ever only a part of something else. It's a self-contained, clearly-defined, and self-consistent unit that may be be combined with other discrete units in useful ways. An object is not created for a specific program, it's a thing that might be used in any number of different programs.

Advertisement

To make things complicated, strictly speaking an object could be anything. It could be a data type like an integer or array, a method (also known as a function or procedure) that operates on data in a specific way, a class that collects these data types and methods together into logical ways, or the program itself. The overall idea is of encapsulation.

Object-oriented programming centers around one particular and highly amorphous sort of object called a class. In class-based object-oriented programming, an object is a self-contained unit made of different variables, functions, structures, classes, and/or data structures. The idea is that all of this stuff can be defined in some external file that can be stashed away out of sight, but its functionalities can be invoked by name in the current file. It's like linking on the internet.

There's nothing keeping a programmer from writing a randomShit object that's just full of useless unrelated garbage.

Ideally, all of the various pieces within a class-object are related to each other and support each other, but there's nothing keeping a programmer from writing a randomShit object that's just full of useless unrelated garbage. This is true of any object-oriented language.

Part of the aim of an object is to keep code from being rewritten again and again. It's axiomatic in programming that once a programmer begins to cut and paste code, they're doing something wrong. Instead of rewriting that code again and again (or copy and pasting it), resulting in a bunch of extra work and the promise of maintenance hassles down the line, it can be packaged up and hidden away somewhere. The result is then an abstraction that can be invoked again and again where needed.

Advertisement

That's sort of the canonical object idea, where an object is manifested as an construct known as a class. A class defined in a file in some code library somewhere can be accessed by name in the current program and the stuff inside of it (functions, variables, data structures) can be then accessed in a number of different ways. Once declared in a program, a class then becomes an instance of that class. An abstraction brought to life within a specific context.

This is surely easier to understand with an example. Say my program needs to have a list of related values: a bunch of names each associated with an ID number. The data structure for this is known as a map, which is just a big list where each item of the list has two values associated with it. One is the key and the other is the value associated with the key. If I want to access a value within a map structure, I reference using its key, which is unique among all items in the map. A map contains key-value pairs.

There are a lot of ways to implement this with ground-up code and it wouldn't be all that hard, but I don't need to do that nor should I do that (if we're all implementing our maps differently, things will get ugly eventually). So, instead, I head over to the Java API and search for "map." There it is. Cool.

To use all of the wonderful tools provided by map I just need to call them by name and tell the program that it's part of map. If I wanted to add a new item to my map (a concrete instance of the generic map class from the Java API) I can just use the class's built-in "put" function. I don't need to create the put function myself, nor do I even need to know how it works. Just that it does.

Advertisement

That's the whole idea. Everyone uses the same put function for the same map class. (It's possible to write a new map class and a new put function in Java, just as in any other language, but it's really defeating the purpose.) Lots of languages do almost this same thing, but Java takes it to the extreme. Part of this is just how "built in" objects are to Java. As a result, Java programs are very modular and also relatively easy to write. They're pretty much already written in psuedocode.

Hello, World

Here, "Hello, World" is implemented as its own class. This is an object that would be included within a program to be used by that program. Within the class is a function, which is another object, and within the "main" function-object, is an actual instruction, "println." This instruction is a member of another class called System, which is an object found within the Java API containing several tools and assorted sub-objects primarily used for input-output operations. Objects in objects in objects in objects. Objects all the way down. It has a certain elegance.

The Java Applet

Java isn't really considered to be a web language (now), but it paved the way for browser-based programming via the Applet. This is essentially just a tiny Java program meant to be run in a web browser via a plug-in. For a long time, the Java Applet was the only game in town for web pages needing to do cool, dynamic things; once upon a time, even a simple rollover effect would require Java (or later Flash). With JavaScript and CSS, things began to fall off for the Applet.

Even as interactivity capabilities materialized elsewhere, Applets could still offer some performance advantages for the simple reason that they had hardware access beyond the browser itself via the Virtual Machine. This changed around 2011 with the introduction of the HTML canvas element, which enabled JavaScript to catch up to Java Applets performance-wise. Meanwhile, the long-fought adoption of CSS and its brave new UI world had began to become a reality in the late-2010s.

Nowadays, using an Applet in a webpage is a good way to get yelled at.

In part II: The early-days internet origins of Java; why people hate Java.

Read more Know Your Language.