The future of Java.

Soldato
Joined
23 Dec 2010
Posts
3,483
Hey guys,

Well I'm coming up to the end of my first year and we've all been asked to look at the modules that we can pick next year.

This is a pretty important thread, so no "PYTHON IS BETTER HERP DERP" please.

I remember reading in 2009 that 'experts' were saying that Java was coming to the end of it's lifespan.

But from what I've seen and read, Java has grown in the past 4 years as ubiquitous computing has become a necessity.

I know that HTML5 is starting to come into it's own with multi-platform application development - but I just want to know whether if it's still worth going through with Java or should I go with something different like C/C++, Python etc?

Thanks.
 
Associate
Joined
10 Dec 2012
Posts
254
I finished my Comp Sci degree 2 years ago and it was taught almost exclusively in Java. It depends almost entirely on what carrier you wish to pursue within the IT industry... I have found myself in a software testing job having to know quite a bit about vb6 and c#... neither of which i was taught at uni.

your couse will teach you the basics about structuring classes and developing programs from a proffesional standpoint and in my opinion thats the thing you should focus on while at university. The programming its self you can teach yourself, each of the languages have their own unique way of doing things but they can all do the same thing at the end of the day. It's just syntax. Having leant java, i have found vb6 and c# to be very easy to pick up. and i can read C and C++ without much problem. It's all logic at the end of the day. What you can't teach yourself is the importance of well structured code and using good coding practices because you wont ever learn them from books or code snippets online.

I have just noticed that you are located at Aberystwyth... I graduated from there 2 years ago. Good uni, but they teach you most of the "programming modules" in Java. This is a good language to keep with. I now prefer c# but thats because i can never be bothered with doing the gui stuff. Learning Java will put you in good stead for understanding all of the other languages. But i cannot emphasize enough just how important the rest of the stuff they teach you is. Having spent a year looking at code written by someone with a university degree, i can point out GLOWING errors that someone who had been formally trained would not make.

The other thing that i can reccomend is take the debugging seriously... they grill it in to you and you may think its dull.. but my god does it make programming easier...

I hope this helps.
 
Soldato
Joined
30 Jan 2007
Posts
15,466
Location
PA, USA (Orig UK)
As a java dev myself..... Looking at job adverts... java + c# + .net (and then things scripting languages as well). Do some coding in other languages as well as it will help you understanding programming more.

If learning java, then learn things like hibernate and spring as well.

Agree with debug as well. I always tried to avoid it and use println's etc, but it's saved me a lot of time in the past even though it can be a pain to track down bugs with at times.
 
Soldato
Joined
9 May 2005
Posts
4,528
Location
Nottingham
I'm also a Java developer and I would say it in quite a strong position at the moment. You will probably find it being used more in enterprise development as the back end infrastructure rather than Desktop applications though. As Scougar mentioned, things like Hibernate are also very valuable as the concepts it uses translate into many other programming languages. The .Net version, NHibernate is based on the same code and the knownledge from Java would translate easily to C#.
 
Associate
Joined
28 Jan 2013
Posts
236
Oh no. Java is so early 90s. Still no lambda expressions, functions as second class citizens. The good devs have long moved to functional languages. Some of those compile to Java bytecodes (e.g. Scala, Clojure), so they can use all Java libs, compile to Android Dalvik VM, etc.

Seriously one line of Clojure does as much work as 10+ lines of Java. Functional languages are way better both on the client (easy to use all cores without error-prone and hard to scale locks) and server (RESTful services and HATEOAS are very functional in nature).
 
Soldato
Joined
17 Oct 2002
Posts
4,308
Location
Bristol
I would agree with the above and also think Java is coming to a sharp decline, but I also think there is great value in learning it, its successor(s) is likely to run on the JVM thus sitting on its libraries and leverage a lot of its current tooling that works well.

I did Java at uni and for the last 10 years, I now work for very cutting edge software house within Nokia and have only written minimal Java for about 1yr, we've moved to nearly all Clojure now and what a revelation its been.

However, there is a lot to be learned for doing Java, not a day goes by where I don't code some interop stuff with it or write a macro for some Java utility. Then there is the enterprise stack associated with Java (maven jboss etc) its massively far reaching and a lot of companies will probably continue to use it even with a functional adoption.

But yes unless something massively changes in the hardware world then functional is the only real logical path, more cores, and more concurrency pushes devs closer to the land of Lisps where its handled almost transparently.
 
Last edited:
Caporegime
Joined
18 Oct 2002
Posts
29,491
Location
Back in East London
Meh. In ten years everyone will have had enough of functional languages. There's verbose code, there's declarative code, there's terse code, there's imperative code, then there is what usually results from functional programming which is not easy to pick up and/or maintain.

Everyone got tired of C/C++ and how easy it was to be too imperative. So Java and the like gain popularity because they promote abstraction and encourage a declarative expression. Soon everyone got tired of too much abstraction and verbosity so the pendulum swung in the opposite extreme and clojure and the like gain momentum.

Meanwhile the skilled developers use whatever tool fits the job and write code that is as near to the middle and is declarative yet terse.

Which language someone can write is not rated that highly compared to their ability to communicate with other developers and customers and to demonstrate their understanding of the fundamentals of programming and software design.
 
Soldato
Joined
17 Oct 2002
Posts
4,308
Location
Bristol
While I do agree there is an element of "flavour of the month" with functional at the moment I feel like there is also real technical underpinnings driving the current adoption above and beyond syntax or "cool" factor or whatever.

In this age of multi cpu cores and "Big data" purity, immutability and things like lazy seqs etc just make so much sense. And tbh I don't think most skilled developers use whatever fits the job, they have to use what the company says they can... and a few are starting to go big on functional adoption.
 
Caporegime
Joined
18 Oct 2002
Posts
29,491
Location
Back in East London
Companies that are doing it right let their developers choose their own tools. :)

Don't get me wrong. Functional languages have their place, but like any popular thing it will get misused and then blamed for that misuse. Much like Agile is really starting to get flak from a lot of people who are blaming it for making things difficult without actually understanding what agile does for and requires from them to be successful.

Of course joining an existing project usually means the choice has already been made.
 
Caporegime
Joined
18 Oct 2002
Posts
32,623
Thing is functional languages are nothing new, they've been around for decades, just happened to come back in vogue ATM.

One language that is coming back after a small decline in C++. Computers maybe getting faster but at slower rates than before, and data requirements are getting larger and larger, much faster than CPU increases, languages like Java and C#, let alone anything functional, are proving to slow in many applications. Our company almost exclusively uses C++, with some Fortran for back end number crunching and some python for scripting.


But anyway, a software engineer should never know only 1 or 2 languages, and the la gauges they know should be of little importance compared to their ability and understanding. Our company has no requirements for language experience, we hire based on ability, which nor ally means Math majors get hired far more frequently than CS majors!
 
Associate
Joined
20 Mar 2013
Posts
813
Location
London
It all depends on which industry sectors you want to work in eg Banking, Big Data, real-time embedded systems, new media etc

Big Data is quite big on Python, R and C++ whereas new media favours Java and HTML5 skills. Big companies will use a variety of languages across their business ranging from Java to COBOL.
 
Soldato
Joined
18 Oct 2002
Posts
3,926
Location
SW London
Companies that are doing it right let their developers choose their own tools. :)

I'm pretty sure I've discussed this with you before, but that's easier said than done for large companies. Where I am now there is masses of infrastructure for certain frameworks so we're pretty much restricted to what is provisioned there. That's not even counting the effort involved with legal and compliance to introduce anything new!!

Having said that the fact that we have things like Clojure and Scala on the JVM and F# on the. NET framework makes it easier to adopt some new stuff.
I certainly don't think Java is going anywhere fast as there are many large companies where ditching Java for the new latest thing is just far too expensive.

We are actually using some Scala at my current place though (thanks to it's use of the JVM and easy integration with the rest of the Java infrastructure as said above), and I certainly learning some functional concepts will benefit any developer. It's not a panacea as others have said, but it's always good to have another tool in your belt.
 
Caporegime
Joined
18 Oct 2002
Posts
29,491
Location
Back in East London
I fully agree. The choice is not just technical merit. However, it should really be the Developer(s) that make the choice - and the information that the company has invested in a particular stack is a factor in that decision. Often the primary factor.
 
Back
Top Bottom