From my personal experience producing "enterprise" code for the past 20 years with languages starting at Fortran and ending up with VB.net with all the Python/C/C++/Visual C++/Delphi etc. in the middle, the language/stack choice is orthogonal to the quality of the code base.
You can find wonderful code in VB.net and pure crap in Ruby and vice-versa. This is the company/coder culture which influences the most the end result. Blaming the tools in this case is for me the mark of a lack of experience both in programming and long term software development.
> the language/stack choice is orthogonal to the quality of the code base
technically true, but in reality there's going to be a correlation between technology stack used and development practices. whether that's Javascript/Swift at super-agile-brogrammer $startup or Java/.NET at Enterprise Corp, Inc. stereotypes exist for a reason.
he's what you can do: evaluate how much you hate an average/mediocre/horrible workflow in said language (for me, I hate anything Javascript, no matter how smart/great the devs are. some people it's the verboseness of Java, etc).
then, go work for/with people you've worked before. i have yet to find another way of evaluating company culture truthfully. (interviews just don't cut it, too much potential to lie.)
> interviews just don't cut it, too much potential to lie.
That, and, an interview is a handful of hours at most. There's no way to really evaluate a culture in that amount of time, even if nobody is hiding anything. The real problems, the frustrating politics, etc only become apparent after you've been "in it" a while I think.
It is all about making developers cogs. The language and frameworks are only useful so far as they make you more replaceable. Don't ever do anything clever or advanced even if it makes the company a ton of money or boosts productivity. Because the next cheap grad they hire to replace you might not be able to figure it out seeing as he only knows Java or .net.
At my job, they have started trying to convince us that "fungibility" is something that we should aspire to. As though we can't see what that means.
I'm not against good documentation and clean code that makes it easier on the next developer, but this is taking things to an extreme.
> In the end though, its absolutely about culture and Enterprise IT will suffocate you no matter what language.
The one thing I really worry about is being stuck in the MS stack for years and not learning as much as I can (or want to).
I still ~really~ dislike the term "IT" or "IT Dept" - since the majority of the time you are seen as some guy who works on computers or can fix your problem.
Don't worry so much about the quantity (learning as much as you can) than the quality. You could basically kill yourself trying to learn every modern piece of technology. It's not really worth it -- it'll be obsolete before you get a chance to actually use that knowledge.
It's one thing to be stuck with terrible non-moving technology but .NET is a good technology stack that is progressing in the future nicely. Conceptually you'll be fine.
You will always be stuck to a certain stack, specially in countries like Germany where what HR department cares about is what language/tooling you did use during the more recent years before the interview.
Usually knowledge learned on side projects are not taken seriously by them, as they weren't on the job. Oh and they love paper (certifications of all sorts).
Algorithms and data-structures always transfer to any stack. Learn the CS mathematical foundations of them.
A good way to avoid being stuck with a certain stack as you put it, is working for a consulting company. It is way easier to jump stacks when switching projects than than trying to convince random HR department of what you actually know.
This is the main issue, I have been seeing this since enterprise applications were written xBase, Turbo Pascal and C.
In the Enterprise, politics and developers being used as cogs trumps any kind of technology stack one can think of.