(Computer-programming) language wars a bit silly, but not irrational
I don't know where I heard it (and it was probably not first hand) the observation of how weird it is that in the 21st century computer professionals segregate by the language they use to talk to the machine. It just seems silly, doesn't it?
Programming language discussions (R vs Python for data science, C++ or Python for computer vision, Java or C# or Ruby for webapps, ...) are a stable of geekdom and easy to categorize as silly. In this short post, I'll argue that that while silly they are not completely irrational.
Programming languages are mostly about tooling
Some languages are better than others, but most of what it matters is not whether the language itself is any good, but how large the ecosystem around it is. You can have a perfect language, but if there is no support for it in your favorite editor/IDE, no good HTTPS libraries which can handle HTTP2.0, then working in it will be efficient or even less pleasant than working in Java. On the other hand, PHP is a terrible terrible language, but its ecosystem is (for its limited domain) very nice. R is a slightly less terrible version of this: not a great language, but a lot of nice libraries and a good culture of documentation.
Haskell is a pretty nice programming language, but working in it got much nicer once stack appeared on the scene. The language is the same, even the set of libraries is the same, but having a better way to install packages is enough to fundamentally change your experience.
On the other hand, Haskell is (still?) enough of a niche language than nobody has yet written a tool comparable to ccache for the C/C++ world (instantaneous rebuilds are amazing for a compiled language).
The value of your code increases if you program in a popular language
This is not strictly true: if the work is self-contained, then it may be very useful on its own even if you wrote it in COBOL, but often the more people can build upon your work, the more valuable that work is. So if your work is written in C or Python as opposed to Haskell or Ada, everything else being equal, it will be more valuable (not everything else is equal, though).
This is somewhat field-dependent. Knowing R is great if you're a bioinformatician, but almost useless if you're writing webserver code. Even general-purpose languages get niches based on history and tools. Functional programming languages somehow seems to be more popular in the financial sector than in other fields (R has a lot of functional elements, but is not typically thought of as a functional language; probably because functional languages are "advanced" and R is "for beginners").
Still, a language that is popular in its field will make your own code more valuable. Packages upon which you depend will be more likely to be maintained, tools will improve. If you release a package yourself, it will be more used (and, if you are in science, maybe even cited).
Changing languages is easy, but costly
Any decent programmer can "pick up" a new language in a few days. I can probably even debug code in any procedural language even without having ever seen it before. However, to really become proficient, it often takes much longer: you need to encounter and internalize the most natural way to do things in the new language, the quirks of the interpreter/compiler, learn about different libraries and tools, &c. None of this is "hard", but it all takes a long time.
Programming languages have network effects
This is all a different way of saying that programming languages have network effects. Thus, if I use language X, it is generally better for me if others also use it. Not always explicitly, but I think this is the rationale for the programming language discussions.