A former colleague wrote about something I occasionally think about. In my view the post touches on at least three areas that could result in a holy war ("The need for bare metal programming skills", "The need for strong concurrent programming skills", "The right approach to concurrent programming"). What fascinates me is that I am not aware of a coherent narrative for at least some of them.
To begin with, the notion of bare metal is a blurred and moving target. From Cliff Click's detailed discussions to Joshua Bloch'shigher-level recommendations there seems to be a wide-spread belief that contemporary hardware is virtually incomprehensible by most programmers. There are too many moving parts and indirection levels to have a mental model good enough for reliable predictions. And it's not just about high-level Java-like platforms.
As history of computing demonstrates, the hardware progress enables programming in terms of higher-level abstractions. And that pushes the border with bare metal higher with time. It's true for programming languages (consider how VM-based or interpreted/dynamic languages became practical in the last ten years) and for particular sub-fields such as concurrent programming (think about a recent surge of interest in Actors and STM).
Concurrent programming is a similarly gray area and also because of strong influence of hardware. On the one hand, shared memory and actors are just models and can be used to solve the same problems. On the other, software runs on real hardware and commodity hardware nowadays means you start from XCHG instruction and layer levels of abstractions on it. Be it Clojure or Scala, you are looking at java.util.concurrent (and, ultimately the very same CPU-level support of CAS) in disguise.
So with time some indirection levels become so low-level that only very few people have time/need/desire to look at. Today java.util.concurrent is the corner-stone and very few people bother even to look inside (for example, compare the number of people intimately familiar with JCiP with those fluent in AoMP). In a few years it might as well be Actors, if not STM/HTM. And then knowledge of j.u.c will count as low-level black magic. It's not easy to see when such a transition happens and adjust one's definition of "bare" or "low-level".
In addition, there are entire domains such as big data. People in the map-reduce land think in terms of [dozens of] servers, not threads. Actually, even in mainstream software not many are lucky to work with j.u.c-level abstractions and some even explicitly prefer even higher-level ones.
But in general, my experience confirms that serious usage of any technology implies comprehensive understanding of its design and some implementation details. As an example, you do not need to know about the GC to program in Java but if you actually do not you probably work on something trivial.
For complicated technologies it can be hard and take time and so one is necessarily limited and cannot be familiar with everything even in a particular language universe (just think about Java - from J2ME to Hadoop "and still counting"). At least in this day and age, most software comes with source code and so you can always dig deeper to learn the details.