Recently I read an article asking, where are the software engineers of tomorrow?
Written by a President and Vice-President of the company AdaCore, they lament the spread of Java as a first programming language taught on many Computer Science courses and a reduction in the formal and theoretical background that students are being given.
While I can't say I agree with every word they say (for example, they appear somewhat dismissive of dynamic languages), they certainly have a lot of good points. You certainly don't learn to be a good programmer, developer or architect by learning one or two programming languages. To do that you need to understand the whole stack of abstractions form top to bottom, be aware of many programming paradigms and understand (or at least be aware of the existence of) the theory that explains it.
It's important to know what assembly language is and have a rough idea of what the hardware is doing. There's no need to be proficient in the assembly for any one CPU, though seeing the trade-offs between different architectures is useful. Layered on top of that are high level languages like C: they are decidedly high level languages, but they are still pretty close to the metal. I'm not saying that people need to be taught C, but there are some things that people really should understand, especially explicit memory management and pointers. For both assembly and C, people graduating from a computer science course should be able to grab the appropriate reference material and use either of them effectively, because they understand the underlying concepts.
Above this level, we have languages that either sit atop of virtual machines or run with notable runtimes. These languages, free of low-level issues, are good for teaching different paradigms, illustrating algorithms and exploring software engineering issues. No one language should be taught exclusively here. There's a simple reason for this: no one language exemplifies all paradigms and approaches. Programming languages are tools, and they are suited to different jobs.
Java is useful for teaching object oriented programming and reflective programming, and perhaps generic programming too. However, it has nothing to offer with regard to functional and higher order programming, and only shows one approach to typing: static typing with explicit type annotations. If Java's capabilities are all a programmer has even seen on their university course, then they are under-equipped to deal with C# 3.0, which offers both higher order programming and limited type inferencing. Someone who had seen a range of languages and paradigms, however, will be able to C# 3.0 to good effect, rather than writing it like they would write Java and complaining about how anonymous methods and lambda expressions are damaging what was once a "pure object oriented language". (Also, anyone who says that doesn't know what pure object orientation is; neither C# nor Java have ever been pure object oriented languages.)
So far I've just talked about programming, because it's what most computer scientists are recruited to do. However, ideally the ability to program should be a natural fallout of an understanding of computer science and software engineering. That is, programming should not seem that big a deal once you understand the concepts behind hardware architectures, have knowledge of a range of data structures and algorithms and comprehend fundamental issues such as algorithmic complexity (and sorry, but if your computer science course didn't explain at least that, the word "science" deserves no place in the course name). To build real world systems takes an understanding of managing software complexity, safety and security engineering and a range of concepts such as concurrency, databases and so on - all things that should have a place on a computer science course.
If there's any word that could describe a computer science course, it would probably be "holistic". Being equipped to deal with the challenges the computing field faces requires a wide skill set, encompassing mathematics, electronics, engineering, psychology and more. And it involves taking the word "science" seriously and teaching concepts, rather than bringing up a bunch of code monkeys who are ill equipped to deal with what lies ten years down the line, because it's not Java or PHP.