On Tue, Feb 22, 2011 at 10:52 AM, Ndungi Kyalo
<ndungi@gmail.com> wrote:
This 'firm base of fundamentals' is ironically composed of elaborate examples drawn from the contemporary fields you have listed - or you would like them to be inferred from terse Mathematical formulae written in the alphabet of a strange tongue.
Ideally, they should be based on what's on the market... Ideally... .
To use a more practical example, if you spent 3 months of your core curriculum in University learning about how to code for the Symbian platform, because it was the most widely used phone platform. Relevance, right? No basic concepts... Said semester would have been rather useless, given the movement to Microsoft last week...
Wrong! You know very well that programming concepts borrowed from one platform can carry over seamlessly to another platform; A Symbian developer would fair better on WinMo than your average complete n00b.
This proves my point, actually... If you think about it, you are talking about a student who graduated from University, and is unable to Google to figure out how to do a server installation... These are the same student's who get confused because they 'learnt to code in VB' and they have no wizard when they jump to PHP... The problem is that the student leaves campus *unable* to transfer the thinking across to another platform and hence thinks that the platform they learnt is the nirvana of platforms... Think about it, we tell students to learn 'Oracle' to make money, while in actual fact we should be preparing them for a career as a DBA, regardless of profession. I had a chat a couple of weeks back with a student who wanted to do Oracle "because it has money", but really did not know "what Oracle" he wanted to do.
Guys (the cs-theory purists that is), CS concepts do not exist in a vacuum. They were not conceived in a vacuum either. How then will we expect the current crop of scholars to come up with new concepts/ theory and ideas addressing contemporary problems if they are not exposed (in a raw way) to current technology ?
Or do we suppose that all the solutions for cs problems already exist and they were described long before us and all we need is to read the books more carefully .. blah blah .. and so we shouldnt 're-invent the wheel' ?
I would prefer that I was taught the technology first, then the theory, history etc later, to put all these things into perspective, otherwise the science could as well have been taught in a foreign language. In this regard, I have always held that CS students MUST go for their industrial attachments from as early as their first year. It even helps the young mind in self-discovery, which is more important than all these lofty concepts mentioned here.
My problem with teaching the technology is that it changes. Give the students a platform that they can build upon. e.g. A unit in embedded systems will cover most new age devices, from the router to media players... that's a good fundamental. Teaching someone about the iPod, may not be as useful...
Think about it... Most 'market technologies' have certification paths outside of University, Cisco, EMC, HP, Microsoft, Oracle etc etc. You can get certified in. Without any pre-requisites. I'm yet to come across someone teaching fundamentals of networking/programming in any of these training centers. This is information you can get primarily in institutes of higher learning, where the objective is to give you a firm and wide base, not to get you certified in a particular product...