Hard Light Productions Forums
Off-Topic Discussion => General Discussion => Topic started by: Kosh on January 08, 2008, 03:02:18 am
-
http://developers.slashdot.org/developers/08/01/08/0348239.shtml
Any thoughts from the programmers amoung us?
-
I didn't read the full article (only the slashdot summary and a quick scan of the article) but I didn't find anything I disagree with. They're not saying Java is bad, they're saying Java ONLY is bad. And I agree with them. Learning only Java means that certain fundamental rules of programming are missed out.
Java teaches a lot of important skills (Learning OO is much easier in Java for instance as is learning multi-threading) but that very ease means that there are lots of things that are fundamental to programming in other languages that the student will leave university not knowing. Turning out graduates of a programming course who don't know what a pointer is isn't anything to be proud of.
Java needs to be taught alongside the other languages. Not as an alternative to them.
-
I just hate Java, period. But thats me :P
-
Java is the devil.
but I'm biased, as are these guys (who both happen to be on the Ada board)
-
They teach Java only at my IT department as well. It is stupid. I wish I had the ability to choose which language I get.
-
Students found it hard to write programs that did not have a graphic interface, had no feeling for the relationship between the source program and what the hardware would actually do, and (most damaging) did not understand the semantics of pointers at all, which made the use of C in systems programming very challenging.
Agree. I learned C++ first, and despise Java.
-
I don't mind Java.
Anyway, at my University, CSC101 is C/C++, while CSC102 and 103 are both Java. Not sure about the upper classes though.
-
The CS departments at many American universities started using Java exclusively for their intro level courses a few years ago, ever since the College Board switched the AP exams to it. I had to learn it to satisfy a requirement, even though I don't have much use for it in applied math and would have done something else if I had a choice. (although I ended up learning other stuff later on anyway)
-
This article really rubs me the wrong way, and the more I read it the more uncomfortable I become.
As far as I know, Ada is a relatively obscure language. I've heard of it before, but mostly for its programming concepts (and lest that suggest it's in any way superior to another language, I've also heard of Java, C, C++, D, Lisp, Fortran, Pascal, Basic, Assembly, Lua, Ruby, and so on and so forth for their programming concepts.)
From the very start the article has issues. The three items for instance:
Mathematics requirements in CS programs are shrinking.
CS continues to depend less and less on raw mathematics as it becomes more possible to abstract things to the point where you can rely on previously written code rather than on raw mathematics. You don't have to actually write the graphics engine, for instance, if you want to program a 3D game. That cuts down a huge amount on the math that you actually need to know. You don't need to figure out how to distort a texture based on mathematical formula - you just feed the coordinates and orientation to the video card and it does the work for you.
Naturally, as this becomes more and more common in the field, more and more possibilities will open up - that don't require math. Simply because the requirements are shrinking doesn't mean that the options are.
The development of programming skills in several languages is giving way to cookbook approaches using large libraries and special-purpose packages.
Though the article elaborates on this, I say good. It shows that the tools that have been developed in the past are versatile enough to be used now, and save a lot of time. You don't need to recode a windowing system; it's done. Now all you need to do is write a few lines of code and Java will do it all for you. This same kind of specialization is, AFAIK, occurring in several other fields as well.
The only downside is that when somebody decides that what they learned is not for them, or their skills become obsolete because of automation. However that does not seem to be a concern of the article, as the article repeatedly reinforces the idea that math, computer hardware, and fine-tuned "bug-free" programming ought to be the goal of software engineering programs. Coincidentally, that required curriculum would also force SE graduates to gain the skills required to work on Ada-based projects (eg aerospace and defense) which would give any companies using Ada a larger pool of employees to select from and thus they would have an easier time hiring people for less. But I digress...
The resulting set of skills is insufficient for today’s software industry (in particular for safety and security purposes) and, unfortunately, matches well what the outsourcing industry can offer. We are training easily replaceable professionals.
Are people in China or India somehow inferior to people in America? Why isn't it a surprise, then, that they can do the same things that Americans can do, and just as well? Especially when the US is around to provide education to those people, and many companies (including companies that work on defense-related products) do outsource because of the cost savings, which promotes further outsourcing and gives the people in those countries training that they would not have had easy access to otherwise.
It [Texas A&M] did [teach Java as the first language]. Then I started teaching C++ to the electrical engineers and when the EE students started to out-program the CS students, the CS department switched to C++. [5]
It will be interesting to see how many departments follow this trend. At AdaCore, we are certainly aware of many universities that have adopted Ada as a first language because of similar concerns.
So, does Ada actually improve the situation, or make it worse? They fail to say. :p
And in addition, does that have anything to do with Ada, or does it have something to do with the fact that it's a university that takes enough of an interest in its curriculum to actually care about what it's teaching rather than blindly follow the crowd? And such a university would be more likely to attract students who are genuinely interested in programming, and actually learn on their own and do more than the minimum requirements.
Funny, I'd rather put my "safety and security" in the hands of someone who is genuinely interested in what they're doing and how well they're doing it. The person that spent their extra time learning the things that they wanted to know that college didn't require, rather than the person who got drunk through college and barely passed the requirements because they were what he needed to make a decent wage.
So if the people who want to make crappy Flash games or do art programming and have no aptitude for advanced math can't do jobs that they're poorly suited for, don't have an interest in, and could possibly jeopardize the safety of others...I'm not too concerned. :rolleyes:
But what really gets me is this:
In this day and age, the fear of terrorist cyber attacks have given a new urgency to the building of software that is not only bug free, but is also immune from malicious attack. Such high-security software relies even more extensively on formal methodologies, and our students need to be prepared for this new world.
So, don't teach Java so much...because if you do, the terrorists will win. (http://fs2source.warpcore.org/temp/wmc/doh.gif)
-
I've never used Java, but I think as far as beginner languages go, I'm going to have to vote for Visual Basic. Because it's scripted so (at least in VB6) the program will nail you for making a mistake before you try to run the program, and also the single stepping feature makes debugging infinately easier. I just think that's its good for getting a general feel for programming.
-
Visual C++ catches certain errors at compile-time and will let you step through programs when debugging. Java does the former, but it doesn't do the latter, although I think there are compilers out there which do provide such functionality.
I've used text editors with the default Java compiler for all my programs, and I haven't found the lack of a debugger a real practical limitation. Then again most of what I've used Java for is GUI-centered so it's rather easy to debug.
-
wait, Java doesn't have a debuger!?:wtf:
-
No, it does, but I personally haven't used it.
-
Visual C++ catches certain errors at compile-time
Yeah, but often it isn't as specific. No doing something right here might cause a slew of errors somewhere else, and also sometimes the compiler itself would cause problems.
I do recall in my university C class that the people who never took programming before in their life had a hell of a time trying to get a handle on it, much more so than people starting with VB.
-
I think the best language for beginners is
wait for it
microsoft quick pascal - the davros edition
why?
well the great thin about it was the error messages were stored as plain text
so they could be changed
and if youve ever seen a compiler error they an be pretty cryptic
mine were much better
eg: "unexpected end of line" was changed to "youve missed out a ' you tosser"
it also included such classics as "something very bad has just happened" and "im afraid i cant do that dave"
-
I wouldn't mind seeing the missing semi-colon error in VC replaced with a simple Doh!
-
...
I completely agree with you. The whole article smacks of elitism and has very much the opposite view of my professors, who always said that increasing complexity in modern applications ABSOLUTELY NECESSITATES using and/or modifying existing libraries because of time constraints. Today's world (especially on the web) runs on frameworks and you'd be stupid to build your own lower-level libraries unless you were going for performance, which isn't a priority for the vast majority of software. I was taught Java at university and it hasn't done me any harm. The only great divides I've encountered between Java and C are in typing, memory management and object referencing and they don't require great intellects to get to grips with if you've got the willpower to make the mistakes and learn from them. I haven't had any great difficulties in making the transition.
And believe it or not, C++ is becoming less relevant in the web-connected world. The number of graduate development positions for traditional offline, platform-restricted applications is dwarfed by the number of companies crying out for those familiar with ASP.net and PHP, and they are far more 'forgiving' languages than Java. What's the point in teaching an unnecessarily complex language that comparatively few graduates will use in their careers?
As founders of a company that specializes in Ada programming tools for mission-critical systems, we find it harder to recruit qualified applicants who have the right foundational skills.
That says it all really. 90%+ of graduates will never have to write Ada code.
-
I don't see any harm in teaching other languages though. I think having an understanding of memory leaks and pointers isn't a bad thing for a CS graduate to have.
I'll agree with you both on their dumb insistence on reinventing the wheel. I can't understand why anyone would insist on writing their own code to do a task if there is a method already available on the user's PC for doing it. Not only is it a waste of time, effort and diskspace but it results in buggy code since the standard libraries in any language get very heavy bug testing.
-
Although programming computers is not really my main job, I thought I could tell my point of view, which would be from the applied research side.
--- BEGIN my viewpoint ---
At the moment, the problem we are facing is that there is not enough people who know how to write fast, error-free code, which doesn't strive to visual perfection.
If eyes serve me correctly, basic calculus, geometry and optimization things are pretty bizarre for people who started their studies in Computer Sciences. Here at this point, the Math indeed becomes relevant as it is not really a good idea to utilize an optimizer without knowing how it actually works. Also, low level optimization tricks seem to have been lost, there are only a handful of people who know how to do this stuff, some of them that I know of are working exactly with Ada.
As a little personal history, I started programming about 8-years-old with Sinclair ZX Spectrum, using the built-in Basic language. At this point I had no idea the actual programming should have been done at Assembly level with that machine. Later became QBasic with Dos 5.0 in a fresh 486, then Visual Basic. Later, working part-timely during studies, I had to learn Matlab, and I still use it today. During later years in University I took a look in Assembly; capability in C++ and Mathematica were required by the faculty. C++ was actually quite good surprise, Mathematica I hated. Decoding some decades old optimization routines, I also had to read some Pascal and Fortran. Fortran, luckily, does not differ too much from Matlab.
--- END of my viewpoint ---
Java is a good beginners language, but I'm against writing serious applications with it, even though it could be faster to code that way. Language reminds me of a certain software that I need to use in work, this is coded with a strange Perl derivate and partly with C or Assembly, I suppose. The macro language that controls it is somewhat close to Java, if all that what I have seen about it are correct. To tell the truth, the memory management of that software totally sucks and there are some interesting "features" that will always crash the program. User cannot even clear the memory or a single variable with the macro-language commands, since it will be automatically done (yeah, right, during the times when the processor load is high, which is about all the time?)!
The bad thing is that the company that programmed the software did not find an error in their code, even though many people have certainly reported of these bugs. So I have a sneaking suspicion that it is related to the way how the language by which it was programmed automatically clears memory, if something is deleted. Unfortunately, during processor intensive moments (most of the time), the language doesn't get the chance to check the memory reservation, ending reserving huge chunks of RAM and slowing computer down to a crawl.
In conclusion, I support the article, because there is a huge amount of people who get to learn Java and related languages without any deeper understanding of the libraries they apply. The competition of the working places in that area will be fierce. If one chooses to learn mathematical computing, then the amount of competition will be drastically less. So, you get a better chance to get a decent job if you choose the more mathematical way.
Mika
-
who always said that increasing complexity in modern applications ABSOLUTELY NECESSITATES using and/or modifying existing libraries because of time constraints. Today's world (especially on the web) runs on frameworks and you'd be stupid to build your own lower-level libraries unless you were going for performance, which isn't a priority for the vast majority of software.
All true, but shouldn't they still be taught how libraries work and understand the libraries they're using? What better way to understand how a library works that to write one or two for yourself (obviously in academia and not the the Real World)?
-
first language i learned was c, then c++. but i wasn't that good back then, so i did most of my early programs in vb (and messed around some with quake c). this was in my high school computer science class. then i just focused on graphics and didnt try programming again until fairly recently.
ive never used java, and from the horror stories i hear i dont think i want to.
-
My opinion tends to be that if you have a go at Java you're only showing your own ignorance. Java is very good at what it was designed for. If you're using it to write graphics intensive tasks and then complaining it doesn't work well then you're an idiot.
It's like complaining that screwdrivers are rubbish because you can't hammer in nails with them.
-
funny, ive set nails with a screwdriver before :D
-
Exactly. You can do it. You just have to recognise it isn't the best tool for the job and you have to know how to get around its limitations.
-
That article can be as misleading, elitist, biased, and crackpot-filled as you guys want. Unfortunately, that doesn't change the underlying message, which is what i agree with.
Pumping out CS majors that don't know what pointers are or how a computer allocates memory is not a good thing. I don't care if your programming Half Life 3 or if your making a bank statement tracking program. You just went to collage to get this degree, you ought to know how a computer works instead of relying on prebuilt code to do it all for you - without understanding what that code actually does.
There is nothing wrong with using prebuilt code to avoid reinventing the wheel. There is something wrong with collage grads that rely on prebuilt code to do all this stuff because they don't know how to do it themselves.
I don't care what language they use. I do, however, care about what they're teaching.
-
Pumping out CS majors that don't know what pointers are or how a computer allocates memory is not a good thing. I don't care if your programming Half Life 3 or if your making a bank statement tracking program. You just went to collage to get this degree, you ought to know how a computer works instead of relying on prebuilt code to do it all for you - without understanding what that code actually does.
There is nothing wrong with using prebuilt code to avoid reinventing the wheel. There is something wrong with collage grads that rely on prebuilt code to do all this stuff because they don't know how to do it themselves.
this.
-
I have to agree here..
Even I know how to use pointers and allocate memory....I think..havn't done it in a loooong while.
Point is, one has to learn the basics of how a computer works on the fundamental level and how the software works, the code.
I started with Pascal and then moved to Basic, C, C++ and Java (a a dozen of other I can now barely remeber..LISP and a few others :P )
-
Real programmers use notepad. ;)
-
notepad++ is better
-
Real programmers use notepad. ;)
I write all my programs using character map and a pipe on the command line to the empty executable file.
-
I use Programmer's Notepad... that close enough?
-
Notepad is only the start. The Real Programmers calculate, talk, read and write in hexadecimal. The Really Advanced Programmers can do binary.
No, I don't belong to either of them. But I have seen some older people checking through the voltage values of CPU-lines with a multichannel oscilloscope and read the data through voltage values (0 V, 5 V). That was pretty crazy already.
Mika
-
But I have seen some older people checking through the voltage values of CPU-lines with a multichannel oscilloscope and read the data through voltage values (0 V, 5 V). That was pretty crazy already.
Mika
Uhh, if such a thing were possible to measure on an oscilloscope the CPU would have to be running at <10Hz to see in real time. And no silicon has ever run that slow.
-
I only barely understand how you'd actually do that, but I've used oscilloscopes (in a class, no less, these were old analog things) that went down to about 2ms before. Given that, it seems like the resolution would be good enough to see a signal of several mhz, up to at least 16mhz (Since the entire screen would have a "width" of 500hz)
With a newer/more expensive oscilloscope you could probably up that value quite a bit, once you reach 66-100mhz you're at the lower end of FSB speeds that were common 5 years ago.
-
I only barely understand how you'd actually do that, but I've used oscilloscopes (in a class, no less, these were old analog things) that went down to about 2ms before. Given that, it seems like the resolution would be good enough to see a signal of several mhz, up to at least 16mhz (Since the entire screen would have a "width" of 500hz)
With a newer/more expensive oscilloscope you could probably up that value quite a bit, once you reach 66-100mhz you're at the lower end of FSB speeds that were common 5 years ago.
You could do, but I'd imagine it'd take weeks or months of work to make sense of one second's worth of data.
-
I almost forgot this thread. Too much (drunken) fun with the buddies from studying times.
It is relatively easy to check the processor lines' information content, attach a voltage meter to the line to give you an idea. Unfortunately, you cannot do this with normal hand held voltage meters because of the refresh rates and integration times. This is also the reason why I think that any copy protection system will ultimately fail. It is a question of how easy this is to do, and currently it is probably too difficult for your common pirate or geek to handle.
Instead, one could utilize an oscilloscope, but current processors have 32 to 64 lines each of which must be monitored if the information content needs to be decyphered. I doubt you can easily find a digital oscilloscope that could support so many channels, also signal frequencies above 3 GHz (period of 0.3 ns - still doable) might make it difficult for the oscilloscope. Instead of that, one can use logic analyzers that don't bother with the accurate voltage sampling, the only thing which matters is whether the voltage is up or not. This way, the sampling frequency might get much higher so that 3 GHz is quite easy.
But the real problem is, according to my understanding, that the 8086 processors are still quite nice to teach students what is actually going on in the computer. With 8086 you can nicely follow the flow instructions to the processor, and processor answering them each on their own turn, in Assembly controlled loops for an example. Newer processors (since 80386 IIRC) have some kind of internal system, some of the assignments are returned earlier and some of the later, but the system keeps track of the assignment numbers so that the rest of the computer sees nothing special in the data. Even though the processor computed them in a totally different order. It is relatively difficult to follow that kind of data flow.
For the programmers:
How did you actually think that the processor testing would happen, if you couldn't follow the data flow physically?
Mika