Off-Topic Discussion > Programming

Why Learning to Code is So Damn Hard

<< < (3/3)

Blue Lion:
The joy is I just got a job as a developer and have no idea what I'm doing. Yay. My life is this graph.

Mito [PL]:
Sooooo I'm not even being close to a "coder" or "developer", though I have my own experiences with programming 8-bit AVR microcontrollers (quite successful, but nothing awesome) in Bascom/C++ (even with a tiny bit of Assembly), and Mongoose's post gave me something to think of.

In order to get a supercomputer doing something, you don't need a brain made of one - you just need some basic understanding of what coding is and several ready-to-use procedures with their documentations. That's basically what modern coding is: clamping some building blocks together so that they work as needed... right?
Think of it: a procedure in high-level language is comprised of many others made in a low-level language (if anybody knows about them: Arduino vs AVR C++), a low-level language procedure is made of others in even lower level language (like AVR C++ vs Assembly). Then there is machine code. *Someone* made the machine code, *someone* made C, Java, Python, not mentioning the stuff needed for convenient 3D rendering. Someone designed all of those so they don't collide (too much) with each other. Someone made the compilers.
Let's look deeper. Somebody designed the architecture, the entire system inside a computer, coordinated all of the present devices into a single, coherent, functional system.
You do not need to know everything about computers to code - or to be a good coder. Just feel this: even with the simplest code possible you wave around with decades of tens of thousands' brilliant work. The little #include tag, a compiler, everything you use to make your program going is a result of such incredible brilliance of very many people.
That's why you don't need to know everything: history does that for you.

<Maybe if I would know English a bit better this thing up there wouldn't be so silly>


By the way, it's amazing how many cool things one can do with a single 8MHz ALU. For some time now, I'm thinking of stuff like Z80 with enormous respect instead of being sure that it's a relic and a piece of junk.
:P

CP5670:
I do a lot of software development but am not really an engineer either, more of a researcher/data scientist. I think increasingly more types of work involve some amount of programming, and it's becoming almost as important a skill as writing. In the figure above, the III and IV stuff is much more important than details of specific languages or libraries, which you can look up quickly online.


--- Quote ---I am working on an implementation of Alexnet, a neural network to recognize images (I need to port it to an embedded architecture). My boss just asked me how much it would cost even though i didn't start the implem yet. I just computed with pen and paper how many multiply-accumulate (basically doing S = S + a * b) the computations would require (there are a lot of them in convolutions, and there are a lot of convolutions in neural networks), and it gave them an idea of how much time it would take to process an image.

Usually, there are two bottlenecks : either you are not fast enough to compute, either the data does not arrive fast enough to your processor. You need to consider the two aspects to know the capabilities of your machine. If you use MPI (distributed computing and Cie), it's more complex to assess though.

Then there is also all the parallelism issues - typical example, you take a program, you make it parallel, and oddly you realize that the parallel version actually runs slower than the sequential version. So you dig, and you realize that you have assigned too many threads, and that the program spends half his time switching thread..
--- End quote ---

Training convolutional neural networks is actually one of the rare problems that can be parallelized easily, and is the main thing GPUs are used for in machine learning. There are several frameworks like Theano and Caffe that let you set up the model at a high level without getting into the details, although they might not be suitable if you have special hardware it needs to run on.

potterman28wxcv:

--- Quote from: CP5670 on November 01, 2016, 06:14:10 pm ---
--- Quote ---I am working on an implementation of Alexnet, a neural network to recognize images (I need to port it to an embedded architecture). My boss just asked me how much it would cost even though i didn't start the implem yet. I just computed with pen and paper how many multiply-accumulate (basically doing S = S + a * b) the computations would require (there are a lot of them in convolutions, and there are a lot of convolutions in neural networks), and it gave them an idea of how much time it would take to process an image.

Usually, there are two bottlenecks : either you are not fast enough to compute, either the data does not arrive fast enough to your processor. You need to consider the two aspects to know the capabilities of your machine. If you use MPI (distributed computing and Cie), it's more complex to assess though.

Then there is also all the parallelism issues - typical example, you take a program, you make it parallel, and oddly you realize that the parallel version actually runs slower than the sequential version. So you dig, and you realize that you have assigned too many threads, and that the program spends half his time switching thread..
--- End quote ---

Training convolutional neural networks is actually one of the rare problems that can be parallelized easily, and is the main thing GPUs are used for in machine learning. There are several frameworks like Theano and Caffe that let you set up the model at a high level without getting into the details, although they might not be suitable if you have special hardware it needs to run on.

--- End quote ---
That's precisely the latter on my case. I'm working on porting AlexNet to an embedded system - which requires extending the support in Theano/Caffe/Tensorflow/whatever framework we choose. Which requires in turn to understand what happens underneath the abstraction

FreeSpaceFreak:

--- Quote from: Blue Lion on November 01, 2016, 12:36:57 pm ---The joy is I just got a job as a developer and have no idea what I'm doing. Yay. My life is this graph.

--- End quote ---

That's exactly how I started three years ago! Straight out of college, I got a job as a C++ developer - not even knowing the difference between a pointer and a reference... Three years in, I've been teaching others to code, having fun with performance-critical pieces of software (SSE instructions are pretty cool, once you get your head around it), found a (much) better paying job, and still quite enjoy solving new challenges every day :)

Just keep your eyes and ears open, read (and understand) StackOverflow, and learn from your more experienced colleagues - you'll get there. (If I can do it, you definitely can ;) ) Courage!

Navigation

[0] Message Index

[*] Previous page

Go to full version