Becoming a Professional Computer Program Developer
Here are a few responses to a question on Slashdot, from real professionals in the Computer Programming flield. The questioner asks for advice on what to teach new programmers during a workshop to help them become more professional. Linked Here.
One of the responses lists a number of books that have been helpful to a number of people over the years. One response has to do with the current state of the computer related businesses culture. And one has to do with getting the best results begins with getting requirements right the first time.
One peice of advice I would offer to anyone reading these: as you read, try to place yourself as both the person talking and the other as the person trying to get a new programmer to understand these points.The people responding were once the "younglings" going to the workshop and had to learn these lessons themselves, either at the workshop or in real life workplaces.
Dislaimer: I did not copy and link to these responses because I am getting to be an old curmudgeon.
I had been hearing and reading about these same issues all through my career in computers. And
as the old saying goes those who fail to learn from history are doomed to repeat it
. Many of our current software and operating system woes could be fixed if our
business people and developers learned ( or re-learned ) these lessons of how to write good
code, how to fix bugs as they are found (ie: not waiting until your customers are hacked and
broken into before fixing it, but also by that, learning to take more time to avoid the bugs
in the first place),how to get good programming requirements, how to work well in groups and
how to be a professional. Not that many people would accuse me of being good on one or all of
these, but at least I am trying to recognize, and pass on, what I see as being big issues
people are having and how to fix them.
Re:Focus on a few key things (Score:5, Informative)
by ShanghaiBill ( 739463 ) on Sunday March 12, 2017 @02:19AM (#54021379). Linked Here
Could we get a full list of those books?
This is the list I currently use. I welcome additional recommendations. What CS books have you read recently that you really wished you had read 10 years ago?
Programming:
- Clean Code
- Code Complete
- Programming Pearls
- The Pragmatic Programmer
- Regular Expressions
- Algorithms by Robert Sedgewick
- Introduction to Algorithms by Tom Cormen
- Hacker's Delight by Henry Warren
Interface design:
- Don't Make Me Think
- The Design of Everyday Things
- Microinteractions
Software Engineering:
Theory:
We worship at the altar of youth here. (Score:5, Insightful)
by w3woody ( 44457 ) on Sunday March 12, 2017 @09:47AM (#54022425) Homepage. Linked Here
The problem is that our industry, unlike every other single industry except acting and modeling (and note neither are known for "intelligence") worship at the altar of youth. I don't know the number of people I've encountered who tell me that by being older, my experience is worthless since all the stuff I've learned has become obsolete.
This, despite the fact that the dominant operating systems used in most systems is based on an operating system that is nearly 50 years old, [wikipedia.org] the "new" features being added to many "modern" languages are really concepts from languages that are between 50 [wikipedia.org] and 60 years old or older, [wikipedia.org] and most of the concepts we bandy about as cutting edge were developed from 20 [wikipedia.org] to 50 years ago. [wikipedia.org]
It also doesn't help that the youth whose accomplishments we worship usually get concepts wrong. I don't know the number of times I've seen someone claim code was refactored along some new-fangled "improvement" [www.objc.io] over an "outdated" design pattern who wrote objects that bare no resemblance to the pattern they claim to be following. (In the case above, the classes they used included "modules" and "models", neither which are part of the VIPER backronym.) And when I indicate that the "massive view controller" problem often represents a misunderstanding as to what constitutes a model and what constitutes a view, I'm told that I have no idea what I'm talking about--despite having more experience than the critic has been alive, and despite graduating from Caltech--meaning I'm probably not a complete idiot.)
Our industry is rife with arrogance, and often the arrogance of the young and inexperienced. Our industry seems to value "cowboys" despite doing everything it can (with the management technique "flavor of the month" [techrepublic.com]) to stop "cowboys." Our industry is agist, [businessinsider.com] sexist, [cbsnews.com] one where the blind leads the blind, and seminal works attempting to understand the problem of development [wikipedia.org] go ignored.
How many of you have seen code which seems developed using "design pattern" roulette? Don't know what you're doing? Spin the wheel!
Ours is also one of the fewest industries based on scientific research which blatantly ignores the research, unless it is popularized in shallow books which rarely explore anything in depth. We have a constant churn [cleancoder.com] of technologies which are often pointless, introducing new languages using extreme hype [apple.com] which is often unwarranted as those languages seldom expand beyond a basic domain representing a subset of LISP. [paulgraham.com] I can't think of a single developer I've met professionally who belong to the ACM [paulgraham.com] or to IEEE, [ieee.org] and when they run into an interesting problem tend to search Github or Stack Overflow, [techcrunch.com] even when it is a basic algorithm problem. (I've met programmers with years of experience who couldn't write code to maintain a linked list.)
So what do we do?
Beats the hell out of me. You cannot teach if your audience revels in its ignorance and doesn't think all that "old junk" has any value. You cannot teach if your students have no respect for experience and knowledge. You cannot teach if your audience is both unaware of their ignorance and disinterested in learning anything not hyped [wikipedia.org] in "The Churn."
Sometimes there are a rare few out there who do want to learn; for those it is worth spending your time. It's been my experience that most software developers who don't bother to develop their skills and who are not interested in learning from those with experience often burn out after a few years. In today's current mobile development expansion, there is still more demand than supply of programmers, but like that will change, as it did with the dot-com bubble, [wikipedia.org] and a lot of those who have no interest in honing their skills (either out of arrogance or ignorance) will find themselves in serious trouble.
Ultimately, as an individual I don't know if there is anything those of us who have been around for a while can do very much of anything, except offer our wisdom and experience to whomever may want to learn. As someone who has been around for a while it is also incumbent on us to continue to learn and hone our skills; just this past few years I picked up another couple of programming languages and have been playing around with a new operating system.
And personally I have little hope. Sure, there is a lot of cutting edge stuff taking place, but as an industry we're also producing a lot of crap. We've created working environments that are hostile (and I see sexism as the canary in the coal-mine of a much deeper cultural problem), and we are creating software which is increasingly hostile to its users, despite decades of research showing us alternatives. We are increasingly ignoring team structures that worked well in the 1980's and 1990's: back then we saw support staff (such as QA and QAE and tech writers) who worked alongside software developers; today in the teams I've worked for I'm hard pressed to find more than one or two QA alongside teams of a dozen or more developers. I haven't seen a technical writer embedded in a development team (helping to document API interfaces and internal workings) for at least 20 years. And we are increasingly being seen as line workers in a factory rather than technically creative workers.
I'm not bitter; I just believe we've fallen into a bunch of bad habits in our industry which need a good recession and some creative destruction to weed out what is limping along. And I hope that others will eventually realize what we've lost and where we're failing.
But for me; I'll just keep plugging along; a 50+ year old developer in an industry where 30 is considered "old", writing software and quietly fixing flaws created by those who can't be bothered to understand that "modules" are not part of VIPER or that MVC can include database access methods in the "model" and who believe all code is "self-documenting."
And complaining about the problems when asked.
Meeting requirements 40 years in the future (Score:5, Interesting)
by raymorris ( 2726007 ) on Sunday March 12, 2017 @03:19AM (#54021475) Journal. Linked Here
As you said, the common way of getting software requirements doesn't work too well, and certainly doesn't work *reliably*.
I have a book from the 1970s that describes many of the programs I use every week. They still serve the requirements 40 years later. I'll come back to that set of programs, and how they predicted requirements 40 years down the road, at the end of my post.
Before getting to the 40 year old programs that are still used daily around the world, this topic reminds me of one of the best software design tips that I've been taught. In retrospect it seems obvious, but many programmers haven't thought to do it, and most don't insist on doing it.
90% of the time, you're writing software to better do something that's currently being done some other way. Perhaps you're replacing legacy software, perhaps it's currently being done "manually", people entering data one item at a time. Perhaps you're replacing a paper-based system. Most of the time, you're replacing *some* method of doing the same task.
It logically follows, then, that to fully understand the process, it's requirements and idiosyncrasies, you can watch the people actually doing it. Even better, have them show you how they do it, then try to do the job yourself while they watch and correct you or point out things to be careful of. Take notes during this. Most likely, the way they are using the old system is NOT how it was designed to be used, because the designers of the old system weren't clear on the requirements. But users find a way of meeting their requirements. Watching how they do that shows you what they actually need to get their job done.
Already just by watching them do the task you'll understand the requirements far better than you would by having a meeting with their boss's boss (the common, bad, way to get requirements). After watching them do the task, next ask them two questions:
What about the current process or tools is frustrating for you, or slows you down?
Pretending *anything* is possible, what would your impossible wishes be for this?
The second question often elicits ideas that allow the programmer to say "I can do that, that's easy". Then you begin to glow with heavenly lights because they thought their wish couldn't possibly be granted. Truly, I've done EASY programming tasks that have garnered me a reputation for being able to do the impossible, simply by asking the users what impossible features they wish I could provide. Their conception of what's easy and what's impossible is totally unrelated to what a good programmer can actually do. (You've probably noticed users often think it should be easy for us to do something that's actually nearly impossible. The flip side of that same ignorance is that they think we can't do stuff that we can actually do pretty easily.)
I didn't come up with any of this myself, these aren't my genius ideas and I wouldn't expect anyone else to think of these things. These are things I've been taught along the way, and I wouldn't expect another programmer to think of them, until they are also taught these ideas.
One more thought, or set of thoughts about foreseeing requirements. I was also taught that you can, fairly easily, plan for and program for future requirements without knowing what those future requirements will be. There are two major ways of doing that, both closely related. One is to avoid hardcoding unnecessary limitations. As an example, configuration for my software never has the user provide a configuration value. Instead, each configuration item is a LIST. If my software can send email notifications, it isn't configured with an email address to send to, it's configured with a list of email addreses. If it can read from a data file, it can read from a list of data files, etc. In the code, the added flexibility requires just this additional code:
foreach { }
That's it. Just "foreach" whenever a configured value is used makes the whole system far more flexible. This is an example of not arbitrarily coding a requirement that it must work with exactly one Foo. Instead it can work with any number of Foo, at the cost of 10 characters of code, "foreach {}".
The other piece is small modules / functions that do one simple, and therefore generic, task. I have a book from the 1970s that describes many of the programs I use every week. They still serve the requirements 40 years later. These programs include "grep", "sort", and "head". 40 years ago, they didn't know what kinds of things I might need to search, what I might need to sort, or what I'd need the "top 10" of. But they did know that in the future people would need to search things, sort things, and get the top items of something. So they wrote code that searches and code that sorts - totally independent of *what* it's searching or sorting. We can write 90% of our code in a similar fashion, so even if requirements completely change 90% of our code doesn't need to be changed. In fact I wrote code 15 years ago that's still part of some operating systems, because it does one straightforward generic task. It's as relevant today as it was 15 years ago. The operating system around my code is very different today, but my code was written to just do it's one job, not knowing or caring about the rest of the OS. The added bonus is that small, simple modules that do one simple thing are also much less likely to have bugs.
The object-oriented people have a special insight into this. If you ask a good object-oriented programmer to write a program to track birds in a zoo, he'll very likely think to himself "if I do a good job and they like it, next they might ask for a program to track bats or frogs or snakes. I think I'll write code to track animals, with the special attributes of birds separated into a specific module". So he'll create an Animal class, with subclass called Bird. He'll write most of the code to handle Animal. Suppose he's wrong. The customer doesn't later decide to add other types of animals. Rather, they want to track a bunch of further details about birds. Awesome! He already put "details about birds" into a separate class. The bulk of his code doesn't know or care about which details are stored, it handles animals generically. So he adds a few attributes to class Bird and he's done - without even opening the files that contain most of his code.
Maybe his prediction is wrong in a different way. Maybe later they want to track visitors. Well, visitors are people, people are animals. His old code for tracking birds has half of what he needs for tracking visitors. Using subclasses for the details, with most of the code working with more generic superclasses, he's created code that can easily be adjusted to handle requirements nobody ever thought about when he first wrote the system.
See also modules ala Wordpress, Apache, and Moodle.