[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: sysadmin qualifications (Re: apt-get vs. aptitude)



Jerry Stuckle wrote:
On 10/15/2013 2:26 PM, Miles Fidelman wrote:



Geeze Jerry, you're just so wrong, on so many things.

What's a "coder"? In over 40 years of programming, I've met many programmers, but no "coders". Some were better than others - but none had "limited and low-level skill set". Otherwise they wouldn't have been employed as programmers.

If you've never heard the term, you sure have a narrow set of experiences over those 40 years. And I've seena LOT of people with very rudimentary skills hired as programmers. Not good ones, mind you. Never quite sure what they've been hired to do (maybe .NET coding for business applications?). All the serious professionals I've come across have titles like "software engineer" and "computer engineer."


And "Systems Programming" has never mean someone who writes operating systems; I've known a lot of systems programmers, who's job was to ensure the system ran. In some cases it meant compiling the OS with the require options; other times it meant configuring the OS to meet their needs. But they didn't write OS's.

It's ALWAYS meant that, back to the early days of the field.

Can't find any "official definition" - but the WikiPedia definition is reasonably accurate: "*System programming* (or *systems programming*) is the activity of computer programming <http://en.wikipedia.org/wiki/Computer_programming> system software <http://en.wikipedia.org/wiki/System_software>. The primary distinguishing characteristic of systems programming when compared to application programming <http://en.wikipedia.org/wiki/Application_programming> is that application <http://en.wikipedia.org/wiki/Application_software> programming aims to produce software which provides services to the user (e.g. word processor <http://en.wikipedia.org/wiki/Word_processor>), whereas systems programming aims to produce software which provides services to the computer hardware <http://en.wikipedia.org/wiki/Computer_hardware> (e.g. disk defragmenter <http://en.wikipedia.org/wiki/Defragmentation>). It requires a greater degree of hardware awareness.

That's been the usage since the days I took courses in it at MIT (early 1970s), and how the term is used in all the textbooks by folks like Jerry Saltzer, John Donovan, Corbato - names you should recognive if you've been in the field for 40 years.


The programmers where I'm currently working - application systems for
buses (vehicle location, engine monitoring and diagnostics, scheduling,
passenger information) -- yeah, they have to worry about things like how
often vehicles send updates over-the-air, the vageries of data
transmission over cell networks (what failure modes to account for),
etc., etc., etc.


That's not "real time". "Real time" is when you get an input and you have to make an immediate decision and output it. Motor controllers are one example; constantly adjusting motor speed to keep a conveyor belt running at optimum capacity as the load changes. Another is steering a radiotelescope to aim at a specific point in the sky and keep it there. Oh, and once they radiotelescope is properly aimed, process the information coming from it and 30-odd others spaced over a couple of hundred square miles, accounting for the propagation delay from each one, and combining the outputs into one digital signal which can be further processed or stored.

There's a spectrum of real-time. Everything from radar jamming (picoseconds count), to things that happen on the order of seconds (reporting and predicting vehicle locations). Basically anything where timing and i/o count.


And worrying about the vagaries of data transmission over cellular networks requires no network knowledge below the application level. In fact, I doubt your programmers even know HOW the cellular network operates at OSI layers 6 and below.

Ummm... yes. Our guys write protocols for stuffing vehicle state data into UDP packets, drivers for talking across funny busses (e.g. J1908 for talking to things like message signs, engine control units, fareboxes). Occaisionally we have to deal with controlling on-board video systems and distributing video streams. Busses also typically have multiple systems that share a router that talks both cellular (on-the-road) and WiFi (at the depot) - lots of resource management going on. And don't get me started on running data over trunked radio networks designed for voice.


When I worked on military systems - trainers, weapons control, command &
control, intelligence, ..... - you couldn't turn your head without
having to deal with "real world" issues - both of the hardware and
networks one was running on, and the external world you had to interact
with.


I'm sorry your military systems were so unstable. But programmers don't worry about the hardware.

Hah.. Tell that to someone who's designing a radar jammer, or a fire control system. Now we're talking "hard real-time" - no timing jitter allowed as control loops execute. Yeah, now we're talking about custom microcode to execute funny algorithms, and interrupt-driven programming.

For that matter, anybody who has to deal with displays - e.g. for simulators or games - has to worry about hardware.

Then again, we never hired "programmers" - we hired software engineers, who were expected to have some serious electrical engineering and computer hardware background.


If you think anybody can code a halfway decent distributed application,
without worrying about latency, transmission errors and recovery,
network topology, and other aspects of the underlying "stuff" - I'd sure
like some of what you've been smoking.


Been doing it for 30+ years - starting when I was working for IBM back in '82.

For example? Distributed systems were still a black art in the early 1980s. Those were the days of SNA, DECNET, and X.25 public networks - most of what went across those was remote job entry and remote terminal access. Distributed applications were mostly research projects. Network links were still 56kbps if you were lucky (and noisy as hell), and an IBM 3081 ran at what, 40 MIPS or so.

What could you have been working on in the 1980s that WASN'T incredibly sensitive to memory use, disk usage, cpu use, and network bandwidth?


Oh, and by the way, an awful lot of big data applications are run on
standard x86 hardware - in clusters and distributed across networks.
Things like network topology, file system organization (particularly
vis-a-vis how data is organized on disk) REALLY impacts performance.


That's what I mean about what people thing is "big data". Please show me ANY X86 hardware which can process petabytes of data before the end of the universe. THAT is big data.

Too many people who have never seen anything outside of the PC world think "big data" is a few gigabytes or even a terabyte of information. In the big data world, people laugh at such small amounts of data.

There are an awful lot of Beowulf clusters out there. Stick a few 100 (or a few thousand) multi-core processors in a rack and you've got a supercomputer - and that's what most of the big data processing I know of is being done on. Heck, Google's entire infrastrucure is built out of commodity x86 hardware. (There's a reason we don't see a lot of surviving supercomputer manufacturers.)

Disk i/o is the problem - again, a problem that requires find tuning algorithms to file system design to how one stores and accesses data on the physical drives.

I might also mention all the folks who have been developing algorithms
that take advantage of the unique characteristics of graphics processors
(or is algorithm design outside your definition of "programming" as well?).


What about them?

Do you consider figuring out algorithms to be part of "programming?" If so, how do you exclude a knowledge of machine characteristics from the practice of "programming?"

I'm trying to figure out what kinds of things you see "programmers" working on that don't need serious knowledge of the underlying operating system, computer hardware, and i/o environment.


--
In theory, there is no difference between theory and practice.
In practice, there is.   .... Yogi Berra


Reply to: