[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Effectively criticizing decisions you disagree with in Debian



2014/09/25 9:15 "lee" <lee@yun.yagibdah.de>:
>
> Joel Rees <joel.rees@gmail.com> writes:
>
> >> On Tue, Sep 23, 2014 at 7:16 AM, lee <lee@yun.yagibdah.de> wrote:
> >>
> >> I could guess that implicit linkage might refer to side effects of
> >> intentional entanglement which may be undesirable or may occur without
> >> being noticed (until a problem shows up which then might be hard to
> >> track down and to fix because there are unknown side effects due to
> >> implicit linkage caused by broken design).  Or does it refer to
> >> unintentional entanglement (and its side effects)?  Or what is it?
> >
> > As an example.
> >
> > If you call a read() function in a C program, there are a bunch of
> > global variables and constants and types, etc., that you do not
> > explicitly reference in your call -- buffer size and count, the
> > location of the buffers, etc. Yes, most of them are explicitly
> > declared somewhere, or explicitly allocated at run-time, but you don't
> > see them in the call itself. This is one kind of implicit linkage.
> >
> > I once considered a language where every such call would have to
> > explicitly declare all globals and side-effects, etc. It's not
> > possible. Still, reducing the number of globals is a good thing.
> >
> > Another kind of implicit linkage is undeclared "magic" constants. For
> > instance, during the early days of PC-DOS, the buffers in the I/O
> > system were assumed to be either 128 bytes or 256, and everyone who
> > worked with the code knew that, so no one bothered to actually declare
> > it as a named constant. You just knew that, if you saw that number, it
> > was probably the buffer size. We don't see many of those any more.
> >
> > In fact, we go a little overboard, defining named constants that never
> > get used, and the namespace clutter becomes another problem, and can
> > induce subtle bugs of its own, which is part of the reason for wanting
> > to be sure that everything you use is declared where you think it is.
> >
> > Protocols are another. In early computer systems, there was often an
> > expected order of procedure:
> >
> > When reading, allocate and clear the buffer first,
> > then check that the DMA controller is not being used and wait if necessary.
> > The DMA controller has it's own set of rules: maybe you have to set
> > the address before the count, or maybe you have to set the count first
> > and the buffer after, etc.. The order of these things should be
> > encompassed in subroutines, but sometimes there are optional steps,
> > and until callback routines became more common, you generally
> > (manually) copied template code to be sure you followed the protocol.
> > Even now, you may have a call-back routine to pass, and no clues as to
> > what that callback is supposed to do.
> >
> > There are other kinds of implicit linkages, basically it's all the
> > stuff that you're expected to know when reading, modifying, or using
> > the code, but isn't written down in front of you anywhere, or is so
> > hard to find that looking it up is going to cause schedule slip.
>
> Hmm.  So linkage is a result of complexity,

What is complexity?

Complexity is not a simple topic. :-\

> and implicity is a result of
> undeclaredness (or unawareness of declaredness).

Sort of, but not quite.

I would rather say,

    Implicitness is the lack of explicit declaration at the point
where the linkage is expressed (or occurs).

but I'm not sure that would be universally well understood, either.

> That means by trying to combine (existing) stuff with other stuff to
> make things easier than (re-)inventing the wheel, you actually make
> things more difficult because they are becoming more complex.

Refactoring, when poorly done, can produce even greater entanglement
and implicit linkage. Of course. People do make mistakes.

> If you
> were to (re-)invent the wheel in order to /not/ make things more
> difficult, you could still make them so complex that you could as well
> use (existing) stuff because it makes things easier for you and no
> significant difference in complexity.

There is always that possibility. It's one of the reasons for the old
adage, "If it ain't broke, don't fix it."

> In any case, you must reduce complexity in order to avoid implicit
> linkage, because even declaredness in itself (or the problem of being
> aware of the declarations) can become so complex (or difficult) that it
> can make things difficult.

Generally, reducing complexity and reducing linkage are related, but
not necessarily. The degree to which linkage is implicit, or to which
entanglement is hidden, is not necessarily dependent on the degree of
either complexity or linkage. These can be independent variables,
depending on the case in question. In some cases, you can even make
them indpendent variables, when they didn't start out that way in your
analysis.

> Since you cannot make things less complex,

I'm not sure what you're trying to say.

If you know you can make things more complex, you know that there must
be things that can be made less complex.

Some engineers subscribe to the theory that you don't actually change
the total complexity, you just move it from one place to another, but
they are definitely not talking about the kind of perceived complexity
that we usually talk about when we talk about complexity. (Assuming
they understand the theory.)

> the concept of implicit linkage has been forgotten.

Ah, you mean historically. I think the concept just got left behind in
the rush to train programmers that didn't know enough math to be able
to disagree with managers who make impossible plans. But that's partly
speculation on my part.

> Is that about right?

I think Well, since you ask, I'll mention a bit about complexity.
There are several kinds of complexity.

One is purely subjective -- perceived complexity: "It's different, so
it's complicated." or "I don't understand it, so it's complicated." We
can talk about the parsing of a problem by the human brain, but it
wouldn't help yet. We should set perceptions of complexity aside here.

If you have a device with 100 inputs and 100 outputs, that's going to
look complicated, right? But if all the inputs just feed directly to
the outputs, it's not really all that complicated after all. This is
one kind of complexity. Analysis is straightforward.

If some of the inputs are inverted or amplified, that's a little more
complicated, but it's the same kind of complexity. Also, if some of
the inputs combine with others, so that some outputs are a function of
multiple inputs, this is a bit more complicated, but it's still the
same kind of complexity.

If some outputs feed back into their own inputs, this changes the kind
of complexity. Small circuits aren't too bad, but if you have even 10
inputs and 10 outputs, and you have some outputs that are a function
both of themselves and their own inputs, analysis gets difficult in a
hurry. If all ten are fed-back and mixed with other inputs, you have a
circuit that is more complex (and more complicated) than the simple
one with a hundred inputs and outputs that don't feed back.

If you can take the device with feedback and split it into five
separate devices, where there is no interconnection between the five
devices, even if there is feedback within individual devices, the
resulting collection of devices is generally much easier to analyze
than the single device with ten inputs and ten outputs with feedback.
And much easier to design and build correctly.

Programs and program functions are similar, the inputs being input
parameters, and the outputs being the result, including side-effects.

> If it is, we can't seriously object systemd, can we?

Well, that's what the pro-systemd folks seem to be saying, isn't it?

Joel Rees

Computer storage is nothing more than fancy paper,
and the CPUs nothing more than fancy pens.
All is text, streaming from the past to the future
in patterns that some read for fate and others for fun.
Those who look for fate think they can manipulate fate
by manipulating that stream of text.
They make weapons of metal and semiconductor,
and weapons of grammar and vocabulary,
and fight for control.
Control of what?
To control oneself is to control the world, and why would anyone want a
weapon for that?


Reply to: