Coupling is a software metric that describes how closely connected two routines or modules are. It is a measure of quality. The concept was introduced by Larry Constantine in the 1960s and was formulized in a 1974 article for the IBM Systems Journal, Structured Design, and in the 1979 book by the same name.
Having modules A and B, the more knowledge about B is required in order to understand A, the more closely connected is A to B. The fact that one module needs to be inspected in order to understand the operation of another is an indication of a degree of interconnection, even if the degree of interconnection is not known.
Coupling is a measure of the strength of that interconnection. Coupling is affected by the type of connections between modules, interface complexity, information flow between module connections, and binding time of module connections. Coupling is usually contrasted with cohesion, e.g., low coupling translates into high cohesion and vice-versa.
Coupling can be low / loose / weak or high / tight / strong.
Tight coupling translates into ripple effects when making changes, as well as code that is difficult to understand. It tends to propagate errors across modules, when one module behaves incorrectly. It tends to complicate debugging and fixing defects.
In loosely coupled systems, on the other hand, individual modules can be studied and altered without the need of taking into account a lot of information from other modules. Errors can be pointed out much more easily. Debugging takes less time, while fixing defects is usually simpler. The chances of error propagation across modules tend to be reduced.
The levels of coupling below are ordered from high to low:
Class level coupling results from implementation dependencies in a system. In general, the more assumptions are made by one class about another, the tighter the coupling.
The strength of coupling is given by the stability of a class, i.e., the amount of changes in dependant classes that need be made if a class changes, and the scope of access, i.e., the scope in which a class is accessed, with the higher scope introducing tighter coupling. At class level, the degree of coupling is measured as the ratio of number of messages passed to the number of messages received, i.e.,
DC = MRC / MPC
MRC is the received message coupling (the number of messages received by a class from other classes), and
MPC is the passed message coupling (the number of messages sent by a class to other classes).
Class level is a particular case of the Module level metric.
A more general metric, this metric tracks other modules, global data, and outside environment. The formula computes a module indicator
mc = k / M
k a proportionality constant and
M a value calculated by the following formula:
M = di + (a * ci) + d0 + (b * c0) + gd + c * gc) + w + r
In the formula above:
care defined empirically
w– the number of modules called (fan out) – and
r– the number of modules calling the module under consideration (fan-in) are environmental coupling parameters
gc, describing the number of global variables used as data and as control, are global coupling parameters
co, describing the number of data and control input and output parameters, are data and control flow parameters
One important note to be made is that as the value of
mc increases, the overall coupling decreases. In order to have the coupling move upward as the degree of coupling increases, a revised coupling metric,
C, might be defined as
C = 1 - mc
Introducing coupling increases the instability of a system. Decoupling is the systematic coupling reduction between modules with the explicit intent of making them more independent, i.e., minimizing the value of
C, as defined in the previous section.
Content coupling can be eliminated by following encapsulation.
Common coupling can be resolved by introducing abstractions. Design patterns could prove useful towards achieving a good architecture.
External coupling can be resolved by eliminating the knowledge of formats from the domain, and operating on concepts.
Control coupling can be eliminated by using strategies or states.
Stamp coupling can be eliminated by passing actual data.
Data coupling can be eliminated by employing message passing.
One very important principle to guide by in reducing coupling is the Law of Demeter, presented below.
Also referred to as the principle of least knowledge, the Law of Demeter is a specific case of loose coupling. The principle states that a unit should only have knowledge of and talk to closely-related units, assuming as little as possible about the structures and properties of anything it interacts with, including its own subcomponents. For example, an object A could call functionality on object B, but should not reach through B to access an object C for its functionality. Instead, object B should facilitate access through its own interface, propagating the request to its subcomponents. Alternatively, A could have a direct reference to C.
A more formal definition states that a method M on an object O can invoke the methods of the following objects:
In particular, an object should not call a method on a returned object, i.e., there should be at most one dot in code, e.g., a.Method(), and not a.B.Method().
Coupling is unavoidable; otherwise each module would be its own program. However, achieving low coupling should be one of the primary objectives in system design, such that individual modules can be studied and altered without the need of taking into account a lot of information from other modules, errors can be pointed out much more easily, and debugging takes less time, while fixing defects is usually simpler.
Loose coupling leads to high cohesion, and together they lead to maintainable systems.
Software is our passion.
We are software craftspeople. We build well-crafted software for our clients, we help developers to get better at their craft through training, coaching and mentoring, and we help companies get better at delivering software.