When a device, organism, program or process becomes so complex/involved that it spontaneously achieves self-awareness and self-volition even if (or especially if) you don't want it to.
Complexity plague especially occurs when embedded AIs begin to act on their own behalf (often completely lacking understanding of the outside world - they only know of drives or wormhole structures) and spread this to other nearby embedded AIs. Imagine the consequences when some thousands of supposedly sub-sophont system control AIs converting a Jupiter-sized mass into magmatter suddenly begin to do something else! This may actually be a reason to have self-aware "overseer" AIs monitor such processes. Of course, what to do with spontaneously independent AIs is another matter. In some cases they are immediately erased (which causes outrage among many sentient rights groups), in other cases they are given full rights. Some groups take a middle ground: they do not erase them, but simply freeze their execution and store them in huge "independence blocks" of formally independent but inactive software.