To illustrate a no-gap condition, imagine a modern computer: as marvelous as these devices are, their behavior can be fully explained by their circuitry, thus there are no causes left over for consciousness. According to believers in the explanatory gap theory, any AI built out of software running on such a machine would also suffer full accountability and thus not be conscious.
The Explanatory Gap is perhaps better known to people as the Chinese Room problem defined by John Searle.
See also: