There are two opposing conventions for the representations of data.
The first of these was first published by G. E. Thomas in 1949 and is followed by numerous authors (e.g., Tannenbaum). It specifies that for a 0 bit the signal levels will be Low-High (assuming an amplitude physical encoding of the data) - with a low level in the first half of the bit period, and a high level in the second half. For a 1 bit the signal levels will be High-Low.
The second convention is also followed by numerous authors (e.g., Stallings) as well as by the IEEE 802.4 standard. It states that a logic 0 is represented by a High-Low signal sequence and a logic 1 is represented by a Low-High signal sequence.
A consequence of the transitions for each bit is that the bandwidth requirements for Manchester encoded signals is doubled compared with asynchronous communications, and the signal spectrum is considerably wider. Although Manchester encoding is a highly reliable form of communication, the bandwidth requirements are seen as a disadvantage, and most modern communication takes place with asynchronous communications protocols.
One consideration with Manchester encoding is synchronising the receiver with the transmitter. At first sight it might seem that a half bit period error would lead to an inverted output at the receiver end, but further consideration reveals that on typical data this will lead to code violations. The hardware used can detect these code violations, and use this information to synchronise accurately on the correct interpretation of the data.
A related technique is differential Manchester encoding.
In summary: