A megabyte is a unit of measurement for computer storage, memory and information; while its exact definition varies, it is approximately equal to one million bytes. The abbreviation for megabyte is MB.
Three definitions for 1 MB are being used:
To reduce the confusion and distinguish between meaning (1) and (2) above, the International Electrotechnical Commission (IEC), adopted an international standard in December 1998 which reserves the term megabyte for 106 bytes and introduces the new term mebibyte (abbreviated as MiB) for 220 bytes. Similarly, the terms kibibyte (KiB, equal to 210 bytes) and gibibyte (GiB, equal to 230 bytes) were introduced. These naming convention, while strongly endorsed by IEEE and CIPM, have not yet been widely accepted, and are simply ignored by most people.
Note the distinction between a megabyte (about one million bytes) and a megabit (about one million bits). A megabit is abbreviated as Mbit (preferably) or as Mb with a lower case "b". There are eight bits in one byte, so a megabyte (MB) is eight times as large as a megabit (Mb or Mbit). Megabits are often used in applications where a serial bitstream is the item of interest, particularly in communications and in specifying the internal data rate of a computer hard drive. In these contexts, one megabit is almost invariably defined as 106 bits. In practice, the abbreviation Mb is frequently encountered as a mistaken notation for MB. In most cases, an examination of the context will indicate which unit of measure was intended.
Similarly, a Gb or Gbit is a gigabit and a kb or kbit is a kilobit: these units too are often written in error when using the "b".