Big endian

In computing, endianness is the ordering of individually addressable sub-units (words, bytes, or even bits) within a longer data word stored in external memory. The most typical cases are the ordering of bytes within a 16-, 32-, or 64-bit word, where endianness is often simply referred to as byte order. The usual contrast is between most versus least significant byte first, called big-endian and little-endian respectively. Mixed forms are also possible; the ordering of bytes within a 16-bit word may be different from the ordering of 16-bit words within a 32-bit word, for instance; although fairly rare, such cases exist, and may sometimes be referred to as mixed-endian or middle-endian.

Endianness may be seen as a low-level attribute of a particular representation format, for example, the order in which the two bytes of an UCS-2 character are stored in memory. Byte order is an important consideration in network programming, since two computers with different byte orders may be communicating. Failure to account for varying endianness when writing code for mixed platforms can lead to bugs that can be difficult to detect. The terms Little-Endians and Big-Endians were introduced on April Fools' Day 1980 by Danny Cohen in his paper "On Holy Wars and a Plea for Peace".Cohen's paper uses Gulliver's Travels by Jonathan Swift (1726), wherein two religious sects of Lilliputians argued over whether to crack open their soft-boiled eggs from the little end or the big end,[3] as an allegory for the byte order (aka Endianness) issue which became crucial when computers became interconnected with each other by networks.