The definition of entropy that really connects with the notion of complexity is that which comes from Boltzmanns Principle. If the number of distinct microscopic states that have the same macroscopic thermodynamic state is
n, e.g. there are
n ways of arranging the positions and momenta of gas molecules that give rise to the same temperature, pressure, volume, energy, etc. then the entropy,
S, is given by
S=
k.ln(
n) ,
where ln() is the natural logarithm.
(You might be wondering how you count an apparently infinite number of microscopic states. To do it classically you have to use a "coarse graining" trick, but the finite quantisation
does arise naturally if you treat everything quantum mechanically.)
It's now possible to see the connection between entropy and order. The more equivalent arrangements there are, the higher the entropy. If we compare a gas and a regular crystaline solid we can imagine that there are many more possible equivalent arrangements of the gas molecules, than there are arrangements of atoms or molecules within a crystal. In a gas, the molecules can be pretty much anywhere, doing "their own thing", whereas in a crystal the atoms are confined to being located at the points on the crystal lattice.
Similarly, one can see that a gas with many equivalent states, is more complex than a regular crystalline solid, as you need far fewer parameters to specify the locations of the atoms in the crystal, than you do for atoms in the gas.
This has probably skimmed over a few of the subtlties, but I hope that it has captured a flavour of what is going on.