Why different wireless network frequencies ?
802.11 Wi-Fi standards use different unlicensed frequency bands, the most common being 2.4GHz and 5GHz. Here is a quick breakdown for your reference:
900 MHz (802.11ah) lower rate extended range wireless
2.4GHz (802.11 b/g/n) 14 ~20MHz channels, 11 usable in the US
5 GHz (802.11 ac/n/a) more/wider channels, higher data rate per channel
60GHz (802.11 ad, aka WiGig) high bandwidth same room environments
The most notable difference between all these standards is that as frequency increases, so does bandwidth, at the expense of reduced range. Higher frequency networks can carry a lot more information, however at a lower distance. The 60GHz frequency is an extreme example that is mostly used for same-room line of sight streaming of UHD video, however it cannot penetrate any walls. The more common 2.4-5GHz ranges are most widely used for residential applications, and the lower 900MHz band is mostly used in rural environments for extended range lower bandwidth/low power consumption networks.
The 2.4GHz band offers good range and bandwidth for residential applications, however, the limited number of channels (11 or 14 total, only 3-4 non-overlapping), interference from neighboring networks, wireless phones, microwaves, EMI, etc., and the ever increasing bandwidth requirements have caused a shift to the 5GHz band for newer devices.
802.11ac uses the 5GHz band exclusively, with a second radio in the 2.4GHz band for backward compatibility with 802.11n networks. The slightly reduced range has been quickly offset by newer standards like beamforming, MU-MIMO, extended channel binding, more available channels, 256 QAM, reduced interference from other devices, etc.
If there is one thing to take away from all this, one should remember that higher frequencies offer higher bandwidth, while lower frequencies offer higher range and better obstacle penetration.
See also: Is 5GHz Wireless better than 2.4GHz ?