What's an operating system?
The operating system is the piece of software that sits between the hardware and the running programs; its *only* job is to share the hardware among those programs. Its importance is overshadowed only by its lack of glamor.
Some companies have produced operating systems, bundled them up with a bunch of other software like graphical user interfaces, web-browsers, and games nobody wants to play, and marketed them with pretentious names like "Windows 8 Professional" and "OS X Yosemite". Due to their prevalence, this is what most people think of when they hear "operating system", but the fact is that the operating system itself, often called the *kernel*, is really a very small corner of what they're selling.
Note that picking an operating system can limit your software choices: programs compiled for Windows won't necessarily run on OS X or Linux or FreeBSD and vice versa.
Linux is an open-source POSIX-compatible operating system. Wikipedia has good descriptions, but shortly, open source means you can download and modify its source code; POSIX-compatible means it complies with rules that define a modern UNIX-y system.
The existence of the POSIX standard implies the existence of other UNIX-y operating systems besides Linux, and this is true. I will not recount the history of UNIX here (Wikipedia has an article); suffice it to say that there now exist two main families: Linux and BSD; though a third, Solaris, stubbornly refuses to go away.
The BSD family includes FreeBSD, OpenBSD, NetBSD, DragonflyBSD, and others. "BSD" stands for Berkeley Software Distribution, as it came out of UC Berkeley in the 70s.
Solaris was originally produced by Sun Microsystems, which has since been bought by Oracle. As far as I know, it's still alive; an open-source relative exists under the name OpenSolaris.
As mentioned previously, Linux is just an operating system, so where does one get all of the other software one might want to use on a Linux system?
Digression: installing software
Most of the software you'll use on Linux is publicly released as source code rather than compiled, binary executables. Absent any other facilities, if you want to install a piece of software, you'd download the source code and compile it yourself.
gcc on every file you download isn't practical,
especially when a single program comprises a few hundred (or thousand or
hundreds of thousands) of files. Fortunately, as this is a tedious,
repetitive, mechanical task, people have written software to do this for you; of
these, GNU make is the most common. It
has a bunch of other features that make it pretty awesome for compiling
programs and, indeed, other tasks as well.
In the simplest case, you'd download the source code, uncompress it, and let GNU make have its way with it:
$ wget http://www.example.com/blargh-1.0.2.tar.gz $ tar -zxf blargh-1.0.2.tar.gz $ cd blargh-1.0.2 $ make
make is run with no arguments, it usually compiles the software.
Makefile authors usually include an additional *target* that installs the
software in an appropriate place on your system. Because this "appropriate
place" is usually a system directory like
/usr, you typically need to be
root to perform the operation:
$ sudo make install
Authors of open-source software frequently want their programs to work on a
variety of operating systems (not just Linux!) and other operating systems
often have different ways of doing things that affect how a program must be
compiled (for instance, the cryptography library might be stored in a
different location in the filesystem). Unfortunately,
make has difficulty
dealing with these variances. Therefore, unsurprisingly, people have written
software that *can* deal with these variances: it scours the system to
discover the variances and then generates the Makefile that is then read by
make. So now the build process looks something like this:
$ ./configure $ make $ sudo make install
Except even this leaves something to be desired. Why make every user of a piece of software download the source and compile it themselves when they're all going to be producing the exact same resultant compiled binaries? This is where operating system *distributions* come into play.
What's a distribution?
A distribution, of which there are a huge number (see Links below), is primarily defined by the install media, the set of compiled software one can install, and the means by which one installs it. Historically, distributions have also been differentiated by the software that sets up the system when it's turned on, but that is becoming less of a factor these days (some would say to the detriment of many).
When I say, "the means by which one installs it", I'm referring to how the compiled software (which may comprise many separate files) is packaged up into a single archive so that people can conveniently download it. Once downloaded, the files in the archive will need to be extracted and moved to appropriate places in the filesystem. There are different ways one can orchestrate this process; the two most prevalent are "rpm" and "deb". Distributions using rpm-format packages include RedHat, Fedora, and CentOS; distributions using deb-format packages include Debian, Ubuntu, and Linux Mint.
(Aside: some distributions don't produce precompiled packages---these are called "source distributions", which differentiates them from distributions that primarily distribute binary packages---and chief among the Linux-based source distributions is Gentoo, which took its inspiration from the FreeBSD ports system. Ironically, FreeBSD has since gravitated towards a binary distribution model. There are also distributions that mix the two; Arch Linux is a good example of this.)
DistroWatch follows the absurd number of open-source operating system distributions, including non-Linux distributions, too.