I've been wanting to delve back into audio/video playback for a while now, and I think the 3 libs mentioned above seem like a good place to start. How do these three libraries work together, though? I did some research, and it looks like libogg is used only to read ogg files containing vorbis and theora packet streams into basic structs containing the encoded data, and some info about the packet such as ID, data length, etc. It sounds like an ogg file can have up to 2 streams: one for vorbis (audio), and the other for theora (video). It seems that libvorbis and libtheora are used to decode their respective packets' audio/video data into raw, uncompressed data that could be sent to the hardware for rendering using a lower-level API, such as OpenAL (audio) and OpenGL (video).
That said, OGG is really a container format, and vorbis and theora files are actually specified with different file extensions. Vorbis files only contain an audio stream, and theora files contain a stream for video data with an optional stream for audio data. Does this sound correct so far?