As others have pointed out, what you're calling DOD is more akin to procedural-style programming, as typical of C code. You can do OOP in C even, you just don't have convenient tools built into the language for doing so. Likewise, you can do actual DOD using OOP techniques or procedural techniques, or functional or other techniques as well.
When we talk about Object-Oriented, Procedural, Functional, Declarative (and more) styles of programming, we typically call those programming paradigms -- a language that is designed to fit one (or maybe blend a few) of those paradigms typically has language-level features and makes language and library design decisions that support and encourage programmers in leveraging a certain mindset when expressing their solutions at the level of source code.
As of yet, I'm not aware of any language that adopts Data-Oriented Design in the way that, say C++, adopts Object-Oriented Design, and I (and most people, I would assume) tend to think of DOD existing on a separate plane that's mostly orthogonal to the plane where OOP, Procedural, and other programming paradigms exist. This is because actual DOD isn't really about how a programmer maps their solutions to source code, its really about how their solution maps to realities of hardware with an emphasis on what data belong physically-together and how it flows through the program logic. DOD says that this mapping from solution to real hardware is more important than the mapping from a programmer's solution to source code -- thus, in DOD, hardware realities drive the solution, and the solution drives the source code. This is the reverse of the typical approach, where programmers do not typically deeply consider the realities of hardware (indeed, some schools of programming actively discourage such considerations) or, if they do, attempts to retrofit hardware considerations as optimizations after the program structure, according to whichever programming paradigm, is already crystallized and difficult to fundamentally change. DOD has to be considered from the start since it dictates how your data will be organized and how it will flow, at least for the processing-intesive parts of your program that will benefit from it; DOD can't be an afterthought.
On OOP, one of the troubles is that what's taught as "OOP" in books and in college classrooms tends to be a very shallow and dogmatic view of it. Most colleges today teach OOP using Java which as a language is particularly dogmatic (there are many reasonable choices which the language simply disallows a programmer to make because the language designers deemed their one-true-way as automatically superior), and not to mention needlessly verbose because of it. Thus, Java is all of OOP many people know when they exit college, and they go on to program in C# or C++ or other "OOP" languages as if they were Java.
Java has no free-standing functions, Java has no operator overloading, Java is garbage-collected, Java is intrinsically independent of any real hardware by creating a fictional homogeneous virtual hardware platform.
Java made choices largely opposite of C++ even though they look superficially similar. C++ has free-standing functions, supports operator overloading, is not garbage collected (or even reference counted by default), and does not make itself independent of real hardware, but defines where those differences may appear explicitly (simultaneously discouraging, but allowing, reliance on such platform-specific behavior). These are just a few examples, and both languages have their place, but it should come as no surprise that programming either one as if its the other, where its even possible, does a disservice to the program. Its like trying to speak Spanish by mixing Spanish words with English rules for grammar -- you might be able to communicate your ideas in the end, but you sound like an idiot and everyone wonders why you seem so overconfident of your ability to speak Spanish.
In C++, for example, its good practice for a class to be as small as possible, containing only the member variables necessary and only the member functions that must be able to manipulate those member variables directly. What's more, in C++, free-standing functions inside the same namespace as a class, if they operate on that class, are every bit as much a part of that class's interface as member functions are because of how C++ name-lookup and overload resolution work (see: Koenig Lookup). In Java-style OOP, this cannot be because the language says that every function must be part of a class -- and as a result every function can manipulate member variables directly even if it doesn't need to (Java's approach is worse for encapsulation, and makes testing more difficult in the same kind of way that global state does). This one difference makes good, idiomatic program design fundamentally different between the two languages -- all of this is kind of a long way of saying that even within "OOP", there are different, competing flavors that dominate in one language or the other. Finally, while I have no love for Java, I do not mean to leave you with the impression that C++-style OOP is the best style of OOP -- C++ happens to be a particularly popular and mostly-good blending of OOP with control over low-level hardware concerns which, combined with is mostly-C-compatible roots, has made it very attractive for game development and other computationally-intensive domains where efficient hardware utilization pays dividends -- C++ is not even a "pure" form of OOP, and many computer scientists argue that languages like Simula (the first OOP language) and smalltalk (another very early OOP language influenced by Simula) have never been surpassed as examples of the OOP programming paradigm.
In the end, the best programs tend to balance pragmatism with just-enough looking-forward. Programs that see the light of day tend to do only what they need, without caring overmuch about how pretty or fast or ideologically-pure they are. At the same time they avoid painting themselves into a corner -- too much specialization too soon, in the wrong places, or without good reason often ends up as wasted effort when it proves inflexible in the face of necessary changes later on. There isn't a formula for this balance, its something you gain a feel for through experience and to a lesser extent by learning from others who are experienced. Its the art of knowing when "better" has become "good enough", and accepting that after this point, "better still" is rarely a justification unto itself. Its accepting and even embracing that we will never know more about a problem now than we will know in the future, and not making big bets on unknowns, for or against (as a side-note, this is not at odds with DOD, since hardware details are known and immutable).