So the wrong thing to focus on is the "10 bit monitor" part, at least initially. The right thing to ask is "what is the colorspace". A "colorspace" represents the range of colors representable by a display. Here's a handy chart
The total area of the chart represents the colors a human eye can see, the triangles represent different "colorspaces". The points of the triangles represent the shades of red, blue, and green that can be mixed together to create the colors in between. Right now we are at the smallest triangle, REC 709, and have been since, well practically since color TV was invented. The "bits" as in "10 bit, 8 bit" etc. come in when you want to display the colors in between the far points of the triangle.
Right now we have (for the most part) 8bit monitors, in binary that plays out to each color (Red, blue, green) having 256 shades each to combine together to make the colors in between. For REC 709 that's fine, you won't see banding (mostly). But when we get to bigger triangles, we need more shades of each color to cover the space in between unless we want banding. EG
This is supposed to be smooth, but doesn't have enough colors to represent a smooth change to our eye, so we see obvious "jumps" in color. That's where we need extra bits, to get more colors to put in between. Ten bits offers 1024 shades of each color, and is enough for the middle triangle, which is the DCI colorspace, or rather what movies are (ideally) projected in in theaters. It's also what the first wave of "HDR!" screens support.
Unfortunately, or fortunately depending, there's also the bigger triangle, the REC 2020 colorspace, which is what's supposed to be supported but couldn't quite make it out this year. To cover that area without banding would need 12 bits of each color. Which "colorspace" will win is a complicated mess and no one knows. Regardless, now that I've covered what's actually going on to the question.
For one part of production, the shades of color and ye 10+ bits, that's going to be easy. Right now textures covering albedo generally take up 24 bits to render (8 bits of color each). In order to cover the bigger triangle, IE in order for it to be done properly, you just up the bits. Maybe to say, 11,11,10 and hope people don't notice banding if we dither for textures. Other things also get upped, some people still use 10 bits per color channel for HDR (think going into a bigger triangle virtually, then shrinking it back down) which shows some banding on 8bit output. But for 10+ bits the minimum HDR target will probably be 16 bits per channel. So, more rendering power, ouch, but relatively easy to do. Though it should be noted right now GPUs have automatic "support" for REC 709, and automatically convert back and forth between REC 709s gamma curve and everything being linear and thus easier and proper to do maths with, while the bigger triangles have different gamma curves and will need to be either manually done or have new GPUs that do it quick and automatic.
Asset creation will be the hard part. Right now every studio does REC 709 textures and colorspace and etc. The cameras for texture sampling are set up for REC 709, the camera color checkers are, the monitors are, the GPUs are, the modeling software is. All of that will have to be replaced, ideally from the moment real production begins (years in advance of release for triple A games) to produce assets that are meant to be displayed in the higher colorspaces. Now you can convert older textures into the newer spaces automatically, but it doesn't mean artists are going to be happy with the results. It might take a while to go over all the textures manually to get them to look acceptable. You might also see older 8 bit textures used in games covering the higher colorspaces (according to marketing...) and just use lighting values that cover the higher colorspaces. Obviously not ideal, but I wouldn't doubt that at least one or more games would go for it.
But ideally you'd have all assets designed from the start to cover whatever higher colorspace is used. With (right now) quite slow adoption of "HDR!" screens, little to no support from anything else (Netflix, image formats, etc.), and the need for more processing power I'd say you aren't going to see much of any games supporting "HDR!" for years, and quite possibly not uniformly until yet another new generation of consoles (or whatever, depending on how long these last).
Hopefully that covers everything you, and anyone else for that matter, wanted to know.