Why does MSVC warn that "not all control path returns a value" with enum class ?

This topic is 738 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

With pre C++11 enum MSVC printed a warning in switch statement if there was no default value even if all enum values were covered.

Since these enums were sort of alias of int value this was ok.

The behavior is still the same with enum class which are strongly typed now. However I was expecting that the strong typing ensured that the compiler would consider that the only valid value are the one declared between the brace of enum class declaration.

I know that it's valid to cast an int to an enum class value ; however I though that such cast was undefined behaviour if the int value was out of the enum class values.

Is there a reason MSVC do that ?

Clang and gcc seem not to emit such warning (clang only print a warning if not all enum value are covered and list them).

Share on other sites
Hmm, what would happen if I did this?

enum class Foo : int
{
Bar = 0,
Baz = 1
}

Foo bar = Foo::Bar;
*reinterpret_cast<int*>(&bar) = 3;
switch (bar)
{
case Bar:
// ...
break;
case Baz:
// ...
break;
}

Edited by Oberon_Command

Share on other sites

Per the standard as of November '14, you invoke undefined behavior if you do this and the value of a is not a valid enum value:

TLDR: This is perfectly fine; leaving out the type specifier (the int part) results in unspecified behavior (so any value may be assigned if it is not 0 or 1)

enum class Foo : int
{
Bar,
Baz
};

int main() {
int a;
std::cin >> a;
Foo foo = static_cast<Foo>(a);
std::cout << static_cast<int>(foo) << std::endl;
return 0;
}


This compiled and ran without warnings in gcc and no dog was shot/poisoned in the process. Maybe the MSVC compiler implements an older version of the standard? Back then foo was an unspecified value of its underlying type, but allowed to be outside of of its value range.

From a compiler standpoint the easiest way to invoke undefined behavior is to simply convert the value to the underlying type and ignore potential issues (at least gcc does this). If your program is 100% ub free C++ conforming to the standard, the default switch should not be needed.

Personally I think having a default case in a switch is a nice way to catch subtle bugs and good practice, so I can totally see MSVC complaining about it

Edited by duckflock

Share on other sites

C# has a similar behavior, which I thought was odd given .NET's much smaller allowance of undefined behavior.  But accepting that as the way it was, I was recently looking into people's recommendations for what type of exception to throw in C# for the same situation, and I encountered the suggestion to throw a NotImplementedException.  The reasoning being that in the future you (or someone else) might add a new value to the enum, but forget to update every last location in code that switches on that enum.  In that scenario, complaining loudly at runtime that a path hasn't yet been implemented seems appropriate.

Granted, having the compiler complain at compile time that not all paths return a value (or initialize a variable before it is used in C#) is even better, but not all uses of a switch statement would produce that warning/error.  Even if they would, the problematic switch could be in an already compiled executable or library file which is still binary compatible with the expanded enum.  With no chance for the compiler to warn the developer, a noisy runtime error would still be welcome.

Share on other sites

Funnily, this is just what I've been asking from C++ (more unspecified and less undefined). Surprised to actually see this, I would habe taken every bet that C++ makes this undefined. Well, good surprise.

I'm not sure it changed.

C++98 just basically said it was whatever the underlying integral type was, and that it behaved as an int or whatever the underlying type was. Many items were listed as "implementation defined" rather than "unspecified".

The standard is quite clear about this, though: "It is possible to define an enumeration that has values not defined by any of its enumerators."  And I think that is the crux of the warning message about all paths.

Share on other sites

Per the standard as of November '14, you invoke undefined behavior if you do this and the value of a is not a valid enum value:

enum class Foo : int
{
Bar,
Baz
}

int main() {
int a;
std::cin >> a;
Foo foo = static_cast<Foo>(a);
std::cout << static_cast<int>(foo) << std::endl;
return 0;
}

This compiled and ran without warnings in gcc and no dog was shot/poisoned in the process. Maybe the MSVC compiler implements an older version of the standard? Back then foo was an unspecified value of its underlying type, but allowed to be outside of of its value range.

From a compiler standpoint the easiest way to invoke undefined behavior is to simply convert the value to the underlying type and ignore potential issues (at least gcc does this). If your program is 100% ub free C++ conforming to the standard, the default switch should not be needed.

Personally I think having a default case in a switch is a nice way to catch subtle bugs and good practice, so I can totally see MSVC complaining about it :)

I use "treat error as warnings" compilation flag, I originally wanted to ensure that if someone adds a value to an enum then the build will break if a switch is missing an extra case(I prefer compile time error over runtime ones) . But msvc doesn't allow this.

Share on other sites

Not sure this is true. As usual in C++, the answer is not immediately obvious. (sigh)

True

[5.2.]9 A value of a scoped enumeration type (7.2) can be explicitly converted to an integral type. [blah bool...] For [the remaining] integral types, the value is unchanged if the original value can be represented by the speci?ed type. Otherwise, the resulting value is unspeci?ed.

Ugh, you're correct. This is why I hate defect reports; they take forever to be incorporated. See http://www.open-std.org/jtc1/sc22/wg21/docs/cwg_defects.html#1766,

which proposes:

A value of integral or enumeration type can be explicitly converted to an enumeration type. The value is unchanged if the original value is within the range of the enumeration values (7.2 [dcl.enum]). Otherwise, the resulting value is unspecified (and might not be in that range) behavior is undefined.

The reasoning is that right now statically casting a constant to enum produces a constexpr, which is incorrect if you think about it: several compilers might implement the unspecified value differently, so it is definitely not constexpr. Unspecified is too weak here. This is only a minor nitpick and in fact it will still work in the same way.

To summarize (hopefully I'm right on this one, please correct me if I'm not):

Current Situation for static_cast<Foo>(value):

• if the value is outside of the range of a specified underlying type (e.g. enum class Foo : char) -> undefined behavior.
• if the enum has not specified a type, its range is defined as the range of a variable that has exactly as many bits as needed to fit the enum values. That is the bmin, bmax part in the standard. Example: (enum class Foo { Bar = 0, Baz = 28 } => range is [0,31] (in 2's complement)
• what integral type an enum with unspecified underlying type has is implementation defined, but not larger than int unless necessary.
• if you cast a value to enum and the value was outside of enum's range -> unspecified value.

Right now dogs will be left alone and the result of static_cast<Foo>(FOO_MAX_PLUS_ONE) may hold any value if you do not specify an underlying type.

So this

enum class Foo : int
{
Bar,
Baz
};

int main() {
int a;
std::cin >> a;
Foo foo = static_cast<Foo>(a);
std::cout << static_cast<int>(foo) << std::endl;
return 0;
}


Is completely fine and will always assign a to foo.

enum class Foo
{
Bar,
Baz
};

int main() {
int a;
std::cin >> a;
Foo foo = static_cast<Foo>(a);
std::cout << static_cast<int>(foo) << std::endl;
return 0;
}


This is unspecified behavior as of now. Dogs will be left alone, but foo may hold any value after the static cast if a is neither 0 nor 1.

This is especially dangerous, because

enum class Foo {
Bar,
Baz
};

constexpr Foo foo = static_cast<Foo>(4);


is perfectly fine and constexpr, even though one compiler may assign 0 to foo and another one -1 and yet another one 4 (most probably all will assign 4).

Share on other sites

for MSVC, use __assume(0)  to hint the compiler you know what you're doing.

Share on other sites

is perfectly fine and constexpr, even though one compiler may assign 0 to foo and another one -1 and yet another one 4 (most probably all will assign 4).

I think constexpr isn't really the issue. It's fine for one compiler to assign 0 and another to assign -1. Indeed, I think that "makes no sense" is certainly a perfectly good constant expression, no matter what its value may be (as long as the value doesn't change during the lifetime of the program). Note that it is also perfectly fine to do any of these:

enum class TrafficLight{ red = 1, yellow = 2, green = 4 }; // assumes you are neither in UK nor Germany where there is red-yellow too
enum class AlertCondition { none = 0, yellow = 1, red = 2 };

TrafficLight elm_street = static_cast<TrafficLight>(3);                                        // WTF?
TrafficLight main_street = static_cast<TrafficLight>(TrafficLight::red + TrafficLight::green); // WTF?
TrafficLight second_street = static_cast<TrafficLight>(0);                                     // Does that mean it's turned off?


Thing is, it is entirely meaningless to do any of the above. Assigning a red traffic light to an alert condition will generate a yellow alert (since the value fits within the range, so the value is preserved). That was the primary example of the reasoning behind strongly typed enums being added (which are a blessing and a curse, since -- why, just why? -- you cannot using them in a function as you could if you wrapped an anonymous enum in a namespace, and neither can the compiler figure out from a function's signature that only one is possible, so you always have to repeatedly type out the fully qualified name when you shouldn't have to).

But also, it is possible, and legitimate, to have red and green on a traffic light at the same time (and indeed this rarely, very rarely happens in real life -- for several decades, traffic lights in Germany were not allowed to be operated by computer systems for exactly that reason, although the technology was readily available -- they were strictly mechanical with forcible control, mutually exclusive switches). Depending on whether you follow the "underlying type" or the "otherwise" clause in the two contradicting paragraphs of the standard, this will or will not preserve the numeric value. But in any case, the numeric value has no meaning, so that is inconsequential.

It just doesn't make sense to do certain things. Because of that, however, if your program allows for that to happen, it is perfectly harmless if the value that you get is arbitrary. There is no difference between any arbitrary "doesn't make sense" and any other "doesn't make sense".

If, however, such a thing can be detected at compile time, it is immensely helpful if the compiler warns you about it, because you are almost certainly making a gigantic stupid mistake. Insofar, I deem this warning very valuable.

Edited by samoth

Share on other sites

is perfectly fine and constexpr, even though one compiler may assign 0 to foo and another one -1 and yet another one 4 (most probably all will assign 4).

I think constexpr isn't really the issue. It's fine for one compiler to assign 0 and another to assign -1. Indeed, I think that "makes no sense" is certainly a perfectly good constant expression, no matter what its value may be (as long as the value doesn't change during the lifetime of the program).

Thinking a bit more about the problem, constexpr evaluation still differs from compiler to compiler (i.e. for floating point), so this is in fact a no-issue.

I agree with everything you have said (except that TrafficLight::red + TrafficLight::green does not compile because there is no operator+ and enum class is not implicitely convertible to its underlying numerical type).

IMHO just add a default case to switches, it may come in handy at some point in the future and catch some errors. No harm in doing that.

Share on other sites
Doesn't MSVC have a compiler flag to turn this warning off?

Share on other sites

Doesn't MSVC have a compiler flag to turn this warning off?

You can turn on all compiler warnings off using pragma push pop. It doesn't mean it's a good solution though.

Share on other sites

Is there some way to detect at build time if all enums are present in all switch statement of a code base with MSVC then ? Adding a default case will only fails at runtime and I don't see any "type" that can hold only symbolic value by making the actual binary representation innaccessible.

Share on other sites

In D, you can use a final switch with an enum. This will cause the compiler to error out if you don't have a case for every member and won't let you use runtime-initialized values. It's very handy. It would be great to see something like that in C++.

• 11
• 9
• 11
• 9
• 12