Why must a variable be initialized ?

Started by
49 comments, last by Aardvajk 12 years, 10 months ago
Hello forum, Vaironl here.

I was doing some very basic calculator program some weeks ago and someone told me to initialize my variables like

int x,y;
x = 0;
y = 0;


But I was wondering , if a value will be inputted later like 3 or 4 or whatever... Why do I have to say x = 0; or y = 0?

Thanks in advance
Advertisement

Hello forum, Vaironl here.

I was doing some very basic calculator program some weeks ago and someone told me to initialize my variables like

int x,y;
x = 0;
y = 0;


But I was wondering , if a value will be inputted later like 3 or 4 or whatever... Why do I have to say x = 0; or y = 0?

Thanks in advance


You don't have to, but for consistency, clarity, and easier debugging, you should.
Edge cases will show your design flaws in your code!
Visit my site
Visit my FaceBook
Visit my github
Variables need to be in a consistent state before they are read from. The easiest way to ensure they are consistent is to initialise them during declaration. This way re-ordering the code won't move a read before the first write. Reading them from input can be considered of a form initialisation, even though it is usually performed as an assignment.
When you create a variable, without initializing it, it will be null. Here's an example:


int x;

x = null (nothing)




int x = 0;

x = 0


You can assign a value to a variable that is null, but if you try to read a variable that is null, your program will propably crash, if you don't use a try-catch statement. This is very logical, becouse you tell the program to read something that doesn't excists.
It's like a person without a name wink.gif

Two reasons (maybe more) to why you want to do it in the beginning.
1. Some compilers require that no variables are null.
2. It's easier to initilize all variables in one place, so you easy can go back later and change some values if you need to.
Maybe you could write your code like this:


int x=0;
int y=0;


I prefer declaring every variable on it's own. Looks cleaner I think.

When you create a variable, without initializing it, it will be null. Here's an example:


int x;


x = null (nothing)


int x = 0;


x = 0

You can assign a value later in the code, but there's atleast two reasons (maybe more) to why you want to do it in the beginning.
1. Some compilers require that no variables are null.
2. It's easier to initilize all variables in one place, so you easy can go back later and change some values if you need to.


This is only true in certain languages. For instance, in C and C++, the value of an uninitialized variable is undefined, not null.

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]

Oh the wonders of google! An opinion on the subject:

By always initializing variables, instead of assigning values to them before they are first used, the code is made more efficient since no temporary objects are created for the initialization. For objects having large amounts of data, this can result in significantly faster code.[/quote]

And the relevant wiki:

In computing, an uninitialized variable is a variable that is declared but is not set to a definite known value before it is used. It will have some value, but not a predictable one. As such it is a programming error and a common source of bugs in software.
A common assumption made by novice programmers is that variables are set to a known value, such as zero, when they are declared. While this is true for many languages, it is not true for all of them, and so the potential for error is there. Languages such as C use stack space for variables, and the collection of variables allocated for a subroutine is known as a stack frame. While the computer will set aside the appropriate amount of space for the stack frame, it usually does so simply by adjusting the value of the stack pointer, and does not set the memory itself to any new state. Therefore whatever contents of that memory at the time will appear as initial values of the variables which occupy those addresses.[/quote]

I had a bug recently in which I was using an uninitialized variable and the first operation I made with it was taking into account the value it already had. Too much confidence in oneself can lead to this kind of noob mistakes. That's why it's recommended to stick to good coding practices like initializing variables, just in case you were driving your programming locomotive like crazy and made a stupid mistake like that.
[size="2"]I like the Walrus best.
Hello,

new member here too, although I'm not new to programming. I saw this topic here while browsing the forum and thought I give my opinion as well.

I never initialize variables at the point of definition and here is why: If you don't initialize it and you use it in a statement that consumes the variable value for example on the right sight of an assignment or in a funciton call as a non reference parameter, the compiler will give a big warning that you are using an uninitialized variable and will assert during run time. That is if you compile in debug mode and you are using Visual Studio. I don't know about other compilers but I would imagine they should have similar options

If you initialize a variable at the time of definition and you use it accidentally without assigning it a "real" value, the compiler and runtime will not warn you.

Now this assumes that you don't know what "real" value to assign to the variable at the point of definition for example if you define all your variables near the top of the function and the initial value depends on calculations you do later in the function. If you already know the initial value at that point, you can of course set it, however I prefer to do it close to the code section where the variable is used in context to have everyting close together and not spread all over a function code.

I never initialize variables at the point of definition and here is why: If you don't initialize it and you use it in a statement that consumes the variable value for example on the right sight of an assignment or in a funciton call as a non reference parameter, the compiler will give a big warning that you are using an uninitialized variable and will assert during run time. That is if you compile in debug mode and you are using Visual Studio. I don't know about other compilers but I would imagine they should have similar options


This is a tremendously bad idea.

The compiler can't catch every case where you screw up initialization. Especially in the presence of proper, idiomatic C++ (which is heavy on RAII) relying on your compiler to do this for you is a major mistake.

Remember: resource acquisition is initialization. There should be nothing in your code - even primitive types - which is not constructed correctly to at least some sane value, period.


Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]


1. Some compilers require that no variables are null.


Too True! Some compilers will actually, if you leave a variable null, grab whatever randomness is/was in that memory space and assign that value to your variable (if you attempt an operation with that variable unassigned).

So, for example, if I have three variables, call them FirstNumber, SecondNumber, and Thirdnumber, and I don't initialize them, then I get some user input and assign that to FirstNumber, and do Thirdnumber = FirstNumber*SecondNumber ...... ThirdNumber is gonna be something totally random, because SecondNumber was never declared and is now whatever random garbage was in that memory space. Added bonus, I'll have zero errors and zero warnings. In a big block of code it could be a pain in the keister trying to figure out why (maybe not so much why, as where) I keep getting random answers when nothing is wrong. Granted, having something like that happen means you goofed big time, but maybe that was the moment you started thinking about a ham sandwich.

This topic is closed to new replies.

Advertisement