I see a lot of people, but also libraries, defining constants (like pin numbers, length of items etc) as #define:
#define LENGTH 5
While it is recommended to use in this case static const int:
static const int LENGTH = 5;
I did a test and for memory it does not make a difference.
Is there some specific Arduino reason why #define is (mis)used so often?
-
1Old habits from plain C?Edgar Bonet– Edgar Bonet2019年08月27日 15:23:33 +00:00Commented Aug 27, 2019 at 15:23
-
1@EdgarBonet I was already a bit 'afraid' of such remark.Michel Keijzers– Michel Keijzers2019年08月27日 15:26:02 +00:00Commented Aug 27, 2019 at 15:26
-
1People copying examples from people that have bad habits, or learned from people with bad habits.Majenko– Majenko2019年08月27日 15:53:44 +00:00Commented Aug 27, 2019 at 15:53
-
1I would use a byte for values that are <=255, save on memory.CrossRoads– CrossRoads2019年08月27日 16:06:04 +00:00Commented Aug 27, 2019 at 16:06
-
1@CrossRoads You are right ... although I use mostly int's when I just want to test something, for a program where I expect (even in future) that memory is an issue (which is very fast for an Arduino) I use smaller types. In this case it doesn't matter, it doesn't use any SRAM memory.Michel Keijzers– Michel Keijzers2019年08月27日 16:18:21 +00:00Commented Aug 27, 2019 at 16:18
1 Answer 1
For numbers, certainly, const <type>
is preferred. This is chiefly because it imposes a type (which would only be optional for a #define
), which can have a knock-on effect for mathematics.
That's not to say that you should always use const <type>
instead of #define
. #define
has its place.
One of the benefits of #define
over const <type>
is that it is a direct literal text replacement. The replacement is done before the compilation, so anything that is valid in your code can be put in a #define
, whereas only things which evaluate to the correct type at runtime can be placed in a const <type>
.
Take strings for example. Yes, you can use:
const char *foo = "This is text";
And you can use that wherever a const char *
is expected. However, if you use:
#define FOO "This is text"
you can use it anywhere a string literal is expected. That makes for useful things such as C's way of concatenating string literals at compile time:
Serial.println("I say: " FOO);
The two string literals "I say: "
and FOO
(which expands to "This is text"
) are concatenated into a singe string literal at compile time into:
Serial.println("I say: This is text");
You can't do that with const char *
.
Another big difference with const <type>
compared with #define
is when expressions are evaluated. Take for example:
const float sinpi = sinf(3.141592653);
That will calculate the sine of PI once at startup. However:
#define SINPI sinf(3.141592653);
will calculate the sine of PI every time it is used.
So in this instance const float
not only gives you an imposed type of float
for all calculations, it also reduces calculation overhead by only running the calculation once and saving the result.
So as you can see there are pros and cons to each. But in general, for numbers:
const <type>
is preferred for storing internal numerical data that doesn't change, and#define
is most often used for user-configurable data, since it doesn't impose the;
at the end of the line which often gets forgotten by (newer) users.
-
Thanks for the insight ... the last remark is 'interesting' (forgotten by newer users).Michel Keijzers– Michel Keijzers2019年08月27日 16:21:54 +00:00Commented Aug 27, 2019 at 16:21
-
even though it is not directly related to the subject of the question, perhaps you could mention the use of #define for conditional compilingjsotola– jsotola2019年08月28日 00:47:08 +00:00Commented Aug 28, 2019 at 0:47
Explore related questions
See similar questions with these tags.