Re: math.abs(math.mininteger) (Was: Must "attempt to get length of a number value" be an error?
[
Date Prev][
Date Next][
Thread Prev][
Thread Next]
[
Date Index]
[
Thread Index]
- Subject: Re: math.abs(math.mininteger) (Was: Must "attempt to get length of a number value" be an error?
- From: Lorenzo Donati <lorenzodonatibz@...>
- Date: 2019年5月16日 23:19:58 +0200
On 14/05/2019 21:41, Will Crowder wrote:
On Mon, 2019年05月13日 at 09:49 +0200, Lorenzo Donati wrote:
[snip]
I'm not a big fan of C (and C++ either: C++ it has almost the same
definition of UB as C), but I had to reconsider my views in the last 5
years, since I had to cope with microcontroller (MCU) software development.
I agree that, in an ideal world, no sane high level application should
be developed in C (or C++) because they are intrinsically unsafe, thanks
to UB. C++ mitigates that a little by providing high level constructs
that a programmer can use to avoid the most common cases where a mistake
can cause UB (e.g. dynamic memory management).
C in the right hands (e.g., folks who understand language standards and
how to avoid or allow for undefined behavior) is incredibly useful for
high level applications, especially where performance is an issue. It's
a matter of understanding the tools you're using.
Sorry, but I don't exactly get the point you are trying to make.
"C in the right hands...is incredibly useful for high level
applications, especially where performance is an issue."
OK, so what? How this contradicts the fact that literature (technical,
academic, etc.) is full of cases where C lacks of "safety checks" ends
up biting the programmer and the user with vulnerabilities?
Developers aren't perfect. Training a programmer to have the same
productivity in C with the same rate of "bug introduction" as another,
more constrained language, is extremely costly.
"In the right hands" even assembly can be used to do high level
applications. This doesn't mean that assembly is a good choice for that.
I stand by my opinion that C is not meant to be a safe language and pure
speed is the need of only an handful of application areas (where C can
be used effectively).
If C were the magic tool that "in the right hands" would solve all the
programming problems, there wouldn't have been decades of development in
new, safer, languages.
C, IMO, is not /the/ right tool for /any/ job. Is the right tool for
/some/ very specific jobs.
In other cases C is just the only available alternative to assembly, so
it turns out to be immensely better, but only for lack of a better
alternative.
And as an example of the fact that C is insane in some of its aspect,
consider this: namespacing. What kind of sane modern language would
require programmers to remember that:
1. any identifier beginning with "str" are reserved,
2. any identifier beginning with "mem" are reserved,
3. any identifier defined in any included header are reserved,
4. etc.
If you redefine such identifier you trigger UB!!!
So I, as a C programmer, I must know EVERY SINGLE ONE identifier defined
in <stdio.h>, but... no! if <stdio.h> includes other headers, I must
know any other identifier defined in them. So I either know the
inclusion tree of any library header or I "simply" remember every
identifier defined in the whole library!!!
That's an enormous effort for a developer's memory, which is plainly
silly, and it is justifiable only because modern C is backward
compatible with it C89 standard, which is somewhat compatible with that
old thing K&R devised in the late 60s!!! It's a relic of the past.
Its limitations were justifiable in the 80s, when SW engineering was in
its infancy and computers with MBs of RAM were a dream.
Nowadays is plainly silly. It's only the economics behind the huge C
codebase that prevents people designing and/or using other, better
/system/ languages.
The very fact that C++ has found fertile ground is a proof that C is not
the silver bullet.
Although C++ retains all the risks of C (by design), its higher level
features allows it to build powerful abstractions that can be used to
shield the programmer from most causes of UB (e.g. RAII techniques, OO
support, templates, a extremely rich library with lots of data
structures that avoid the user direct memory management, and decent
namespacing!, etc.).
Again, as I said, I had to reconsider my extremely critic view on C in
the latest years. It has its advantages even today: it is as universal
as you can get for a computer language, and it is simple enough to be
learned easily. If you are careful and you don't try to be too smart,
you can avoid many pitfalls. But still, you always pay for something you
don't need or use: if you don't need ultra-optimized code and max speed,
or access to the bare metal, you still must be extra wary for even the
innocuous looking (1 << 16) expression (that's UB on implementations
where ints are 16 bit wide!!!).
And in fact, many of
the "higher" level languages have implementations written in C. Does
that mean it would be "insane" to code a "high level application" in any
language whose underlying implementation was written in C? I'd think
not.
How the fact that an higher level language implementation may be written
in C does render the higher level language intrinsically unsafe as the
underlying C?
If the language is implemented correctly, and it is meant to be safer
than C, the compiler/interpreter programmer will have coded checks and
systems that shields the end-user (the would-be programmer of the higher
level language) from the underlying C problems.
I never said that you can't write a safe application in C (but it is
very difficult, with all the loopholes). I said that C is unsafe as it
is specified: you have to provide all the possible checks to make the
application safe. This is a huge amount of work for most applications. A
simple integer overflow is UB in C. And that's not because it can't be
made well-defined, but because that would thwart optimizations
opportunities on some platforms (even if it is NOT your platform!), just
for the sake of portability.
Should any application programmer worry about any innocent-looking
integer expression he writes because it could trigger UB in some obscure
corner case? That's insane, I repeat it, for most applications.
"Fail fast and fail loudly". That's one of my favorite SW engineering
mottos. UB is utterly unpredictable. Change compiler version down the
road and some code that has worked for ages now is broken just for a
silly bug that could have been spotted from the beginning, had the
language be, for example, Lua.
Will
Cheers!
-- Lorenzo
- References:
- Re: Must "attempt to get length of a number value" be an error?, Coda Highland
- Re: Must "attempt to get length of a number value" be an error?, Dirk Laurie
- Re: Must "attempt to get length of a number value" be an error?, Egor Skriptunoff
- Re: Must "attempt to get length of a number value" be an error?, Egor Skriptunoff
- Re: Must "attempt to get length of a number value" be an error?, Sean Conner
- Re: Must "attempt to get length of a number value" be an error?, Egor Skriptunoff
- Re: Must "attempt to get length of a number value" be an error?, Sean Conner
- Re: Must "attempt to get length of a number value" be an error?, Egor Skriptunoff
- math.abs(math.mininteger) (Was: Must "attempt to get length of a number value" be an error?, Dirk Laurie
- Re: math.abs(math.mininteger) (Was: Must "attempt to get length of a number value" be an error?, Philippe Verdy
- Re: math.abs(math.mininteger) (Was: Must "attempt to get length of a number value" be an error?, Andrew Gierth
- Re: math.abs(math.mininteger) (Was: Must "attempt to get length of a number value" be an error?, Philippe Verdy
- Re: math.abs(math.mininteger) (Was: Must "attempt to get length of a number value" be an error?, Lorenzo Donati
- Re: math.abs(math.mininteger) (Was: Must "attempt to get length of a number value" be an error?, Philippe Verdy
- Re: math.abs(math.mininteger) (Was: Must "attempt to get length of a number value" be an error?, Lorenzo Donati
- Re: math.abs(math.mininteger) (Was: Must "attempt to get length of a number value" be an error?, Will Crowder