[En-Nut-Discussion] Yagarto from 23.12.2009 and newlib again...

Harald Kipp harald.kipp at egnite.de
Thu Feb 4 12:47:17 CET 2010

Hi Ulrich,

Ulrich Prinz wrote:

> Just a suggestion...
> If we insert a definition like this
> #define NLIB_CHAR char
> #endif
> isalnum((NLIB_CHAR)*ptr1)

Sorry for not having been able to follow this in detail yet. I'm still
struggling with OpenOCD problems.

>From a shot look I had the feeling, that the problem is mainly caused by
 the ambiguity of the char type. It is handled differently by compilers
and some even provide options to switch it from signed to unsigned.

In early releases we generally used u_char for strings. Using negative
values for letters above ASCII 127 looked weird to us. Over the years
GCC became more picky on sign mismatch. In order to keep Nut/OS free of
compiler warnings, we were forced to replace u_char strings by signed
char strings.

It looks to me, as if the C community now decided to move away from
"char" being singed by default.

Although GCC offers this option, I'd not recommend to treat char as an
unsigned type by default. This may break existing code.

But if I'm right and the problems are caused by the ambiguity of the
type, then it may help to replace "char" by "signed char". Or, even
better, does the latest GCC allow to specify char as a signed value by
default? In that case we simply need to modify a few Makefiles.

In my view

strcpy(signed char *dst, const signed char *src);

is no longer C. If it continues this way, we will end up with something

Compiler warnings at

*dst, const char_where_sign_is_specified_by_a_compile_option_t *src);

Warning #1:
is deprecated, use

Warning #2:
The name of the variable dst is too short, consider using a more
descriptive name

Warning #3:
The name of the variable src is too short, consider using a more
descriptive name

Warning #4:
Too many warnings in one line, consider using the automatic code creator
instead of trying to write your own code


More information about the En-Nut-Discussion mailing list