The following code (reduced varint encoding) works as expected on Linux x86_64 if built without flags or just one of -O and -release, but fails if both -O -release are specified: --- import core.stdc.stdio; ubyte foo(uint n) { ubyte[5] buf = void; ubyte wsize; while (true) { if ((n & ~0x7F) == 0) { buf[wsize++] = cast(ubyte)n; break; } else { buf[wsize++] = cast(ubyte)((n & 0x7F) | 0x80); n >>= 7; } } printf("%hhu\n", wsize); return buf[0]; } void main() { printf("%hhx\n", foo(3)); } --- More specifically, the output (printf()s for shorter assembly) is »1 e0« for the optimized build instead of »1 3«, and the program crashes most of the time (different errors: segfaults, illegal instruction, glibc free() assert triggers, ...) – stack corruption?
https://github.com/D-Programming-Language/dmd/commit/683624c4d7ae2f58fe2d8164ba7be77e08a4424f https://github.com/D-Programming-Language/dmd/commit/3d21709860f771a7376eef7e7d5f924941421d02
AltStyle によって変換されたページ (->オリジナル) / アドレス: モード: デフォルト 音声ブラウザ ルビ付き 配色反転 文字拡大 モバイル