[openssl-dev] [openssl.org #4479] OS X 10.8 (x86_64): Compile errors when using "no-asm -ansi"
jeremy.farrell at oracle.com
Fri Mar 25 17:42:11 UTC 2016
On 25/03/2016 17:01, Jeffrey Walton wrote:
> On Fri, Mar 25, 2016 at 12:49 PM, Richard Levitte via RT <rt at openssl.org> wrote:
>> Vid Fre, 25 Mar 2016 kl. 16.31.14, skrev noloader at gmail.com:
>>> To configure:
>>> ./config shared no-asm -ansi -D_DEFAULT_SOURCE=__STRICT_ANSI__
>>> I'm not sure if Configure should set _DEFAULT_SOURCE=__STRICT_ANSI__
>> Why do you give it the value __STRICT_ANSI__? All documentation I find suggests
>> it's enough to simply define it. See man page feature_test_macros(7) on Linux
>> (at least)
>> The alternative is, of course, to define _DEFAULT_SOURCE in the files where
>> -ansi becomes a problem.
> That was based on examining /usr/include/features.h and the comment
> for _DEFAULT_SOURCE:
> _DEFAULT_SOURCE The default set of features (taking precedence
> over __STRICT_ANSI__).
> How do you convey features by just defining it? It seems like it needs
> an argument, like _DEFAULT_SOURCE=__STRICT_ANSI__ or
> But its definitely not my area of expertise. I've never had to define to before.
It's the fact of its being defined which indicates features - it's
tested in the GNU headers to decide what functionality to make visible.
The norm is just to define it, or to define it to 1; setting it to
__STRICT_ANSI__ would be a very confusing thing to do since the whole
point of defining it is to say that you don't want __STRICT_ANSI__.
Why do you want to be able to build on an OS released in 2012 with a
C89-only compiler? I'm probably missing something, but I'm struggling to
understand the point of this.
J. J. Farrell
Not speaking for Oracle.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the openssl-dev