I’m using this in Windows and Linux, but it seems that on Mac the function is not implemented.

It seems that there is some code in there, at least since multiple years, but it’s commented.
Why is it disabled? Does it work, if I uncomment it ?

I don’t remember exactly, but I don’t think there were any appropriate functions in cocoa to get that information. I doubt if the commented-out code will work.

Can confirm it doesn’t (getAGLAttribute doesn’t exist, but I’ll figure this out better).
I don’t understand a word in Cocoa / Obj-C stuff, but it seems that there is a class called NSOpenGLPixelFormat in the Apple documentation. Maybe it’s worth a look ?

Maybe, it’s a long time since I looked at that stuff.

Ok, so from what I’ve understood from spying on other’s code, if using aglChoosePixelFormat, it’s up to you to test the format.
So instead of querying a list of available format from the system, you have to provide a list, and then check if each item works, or not.
What do you think, it could probably be fixed by checking the classical ARGB 32bits, a luminance value (Y8 and Y16) and that’s all. I don’t think anyone is using RGB565 anymore and the other dumb stuff, am I wrong ?

Ok, I’ve got this code to work

#ifdef JUCE_MAC
			GLint attribs [64];
			int n = 0;
			attribs[n++] = AGL_RGBA;
			attribs[n++] = AGL_DOUBLEBUFFER;
			attribs[n++] = AGL_ACCELERATED;
			attribs[n++] = AGL_NO_RECOVERY;
			attribs[n++] = AGL_NONE;
			AGLPixelFormat p = aglChoosePixelFormat (0, 0, attribs);
			GLint val = 0;
			while (p != 0)
				OpenGLPixelFormat* const pf = new OpenGLPixelFormat();
				aglDescribePixelFormat(p, AGL_RED_SIZE, &val);
				pf->redBits = val;
				aglDescribePixelFormat(p, AGL_GREEN_SIZE, &val);
				pf->greenBits = val;
				aglDescribePixelFormat(p, AGL_BLUE_SIZE, &val);
				pf->blueBits = val;
				aglDescribePixelFormat(p, AGL_ALPHA_SIZE, &val);
				pf->alphaBits = val;
				aglDescribePixelFormat(p, AGL_DEPTH_SIZE, &val);
				pf->depthBufferBits = val;
				aglDescribePixelFormat(p, AGL_STENCIL_SIZE, &val);
				pf->stencilBufferBits = val;
				aglDescribePixelFormat(p, AGL_ACCUM_RED_SIZE, &val);
				pf->accumulationBufferRedBits = val;
				aglDescribePixelFormat(p, AGL_ACCUM_GREEN_SIZE, &val);
				pf->accumulationBufferGreenBits = val;
				aglDescribePixelFormat(p, AGL_ACCUM_BLUE_SIZE, &val);
				pf->accumulationBufferBlueBits = val;
				aglDescribePixelFormat(p, AGL_ACCUM_ALPHA_SIZE, &val);
				pf->accumulationBufferAlphaBits = val;
				results.add (pf);
				p = aglNextPixelFormat (p);

This does test for RGBA only. The other tests should follow the same syntax. aglNextPixelFormat doesn’t return the next one in the list, but only the next one that EXACTLY match the requirement (so in that example, the next one that is RGBA).
On the mac it doesn’t seem to work without specifying the AGL_RGBA attribute to get all the other format, but if you can test on your big mac, I would be glad you can confirm this.

AGL is deprecated now - there may or may not be a cocoa equivalent, but using AGL’s not the way to do it.

It’s not written as deprecated on Apple documentation (while some functions are, but they provide a replacement).
The issue I have with Cocoa, is that I don’t understand objective C code very well, anyway, can you try to fix this, as I think this could work:

    #ifdef JUCE_MAC
          NSOpenGLPixelFormatAttribute attributes[] =
/* Please try with the following lines commented and not commented 
              NSOpenGLPFADepthSize, (NSOpenGLPixelFormatAttribute)16,
              NSOpenGLPFAAlphaSize, (NSOpenGLPixelFormatAttribute)8,
              NSOpenGLPFARedSize, (NSOpenGLPixelFormatAttribute)8,
              NSOpenGLPFAGreenSize, (NSOpenGLPixelFormatAttribute)8,
              NSOpenGLPFABlueSize, (NSOpenGLPixelFormatAttribute)8,

          NSOpenGLPixelFormat* p = ([[[NSOpenGLPixelFormat alloc] initWithAttributes] autoRelease]);
          long ret = 0;
          if (p != nil)
                OpenGLPixelFormat* const pf = new OpenGLPixelFormat();
                [p getValues:&ret forAttribute:NSOpenGLPFARedSize forVirtualScreen: 0  ];
                pf->redBits = ret;
                [p getValues:&ret forAttribute:NSOpenGLPFAGreenSize forVirtualScreen: 0  ];
                pf->greenBits = ret;
                [p getValues:&ret forAttribute:NSOpenGLPFABlueSize forVirtualScreen: 0  ];
                pf->blueBits = ret;
                [p getValues:&ret forAttribute:NSOpenGLPFAAlphaSize forVirtualScreen: 0  ];
                pf->alphaBits = ret;
                [p getValues:&ret forAttribute:NSOpenGLPFADepthSize forVirtualScreen: 0  ];
                pf->depthBufferBits = ret;
                [p getValues:&ret forAttribute:NSOpenGLPFAStencilSize forVirtualScreen: 0  ];
                pf->stencilBufferBits = ret;
                [p getValues:&ret forAttribute:NSOpenGLPFAAccumRedSize forVirtualScreen: 0  ];
                pf->accumulationBufferRedBits = ret;
                [p getValues:&ret forAttribute:NSOpenGLPFAAccumGreenSize forVirtualScreen: 0  ];
                pf->accumulationBufferGreenBits = ret;
                [p getValues:&ret forAttribute:NSOpenGLPFAAccumBlueSize forVirtualScreen: 0  ];
                pf->accumulationBufferBlueBits = ret;
                [p getValues:&ret forAttribute:NSOpenGLPFAAccumAlphaSize forVirtualScreen: 0  ];
                pf->accumulationBufferAlphaBits = ret;
                results.add (pf);

The forVirtualScreen should be found from the component, but my knowledge is too limited to get this information.
Since there is no “monitor index” in juce, I don’t know how you could resolve this.

Well, I’ve not tested it, but this compiles:

[code]static int getPixelFormatAttribute (NSOpenGLPixelFormat* p, NSOpenGLPixelFormatAttribute att)
GLint val = 0;
[p getValues: &val forAttribute: att forVirtualScreen: 0];
return (int) val;

void OpenGLPixelFormat::getAvailablePixelFormats (Component* /component/,
OwnedArray & results)

NSOpenGLPixelFormatAttribute attributes[] =
    NSOpenGLPFADepthSize,  (NSOpenGLPixelFormatAttribute) 16,
    NSOpenGLPFAAlphaSize,  (NSOpenGLPixelFormatAttribute) 8,
    NSOpenGLPFAColorSize,  (NSOpenGLPixelFormatAttribute) 24,
    NSOpenGLPFAAccumSize,  (NSOpenGLPixelFormatAttribute) 32,

NSOpenGLPixelFormat* p = [[[NSOpenGLPixelFormat alloc] initWithAttributes: attributes] autorelease];

if (p != nil)
    OpenGLPixelFormat* const pf = new OpenGLPixelFormat();

    pf->redBits = pf->greenBits = pf->blueBits = getPixelFormatAttribute (p, NSOpenGLPFAColorSize) / 3;
    pf->alphaBits = getPixelFormatAttribute (p, NSOpenGLPFAAlphaSize);
    pf->depthBufferBits = getPixelFormatAttribute (p, NSOpenGLPFADepthSize);
    pf->stencilBufferBits = getPixelFormatAttribute (p, NSOpenGLPFAStencilSize);
    pf->accumulationBufferRedBits = pf->accumulationBufferGreenBits
        = pf->accumulationBufferBlueBits = pf->accumulationBufferAlphaBits
        = getPixelFormatAttribute (p, NSOpenGLPFAAccumSize) / 4;

    results.add (pf);


There were no attributes to get the individual red/green/blue sizes, so I’ve bodged that, and I’m not 100% sure what the NSOpenGLPFAAccumSize value means, since there’s no documentation about it - but this should at least be a start!

…and it looks like the constructor could also use a bit of a clean-up:

WindowedGLContext (Component& component, const OpenGLPixelFormat& pixelFormat_, NSOpenGLContext* sharedContext) : renderContext (0), pixelFormat (pixelFormat_) { NSOpenGLPixelFormatAttribute attribs[] = { NSOpenGLPFADoubleBuffer, NSOpenGLPFAAccelerated, NSOpenGLPFAMPSafe, NSOpenGLPFAClosestPolicy, NSOpenGLPFANoRecovery, NSOpenGLPFAColorSize, (NSOpenGLPixelFormatAttribute) (pixelFormat.redBits + pixelFormat.greenBits + pixelFormat.blueBits), NSOpenGLPFAAlphaSize, (NSOpenGLPixelFormatAttribute) pixelFormat.alphaBits, NSOpenGLPFADepthSize, (NSOpenGLPixelFormatAttribute) pixelFormat.depthBufferBits, NSOpenGLPFAStencilSize, (NSOpenGLPixelFormatAttribute) pixelFormat.stencilBufferBits, NSOpenGLPFAAccumSize, (NSOpenGLPixelFormatAttribute) (pixelFormat.accumulationBufferRedBits + pixelFormat.accumulationBufferGreenBits + pixelFormat.accumulationBufferBlueBits + pixelFormat.accumulationBufferAlphaBits), NSOpenGLPFASampleBuffers, (NSOpenGLPixelFormatAttribute) 1, 0 };

I don’t think the colour size was being set correctly before.

Off topic, but how do you get Xcode to rebuilt automatically the amalgamated files, when modifying the src/native/ files ?
Mine doesn’t so, whenever I change the source file it still refer to the amalgamated version.

right click > touch

touch what ?
Seems like none of the amalgamatedX.cpp file are being rebuilt, and it seems none contains the native/ code, or am I missing something ?

Touch the amalgamated files so it rebuild them from scratch

no… the only way to rebuild the amalg files is to run the amalgamator on them. There’s no build step to do it automatically.

you shouldn’t use the amalgamated files directly but use instead the file in the amalgamation folder like this is done in the juce demo.
And in that case, touch those file works.

No it doesn’t.
In fact, probably the amalgamated files in this folder should have a specific makefile rule that cause the amalgamator to re-run, in case any of the source file is modified.
This is not really hard to do under linux, since you only need to run gcc -MT -ME on them, but on the other platform it’s a bit harder.
If I were you, I would either drop the amalgamation all together, and let the preprocessor include the multiple cpp files in an amalgamation cpp file, and then compile that, or modify the amalgamator so that it writes out the dependencies rules in the different projects.
Something like :

// In Makefile:
amalgamatedX.cpp:  $(DEP)
    ./amalgamator $(JUCE_DIR)

    ./amalgamator --build-dependencies $(JUCE_DIR)

@Jules, the enumeration works. I haven’t checked the constructor stuff however.
Don’t forget to add AppKit framework to the Xcode project.

What’s AppKit needed for? I’ve never had any link problems without it.

Well, it seems I can’t link without AppKit framework added in the XCode project.
Maybe you’re including it automatically, in the new tip?