![](http://www.gravatar.com/avatar/9d50e67c0bd5a569a6e15d93a6a66c2a.jpg?s=60&d=http://forge.voodooprojects.org/media/idf/img/spacer.gif)
Comment 1 by mohamed khairy, Oct 24, 2011
after reading some other issues i see that chameleon has problem on injection of secondary gfx card :(
![](http://www.gravatar.com/avatar/52efd03a9e7f32aa6b11b197bb5a92c2.jpg?s=60&d=http://forge.voodooprojects.org/media/idf/img/spacer.gif)
Comment 2 by Cosmosis Jones, Nov 3, 2011
You cannot use GraphicsEnabler=YES w/ dual video cards.. ESPECIALLY when they are 2 different chipsets. you have to use DSDT or EFI dev-prop strings.
Status:
Invalid
![](http://www.gravatar.com/avatar/9d50e67c0bd5a569a6e15d93a6a66c2a.jpg?s=60&d=http://forge.voodooprojects.org/media/idf/img/spacer.gif)
Comment 3 by mohamed khairy, Nov 9, 2011
i solved it from nearly two weeks iedited chameleon nvidia.c renamed parent to child and now two cards works with gfx enabler of chameleon i think it will be good if there are boot flag enable you to choose between parent and child and if chamelon founded two cards make second as child it will be amazing thanks very much
![](http://www.gravatar.com/avatar/9d50e67c0bd5a569a6e15d93a6a66c2a.jpg?s=60&d=http://forge.voodooprojects.org/media/idf/img/spacer.gif)
Comment 7 by mohamed khairy, Nov 10, 2011
i just replaced this line 81 const char *nvidia_device_type[] = { "device_type", "NVDA,Parent" }; by this 81 const char *nvidia_device_type[] = { "device_type", "NVDA,Child" }; if you ask for patch for enable choosing between parent and child iam not a programer :) thanks
![](http://www.gravatar.com/avatar/9d50e67c0bd5a569a6e15d93a6a66c2a.jpg?s=60&d=http://forge.voodooprojects.org/media/idf/img/spacer.gif)
Comment 8 by mohamed khairy, Nov 10, 2011
i think the code will be like this define if there are one gfx card (nvidia) inject parent if there are two cards if the first card nvidia inject parent if the second card nvidia inject child transform it to coca programing :D hhhhhhhhhh
![](http://www.gravatar.com/avatar/4f2b6397d3db4e9b7cd1c5b2858c33e5.jpg?s=60&d=http://forge.voodooprojects.org/media/idf/img/spacer.gif)
Comment 9 by Jeremy Agostino, Nov 13, 2011
Would an approach to solving this multiple video card problem be to make the setup_*_devprop functions aware of how many video cards have already been detected? In mohamed's particular case, being that the nVidia card is the second card in his system he needed a dev-prop string with "NVDA,Child" instead of "NVDA,Parent". Presumably for an ATI card the format string at line 1410 of ati.c 1410> sprintf(name_parent, "ATY,%sParent", card->cfg_name); would have to be changed to "ATY,%sChild"? I'm hesitant to make a patch because 1) I don't have a dual-card system to test with, 2) the requisite refactoring would involve changing the prototypes of setup_*_devprop functions, and 3) I have no conception of what the side-effects of doing this would be.
![](http://www.gravatar.com/avatar/9d50e67c0bd5a569a6e15d93a6a66c2a.jpg?s=60&d=http://forge.voodooprojects.org/media/idf/img/spacer.gif)
Comment 10 by mohamed khairy, Nov 13, 2011
@ Jeremy Agostino do you know device_type anther Geforce , Parent and Child ?
![](http://www.gravatar.com/avatar/9d50e67c0bd5a569a6e15d93a6a66c2a.jpg?s=60&d=http://forge.voodooprojects.org/media/idf/img/spacer.gif)
Comment 12 by mohamed khairy, Nov 14, 2011
on nvidia card injection you choose between this "NVDA,Geforce" "NVDA,Parent" "NVDA,Child" is there anther option to choose i am asking this for nvidia optimus card injection
![](http://www.gravatar.com/avatar/4f2b6397d3db4e9b7cd1c5b2858c33e5.jpg?s=60&d=http://forge.voodooprojects.org/media/idf/img/spacer.gif)
Comment 13 by Jeremy Agostino, Nov 14, 2011
Not that I know of, but I'm not really the one who would know. My *guess* is that those strings are being matched in the Geforce kernel extension. The keys in nvidia.c (device_type, name, compatible, etc) are the keys that you can match against with IOPCIDevice when you're writing a Mac driver. From what I understand, Chameleon is just creating an OF/EFI hardware index that the Mac OS X driver stack can understand ("injecting" devices, so to speak). I wish I had some hardware with this Optimus configuration so I could experiment and see exactly how the device(s) are being parsed and reported. I can only speculate that the inability to switch between GPUs on
![](http://www.gravatar.com/avatar/4f2b6397d3db4e9b7cd1c5b2858c33e5.jpg?s=60&d=http://forge.voodooprojects.org/media/idf/img/spacer.gif)
Comment 14 by Jeremy Agostino, Nov 14, 2011
...between GPUs on a Hackintosh stems from the Apple graphics drivers being written to handle those Macs that have dual GPUs very specifically. As far as I know it's almost always possible to force usage of either one of the GPUs in Mac OS X, either through a DSDT patch or BIOS settings.
![](http://www.gravatar.com/avatar/9d50e67c0bd5a569a6e15d93a6a66c2a.jpg?s=60&d=http://forge.voodooprojects.org/media/idf/img/spacer.gif)
Comment 15 by mohamed khairy, Nov 14, 2011
i know that i tried all device type above no one of them work my thoughts about optimus on mac that making nvidia card works like secondary card means that the two cards works at the same time then thinking on switching i had found that on my del inspiron n5110 the intel card is directly connected to lvds and vga so it do framebuffering for this both connector and nvidia card is connected to HDMI so it do framebuffering for it the problem is that i dont have display with hdmi to do test but anyway till now i didnt reached to right injection on dsdt note chamelon cant read vbios as it is apart on system bios so i must extract vbios and tell chameoleon to use it or chamelon stop when using gx enabler and this is my dsdt injection according to my thoughts "@0,compatible", Buffer (0x0B) { "NVDA,NVMac" }, "@0,connector-type", Buffer (0x04) { 0x00, 0x08, 0x00, 0x00 }, "@0,device_type", Buffer (0x08) { "display" }, "@0,name", Buffer (0x0F) { "NVDA,Display-A" }, "NVCAP", Buffer (0x14) { /* 0000 */ 0x04, 0x00, 0x00, 0x00, 0x00, 0x00, 0x0D, 0x00, /* 0008 */ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x0A, /* 0010 */ 0x00, 0x00, 0x00, 0x00 }, "VRAM,totalsize", Buffer (0x04) { 0x00, 0x00, 0x00, 0x40 }, "device_type", Buffer (0x0D) { "NVDA,Child" }, "hda-gfx", Buffer (0x0A) { "onboard-1" }, "model", Buffer (0x17) { "nVidia GeForce GT 525M" }, "rom-revision", Buffer (0x0F) { "70.8.56.0.a" }
![](http://www.gravatar.com/avatar/4f2b6397d3db4e9b7cd1c5b2858c33e5.jpg?s=60&d=http://forge.voodooprojects.org/media/idf/img/spacer.gif)
Comment 16 by Jeremy Agostino, Nov 14, 2011
That looks in line with what I'm using on my ThinkPad T410. Some models of the T410 have the switchable graphics. While mine just has the discrete, I've found a big ol' debug string in my DSDT decompile pointing out the Optimus DSM method (NVOP).
Sign in to reply to this comment.
Reported by mohamed khairy, Oct 24, 2011