Issue 193: problem with xfx geforce 210 !!

Reported by mohamed khairy, Oct 24, 2011

i got this card to make it as physx card on windows

my pc have  hd 5770 on first pcie 16 @8slot
and the new card on second pcie 16 @8

it works under windows

but under mac there are problem on patching vbios 

first it read its memory as 1mb so i added this line to nvidia.c

		case 0x0A65: vram_size = 1024*1024*1024; break; // 210           

put also there are problem 
here is chameleon log

Using PCI-Root-UID value: 0
Framebuffer @0xB0000000  MMIO @0xF7FC0000	I/O Port @0x0000B000 ROM 
Addr @0xF7FA0000
ATI card POSTed, reading VBIOS from legacy space
Framebuffer set to device's default: Vervet
(AtiPorts) Nr of ports set to: 3
ATI Juniper AMD Radeon HD 5700 Series 1024MB (Vervet) [1002:68b8] 
(subsys [174b:1482]):: PciRoot(0x0)/Pci(0x2,0x0)/Pci(0x0,0x0)
nVidia GeForce 210 1024MB NV1ff [10de:0a65] :: 
Bad display config block signature (0x2608578b)
ERROR: nVidia ROM Patching Failed!

also tried dsdt method to inject card but not working :(

need help please

Comment 1 by mohamed khairy, Oct 24, 2011

after reading some other issues 
i see that chameleon has problem on injection of secondary gfx card 

Comment 2 by Cosmosis Jones, Nov 3, 2011

You cannot use GraphicsEnabler=YES w/ dual video cards.. ESPECIALLY 
when they are 2 different chipsets. you have to use DSDT or EFI 
dev-prop strings.
Status: Invalid

Comment 3 by mohamed khairy, Nov 9, 2011

i solved it from nearly two weeks

 iedited chameleon nvidia.c

renamed parent to child

and now two cards works with gfx enabler of chameleon

i think it will be good if there are boot flag enable you to choose 
between parent and child
and if chamelon founded two cards   make second as child

it will be amazing

thanks very much

Comment 4 by Cosmosis Jones, Nov 9, 2011

your patch?

Comment 5 by Cosmosis Jones, Nov 9, 2011

Status: Accepted

Comment 6 by Cosmosis Jones, Nov 9, 2011

Status: AwaitingInformation

Comment 7 by mohamed khairy, Nov 10, 2011

i just replaced this line
const char *nvidia_device_type[]    =    { "device_type",  
  "NVDA,Parent"     }; 

by this 

const char *nvidia_device_type[]    =    { "device_type",  
  "NVDA,Child"     }; 

if you ask for  patch for enable choosing between parent and child 
iam not a programer :)


Comment 8 by mohamed khairy, Nov 10, 2011

i think the code will be like this

if there are one gfx card (nvidia)
inject parent

if there are two cards 
    if the first card nvidia
    inject parent
    if the second card nvidia
    inject child

transform it to coca programing :D

Comment 9 by Jeremy Agostino, Nov 13, 2011

Would an approach to solving this multiple video card problem be to 
make the setup_*_devprop functions aware of how many video cards 
have already been detected? In mohamed's particular case, being that 
the nVidia card is the second card in his system he needed a 
dev-prop string with "NVDA,Child" instead of 
"NVDA,Parent". Presumably for an ATI card the format 
string at line 1410 of ati.c
1410> sprintf(name_parent, "ATY,%sParent", 
would have to be changed to "ATY,%sChild"?

I'm hesitant to make a patch because 1) I don't have a dual-card 
system to test with, 2) the requisite refactoring would involve 
changing the prototypes of setup_*_devprop functions, and 3) I have 
no conception of what the side-effects of doing this would be.

Comment 10 by mohamed khairy, Nov 13, 2011

@ Jeremy Agostino
do you know device_type anther Geforce , Parent and Child ?

Comment 11 by Jeremy Agostino, Nov 13, 2011

Sorry, what are you asking me?

Comment 12 by mohamed khairy, Nov 14, 2011

on nvidia card injection you choose between this


is there anther option to choose 
i am asking this for nvidia optimus card injection

Comment 13 by Jeremy Agostino, Nov 14, 2011

Not that I know of, but I'm not really the one who would know. My 
*guess* is that those strings are being matched in the Geforce 
kernel extension. The keys in nvidia.c (device_type, name, 
compatible, etc) are the keys that you can match against with 
IOPCIDevice when you're writing a Mac driver. From what I 
understand, Chameleon is just creating an OF/EFI hardware index that 
the Mac OS X driver stack can understand ("injecting" 
devices, so to speak).

I wish I had some hardware with this Optimus configuration so I 
could experiment and see exactly how the device(s) are being parsed 
and reported. I can only speculate that the inability to switch 
between GPUs on

Comment 14 by Jeremy Agostino, Nov 14, 2011

...between GPUs on a Hackintosh stems from the Apple graphics 
drivers being written to handle those Macs that have dual GPUs very 
specifically. As far as I know it's almost always possible to force 
usage of either one of the GPUs in Mac OS X, either through a DSDT 
patch or BIOS settings.

Comment 15 by mohamed khairy, Nov 14, 2011

i know that

i tried all device type above no one of them work

my  thoughts about optimus on mac
 that making nvidia card works like secondary card 
means that the two cards works at the same time
then thinking on switching

i had found that on my del inspiron n5110 the intel card is directly 
connected to lvds and vga so it do framebuffering for this both 
and nvidia card is connected to HDMI so it do framebuffering for it

the problem is that i dont have display with hdmi to do test

but anyway till now i didnt reached to right injection on dsdt

note chamelon cant read vbios as it is apart on system bios so i 
must extract vbios and tell chameoleon to use it
or chamelon stop when using gx enabler

and this is my dsdt injection according to my thoughts

                                Buffer (0x0B)
                                Buffer (0x04)
                                    0x00, 0x08, 0x00, 0x00
                                Buffer (0x08)
                                Buffer (0x0F)
                                Buffer (0x14)
                                    /* 0000 */    0x04, 0x00, 0x00, 
0x00, 0x00, 0x00, 0x0D, 0x00, 
                                    /* 0008 */    0x00, 0x00, 0x00, 
0x00, 0x00, 0x00, 0x00, 0x0A, 
                                    /* 0010 */    0x00, 0x00, 0x00, 
                                Buffer (0x04)
                                    0x00, 0x00, 0x00, 0x40
                                Buffer (0x0D)
                                Buffer (0x0A)
                                Buffer (0x17)
                                    "nVidia GeForce GT 
                                Buffer (0x0F)

Comment 16 by Jeremy Agostino, Nov 14, 2011

That looks in line with what I'm using on my ThinkPad T410. Some 
models of the T410 have the switchable graphics. While mine just has 
the discrete, I've found a big ol' debug string in my DSDT decompile 
pointing out the Optimus DSM method (NVOP).

Comment 17 by Cosmosis Jones, May 8, 2012

Owner: mingy

Comment 18 by Cosmosis Jones, Oct 29, 2012

No movement.
Status: Invalid

Created: 8 years 5 months ago by mohamed khairy

Updated: 7 years 5 months ago

Status: Invalid

Owner: Moik The

Followed by: 2 persons