Ok, hello again GameDEV - It's been a few years since I was last here.
Anyways,
I am in middle of writing a VBE driver for my kernel, currently I have a double buffer which does stop the flashing, but even at low resolutions I am unable to copy the entire buffer into active video memory within the refresh time. (at 640X480X8 it takes 2.5 refresh cycles which causes tearing)
In Real-Mode I have set the display resolution, enabled LFB, and grabbed the PMode Interface Table - and passed all information to the Kernel.
The issue that I have been facing is when trying to use the SetDisplayStart, no matter my input on DX/CX I end up with the same results.
Currently my video memory has at offset 0 color 0x83 640*480 times and just after this we have color 0x28 640*480 times.
The Current VBE Mode info contains:
BytesPerLine = 0x280 (640)
LinBytesPerScanLine = 0x280
I am calling SetDisplayStart the following way (Intel ASM):
FlipPage:
pusha
mov eax, DWORD [SecondBuffer]
test eax, eax
jz .NotSecond
mov dx, 0
mov cx, 0
mov DWORD [SecondBuffer], 0
jmp .flipme
.NotSecond:
mov dx, WORD [Height]
mov cx, 0
mov DWORD [SecondBuffer], 1
.flipme:
;Now Lets Page Flip
xor bx, bx
mov ax, 0x4F07
xchg bx, bx
call DWORD [VBE_FUNCTION7]
test ah, ah
jnz ERROR
cmp al, 0x4F
jne ERROR
popa
ret
And no matter how DX/CX is filled the result is always (or Y out of range panic from Bochs):
[attachment=28809:2015-08-20-014835_642x555_scrot.png]
and yes even with DX and CX = 0 this is the result.
So I'm hoping beyond all hope someone here has done this before (as the internet seems to have no reference other than what is in the VBE Specifications...)
I know we are going back to the mid/early 90's here, but I cannot write a driver per card (most have no Info) so I'm stuck with VBE support and software rendering.