This game doesn't even open GSFrame window, but it's the first time I saw Dbgconsole open.
Dbgconsole
Quote:[STARTUP] 2048k consumed by modules, 0k consumed by SPURS, 0k consumed by exception handling.
[STARTUP] Restricting 257024k heap to 69632k+3072k+112640k=185344k.
[STARTUP] Allocating 181M
[STARTUP] Heap allocated from [30300000,3b800000)
[STARTUP] Remaining user memory: 74688kb; USED=181M TSS=1536kb EXTRA=448kb (RESIDUAL)
[POOL] Residual allocator has 456384 bytes available at creation.
[FATAL] Critsec didn't init either control word
Log attached of course.
CPU : Intel Core i7-3770S
GPU : Nvidia GeForce GTX 980
RAM :8 GB DDR3
HDD: Western Digital 1TB+500GB
MOBO: Asus P8Z77-V LK
OS : Windows 10 Pro (1607) 64bit
CPU : Intel Core i7-3770S
GPU : Nvidia GeForce GTX 980
RAM :8 GB DDR3
HDD: Western Digital 1TB+500GB
MOBO: Asus P8Z77-V LK
OS : Windows 10 Pro (1607) 64bit
(03-15-2014, 09:10 PM)Blackbird Wrote: With sys_lwmutex functions implemented the log is a lot shorter http://pastebin.com/RzvBkD4F
The game even exits itself Well, seems to need spurs, like many other commercial games right now. Very shortly speaking spurs is about multithreading and such things, quite essential stuff.
Asus N55SF, i7-2670QM (~2,8 ghz under typical load), GeForce GT 555M (only OpenGL)
03-15-2014, 11:23 PM (This post was last modified: 03-15-2014, 11:27 PM by Blackbird.)
(03-15-2014, 09:49 PM)ssshadow Wrote:
(03-15-2014, 09:10 PM)Blackbird Wrote: With sys_lwmutex functions implemented the log is a lot shorter http://pastebin.com/RzvBkD4F
The game even exits itself Well, seems to need spurs, like many other commercial games right now. Very shortly speaking spurs is about multithreading and such things, quite essential stuff.
It's self aware I guess
And the GCM API RSX uses I think. What I'm interested in is how many games actually utilize all the SPUs rather than relying purely on the PPU. In theory all early 3rd party games could be easier to emulate as developers weren't familiar with the HW.
CPU : Intel Core i7-3770S
GPU : Nvidia GeForce GTX 980
RAM :8 GB DDR3
HDD: Western Digital 1TB+500GB
MOBO: Asus P8Z77-V LK
OS : Windows 10 Pro (1607) 64bit
03-16-2014, 12:26 AM (This post was last modified: 03-16-2014, 12:37 AM by ssshadow.)
(03-15-2014, 11:23 PM)Blackbird Wrote:
(03-15-2014, 09:49 PM)ssshadow Wrote:
(03-15-2014, 09:10 PM)Blackbird Wrote: With sys_lwmutex functions implemented the log is a lot shorter http://pastebin.com/RzvBkD4F
The game even exits itself Well, seems to need spurs, like many other commercial games right now. Very shortly speaking spurs is about multithreading and such things, quite essential stuff.
It's self aware I guess
And the GCM API RSX uses I think. What I'm interested in is how many games actually utilize all the SPUs rather than relying purely on the PPU. In theory all early 3rd party games could be easier to emulate as developers weren't familiar with the HW.
Pretty much 9/10 games with high end graphics utilize the SPUs, and spurs, to some extent at least from what I have tested. Obviously games like the Arkedo series has no real need to, which is why they work.
Same goes for gcm. Every graphically advanced game is going to be using it a lot.
A good exception from this is Disgaea 3 though. It's not a pretty game, but it is 3D nonetheless. It doesn't use spurs, and doesn't seem to use any advanced gcm commands either. If that strange video error could be fixed it might actually reach in game from there. (Or maybe not )
Asus N55SF, i7-2670QM (~2,8 ghz under typical load), GeForce GT 555M (only OpenGL)