Absolutely awesome... Runs flawlessly, everything works. Game runs fast and smooth nearly out of the box.
What does not
What was not tested
***System Information*** CPU: AMD Phenom x4 3,4Ghz RAM: 16G GPU: ATI 6990 VGA Driver: Latest Catalyst 12.4 ***Game Settings*** Resolution: 1920x1080 - Fullscreen Texture Cache: Large Vsync: On ***Installation*** 1. operating on a clean WINEPRFIX (dont know if thats nec. but im using a new PREFIX for every game/program i install... 2. env WINEPREFIX=~/yourfoolwinepath/ winetricks xact_jun2010 3. env WINEPREFIX=~/yourfoolwinepath/ wine Setup.exe The installer runs without any problems. After that the game crashes for me after leaving the first room and get rescued by the carguy. The game freezes. So i applyed the Update 1 from ID (normally youÂ´ll get it from Steam if you got the arselicking Steamversion ^^) After that the game runs really flawless without any probs (okay i got some black graphic issues, but thats not to be announcend here). I would even give the game a platinum status. IDÂ´s definetly the best choice. ***Additional Notes*** Take a look at the little "dancing" character in the front of you sitting in the buggy where you get rescued from the bandits direct after starting the game. Here wo go :-) knee down for the one and only DOOM character taken you back in the days :-D the older ones may know what i mean ^^ ID rulez!!
|Operating system||Test date||Wine version||Installs?||Runs?||Used|
|Current||openSUSE 12.1 x86_64||May 03 2012||1.5.3||Yes||Yes||Gold||akI*|
|Show||openSUSE 11.4 x86_64||Mar 15 2012||1.4||Yes||Yes||Gold||akI*|
|Show||Slackware -current||Dec 27 2011||1.3.35||Yes||Yes||Gold||an anonymous user|
|Show||Ubuntu 11.10 "Oneiric" amd64 (+ variants like Kubuntu)||Dec 26 2011||1.3.35||Yes||Yes||Gold||Nonoo|
|Show||Linux Mint 12 "Lisa" x86_64||Nov 27 2011||1.3.33||N/A||Yes||Gold||Xpander|
|Bug #||Description||Status||Resolution||Other apps affected|
|12182||Multiple games need X3DAudio1_1.dll (Supreme Commander)||CLOSED||FIXED||View|
|27779||Desktop mouse pointer always visible in Steam games||CLOSED||FIXED||View|
|28679||Corrupt sound in many apps||CLOSED||FIXED||View|
|28723||Sound stutter in Rage when emulated windows version is set to "Windows 7" (XAudio2 -> mmdevapi sound output path)||CLOSED||FIXED||View|
|28730||The game "RAGE"crash at startup with "SetPixelFormat failed" error||CLOSED||INVALID||View|
Game requires XAudio2 engine version 2.7 which is a part of Microsoft DirectX June 2010 redistributable. In case you had downloaded and launch the game in a "usual" way using Steam client in a clean Wine prefix it should automatically install DirectX runtime for you. In an unfortunate case it had failed for some reason you should use fresh version of winetricks to install xact_jun2010. Be careful not to confuse it with xact, which would install older version (2.6) of XAudio2 engine from February 2010 DirectX redist that is too old and won't fit for this game.
It had also been reported that for some people it was required to set sound into "Emulation" mode in winecfg preferences. This setting is not available (and most likely is not required) in Wine versions 1.3.29+. In case you use older Wine version and experience sound problems your best bet would be to upgrade to a more recent Wine version. If you've got some strong reasons not to proceed with Wine version upgrade then you might give "Emulation" mode a try. YMMV.
It had been reported that sometimes Rage starts up OK but fails to produce any sound output for uncertain reasons. Chances are that you have your Wine prefix configured to emulate WinXP and you've got stale DirectSound registry entries which causes trouble for an unknown reasons. Try running Rage with a fresh and clean Wine prefix and check if it helps. You may also try to delete all HKLU/Software/Wine/DirectSound key and its subkeys from registry, in my case it helped to fix the problem and get Rage working correctly under old and polluted Wine prefix. Switching prefix to emulate Windows 7 or Vista is another possible workaround for this problem.
RAGE is based on idTech5 engine which uses "Virtual Texture" (a.k.a. "megatexture") technology to texture the game world. This technology is very computation intensive and is extremely hungry to the CPU power, to GPU power (in case you use Cuda-accelerated VT transcode) and to the overall throughput of the disk subsystem. The engine is designed in a such way so it should be tweaked to run most effectively using all available CPU cores your system is equipped with. There's built-in CPU core count detector that is supposed to automatically tweak the engine but it fails to properly do its job due to limitation imposed by Wine (as of Wine 1.5.2). Problem is that the game uses a technique to determine CPU cores count which isn't properly implemented in Wine resulting the game to assume that the system is a single CPU single core box.
To manually tweak the engine you should set the value of two game cvars, namely "vt_maxPPF" and "jobs_numThreads". First one determines how much texture pieces should be decoded per displayed frame in the worst case, second one tells the game to use that much threads simultaneously to do its tasks.
Correct value for "jobs_numThreads" depends on the amount of cores your CPU has. Quad core CPU owners should use 3 or 4, dual core CPUs would work better with 1 or 2. In case your CPU is Hyper-Threaded you'd better set this cvar to the number of physical cores your CPU has (i.e. for 4 cores/8 HT execution threads CPU best value would be 4). At some rare cases (useful for dual core CPU + AMD GPU) it might be better not to use separate job thread at all due to limitations imposed by GPU driver. For latter case you should set jobs_numThreads to 0 to completely turn off usage of job threads.
Best place to set "jobs_numThreads" is in Steam Game Launch Options (details on how to do it). Add something like "+jobs_numThreads
With Rage path 1.2 (released on the 2th of Feb, 2012) it became possible to benchmark the speed of the texture transcode operation. Unless you had compiled and installed the wine-cuda DLL wrapper for your system the speed of this operation is mostly determined by the speed of your CPU and the amount of threads the game is instructed to use through "jobs_numThreads" variable. Don't be fooled by the number this benchmark presents to you - having best texture transcoding performance is not the same as having smooth gaming experience with consistent FPS. If you want to maximize the transcoding performance despite this warning you can try to do as follows:
It is important to understand that there's no point in getting maximum texture transcode performance no matter the costs because it might result in extremely unstable FPS and non smooth gaming experience. For example, with AMD FX 8120 eight core CPU one would get maximum transcoding performance (bench. result ~80 megatexels) when "jobs_numThreads" is set to 8, while most stable FPS level is achieved with "jobs_numThreads" equal to 2 (smooth and almost constant 60FPS) despite the fact that benchmark result for latter case is significantly lower - around 40 megatexels. In other words, if the benchmark result is 35-40 or more - there's no point in trying improve it. On the other hand, having benchmark result less than 20 would result in jerky gaming experience with huge FPS drops during viewport moves.
Second cvar to tweak - "vt_maxPPF" - effectively acts as a balancer between FPS stability and the amount of texture pop-ins. Depending on the speed and type of your CPU and also on whether you use Cuda-accelerated texture decoding or not the good value for "vt_maxPPF" cvar might be in range from 8 to 256. Owners of low-speed dualcore CPU should use 8 or 16, typical value for quad core AMD Phenom II X4 955 CPU would be 32 or 64. Users with Cuda-enabled setups might be happy with setting vt_maxPPF to 128 or 256. In case you can live with moderate amount of textures pop-in but can't stand FPS drops below 60 when moving viewport your best bet would be to use low value for vf_maxPPF like 8 or even 4. To set the value of this cvar add something like "+vt_maxPPF
If your system configuration is similar to "quad-core CPU + GeForce 550 Ti with 1Gb" you could use "jobs_numThreads" set to 2 and "vt_maxPPF" set to 16 as a good starting values. It would allow you to get smooth 60FPS gaming experience when playing at 1680x1050 resolution with 4x antialiasing, "big" textures cache and "high" aniso settings.
In case your videocard have got more than 1GB of video RAM it is reasonable to increase game virtual texture page sizes (think about itÂ as "some kind of texture cache") to be 8K per page. To do this create a new file in Â«Steam Install DirÂ»/steamapps/common/rage/base/ foldÂer named Â«Rageconfig.cfgÂ» and populate it with following lines:
vt_pageimagesizeuniquediffuseonly2 "8192" vt_pageimagesizeuniquediffuseonly "8192" vt_pageimagesizeunique "8192"
Create Â«Rageconfig.cfgÂ» as was described earlier and populate it with the following lines:
// Do not show unskipable introduction video com_skipIntroVideo "1" // Enable built-in engine FPS counter com_showFPS "1" con_noPrint "0" // Increase mouse sensitivity and use 2-tap mouse input filter to fix "jerky" mlook m_sensitivity 10 m_smooth 2Note: It might be required to specify "+com_skipIntroVideo 1" in the Steam's game "Launch properties..." for intro video skip to work. I have it set in both places and it works like a charm.
You should never force AA/ANISO levels through GPU driver control panel. Game engine uses complicated rendering technology which is incompatible with the ways drivers force AA/ANISO levels and results in visible render glitches in case AA/ANISO level forcing is turned on. nVIDIA users should turn off "Texture Sharpening" as it causes huge render glitches too. Correct way to set the AA level is to use in-game graphical settings menu. Setting the level of anisotropic filtering requires modifying game configuration file. Create Â«Rageconfig.cfgÂ» as was described earlier and populate it with the following lines:
image_anisotropy "1" vt_maxaniso "2"These two settings control the ANISO level the game uses for texture filtering. Official tweak guide states that the engine only supports two levels on ANISO filtering, namely 2 and 4, and to select one of them you should modify the value of vt_maxaniso cvar. My experiments showed that the visual difference between levels 2 and 4 is almost negligible while performance drop is pretty huge. YMMV, experiment and choose the best to fit your needs. I hadn't been able to notice any visual and performance difference changing image_anisotropy cvar value. Still it might be wise to set it to be 8 or 16 in case it has effect on visual quality in some rare circumstances.
Rage engine is designed in a special way to support brand-new vsync controlling style. Main idea is to dynamically control vsync turning it on in case system it fast enough to render at 60+ FPS and off in case FPS is less than 60 FPS. This technology allows tear-free 60 FPS gameplay on decent systems without undesired drops down to 30 FPS in performance-constrained situations. Unfortunately current versions of GPU drivers lack OpenGL extension that is required for this vsync control system to work. Although it had been reported by nVIDIA linux driver development team member that the implementation of this extension is complete and it would be introduced in future driver versions, as of driver version 295.40 the extension in question is still unavailable. To overcome the problem and get a tear-free gaming experience in Rage one would have to reconfigure the game to use traditional vsync controlling scheme (FPS drops down to 15/30 in case system is not fast enough to render at 60+/30+ FPS). Recommended way to enable traditional vsync control scheme is to create Â«Rageconfig.cfgÂ» file as explained in previous section and add the following line into it:
It should do the trick for the most cases. With some buggy GPU drivers this setting might have no effect. In such case you should try to force vsync "on" in GPU driver control panel. It might help but be prepared to face game "crash to desktop" and all kinds of render glitches as game engine wasn't designed to work with vsync forced on using GPU driver control panel.