So far everything I've tested. Install obviously is being done by Steam, so no problems with it. The game itself starts up fine, and there are no problems whatsoever with it. Everything seems to work:
What does not
What was not tested
Playing the game from start to end: had only played it for about 30 minutes for testing purposes before writing this report.
Giving it "Gold" rating as the game works exceptionally well, smoothly and without any significant problems (except for high mouse sensitivity issue which isn't a show-stopper thing), but requires XAudio2 engine to be installed (third-party software/native dll). One may run into graphical glitches in case he/she is on AMD/ATI GPU hardware (Catalyst driver is known to be buggy as hell) or in case he/she use outdated nVIDIA driver. I had to upgrade up to the 280.13 version to play the game without visual glitches (as of this testing I've been using driver version 295.40). It is very important not to force AA/Aniso using driver CP and not to enable "Texture sharpening" - using any of these would result in visual artifacts looking like "visible seams between polygons". AA level should be controlled using game settings menu, Aniso level might be changed by editing configutation file. System specs I had tested the game on: AMD FX(tm)-8120 Eight-Core CPU @3.1GHz, 8GB DDR3 RAM, nVIDIA GeForce GTX 550 Ti GPU with 1GB VRAM. With virtual texture 8K page size tweak in-place (of texture cache size set to be big through video settings menu) I've got pretty smooth and steady 60fps and nice-enough visual game experience. Using wine-cuda dll wrapper gives a nice boost to texture transcoding performance which helps to reduce "textures pop-in effect" to a negligible amount. Best gaming experience achieved when using "+com_skipIntroVideo 1 +r_swapInterval 1 +vt_maxPPF 32 +jobs_numThreads 3 +g_fov 100" as a game startup command line options.
|Operating system||Test date||Wine version||Installs?||Runs?||Used|
|Show||FreeBSD 11.1 amd64||Sep 28 2017||2.15-staging||Yes||Yes||No||Platinum||SF|
|Show||Ubuntu 16.04 "Xenial" amd64 (+ variants like Kubuntu)||Sep 14 2016||1.9.18||Yes||Yes||Garbage||Kari Saaranen|
|Show||Debian GNU/Linux 8.x "Jessie" x86_64||Mar 06 2016||1.9.1||Yes||Yes||Platinum||Roman Hargrave|
|Show||Arch Linux x86_64||Jul 24 2013||1.6||Yes||Yes||Platinum||Artur h0m3|
|Show||Slackware64 14.0||Jan 05 2013||1.5.21||Yes||Yes||Gold||an anonymous user|
Game requires XAudio2 engine version 2.7 which is a part of Microsoft DirectX June 2010 redistributable. In case you had downloaded and launch the game in a "usual" way using Steam client in a clean Wine prefix it should automatically install DirectX runtime for you. In an unfortunate case it had failed for some reason you should use fresh version of winetricks to install xact_jun2010. Be careful not to confuse it with xact, which would install older version (2.6) of XAudio2 engine from February 2010 DirectX redist that is too old and won't fit for this game.
It had also been reported that for some people it was required to set sound into "Emulation" mode in winecfg preferences. This setting is not available (and most likely is not required) in Wine versions 1.3.29+. In case you use older Wine version and experience sound problems your best bet would be to upgrade to a more recent Wine version. If you've got some strong reasons not to proceed with Wine version upgrade then you might give "Emulation" mode a try. YMMV.
It had been reported that sometimes Rage starts up OK but fails to produce any sound output for uncertain reasons. Chances are that you have your Wine prefix configured to emulate WinXP and you've got stale DirectSound registry entries which causes trouble for an unknown reasons. Try running Rage with a fresh and clean Wine prefix and check if it helps. You may also try to delete all HKLU/Software/Wine/DirectSound key and its subkeys from registry, in my case it helped to fix the problem and get Rage working correctly under old and polluted Wine prefix. Switching prefix to emulate Windows 7 or Vista is another possible workaround for this problem.
RAGE is based on idTech5 engine which uses "Virtual Texture" (a.k.a. "megatexture") technology to texture the game world. This technology is very computation intensive and is extremely hungry to the CPU power, to GPU power (in case you use Cuda-accelerated VT transcode) and to the overall throughput of the disk subsystem. The engine is designed in a such way so it should be tweaked to run most effectively using all available CPU cores your system is equipped with. There's built-in CPU core count detector that is supposed to automatically tweak the engine but it fails to properly do its job due to limitation imposed by Wine (as of Wine 1.5.2). Problem is that the game uses a technique to determine CPU cores count which isn't properly implemented in Wine resulting the game to assume that the system is a single CPU single core box.
To manually tweak the engine you should set the value of two game cvars, namely "vt_maxPPF" and "jobs_numThreads". First one determines how much texture pieces should be decoded per displayed frame in the worst case, second one tells the game to use that much threads simultaneously to do its tasks.
Correct value for "jobs_numThreads" depends on the amount of cores your CPU has. Quad core CPU owners should use 3 or 4, dual core CPUs would work better with 1 or 2. In case your CPU is Hyper-Threaded you'd better set this cvar to the number of physical cores your CPU has (i.e. for 4 cores/8 HT execution threads CPU best value would be 4). At some rare cases (useful for dual core CPU + AMD GPU) it might be better not to use separate job thread at all due to limitations imposed by GPU driver. For latter case you should set jobs_numThreads to 0 to completely turn off usage of job threads.
Best place to set "jobs_numThreads" is in Steam Game Launch Options (details on how to do it). Add something like "+jobs_numThreads
With Rage path 1.2 (released on the 2th of Feb, 2012) it became possible to benchmark the speed of the texture transcode operation. Unless you had compiled and installed the wine-cuda DLL wrapper for your system the speed of this operation is mostly determined by the speed of your CPU and the amount of threads the game is instructed to use through "jobs_numThreads" variable. Don't be fooled by the number this benchmark presents to you - having best texture transcoding performance is not the same as having smooth gaming experience with consistent FPS. If you want to maximize the transcoding performance despite this warning you can try to do as follows:
It is important to understand that there's no point in getting maximum texture transcode performance no matter the costs because it might result in extremely unstable FPS and non smooth gaming experience. For example, with AMD FX 8120 eight core CPU one would get maximum transcoding performance (bench. result ~80 megatexels) when "jobs_numThreads" is set to 8, while most stable FPS level is achieved with "jobs_numThreads" equal to 2 (smooth and almost constant 60FPS) despite the fact that benchmark result for latter case is significantly lower - around 40 megatexels. In other words, if the benchmark result is 35-40 or more - there's no point in trying improve it. On the other hand, having benchmark result less than 20 would result in jerky gaming experience with huge FPS drops during viewport moves.
Second cvar to tweak - "vt_maxPPF" - effectively acts as a balancer between FPS stability and the amount of texture pop-ins. Depending on the speed and type of your CPU and also on whether you use Cuda-accelerated texture decoding or not the good value for "vt_maxPPF" cvar might be in range from 8 to 256. Owners of low-speed dualcore CPU should use 8 or 16, typical value for quad core AMD Phenom II X4 955 CPU would be 32 or 64. Users with Cuda-enabled setups might be happy with setting vt_maxPPF to 128 or 256. In case you can live with moderate amount of textures pop-in but can't stand FPS drops below 60 when moving viewport your best bet would be to use low value for vf_maxPPF like 8 or even 4. To set the value of this cvar add something like "+vt_maxPPF
If your system configuration is similar to "quad-core CPU + GeForce 550 Ti with 1Gb" you could use "jobs_numThreads" set to 2 and "vt_maxPPF" set to 16 as a good starting values. It would allow you to get smooth 60FPS gaming experience when playing at 1680x1050 resolution with 4x antialiasing, "big" textures cache and "high" aniso settings.
In case your videocard have got more than 1GB of video RAM it is reasonable to increase game virtual texture page sizes (think about itÂ as "some kind of texture cache") to be 8K per page. To do this create a new file in Â«Steam Install DirÂ»/steamapps/common/rage/base/ foldÂer named Â«Rageconfig.cfgÂ» and populate it with following lines:
vt_pageimagesizeuniquediffuseonly2 "8192" vt_pageimagesizeuniquediffuseonly "8192" vt_pageimagesizeunique "8192"
Create Â«Rageconfig.cfgÂ» as was described earlier and populate it with the following lines:
// Do not show unskipable introduction video com_skipIntroVideo "1" // Enable built-in engine FPS counter com_showFPS "1" con_noPrint "0" // Increase mouse sensitivity and use 2-tap mouse input filter to fix "jerky" mlook m_sensitivity 10 m_smooth 2Note: It might be required to specify "+com_skipIntroVideo 1" in the Steam's game "Launch properties..." for intro video skip to work. I have it set in both places and it works like a charm.
You should never force AA/ANISO levels through GPU driver control panel. Game engine uses complicated rendering technology which is incompatible with the ways drivers force AA/ANISO levels and results in visible render glitches in case AA/ANISO level forcing is turned on. nVIDIA users should turn off "Texture Sharpening" as it causes huge render glitches too. Correct way to set the AA level is to use in-game graphical settings menu. Setting the level of anisotropic filtering requires modifying game configuration file. Create Â«Rageconfig.cfgÂ» as was described earlier and populate it with the following lines:
image_anisotropy "1" vt_maxaniso "2"These two settings control the ANISO level the game uses for texture filtering. Official tweak guide states that the engine only supports two levels on ANISO filtering, namely 2 and 4, and to select one of them you should modify the value of vt_maxaniso cvar. My experiments showed that the visual difference between levels 2 and 4 is almost negligible while performance drop is pretty huge. YMMV, experiment and choose the best to fit your needs. I hadn't been able to notice any visual and performance difference changing image_anisotropy cvar value. Still it might be wise to set it to be 8 or 16 in case it has effect on visual quality in some rare circumstances.
Rage engine is designed in a special way to support brand-new vsync controlling style. Main idea is to dynamically control vsync turning it on in case system it fast enough to render at 60+ FPS and off in case FPS is less than 60 FPS. This technology allows tear-free 60 FPS gameplay on decent systems without undesired drops down to 30 FPS in performance-constrained situations. Unfortunately current versions of GPU drivers lack OpenGL extension that is required for this vsync control system to work. Although it had been reported by nVIDIA linux driver development team member that the implementation of this extension is complete and it would be introduced in future driver versions, as of driver version 295.40 the extension in question is still unavailable. To overcome the problem and get a tear-free gaming experience in Rage one would have to reconfigure the game to use traditional vsync controlling scheme (FPS drops down to 15/30 in case system is not fast enough to render at 60+/30+ FPS). Recommended way to enable traditional vsync control scheme is to create Â«Rageconfig.cfgÂ» file as explained in previous section and add the following line into it:
It should do the trick for the most cases. With some buggy GPU drivers this setting might have no effect. In such case you should try to force vsync "on" in GPU driver control panel. It might help but be prepared to face game "crash to desktop" and all kinds of render glitches as game engine wasn't designed to work with vsync forced on using GPU driver control panel.