LuxMark v3.1beta2

Discussions related to GPU Acceleration in LuxRender

Moderators: Dade, jromang, tomb, coordinators

LuxMark v3.1beta2

Postby Dade » Fri Jul 31, 2015 5:35 am

For more details about LuxMark v3.x, check: viewtopic.php?f=34&t=11585

This version includes LuxRender v1.5RC2 render engine and:

- the latest OpenCL optimization suggested by NVIDIA
- OpenCL "overclocking" (viewtopic.php?f=8&t=12265)
- a new command line --ext-info option (viewtopic.php?f=8&t=12278#p115645)
- a fix for OpenCL device with weird names (viewtopic.php?f=34&t=11585&start=50#p115646)

Note: While LuxMark v3.0 and v3.1beta2 deliver similar results, it is fair only to compare only v3.1beta2 results with v3.1beta2 and v3.0 with v3.0.

Binaries

Windows 64bit: http://www.luxrender.net/release/luxmar ... 1beta2.zip (note: you may have to install VisualStudio 2013 C++ runtime => https://www.microsoft.com/en-US/downloa ... x?id=40784)
MacOS 64bit: http://www.luxrender.net/release/luxmar ... x86_64.zip
Linux 64bit: http://www.luxrender.net/release/luxmar ... a2.tar.bz2
User avatar
Dade
Developer
 
Posts: 8356
Joined: Sat Apr 19, 2008 6:04 pm
Location: Italy

Re: LuxMark v3.1beta2

Postby crosley09 » Fri Jul 31, 2015 10:16 am

How do we call the 'overclocking' feature? (or maybe it's running by default?)

I'm currently trying:

Code: Select all
--scene=LUXBALL_HDR --mode=BENCHMARK_OCL_GPU --single-run --opencl.kernel.options="-cl-fast-relaxed-math -cl-strict-aliasing -cl-mad-enable"


to no avail.

Also, while i'm asking, is there documentation of the command line options available to luxmark somewhere? I've pieced together the above from various posts.
i7 5930k, GTX 980 ti, 32 GB ddr4, 512 GB PCIe SSD...Windows 10
User avatar
crosley09
 
Posts: 199
Joined: Fri Oct 07, 2011 1:00 pm
Location: Indiana, USA

Re: LuxMark v3.1beta2

Postby Dade » Fri Jul 31, 2015 11:47 am

crosley09 wrote:How do we call the 'overclocking' feature? (or maybe it's running by default?)


It is enabled by default in all 3 scenes.

crosley09 wrote:Also, while i'm asking, is there documentation of the command line options available to luxmark somewhere? I've pieced together the above from various posts.


Just run luxmark with --help, it will print an help message.
User avatar
Dade
Developer
 
Posts: 8356
Joined: Sat Apr 19, 2008 6:04 pm
Location: Italy

Re: LuxMark v3.1beta2

Postby crosley09 » Fri Jul 31, 2015 11:51 am

Excellent. Thanks!
i7 5930k, GTX 980 ti, 32 GB ddr4, 512 GB PCIe SSD...Windows 10
User avatar
crosley09
 
Posts: 199
Joined: Fri Oct 07, 2011 1:00 pm
Location: Indiana, USA

Re: LuxMark v3.1beta2

Postby cwizou » Mon Aug 03, 2015 11:57 am

Apologies for the delay, I can confirm the DeviceName string bug is indeed fixed, thanks again !

(side note, you may want to consider making open cl "overclocking" something that can be disabled ?)
cwizou
 
Posts: 7
Joined: Wed Jul 29, 2015 9:12 am

Re: LuxMark v3.1beta2

Postby Dade » Tue Aug 04, 2015 5:58 am

cwizou wrote:(side note, you may want to consider making open cl "overclocking" something that can be disabled ?)


It can be disabled by removing the following line inside .cfg files:

Code: Select all
opencl.kernel.options = "-cl-fast-relaxed-math -cl-strict-aliasing -cl-mad-enable"
User avatar
Dade
Developer
 
Posts: 8356
Joined: Sat Apr 19, 2008 6:04 pm
Location: Italy

Re: LuxMark v3.1beta2

Postby cwizou » Tue Aug 04, 2015 1:56 pm

Dade wrote:
cwizou wrote:(side note, you may want to consider making open cl "overclocking" something that can be disabled ?)


It can be disabled by removing the following line inside .cfg files:

Code: Select all
opencl.kernel.options = "-cl-fast-relaxed-math -cl-strict-aliasing -cl-mad-enable"

Good to know, thanks.

I'm still perplexed by the choice of enabling it by default, considering the caveats you mentioned in this thread : http://www.luxrender.net/forum/viewtopic.php?f=8&t=12265

- It's not enabled by default in LuxCore
- It can lead to artifacts
- It gives a sizeable advantage to one manufacturer

The third point wouldn't be an issue to me without the second to be honest. I'd suggest you ask other reviewers opinion on the matter. Sane defaults is a must, considering most reviewers won't take the time to change a config file to conform to something "fairer" and that actually represents the way LuxCore performs (isn't that supposed to be the point of LuxMark ?). Anyway, just my 2 cents ! Thanks again for the work you put into making this benchmark.
cwizou
 
Posts: 7
Joined: Wed Jul 29, 2015 9:12 am

Re: LuxMark v3.1beta2

Postby jensverwiebe » Wed Aug 05, 2015 6:01 am

OSX build: http://www.luxrender.net/release/luxmar ... x86_64.zip ( not codesigned )

Jens

Sidenote: with Intel ocl sdk and Linux -cl-strict-aliasing breaks kernelcompiles as i mentioned before, lets see how much other ppl will complain ;)
One solution could be using the amd app sdk for cpu ( appropiate icd )
But it is anyway not longer part of ocl since 1.1, see: https://www.khronos.org/registry/cl/sdk ... ogram.html

Jens
Last edited by jensverwiebe on Fri Aug 07, 2015 4:56 pm, edited 2 times in total.
User avatar
jensverwiebe
Developer
 
Posts: 3407
Joined: Wed Apr 02, 2008 4:34 pm

Re: LuxMark v3.1beta2

Postby pciccone » Wed Aug 05, 2015 12:30 pm

Thank you Jens.
User avatar
pciccone
Developer
 
Posts: 1579
Joined: Wed Jan 13, 2010 11:02 am
Location: South Carolina

Re: LuxMark v3.1beta2

Postby pciccone » Wed Aug 05, 2015 12:32 pm

Jens, the build does not contain the Qt framework and so it crashes at startup
User avatar
pciccone
Developer
 
Posts: 1579
Joined: Wed Jan 13, 2010 11:02 am
Location: South Carolina

Next

Return to GPU Acceleration

Who is online

Users browsing this forum: No registered users and 1 guest