GPU table outdated, only up to GTX 1070

Message boards : Number crunching : GPU table outdated, only up to GTX 1070
Message board moderation

To post messages, you must log in.

AuthorMessage
hsdecalc

Send message
Joined: 22 Feb 19
Posts: 3
Credit: 3,275,629
RAC: 460
Message 3267 - Posted: 16 Jun 2022, 14:06:11 UTC

On WIN 10 I have a gpuLookupTable_v402.txt.
My GTX1080 Ti is not listed and is running with little load / in standard mode (stderr.txt).
Is there a table/values ​​that better utilize the GPU?
ID: 3267 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Eric Driver
Project administrator
Project developer
Project tester
Project scientist

Send message
Joined: 8 Jul 11
Posts: 1212
Credit: 247,700,737
RAC: 223,199
Message 3269 - Posted: 17 Jun 2022, 15:23:54 UTC - in response to Message 3267.  

On WIN 10 I have a gpuLookupTable_v402.txt.
My GTX1080 Ti is not listed and is running with little load / in standard mode (stderr.txt).
Is there a table/values ​​that better utilize the GPU?


The default values should work well enough. The lookup table is for fine tuning and usually gives less than a 10% improvement.

If GPU utilization is less than 100%, you can try using the app_config.xml file to have it run more than 1 GPU task at a time. I have done that myself to raise my usage from 90% to 100%. Individual run times go up but you process 2 WUs at a time.
ID: 3269 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Speedy51

Send message
Joined: 13 Apr 19
Posts: 24
Credit: 6,747,342
RAC: 9,967
Message 3270 - Posted: 18 Jun 2022, 3:41:37 UTC - in response to Message 3269.  

When you are running 2 tasks at a time how much CPU do you give to each task? Example 0.5
ID: 3270 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Eric Driver
Project administrator
Project developer
Project tester
Project scientist

Send message
Joined: 8 Jul 11
Posts: 1212
Credit: 247,700,737
RAC: 223,199
Message 3271 - Posted: 18 Jun 2022, 4:01:28 UTC - in response to Message 3270.  

When you are running 2 tasks at a time how much CPU do you give to each task? Example 0.5


I think you are asking how to set the "cpu_usage" tag in the app_config.xml file.

This will vary depending on your cpu. I have mine set to .20. I got that by looking at what the cpu usage was for the gpu task when it was running a single task.

In case it helps, here are the contents of my app_config.xml file:
<app_config>
  <app>
    <name>GetDecics</name>
    <fraction_done_exact/>
    <gpu_versions>
      <gpu_usage>0.49</gpu_usage>
      <cpu_usage>0.20</cpu_usage>
    </gpu_versions>
  </app>
</app_config>
ID: 3271 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Speedy51

Send message
Joined: 13 Apr 19
Posts: 24
Credit: 6,747,342
RAC: 9,967
Message 3272 - Posted: 18 Jun 2022, 7:56:16 UTC - in response to Message 3271.  

Thanks Eric, that was the information I was after. I tried it and I feel 8 minutes for one task when running 2 tasks is too long. When running 1 task I haven't seen runtime like that on my card. GPU percentage is 90% or above the majority of the time
ID: 3272 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
hsdecalc

Send message
Joined: 22 Feb 19
Posts: 3
Credit: 3,275,629
RAC: 460
Message 3273 - Posted: 18 Jun 2022, 14:54:46 UTC

Ok, thanks. with:
numBlocks = 9600.
threadsPerBlock = 32.
my GPU utilization is 99% and Power consumption around 175W.
With standard:
numBlocks = 1024.
threadsPerBlock = 32.
my GPU utilization is 98% and Power consumption around 160W = 10% less.
ID: 3273 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
k29000

Send message
Joined: 30 Apr 19
Posts: 2
Credit: 650
RAC: 0
Message 3274 - Posted: 19 Jun 2022, 11:43:08 UTC
Last modified: 19 Jun 2022, 11:43:42 UTC

Mr Eric,

Is there a noobs guide for what you're talking about in this thread.

For years I've been just running a spare computer with whatever happens when you run BOINC by default. CPU only, no GPUs. Wondering if there's anything I can do to get more performance out of it.
ID: 3274 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Eric Driver
Project administrator
Project developer
Project tester
Project scientist

Send message
Joined: 8 Jul 11
Posts: 1212
Credit: 247,700,737
RAC: 223,199
Message 3275 - Posted: 19 Jun 2022, 15:20:26 UTC - in response to Message 3274.  

Mr Eric,

Is there a noobs guide for what you're talking about in this thread.

For years I've been just running a spare computer with whatever happens when you run BOINC by default. CPU only, no GPUs. Wondering if there's anything I can do to get more performance out of it.


This link explains the app_config.xml file
https://boinc.berkeley.edu/wiki/Client_configuration

The app_config goes in the project directory. It gives extra control of the client. If not present, the default configuration is used, which in general is good enough for most users. I personally didn't need to use it until I had a much better GPU and wanted to run more than one task at a time.
ID: 3275 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote

Message boards : Number crunching : GPU table outdated, only up to GTX 1070


Main page · Your account · Message boards


Copyright © 2022 Arizona State University