This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision Next revision Both sides next revision | ||
cluster:167 [2018/06/26 19:30] hmeij07 [CPU vs GPU] |
cluster:167 [2018/06/28 11:48] hmeij07 [CPU vs GPU] |
||
---|---|---|---|
Line 10: | Line 10: | ||
* That period covers 600 hours of time | * That period covers 600 hours of time | ||
* Assume 99% utilization of cpu core or gpu device | * Assume 99% utilization of cpu core or gpu device | ||
- | * Available time is measured per cpu core but by gpu device | + | * Available time is measured per physical |
* There is no good/bad metric | * There is no good/bad metric | ||
* Never collated such data before | * Never collated such data before | ||
* The GPU usage is based on detecting gpu reservations (gpu= flag) | * The GPU usage is based on detecting gpu reservations (gpu= flag) | ||
- | * Sadly, the logs showing gpu %util confirm this | ||
- | ^ Metric ^ CPU ^ Ratio ^ GPU ^ Notes ^ | + | |
+ | ^ Metric ^ CPU ^ Ratio ^ GPU ^ Notes June 2018 ^ | ||
| Device Count | 72 | 3:1 | 24 | cpu all intel, gpu all nvidia | | | Device Count | 72 | 3:1 | 24 | cpu all intel, gpu all nvidia | | ||
| Core Count | 1,192 | 1:54 | 64,300 | physical only | | | Core Count | 1,192 | 1:54 | 64,300 | physical only | | ||
| Memory | 7,408 | 51:1 | 144 | GB | | | Memory | 7,408 | 51:1 | 144 | GB | | ||
| Teraflops | 38 | 1.5:1 | 25 | double precision, floating point, theoretical | | | Teraflops | 38 | 1.5:1 | 25 | double precision, floating point, theoretical | | ||
- | | Avail Hours | 715, | ||
| Job Count | 2,834 | 3:1 | 1,045 | scheduled jobs irregardless of exit status | | | Job Count | 2,834 | 3:1 | 1,045 | scheduled jobs irregardless of exit status | | ||
- | | Job Hours | 221, | + | | Avail Hours | 715, |
- | | Avail Hrs Util% | 31 | 6:1 | 5 | weeping... | + | | Job Hours | 221, |
- | | Avail Hours2 | 561, | + | | Job Hours % | 31 | 6:1 | 5 | as a percentage |
- | | Avail Hrs2 Util% | 39 | 8:1 | 5 | more realistic... | | + | | Avail Hours2 | 561, |
+ | | Job Hours % | 39 | 8:1 | 5 | more realistic...hp12 rarely used in June18| | ||
+ | |||
+ | The logs showing gpu %util confirm the extremely low GPU usage. When concatenating the four gpu %util values into a string, since 01Jan2017, the string ' | ||
+ | |||
+ | So were these 25 days in June 2018 an oddity? March is Honors' | ||
+ | |||
+ | ^ Total Monthly CPU+GPU Hours ^^^^^^^^^^^ | ||
+ | ^Ju17^Aug17^Sep17^Oct17^Nov17^Dec17^Jan18^Feb18^Mar18^Apr18^May18^ | ||
+ | |313, | ||
+ | |||
+ | ^ Metric ^ CPU ^ Ratio ^ GPU ^ Notes July 2017 ^ | ||
+ | | Device Count | 72 | 4:1 | 20 | cpu all intel, gpu all nvidia | | ||
+ | | Core Count | 1,192 | 1:42 | 50,000 | physical only | | ||
+ | | Memory | 7,408 | 74:1 | 100 | GB | | ||
+ | | Teraflops | 38 | 1.7:1 | 23 | double precision, floating point, theoretical | | ||
+ | | Job Count | 12, | ||
+ | | Avail Hours | 886, | ||
+ | | Job Hours | 260, | ||
+ | | Job Hours % | 30 | 1:1 | 26 | as a percentage | | ||
+ | | Avail Hours2 | 696, | ||
+ | | Job Hours % | 37 | 1.5:1 | 26 | more realistic...hp12 rarely used in June18| | ||
+ | |||
+ | * Some noise in this data with the inability to match start and end of job (~15% of records) | ||
+ | * The assumption that '' | ||
+ | * 939591860 | ||
**[[cluster: | **[[cluster: |