This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision Next revision Both sides next revision | ||
cluster:189 [2020/01/13 13:01] hmeij07 [General] |
cluster:189 [2020/01/22 13:46] hmeij07 |
||
---|---|---|---|
Line 8: | Line 8: | ||
==== History ==== | ==== History ==== | ||
- | In 2006, 4 Wesleyan faculty members approached ITS with a proposal to centrally manage a high performance computing center (HPCC) seeding the effort with an NSF grant (about $190K, two racks full of Dell PE1950, a total of 256 physical cpu cores on Infiniband). ITS offered 0.5 FTE for a dedicated " | + | In 2006, 4 Wesleyan faculty members approached ITS with a proposal to centrally manage a high performance computing center (HPCC) seeding the effort with an NSF grant (about $190K, two racks full of Dell PE1950, a total of 256 physical cpu cores on Infiniband). ITS offered 0.5 FTE for a dedicated " |
The Advisory Group meets with the user base yearly during the reading week of the Spring semester (early May) before everybody scatters for the summer. At this meeting, the hpcadmin reviews the past year, previews the coming year, and the user base are contributing feedback on progress and problems. | The Advisory Group meets with the user base yearly during the reading week of the Spring semester (early May) before everybody scatters for the summer. At this meeting, the hpcadmin reviews the past year, previews the coming year, and the user base are contributing feedback on progress and problems. | ||
Line 66: | Line 66: | ||
Rstore is a platform for storing research static data. The hope is to move static data off the HPCC and mount it read-only back onto the HPCC login nodes. | Rstore is a platform for storing research static data. The hope is to move static data off the HPCC and mount it read-only back onto the HPCC login nodes. | ||
- | The Data Center has recently been renovated so the HPCC has no more cooling | + | The Data Center has recently been renovated so the HPCC has no more cooling problems (It used to be in the event of a cooling tower failure, within 3 hours the HPCC would push temps above 85F). No more. We have sufficient rack space (5) and power for expansion. For details on that "live renovation" |