User Tools

Site Tools


cluster:151

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Next revision Both sides next revision
cluster:151 [2016/11/21 19:33]
hmeij07 [Resync Data]
cluster:151 [2016/11/29 19:08]
hmeij07 [Resync Data #2]
Line 87: Line 87:
 3678 3678
  
-redefine with numtargets=2 +# with numtargets=1 beegfs still writes to all primary targets found in all buddygroups 
-resync + 
 +# rebuild test servers with from scratch with numparts=2 
 +drop hmeij/ into home1/ and obtain slightly more files (couple of 100s), not double the amount 
 +# /home/hmeij has 7808 files in it which gets split over primaries but numparts=2 would yield 15,616 files? 
 +# drop another copy in home2/ and file counts double to circa 7808 
 +[root@cottontail2 ~]# beegfs-ctl --getentryinfo  /mnt/beegfs/home1 
 +Path: /home1 
 +Mount: /mnt/beegfs 
 +EntryID: 0-583C50A1-FA 
 +Metadata node: cottontail2 [ID: 250] 
 +Stripe pattern details: 
 ++ Type: Buddy Mirror 
 ++ Chunksize: 512K 
 ++ Number of storage targets: desired: 2 
 +[root@cottontail2 ~]# beegfs-ctl --getentryinfo  /mnt/beegfs/home2 
 +Path: /home2 
 +Mount: /mnt/beegfs 
 +EntryID: 1-583C50A1-FA 
 +Metadata node: cottontail2 [ID: 250] 
 +Stripe pattern details: 
 ++ Type: Buddy Mirror 
 ++ Chunksize: 512K 
 ++ Number of storage targets: desired: 2 
 + 
 +Source: /home/hmeij 7808 files in 10G 
 + 
 +TargetID        Pool        Total         Free    %      ITotal       IFree    % 
 +========        ====        =====         ====    =      ======       =====    = 
 +   13601         low     291.4GiB      63.1GiB  22%       18.5M       18.5M 100% 
 +   13602         low     291.4GiB      63.1GiB  22%       18.5M       18.5M 100% 
 +   21701         low     291.2GiB     134.6GiB  46%       18.5M       16.2M  87% 
 +   21702         low     291.2GiB     134.6GiB  46%       18.5M       16.2M  87% 
 +[root@cottontail2 ~]# rsync -ac --bwlimit=2500 /home/hmeij /mnt/beegfs/home1/ 
 +[root@cottontail2 ~]# rsync -ac --bwlimit=2500 /home/hmeij /mnt/beegfs/home2/ 
 +TargetID        Pool        Total         Free    %      ITotal       IFree    % 
 +========        ====        =====         ====    =      ======       =====    = 
 +   13601         low     291.4GiB      43.5GiB  15%       18.5M       18.5M 100% 
 +   13602         low     291.4GiB      43.5GiB  15%       18.5M       18.5M 100% 
 +   21701         low     291.2GiB     114.9GiB  39%       18.5M       16.1M  87% 
 +   21702         low     291.2GiB     114.9GiB  39%       18.5M       16.1M  87% 
 + 
 +# first rsync drops roughly 5G in both primaries which then get copied to secondaries. 
 +# second rsync does the same so both storage servers loose 20G roughly 
 +# now shut a storage server down and the whole filesystem can still be accessed (HA)
  
 </code>  </code> 
cluster/151.txt · Last modified: 2016/12/06 20:14 by hmeij07