Jump to content
Welcome to our new Citrix community!

Cannot add SDX and VPX on MAS


Recommended Posts

 

Hi,

 

Our client asked us to install MAS but after installed it on VMware we cannot add any instances of VPX and SDX.

 

At the moment we have installed the 12.0 53.6 for MAS and 11.1 54.14 for VPX and SDX, do we need upgrade them to the 12.0 release to work?.

 

When we try to add an SDX the MAS responds:

 

ex.

Trying to connect to 10.55.81.15 
Error: Either the device type is incorrect or login has failed.

 

When we try to add a VPX the MAS responds:

 

ex.

Trying to connect to 10.55.81.17 
Error: License cannot be retrieved. Either the NetScaler is unresponsive or the login credentials are incorrect.

 

We have create the NetScaler Profile using the nsroot credentials.

 

On VPX and SDX we have installed a platinum license while on the MAS we have not yet installed any license.

 

Someone can help us?

 

Thanks to all

Link to comment
Share on other sites

2 hours ago, Max Lindqvist1709152463 said:

Any firewalls between the MAS and SVN/VPX?

 

Yes, we will ask the SOC to review their firewall policies

 

 

EDIT.

 

The SOC has opened all ports and we can add VPX instances even if we cannot add the SDXs yet.

 

Always same error

 

Trying to connect to 10.55.81.15 
Error: Either the device type is incorrect or login has failed.

 

Edited by c.tidei
Update
Link to comment
Share on other sites

Here is a part of the mps_inventory.log file

 

 

]0;root@netscaler-sdx:~[root@netscaler-sdx ~]#
Monday, 5 Mar 18 10:42:57.811 +0100 [Debug] [HealthMonitorMain] Entering critical plugin section: iovirt-sdx, function_name=get_cavium_vf_summary
Monday, 5 Mar 18 10:42:58.170 +0100 [Debug] [HealthMonitorMain] Leaving critical plugin section: iovirt-sdx, function_name=get_cavium_vf_summary
Monday, 5 Mar 18 10:42:58.170 +0100 [Debug] [HealthMonitorMain] <?xml version="1.0" ?>
<iovirt>
    <socket>
        <socketnum>
            0
        </socketnum>
        <vfstats>
            <freevfcount>
                4
            </freevfcount>
            <allocatedvfcount>
                4
            </allocatedvfcount>
            <total>
                8
            </total>
        </vfstats>
        <pfcount>
            1
        </pfcount>
        <pfs>
            <pf>
                <id>
                    0000:05:00.0
                </id>
                <allocatedvfs>
                    0000:05:00.4, 0000:05:00.1, 0000:05:00.3, 0000:05:00.2
                </allocatedvfs>
                <freevfs>
                    0000:05:00.5, 0000:05:00.6, 0000:05:00.7, 0000:05:01.0
                </freevfs>
            </pf>
        </pfs>
    </socket>
    <socket>
        <socketnum>
            1
        </socketnum>
        <vfstats>
            <freevfcount>
                5
            </freevfcount>
            <allocatedvfcount>
                3
            </allocatedvfcount>
            <total>
                8
            </total>
        </vfstats>
        <pfcount>
            1
        </pfcount>
        <pfs>
            <pf>
                <id>
                    0000:14:00.0
                </id>
                <allocatedvfs>
                    0000:14:00.3, 0000:14:00.1, 0000:14:00.2
                </allocatedvfs>
                <freevfs>
                    0000:14:00.4, 0000:14:00.5, 0000:14:00.6, 0000:14:00.7, 0000:14:01.0
                </freevfs>
            </pf>
        </pfs>
    </socket>
    <allocatedtovms>
        <vm>
            <uuid>
                27ca8d30-e6b4-46b9-2f0e-c9bf557dd84c
            </uuid>
            <allocatedvfs>
                0000:14:00.2, 0000:05:00.3
            </allocatedvfs>
        </vm>
        <vm>
            <uuid>
                9f401157-3967-41f1-2716-8b36801b875e
            </uuid>
            <allocatedvfs>
                0000:14:00.3, 0000:14:00.1, 0000:05:00.4, 0000:05:00.1, 0000:05:00.2
            </allocatedvfs>
        </vm>
    </allocatedtovms>
</iovirt>

Monday, 5 Mar 18 10:42:58.234 +0100 [Debug] [HealthMonitorMain] Entering critical plugin section: xsnsmonitor, function_name=ipmi_sensor_list
Monday, 5 Mar 18 10:42:58.425 +0100 [Debug] [HealthMonitorMain] Leaving critical plugin section: xsnsmonitor, function_name=ipmi_sensor_list
Monday, 5 Mar 18 10:42:58.425 +0100 [Debug] [HealthMonitorMain] ipmi_sensor_list:CPU1 Temp        | 43.000     | degrees C  | ok    | 0.000     | 0.000     | 0.000     | 90.000    | 93.000    | 96.000    
CPU2 Temp        | 37.000     | degrees C  | ok    | 0.000     | 0.000     | 0.000     | 90.000    | 93.000    | 96.000    
System Temp      | 24.000     | degrees C  | ok    | 13.000    | 16.000    | 18.000    | 77.000    | 82.000    | 90.000    
FAN 1            | 5929.000   | RPM        | ok    | 400.000   | 576.000   | 784.000   | 33856.000 | 34225.000 | 34596.000 
FAN 2            | 5476.000   | RPM        | ok    | 400.000   | 576.000   | 784.000   | 33856.000 | 34225.000 | 34596.000 
FAN 3            | na         | RPM        | na    | na        | na        | na        | na        | na        | na        
FAN 4            | 5929.000   | RPM        | ok    | 400.000   | 576.000   | 784.000   | 33856.000 | 34225.000 | 34596.000 
FAN 5            | 5476.000   | RPM        | ok    | 400.000   | 576.000   | 784.000   | 33856.000 | 34225.000 | 34596.000 
FAN 6            | 5929.000   | RPM        | ok    | 400.000   | 576.000   | 784.000   | 33856.000 | 34225.000 | 34596.000 
FAN 7            | na         | RPM        | na    | na        | na        | na        | na        | na        | na        
FAN 8            | 5929.000   | RPM        | ok    | 400.000   | 576.000   | 784.000   | 33856.000 | 34225.000 | 34596.000 
CPU1 Vcore       | 1.080      | Volts      | ok    | 0.808     | 0.816     | 0.824     | 1.352     | 1.360     | 1.368     
CPU2 Vcore       | 1.048      | Volts      | ok    | 0.808     | 0.816     | 0.824     | 1.352     | 1.360     | 1.368     
+1.5 V           | 1.504      | Volts      | ok    | 1.320     | 1.328     | 1.336     | 1.656     | 1.664     | 1.672     
+5 V             | 5.088      | Volts      | ok    | 4.416     | 4.448     | 4.480     | 5.536     | 5.568     | 5.600     
+5VSB            | 5.004      | Volts      | ok    | 4.428     | 4.464     | 4.500     | 5.472     | 5.508     | 5.544     
+12 V            | 12.084     | Volts      | ok    | 10.600    | 10.653    | 10.706    | 13.250    | 13.303    | 13.356    
+3.3VCC          | 3.240      | Volts      | ok    | 2.880     | 2.904     | 2.928     | 3.648     | 3.672     | 3.696     
+3.3VSB          | 3.240      | Volts      | ok    | 2.880     | 2.904     | 2.928     | 3.648     | 3.672     | 3.696     
VBAT             | 3.168      | Volts      | ok    | 2.880     | 2.904     | 2.928     | 3.648     | 3.672     | 3.696     
CPU1 VTT         | 1.560      | Volts      | ok    | 1.320     | 1.328     | 1.336     | 1.656     | 1.664     | 1.672     
CPU2 VTT         | 1.560      | Volts      | ok    | 1.320     | 1.328     | 1.336     | 1.656     | 1.664     | 1.672     
Chassis Intru    | 0x0        | discrete   | 0x0000| na        | na        | na        | na        | na        | na        
PS_1 Status      | 0x1        | discrete   | 0x0100| na        | na        | na        | na        | na        | na        
PS_1 Fan Status  | 0x1        | discrete   | 0x0100| na        | na        | na        | na        | na        | na        
PS_1 Temp Status | 0x1        | discrete   | 0x0100| na        | na        | na        | na        | na        | na        
PS_1 Temp        | 34.000     | degrees C  | ok    | na        | na        | na        | na        | na        | na        
PS_2 Status      | 0x1        | discrete   | 0x0100| na        | na        | na        | na        | na        | na        
PS_2 Fan Status  | 0x1        | discrete   | 0x0100| na        | na        | na        | na        | na        | na        
PS_2 Temp Status | 0x1        | discrete   | 0x0100| na        | na        | na        | na        | na        | na        
PS_2 Temp        | 35.000     | degrees C  | ok    | na        | na        | na        | na        | na        | na        
PDB Status       | 0x1        | discrete   | 0x0100| na        | na        | na        | na        | na        | na        

Monday, 5 Mar 18 10:42:58.572 +0100 [Debug] [HealthMonitorMain] Done StatHealthMonitorHandler for 169.254.0.1
Monday, 5 Mar 18 10:43:08.902 +0100 [Debug] [Stat[#1]] StatXenHandler for 169.254.0.1
Monday, 5 Mar 18 10:43:08.920 +0100 [Debug] [Stat[#1]] StatXenHandler: http://169.254.0.1/rrd_updates?session_id=**********&start=1520242988&host=true
Monday, 5 Mar 18 10:43:08.921 +0100 [Debug] [StatMain] Status of all the subsystems :
  401 root          76  49    0   271M 92968K ucond   94.7H  0.00% svm_inventory
  422 root          80  44    0   339M   157M ucond  214:53  0.00% svm_event
  439 root         112  57    0   337M   116M ucond   85:07  0.00% svm_config
  457 root          62  44    0   276M   101M ucond   79:39  0.00% svm_service

Monday, 5 Mar 18 10:43:08.927 +0100 [Debug] [Stat[#1]] StatXenHandler: Retrying because step != 5, step = 3600
Monday, 5 Mar 18 10:43:08.927 +0100 [Debug] [Stat[#1]] StatXenHandler: http://169.254.0.1/rrd_updates?session_id=**********&start=1520254800&host=true
Monday, 5 Mar 18 10:43:08.929 +0100 [Debug] [Stat[#1]] StatXenHandler: Retrying because row_list->length() < 1
Monday, 5 Mar 18 10:43:08.929 +0100 [Debug] [Stat[#1]] StatXenHandler: http://169.254.0.1/rrd_updates?session_id=**********&start=1520251365&host=true
Monday, 5 Mar 18 10:43:08.992 +0100 [Debug] [Stat[#1]] Done StatXenHandler for 169.254.0.1
Monday, 5 Mar 18 10:43:09.023 +0100 [Debug] [Stat[#2]] HTTP Request Protocol: http, ContentType: , Method: GET, URL: http://10.55.81.17/nitro/v1/stat/ns?format=json
Monday, 5 Mar 18 10:43:09.028 +0100 [Debug] [Stat[#3]] HTTP Request Protocol: http, ContentType: , Method: GET, URL: http://10.55.81.18/nitro/v1/stat/ns?format=json
Monday, 5 Mar 18 10:43:09.084 +0100 [Debug] [Stat[#3]] Done StatNSHandler for 10.55.81.18
Monday, 5 Mar 18 10:43:09.117 +0100 [Debug] [DiskMonitorMain] StatDiskMonitorHandler for 169.254.0.1
Monday, 5 Mar 18 10:43:09.141 +0100 [Debug] [DiskMonitorMain] Entering critical plugin section: xsnsmonitor, function_name=disk_status_get
Monday, 5 Mar 18 10:43:09.144 +0100 [Debug] [MPSMonitorMain] Done getting MPS stats
Monday, 5 Mar 18 10:43:09.156 +0100 [Debug] [Stat[#2]] Done StatNSHandler for 10.55.81.17
Monday, 5 Mar 18 10:43:09.703 +0100 [Debug] [DiskMonitorMain] Leaving critical plugin section: xsnsmonitor, function_name=disk_status_get
Monday, 5 Mar 18 10:43:09.703 +0100 [Debug] [DiskMonitorMain] 1 : Localstorage : good
2 : VPX-SR : good

Monday, 5 Mar 18 10:43:09.735 +0100 [Debug] [DiskMonitorMain] Entering critical plugin section: xsnsmonitor, function_name=disk_io_use
Monday, 5 Mar 18 10:43:10.215 +0100 [Debug] [DiskMonitorMain] Leaving critical plugin section: xsnsmonitor, function_name=disk_io_use
Monday, 5 Mar 18 10:43:10.215 +0100 [Debug] [DiskMonitorMain] 1 : Localstorage : /dev/sda4 VG_XenStorage-c6f9eb31-24fa-0852-9d56-d3426ade737e lvm2 a-- 226.45G 106.20G
2 : VPX-SR : /dev/sdb VG_XenStorage-f97838ae-6cf2-1ba7-b6bf-46f62db9fa6b lvm2 a-- 931.50G 849.31G

Monday, 5 Mar 18 10:43:10.215 +0100 [Debug] [DiskMonitorMain] Entering critical plugin section: xsnsmonitor, function_name=disk_io_get
Monday, 5 Mar 18 10:43:10.649 +0100 [Debug] [DiskMonitorMain] Leaving critical plugin section: xsnsmonitor, function_name=disk_io_get
Monday, 5 Mar 18 10:43:10.649 +0100 [Debug] [DiskMonitorMain] 1 : sda4 24.96 284.76 891.04 5432891069 16999930141
2 : sdb 13.26 8.96 616.54 170870930 11762712178

Monday, 5 Mar 18 10:43:10.650 +0100 [Debug] [DiskMonitorMain] Resetting DB ODBC connection after 301
Monday, 5 Mar 18 10:43:10.675 +0100 [Debug] [DiskMonitorMain] Done StatDiskMonitorHandler for 169.254.0.1
Monday, 5 Mar 18 10:43:12.582 +0100 [Debug] [HealthMonitorMain] StatHealthMonitorHandler for 169.254.0.1
Monday, 5 Mar 18 10:43:12.586 +0100 [Debug] [HealthMonitorMain] Entering critical plugin section: iovirt-sdx, function_name=show_summary
Monday, 5 Mar 18 10:43:12.926 +0100 [Debug] [HealthMonitorMain] Leaving critical plugin section: iovirt-sdx, function_name=show_summary
Monday, 5 Mar 18 10:43:12.955 +0100 [Debug] [HealthMonitorMain] Entering critical plugin section: xsnsmonitor, function_name=nic_stats_get_all
Monday, 5 Mar 18 10:43:13.902 +0100 [Debug] [HealthMonitorMain] Leaving critical plugin section: xsnsmonitor, function_name=nic_stats_get_all
Monday, 5 Mar 18 10:43:14.054 +0100 [Debug] [HealthMonitorMain] Entering critical section: ssh on 169.254.0.1
Monday, 5 Mar 18 10:43:14.122 +0100 [Debug] [HealthMonitorMain] Out of critical section: ssh on 169.254.0.1
Monday, 5 Mar 18 10:43:14.263 +0100 [Debug] [HealthMonitorMain] Executing "xenpm get-cpu-topology" command on 169.254.0.1
Monday, 5 Mar 18 10:43:14.308 +0100 [Debug] [HealthMonitorMain] Result of "xenpm get-cpu-topology" command on 169.254.0.1
xenpm get-cpu-topology
CPU    core    socket    node
CPU0     0     0     0
CPU1     0     0     0
CPU2     1     0     0
CPU3     1     0     0
CPU4     2     0     0
CPU5     2     0     0
CPU6     8     0     0
CPU7     8     0     0
CPU8     9     0     0
CPU9     9     0     0
CPU10     10     0     0
CPU11     10     0     0
CPU12     0     1     1
CPU13     0     1     1
CPU14     1     1     1
CPU15     1     1     1
CPU16     2     1     1
CPU17     2     1     1
CPU18     8     1     1
CPU19     8     1     1
CPU20     9     1     1
CPU21     9     1     1
CPU22     10     1     1
CPU23     10     1     1
]0;root@netscaler-sdx:~[root@netscaler-sdx ~]#
Monday, 5 Mar 18 10:43:14.421 +0100 [Debug] [HealthMonitorMain] Entering critical plugin section: iovirt-sdx, function_name=get_cavium_vf_summary
Monday, 5 Mar 18 10:43:14.781 +0100 [Debug] [HealthMonitorMain] Leaving critical plugin section: iovirt-sdx, function_name=get_cavium_vf_summary
Monday, 5 Mar 18 10:43:14.781 +0100 [Debug] [HealthMonitorMain] <?xml version="1.0" ?>
<iovirt>
    <socket>
        <socketnum>
            0
        </socketnum>
        <vfstats>
            <freevfcount>
                4
            </freevfcount>
            <allocatedvfcount>
                4
            </allocatedvfcount>
            <total>
                8
            </total>
        </vfstats>
        <pfcount>
            1
        </pfcount>
        <pfs>
            <pf>
                <id>
                    0000:05:00.0
                </id>
                <allocatedvfs>
                    0000:05:00.4, 0000:05:00.1, 0000:05:00.3, 0000:05:00.2
                </allocatedvfs>
                <freevfs>
                    0000:05:00.5, 0000:05:00.6, 0000:05:00.7, 0000:05:01.0
                </freevfs>
            </pf>
        </pfs>
    </socket>
    <socket>
        <socketnum>
            1
        </socketnum>
        <vfstats>
            <freevfcount>
                5
            </freevfcount>
            <allocatedvfcount>
                3
            </allocatedvfcount>
            <total>
                8
            </total>
        </vfstats>
        <pfcount>
            1
        </pfcount>
        <pfs>
            <pf>
                <id>
                    0000:14:00.0
                </id>
                <allocatedvfs>
                    0000:14:00.3, 0000:14:00.1, 0000:14:00.2
                </allocatedvfs>
                <freevfs>
                    0000:14:00.4, 0000:14:00.5, 0000:14:00.6, 0000:14:00.7, 0000:14:01.0
                </freevfs>
            </pf>
        </pfs>
    </socket>
    <allocatedtovms>
        <vm>
            <uuid>
                27ca8d30-e6b4-46b9-2f0e-c9bf557dd84c
            </uuid>
            <allocatedvfs>
                0000:14:00.2, 0000:05:00.3
            </allocatedvfs>
        </vm>
        <vm>
            <uuid>
                9f401157-3967-41f1-2716-8b36801b875e
            </uuid>
            <allocatedvfs>
                0000:14:00.3, 0000:14:00.1, 0000:05:00.4, 0000:05:00.1, 0000:05:00.2
            </allocatedvfs>
        </vm>
    </allocatedtovms>
</iovirt>

Monday, 5 Mar 18 10:43:14.845 +0100 [Debug] [HealthMonitorMain] Entering critical plugin section: xsnsmonitor, function_name=ipmi_sensor_list
Monday, 5 Mar 18 10:43:15.038 +0100 [Debug] [HealthMonitorMain] Leaving critical plugin section: xsnsmonitor, function_name=ipmi_sensor_list
Monday, 5 Mar 18 10:43:15.038 +0100 [Debug] [HealthMonitorMain] ipmi_sensor_list:CPU1 Temp        | 43.000     | degrees C  | ok    | 0.000     | 0.000     | 0.000     | 90.000    | 93.000    | 96.000    
CPU2 Temp        | 37.000     | degrees C  | ok    | 0.000     | 0.000     | 0.000     | 90.000    | 93.000    | 96.000    
System Temp      | 24.000     | degrees C  | ok    | 13.000    | 16.000    | 18.000    | 77.000    | 82.000    | 90.000    
FAN 1            | 5929.000   | RPM        | ok    | 400.000   | 576.000   | 784.000   | 33856.000 | 34225.000 | 34596.000 
FAN 2            | 5929.000   | RPM        | ok    | 400.000   | 576.000   | 784.000   | 33856.000 | 34225.000 | 34596.000 
FAN 3            | na         | RPM        | na    | na        | na        | na        | na        | na        | na        
FAN 4            | 5929.000   | RPM        | ok    | 400.000   | 576.000   | 784.000   | 33856.000 | 34225.000 | 34596.000 
FAN 5            | 5476.000   | RPM        | ok    | 400.000   | 576.000   | 784.000   | 33856.000 | 34225.000 | 34596.000 
FAN 6            | 5929.000   | RPM        | ok    | 400.000   | 576.000   | 784.000   | 33856.000 | 34225.000 | 34596.000 
FAN 7            | na         | RPM        | na    | na        | na        | na        | na        | na        | na        
FAN 8            | 5929.000   | RPM        | ok    | 400.000   | 576.000   | 784.000   | 33856.000 | 34225.000 | 34596.000 
CPU1 Vcore       | 1.088      | Volts      | ok    | 0.808     | 0.816     | 0.824     | 1.352     | 1.360     | 1.368     
CPU2 Vcore       | 1.048      | Volts      | ok    | 0.808     | 0.816     | 0.824     | 1.352     | 1.360     | 1.368     
+1.5 V           | 1.504      | Volts      | ok    | 1.320     | 1.328     | 1.336     | 1.656     | 1.664     | 1.672     
+5 V             | 5.088      | Volts      | ok    | 4.416     | 4.448     | 4.480     | 5.536     | 5.568     | 5.600     
+5VSB            | 5.004      | Volts      | ok    | 4.428     | 4.464     | 4.500     | 5.472     | 5.508     | 5.544     
+12 V            | 12.084     | Volts      | ok    | 10.600    | 10.653    | 10.706    | 13.250    | 13.303    | 13.356    
+3.3VCC          | 3.264      | Volts      | ok    | 2.880     | 2.904     | 2.928     | 3.648     | 3.672     | 3.696     
+3.3VSB          | 3.240      | Volts      | ok    | 2.880     | 2.904     | 2.928     | 3.648     | 3.672     | 3.696     
VBAT             | 3.168      | Volts      | ok    | 2.880     | 2.904     | 2.928     | 3.648     | 3.672     | 3.696     
CPU1 VTT         | 1.560      | Volts      | ok    | 1.320     | 1.328     | 1.336     | 1.656     | 1.664     | 1.672     
CPU2 VTT         | 1.560      | Volts      | ok    | 1.320     | 1.328     | 1.336     | 1.656     | 1.664     | 1.672     
Chassis Intru    | 0x0        | discrete   | 0x0000| na        | na        | na        | na        | na        | na        
PS_1 Status      | 0x1        | discrete   | 0x0100| na        | na        | na        | na        | na        | na        
PS_1 Fan Status  | 0x1        | discrete   | 0x0100| na        | na        | na        | na        | na        | na        
PS_1 Temp Status | 0x1        | discrete   | 0x0100| na        | na        | na        | na        | na        | na        
PS_1 Temp        | 34.000     | degrees C  | ok    | na        | na        | na        | na        | na        | na        
PS_2 Status      | 0x1        | discrete   | 0x0100| na        | na        | na        | na        | na        | na        
PS_2 Fan Status  | 0x1        | discrete   | 0x0100| na        | na        | na        | na        | na        | na        
PS_2 Temp Status | 0x1        | discrete   | 0x0100| na        | na        | na        | na        | na        | na        
PS_2 Temp        | 35.000     | degrees C  | ok    | na        | na        | na        | na        | na        | na        
PDB Status       | 0x1        | discrete   | 0x0100| na        | na        | na        | na        | na        | na        

Monday, 5 Mar 18 10:43:15.184 +0100 [Debug] [HealthMonitorMain] Done StatHealthMonitorHandler for 169.254.0.1
Monday, 5 Mar 18 10:43:22.932 +0100 [Debug] [Stat[#1]] StatXenHandler for 169.254.0.1
Monday, 5 Mar 18 10:43:22.950 +0100 [Debug] [Stat[#1]] StatXenHandler: http://169.254.0.1/rrd_updates?session_id=**********&start=1520243002&host=true
Monday, 5 Mar 18 10:43:22.951 +0100 [Debug] [StatMain] Status of all the subsystems :
  401 root          76  49    0   271M 92968K ucond   94.7H  0.00% svm_inventory
  422 root          80  44    0   339M   157M ucond  214:53  0.00% svm_event
  439 root         112  57    0   337M   116M ucond   85:07  0.00% svm_config
  457 root          62  44    0   276M   101M ucond   79:39  0.00% svm_service

Monday, 5 Mar 18 10:43:22.957 +0100 [Debug] [Stat[#1]] StatXenHandler: Retrying because step != 5, step = 3600
Monday, 5 Mar 18 10:43:22.957 +0100 [Debug] [Stat[#1]] StatXenHandler: http://169.254.0.1/rrd_updates?session_id=**********&start=1520254800&host=true
Monday, 5 Mar 18 10:43:22.959 +0100 [Debug] [Stat[#1]] StatXenHandler: Retrying because row_list->length() < 1
Monday, 5 Mar 18 10:43:22.959 +0100 [Debug] [Stat[#1]] StatXenHandler: http://169.254.0.1/rrd_updates?session_id=**********&start=1520251375&host=true
Monday, 5 Mar 18 10:43:23.027 +0100 [Debug] [Stat[#1]] Done StatXenHandler for 169.254.0.1
Monday, 5 Mar 18 10:43:23.053 +0100 [Debug] [Stat[#2]] HTTP Request Protocol: http, ContentType: , Method: GET, URL: http://10.55.81.17/nitro/v1/stat/ns?format=json
Monday, 5 Mar 18 10:43:23.058 +0100 [Debug] [Stat[#3]] HTTP Request Protocol: http, ContentType: , Method: GET, URL: http://10.55.81.18/nitro/v1/stat/ns?format=json
Monday, 5 Mar 18 10:43:23.118 +0100 [Debug] [Stat[#3]] Done StatNSHandler for 10.55.81.18
Monday, 5 Mar 18 10:43:23.125 +0100 [Debug] [Stat[#2]] Done StatNSHandler for 10.55.81.17
Monday, 5 Mar 18 10:43:23.166 +0100 [Debug] [MPSMonitorMain] Done getting MPS stats
Monday, 5 Mar 18 10:43:24.682 +0100 [Debug] [DiskMonitorMain] StatDiskMonitorHandler for 169.254.0.1
Monday, 5 Mar 18 10:43:24.698 +0100 [Debug] [DiskMonitorMain] Entering critical plugin section: xsnsmonitor, function_name=disk_status_get
Monday, 5 Mar 18 10:43:25.214 +0100 [Debug] [DiskMonitorMain] Leaving critical plugin section: xsnsmonitor, function_name=disk_status_get
Monday, 5 Mar 18 10:43:25.214 +0100 [Debug] [DiskMonitorMain] 1 : Localstorage : good
2 : VPX-SR : good

Monday, 5 Mar 18 10:43:25.246 +0100 [Debug] [DiskMonitorMain] Entering critical plugin section: xsnsmonitor, function_name=disk_io_use
Monday, 5 Mar 18 10:43:25.772 +0100 [Debug] [DiskMonitorMain] Leaving critical plugin section: xsnsmonitor, function_name=disk_io_use
Monday, 5 Mar 18 10:43:25.772 +0100 [Debug] [DiskMonitorMain] 1 : Localstorage : /dev/sda4 VG_XenStorage-c6f9eb31-24fa-0852-9d56-d3426ade737e lvm2 a-- 226.45G 106.20G
2 : VPX-SR : /dev/sdb VG_XenStorage-f97838ae-6cf2-1ba7-b6bf-46f62db9fa6b lvm2 a-- 931.50G 849.31G

Monday, 5 Mar 18 10:43:25.772 +0100 [Debug] [DiskMonitorMain] Entering critical plugin section: xsnsmonitor, function_name=disk_io_get
Monday, 5 Mar 18 10:43:26.319 +0100 [Debug] [DiskMonitorMain] Leaving critical plugin section: xsnsmonitor, function_name=disk_io_get
Monday, 5 Mar 18 10:43:26.319 +0100 [Debug] [DiskMonitorMain] 1 : sda4 24.96 284.76 891.04 5432891144 16999944717
2 : sdb 13.26 8.96 616.54 170871018 11762716522

Monday, 5 Mar 18 10:43:26.338 +0100 [Debug] [DiskMonitorMain] Done StatDiskMonitorHandler for 169.254.0.1
Monday, 5 Mar 18 10:43:29.198 +0100 [Debug] [HealthMonitorMain] StatHealthMonitorHandler for 169.254.0.1
Monday, 5 Mar 18 10:43:29.206 +0100 [Debug] [HealthMonitorMain] Entering critical plugin section: iovirt-sdx, function_name=show_summary
Monday, 5 Mar 18 10:43:29.560 +0100 [Debug] [HealthMonitorMain] Leaving critical plugin section: iovirt-sdx, function_name=show_summary
Monday, 5 Mar 18 10:43:29.592 +0100 [Debug] [HealthMonitorMain] Entering critical plugin section: xsnsmonitor, function_name=nic_stats_get_all
Monday, 5 Mar 18 10:43:30.600 +0100 [Debug] [HealthMonitorMain] Leaving critical plugin section: xsnsmonitor, function_name=nic_stats_get_all
Monday, 5 Mar 18 10:43:30.798 +0100 [Debug] [HealthMonitorMain] Entering critical section: ssh on 169.254.0.1
Monday, 5 Mar 18 10:43:30.871 +0100 [Debug] [HealthMonitorMain] Out of critical section: ssh on 169.254.0.1
 

 

 

Link to comment
Share on other sites

Hi,

 

Here other log

 

bash-2.05b# tail -n 100 mps_inventory.log
Monday, 12 Mar 18 16:45:24.437 +0100 [Debug] [#4] Starting License inventory
Monday, 12 Mar 18 16:45:25.553 +0100 [Debug] [#4] License read successfully, f.num_licenses=0
Monday, 12 Mar 18 16:45:25.553 +0100 [Debug] [#4] License inventory completed
Monday, 12 Mar 18 16:45:27.206 +0100 [Debug] [MPSMonitorMain] Done getting MPS stats
Monday, 12 Mar 18 16:45:35.264 +0100 [Debug] [Main] Received on responseQ 
{ "errorcode": 0, "message": "Done", "is_user_part_of_default_group": true, "skip_auth_scope": true, "message_id": "", "resrc_driven": true, "login_session_id": "##6D96DAE415BD3752CCDF9C3AFC6ACC0B49A95E226EB0823717694501A771", "username": "ctidei", "tenant_name": "Owner", "mps_ip_address": "172.16.82.98", "client_ip_address": "172.21.45.168", "client_protocol": "http", "client_port": 27680, "mpsSessionId": "", "source": "service", "target": "INVENTORY", "version": "v1", "messageType": "inventory", "client_type": "GUI", "resourceType": "inventory", "orignal_resourceType": "inventory", "resourceName": "", "operation": "get", "asynchronous": false, "params": { "pageno": 0, "pagesize": 0, "detailview": true, "compression": false, "count": false, "total_count": 0, "action": "", "type": "", "onerror": "EXIT", "is_db_driven": false, "order_by": "", "asc": false, "duration": "", "duration_summary": 0, "report_start_time": "0", "report_end_time": "0" }, "filter_props": [ { "device_type": "nssdx" } ], "additionalInfo": { "Referer": "http:\/\/172.16.82.98\/admin_ui\/mas\/ent\/html\/main.html", "cert_present": "false", "rand_key": "f20681cb56caa7c", "request_source": "NITRO_WEB_APPLICATION", "sessionId": "##6D96DAE415BD3752CCDF9C3AFC6ACC0B49A95E226EB0823717694501A771" }, "inventory": [ { "devices": [ ] } ] }
Monday, 12 Mar 18 16:45:35.265 +0100 [Debug] [Main] Incoming request in InventoryProcessor for type "inventory"
Monday, 12 Mar 18 16:45:35.265 +0100 [Information] [Main] Inventory started with id b061a131-fe7b-4a01-878f-1049f96253cd and typenssdx
Monday, 12 Mar 18 16:45:35.267 +0100 [Debug] [Main] Sent on responseQ 
{ "errorcode": 0, "message": "Done", "is_user_part_of_default_group": true, "skip_auth_scope": true, "message_id": "", "resrc_driven": true, "login_session_id": "##6D96DAE415BD3752CCDF9C3AFC6ACC0B49A95E226EB0823717694501A771", "username": "ctidei", "tenant_name": "Owner", "mps_ip_address": "172.16.82.98", "client_ip_address": "172.21.45.168", "client_protocol": "http", "client_port": 27680, "mpsSessionId": "", "source": "service", "target": "INVENTORY", "version": "v1", "messageType": "inventory", "client_type": "GUI", "resourceType": "inventory", "orignal_resourceType": "inventory", "resourceName": "", "operation": "get", "asynchronous": false, "params": { "pageno": 0, "pagesize": 0, "detailview": true, "compression": false, "count": false, "total_count": 0, "action": "", "type": "", "onerror": "EXIT", "is_db_driven": false, "order_by": "", "asc": false, "duration": "", "duration_summary": 0, "report_start_time": "0", "report_end_time": "0" }, "filter_props": [ { } ], "additionalInfo": { "Referer": "http:\/\/172.16.82.98\/admin_ui\/mas\/ent\/html\/main.html", "cert_present": "false", "rand_key": "f20681cb56caa7c", "request_source": "NITRO_WEB_APPLICATION", "rest_params_key_present": "true", "sessionId": "##6D96DAE415BD3752CCDF9C3AFC6ACC0B49A95E226EB0823717694501A771" }, "inventory": [ { "device_ipaddress": "", "device_type": "nssdx", "act_id": "b061a131-fe7b-4a01-878f-1049f96253cd", "agent_id": "", "inventory_status": "Rediscovery started", "devices": [ ] } ] }
Monday, 12 Mar 18 16:45:35.269 +0100 [Debug] [Inventory[#213]] InventorySDXHandler, start for 10.55.81.20
Monday, 12 Mar 18 16:45:35.269 +0100 [Debug] [Inventory[#214]] InventorySDXHandler, start for 10.55.81.15
Monday, 12 Mar 18 16:45:35.278 +0100 [Debug] [Inventory[#213]] HTTP Request Protocol: http, ContentType: , Method: GET, URL: http://10.55.81.20/nitro/v1/config/mps?format=json
Monday, 12 Mar 18 16:45:35.279 +0100 [Debug] [Inventory[#214]] HTTP Request Protocol: http, ContentType: , Method: GET, URL: http://10.55.81.15/nitro/v1/config/mps?format=json
Monday, 12 Mar 18 16:45:35.280 +0100 [Debug] [Inventory[#213]] Possible JSON Parsing issue for: 
Monday, 12 Mar 18 16:45:35.280 +0100 [Debug] [Inventory[#213]] MPSHTTPClient: MPS GET Request :Not deleting session from table for errorcode : -1
Monday, 12 Mar 18 16:45:35.281 +0100 [Debug] [Inventory[#214]] Possible JSON Parsing issue for: 
Monday, 12 Mar 18 16:45:35.281 +0100 [Debug] [Inventory[#214]] MPSHTTPClient: MPS GET Request :Not deleting session from table for errorcode : -1
Monday, 12 Mar 18 16:45:35.281 +0100 [Debug] [Inventory[#213]] Setting old session id for login request for IP: 10.55.81.20
Monday, 12 Mar 18 16:45:35.282 +0100 [Debug] [Inventory[#214]] Setting old session id for login request for IP: 10.55.81.15
Monday, 12 Mar 18 16:45:35.285 +0100 [Debug] [Inventory[#213]] Checking session validity, rand_key is blank: 
Monday, 12 Mar 18 16:45:35.285 +0100 [Debug] [Inventory[#213]] Checking session validity, HTTP Request URL: http://10.55.81.20/nitro/v1/config/mps?view=summary
Monday, 12 Mar 18 16:45:35.285 +0100 [Debug] [Inventory[#214]] Checking session validity, rand_key is blank: 
Monday, 12 Mar 18 16:45:35.285 +0100 [Debug] [Inventory[#214]] Checking session validity, HTTP Request URL: http://10.55.81.15/nitro/v1/config/mps?view=summary
Monday, 12 Mar 18 16:45:35.286 +0100 [Debug] [Inventory[#213]] Checking session validity, HTTP Response: 
Monday, 12 Mar 18 16:45:35.287 +0100 [Debug] [Inventory[#214]] Checking session validity, HTTP Response: 
Monday, 12 Mar 18 16:45:35.287 +0100 [Debug] [Inventory[#213]] HTTP SDX logout request uri path: /nitro/v1/config/login?args=sessionid:##BF444DC425A577B6E842868529EA9564FC2C5BD18AE058044C8291C1C290
Monday, 12 Mar 18 16:45:35.287 +0100 [Debug] [Inventory[#213]] Sending logout request, rand_key is blank: 
Monday, 12 Mar 18 16:45:35.288 +0100 [Debug] [Inventory[#214]] HTTP SDX logout request uri path: /nitro/v1/config/login?args=sessionid:##B8288F5DB5CEA2B7FDC7D41F16EAD348E47935B4F91DC6C00FA062124B81
Monday, 12 Mar 18 16:45:35.288 +0100 [Debug] [Inventory[#214]] Sending logout request, rand_key is blank: 
Monday, 12 Mar 18 16:45:35.288 +0100 [Debug] [Inventory[#213]] HTTP Response for SDX logout request: 
Monday, 12 Mar 18 16:45:35.289 +0100 [Debug] [Inventory[#214]] HTTP Response for SDX logout request: 
Monday, 12 Mar 18 16:45:35.289 +0100 [Debug] [Inventory[#213]] NITRO Message Body: { "login": { "username": "nsroot", "password": "***********" } }, URL: http://10.55.81.20/nitro/v1/config
Monday, 12 Mar 18 16:45:35.290 +0100 [Debug] [Inventory[#214]] NITRO Message Body: { "login": { "username": "nsroot", "password": "***********" } }, URL: http://10.55.81.15/nitro/v1/config
Monday, 12 Mar 18 16:45:35.291 +0100 [Debug] [Inventory[#213]] HTTP Request Protocol: http, ContentType: application/x-www-form-urlencoded, Method: POST, URL: http://10.55.81.20/nitro/v1/config
Monday, 12 Mar 18 16:45:35.291 +0100 [Debug] [Inventory[#214]] HTTP Request Protocol: http, ContentType: application/x-www-form-urlencoded, Method: POST, URL: http://10.55.81.15/nitro/v1/config
Monday, 12 Mar 18 16:45:35.292 +0100 [Debug] [Inventory[#213]] Possible JSON Parsing issue for: 
Monday, 12 Mar 18 16:45:35.292 +0100 [Debug] [Inventory[#213]] NITRO Response Received for (login): http://10.55.81.20/nitro/v1/config
Monday, 12 Mar 18 16:45:35.292 +0100 [Error] [Inventory[#213]] executeNITROCommand: Send HTTPMessage Failed:Exception: Blank Response: Json Parsing Error for 10.55.81.20
Monday, 12 Mar 18 16:45:35.292 +0100 [Debug] [Inventory[#213]] Renew Session Login for :10.55.81.20 return code is :10001
Monday, 12 Mar 18 16:45:35.292 +0100 [Debug] [Inventory[#213]] DeviceSessionManager renewHTTPSession: now throwing again exception with errorcode:10001, for IP: 10.55.81.20
Monday, 12 Mar 18 16:45:35.292 +0100 [Error] [Inventory[#213]] http://10.55.81.20/nitro/v1/config/mps?format=json, Reason: Exception: DEVICE_NOT_REACHABLE
Monday, 12 Mar 18 16:45:35.292 +0100 [Error] [Inventory[#213]] SDXNITROBaseHandler::execute and MPSException for url http://10.55.81.20/nitro/v1/config/mps?format=json and returning string 
Monday, 12 Mar 18 16:45:35.293 +0100 [Debug] [Inventory[#214]] Possible JSON Parsing issue for: 
Monday, 12 Mar 18 16:45:35.293 +0100 [Debug] [Inventory[#214]] NITRO Response Received for (login): http://10.55.81.15/nitro/v1/config
Monday, 12 Mar 18 16:45:35.293 +0100 [Error] [Inventory[#214]] executeNITROCommand: Send HTTPMessage Failed:Exception: Blank Response: Json Parsing Error for 10.55.81.15
Monday, 12 Mar 18 16:45:35.293 +0100 [Debug] [Inventory[#214]] Renew Session Login for :10.55.81.15 return code is :10001
Monday, 12 Mar 18 16:45:35.293 +0100 [Debug] [Inventory[#214]] DeviceSessionManager renewHTTPSession: now throwing again exception with errorcode:10001, for IP: 10.55.81.15
Monday, 12 Mar 18 16:45:35.293 +0100 [Error] [Inventory[#214]] http://10.55.81.15/nitro/v1/config/mps?format=json, Reason: Exception: DEVICE_NOT_REACHABLE
Monday, 12 Mar 18 16:45:35.293 +0100 [Error] [Inventory[#214]] SDXNITROBaseHandler::execute and MPSException for url http://10.55.81.15/nitro/v1/config/mps?format=json and returning string 
Monday, 12 Mar 18 16:45:35.306 +0100 [Debug] [Inventory[#213]] HTTP Request Protocol: http, ContentType: , Method: GET, URL: http://10.55.81.20/nitro/v1/config/vm_device?format=json
Monday, 12 Mar 18 16:45:35.306 +0100 [Debug] [Inventory[#214]] HTTP Request Protocol: http, ContentType: , Method: GET, URL: http://10.55.81.15/nitro/v1/config/vm_device?format=json
Monday, 12 Mar 18 16:45:35.307 +0100 [Debug] [Inventory[#213]] Possible JSON Parsing issue for: 
Monday, 12 Mar 18 16:45:35.307 +0100 [Debug] [Inventory[#213]] MPSHTTPClient: MPS GET Request :Not deleting session from table for errorcode : -1
Monday, 12 Mar 18 16:45:35.308 +0100 [Debug] [Inventory[#214]] Possible JSON Parsing issue for: 
Monday, 12 Mar 18 16:45:35.308 +0100 [Debug] [Inventory[#214]] MPSHTTPClient: MPS GET Request :Not deleting session from table for errorcode : -1
Monday, 12 Mar 18 16:45:35.308 +0100 [Debug] [Inventory[#213]] Setting old session id for login request for IP: 10.55.81.20
Monday, 12 Mar 18 16:45:35.309 +0100 [Debug] [Inventory[#214]] Setting old session id for login request for IP: 10.55.81.15
Monday, 12 Mar 18 16:45:35.311 +0100 [Debug] [Inventory[#213]] Checking session validity, rand_key is blank: 
Monday, 12 Mar 18 16:45:35.311 +0100 [Debug] [Inventory[#213]] Checking session validity, HTTP Request URL: http://10.55.81.20/nitro/v1/config/mps?view=summary
Monday, 12 Mar 18 16:45:35.312 +0100 [Debug] [Inventory[#214]] Checking session validity, rand_key is blank: 
Monday, 12 Mar 18 16:45:35.312 +0100 [Debug] [Inventory[#214]] Checking session validity, HTTP Request URL: http://10.55.81.15/nitro/v1/config/mps?view=summary
Monday, 12 Mar 18 16:45:35.312 +0100 [Debug] [Inventory[#213]] Checking session validity, HTTP Response: 
Monday, 12 Mar 18 16:45:35.313 +0100 [Debug] [Inventory[#214]] Checking session validity, HTTP Response: 
Monday, 12 Mar 18 16:45:35.314 +0100 [Debug] [Inventory[#213]] HTTP SDX logout request uri path: /nitro/v1/config/login?args=sessionid:##BF444DC425A577B6E842868529EA9564FC2C5BD18AE058044C8291C1C290
Monday, 12 Mar 18 16:45:35.314 +0100 [Debug] [Inventory[#213]] Sending logout request, rand_key is blank: 
Monday, 12 Mar 18 16:45:35.315 +0100 [Debug] [Inventory[#214]] HTTP SDX logout request uri path: /nitro/v1/config/login?args=sessionid:##B8288F5DB5CEA2B7FDC7D41F16EAD348E47935B4F91DC6C00FA062124B81
Monday, 12 Mar 18 16:45:35.315 +0100 [Debug] [Inventory[#214]] Sending logout request, rand_key is blank: 
Monday, 12 Mar 18 16:45:35.315 +0100 [Debug] [Inventory[#213]] HTTP Response for SDX logout request: 
Monday, 12 Mar 18 16:45:35.316 +0100 [Debug] [Inventory[#214]] HTTP Response for SDX logout request: 
Monday, 12 Mar 18 16:45:35.316 +0100 [Debug] [Inventory[#213]] NITRO Message Body: { "login": { "username": "nsroot", "password": "***********" } }, URL: http://10.55.81.20/nitro/v1/config
Monday, 12 Mar 18 16:45:35.317 +0100 [Debug] [Inventory[#214]] NITRO Message Body: { "login": { "username": "nsroot", "password": "***********" } }, URL: http://10.55.81.15/nitro/v1/config
Monday, 12 Mar 18 16:45:35.317 +0100 [Debug] [Inventory[#213]] HTTP Request Protocol: http, ContentType: application/x-www-form-urlencoded, Method: POST, URL: http://10.55.81.20/nitro/v1/config
Monday, 12 Mar 18 16:45:35.318 +0100 [Debug] [Inventory[#214]] HTTP Request Protocol: http, ContentType: application/x-www-form-urlencoded, Method: POST, URL: http://10.55.81.15/nitro/v1/config
Monday, 12 Mar 18 16:45:35.319 +0100 [Debug] [Inventory[#213]] Possible JSON Parsing issue for: 
Monday, 12 Mar 18 16:45:35.319 +0100 [Debug] [Inventory[#213]] NITRO Response Received for (login): http://10.55.81.20/nitro/v1/config
Monday, 12 Mar 18 16:45:35.319 +0100 [Error] [Inventory[#213]] executeNITROCommand: Send HTTPMessage Failed:Exception: Blank Response: Json Parsing Error for 10.55.81.20
Monday, 12 Mar 18 16:45:35.319 +0100 [Debug] [Inventory[#213]] Renew Session Login for :10.55.81.20 return code is :10001
Monday, 12 Mar 18 16:45:35.319 +0100 [Debug] [Inventory[#213]] DeviceSessionManager renewHTTPSession: now throwing again exception with errorcode:10001, for IP: 10.55.81.20
Monday, 12 Mar 18 16:45:35.319 +0100 [Error] [Inventory[#213]] http://10.55.81.20/nitro/v1/config/vm_device?format=json, Reason: Exception: DEVICE_NOT_REACHABLE
Monday, 12 Mar 18 16:45:35.319 +0100 [Error] [Inventory[#213]] SDXNITROBaseHandler::execute and MPSException for url http://10.55.81.20/nitro/v1/config/vm_device?format=json and returning string 
Monday, 12 Mar 18 16:45:35.319 +0100 [Debug] [Inventory[#213]] VM_DEVICE NO: 0
Monday, 12 Mar 18 16:45:35.320 +0100 [Debug] [Inventory[#214]] Possible JSON Parsing issue for: 
Monday, 12 Mar 18 16:45:35.320 +0100 [Debug] [Inventory[#214]] NITRO Response Received for (login): http://10.55.81.15/nitro/v1/config
Monday, 12 Mar 18 16:45:35.320 +0100 [Error] [Inventory[#214]] executeNITROCommand: Send HTTPMessage Failed:Exception: Blank Response: Json Parsing Error for 10.55.81.15
Monday, 12 Mar 18 16:45:35.320 +0100 [Debug] [Inventory[#214]] Renew Session Login for :10.55.81.15 return code is :10001
Monday, 12 Mar 18 16:45:35.320 +0100 [Debug] [Inventory[#214]] DeviceSessionManager renewHTTPSession: now throwing again exception with errorcode:10001, for IP: 10.55.81.15
Monday, 12 Mar 18 16:45:35.320 +0100 [Error] [Inventory[#214]] http://10.55.81.15/nitro/v1/config/vm_device?format=json, Reason: Exception: DEVICE_NOT_REACHABLE
Monday, 12 Mar 18 16:45:35.320 +0100 [Error] [Inventory[#214]] SDXNITROBaseHandler::execute and MPSException for url http://10.55.81.15/nitro/v1/config/vm_device?format=json and returning string 
Monday, 12 Mar 18 16:45:35.320 +0100 [Debug] [Inventory[#214]] VM_DEVICE NO: 0
Monday, 12 Mar 18 16:45:35.320 +0100 [Debug] [Inventory[#213]] OLD VM_DEVICE NO: 0  NEW VM DEVICE NO:0
Monday, 12 Mar 18 16:45:35.322 +0100 [Debug] [Inventory[#214]] OLD VM_DEVICE NO: 0  NEW VM DEVICE NO:0
Monday, 12 Mar 18 16:45:40.556 +0100 [Debug] [#4] Starting License inventory
Monday, 12 Mar 18 16:45:41.254 +0100 [Debug] [MPSMonitorMain] Done getting MPS stats
Monday, 12 Mar 18 16:45:41.667 +0100 [Debug] [#4] License read successfully, f.num_licenses=0
Monday, 12 Mar 18 16:45:41.667 +0100 [Debug] [#4] License inventory completed
Monday, 12 Mar 18 16:45:55.308 +0100 [Debug] [MPSMonitorMain] Done getting MPS stats
Monday, 12 Mar 18 16:45:56.670 +0100 [Debug] [#4] Starting License inventory
Monday, 12 Mar 18 16:45:57.786 +0100 [Debug] [#4] License read successfully, f.num_licenses=0
Monday, 12 Mar 18 16:45:57.786 +0100 [Debug] [#4] License inventory completed

Link to comment
Share on other sites

  • 1 year later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...